Categories
Current News Technology

Scratching My Head Over Gigabit Wireless

Wi-FiOver the last few weeks I have seen numerous announcements of companies that plan to deliver gigabit wireless speeds using unlicensed spectrum. For example, RST announced plans to deliver gigabit wireless all over the state of North Carolina. Vivant announced plans to do the same in Utah. And I just scratch my head at these claims.

These networks plan to use the 5 GHz portion of the unlicensed spectrum that we have all come to collectively call WiFi. And these firms will be using equipment that meets the new WiFi standard of 802.11ac. That technology has the very unfortunate common name of gigabit WiFi, surely coined by some marketing guru. I say unfortunate, because in real life it isn’t going to deliver speeds anywhere near to a gigabit. There are two ways to deploy this technology to multiple customers, either through hotspots like they have at Starbucks or on a point-to-multipoint basis. Let’s look at the actual performance of 802.11ac in these two cases.

There is no doubt that an 802.11ac WiFi hotspot is going to perform better than the current hotspots that use 802.11n. But how much better in reality? A number of manufacturers have tested the new technology in a busy environment, and with multiple users the new 80211.ac looks to be between 50% and 100% better than the older 802.11n standard. That is impressive, but that is nowhere near to gigabit speeds.

But let’s look deeper at the technology. One of the biggest improvements in the technology is that the transmitters can bond multiple WiFi channels to make one data path up to one 160 MHz channel. The downside to this is that there are only five channels in the 5 GHz range and so only a tiny handful of devices can use that much spectrum at the same time. When there are multiple users the channel size automatically steps down until it ends up at the same 40 MHz channels as 802.11n.

The most important characteristic of 5 GHz in this application is how fast the spectrum dies with distance. In a recent test with a Galaxy S4 smartphone, the phone could get 238 Mbps at 15 feet, 193 Mbps at 75 feet, 154 Mbps at 150 feet and very little at 300 feet. This makes the spectrum ideal for inside applications, but an outdoor hotspot isn’t going to carry very far.

So why do they call this gigabit WiFi if the speeds above are all that you can get? The answer is that the hotspot technology can include something called beamforming and can combine multiple data paths to a device (assuming that the device has multiple receiving antennas). In theory one 160 MHz channel can deliver 433 Mbps. However, in the real world there are overheads in the data path and about the fastest speed that has been achieved in a lab is about 310 Mbps. Combine three of those (the most that can be combine), and a device that is right next to the hotspot could get 900 Mbps. But again, the speeds listed above for the Galaxy S4 test are more representative of the speeds that can be obtained in a relatively empty environment. Put a bunch of users in the rooms and the speeds drop from there.

But when companies talk about delivering rural wireless they are not talking about hotspots, but about point-to-multipoint networks. How does this spectrum do on those networks? When designing a point-to-point network the engineer has two choices. They can open up the spectrum to deliver the most bandwidth possible. But if you do that, then the point-to-multipoint network won’t do any better than the hotspot. Or, through techniques known as wave shaping, they can design the whole system to maximize the bandwidth at the furthest point in the network. In the case of 5 GHz, about the best that can be achieved is to deliver just under 40 Mbps to 3 miles. You can get a larger throughput if you shorten that to one or two miles, but anybody who builds a tower wants to go as far as they can reach, and so 3 miles is the likely networks that will be built.

However, once you engineer for the furthest point, that is then the same amount of bandwidth that can be delivered anywhere, even right next to the transmitter. Further, that 40 Mbps is total bandwidth and that has to be divided into an upload and download path. This makes a product like 35 Mbps download and 5 Mbps upload a possibility for rural areas.

If this is brought to an area that has no broadband it is a pretty awesome product. But this is nowhere near the bandwidth that can be delivered with fiber, or even with cable modems. It’s a nice rural solution, but one that is going to feel really tiny five years from now when homes are looking for 100 Mbps speeds at a minimum.

So it’s unfortunate that these companies are touting gigabit wireless. This technology only has this name because it’s theoretically possible in a lab environment to get that much output to one device. But it creates a really terrible public expectation to talk about selling gigabit wireless and then delivering 35 Mbps, or 1/28th of a gigabit.

Categories
Current News Technology

More on White Space Wireless

Last July I wrote about the Google database that shows the availability of white space radio spectrum in the US. This is spectrum that has been used for years by UHF television stations. In some rural places it was never used and in others it has been freed up as stations have moved elsewhere.

I’ve been hearing about this spectrum a lot lately so I thought I’d talk a little more about it. There are now several trials of the spectrum going on in the US. The first test market was the City of Wilmington NC who implemented this in their municipal network in 2010. They use it to control traffic lights, for public surveillance cameras and other municipal uses. Probably the biggest US test so far is a campus-wide deployment at West Virginia University in Morgantown that launched in July 2013. There only look to be a few dozen of these trials going on worldwide.

So what are the pros and cons of this technology and why isn’t it being deployed more? Consider some of the following:

  • It’s not available everywhere. That’s why Google and others have put together the maps. Where there are still TV stations using some of the bandwidth, only the unused portion of spectrum is available. There are still large areas around most major metros that have some use in the spectrum.
  • This is still a trial provisional spectrum and the FCC has to approve your trial use. I’m not sure why this is taking so long, because the Wilmington test has been going on since 2010 and supposedly has no interference issues. But I guess the FCC is being very cautious about letting WISPs interfere with television signals.
  • We are at that awkward point that happens with all new uses of spectrum, where there is equipment that will work with the spectrum, but that equipment won’t get really cheap until there is a lot of demand for it. But until that demand is believed by a manufacturer, not much happens. It was this equipment cost barrier that killed the use of LMDS and MMDS spectrum in the 90s. There is no equipment on the market yet that would let white space be used by laptops, cell phones or tablets. Instead it must feed a traditional WiFi router.
  • One use of the spectrum is that it can make a better hotspot. I don’t think most people understand the short distances that can be achieved with hotspots today. A 2.4 GHz WiFi signal can deliver just under 100 Mbps out to about 300 feet. But it dies quickly after that and there may 30 Mbps left at 600 feet and nothing much after that. If they put whitespace receivers into laptops this spectrum can deliver just under 50 Mbps out to 600 feet and 25 Mbps out to 1,200 feet. And there is an additional advantage to white space in that it travels fairly free through walls and other barriers.
  • The real potential for the spectrum is to extend point-to-multipoint radio systems. With white space you can deliver a little less than 50 Mbps up to about 6 miles from the transmitter. That’s easily twice as far as the distances that can be achieved today using unlicensed spectrum and a 12-mile circle around a transmitter can make for viable economic returns on an investment. Physics limits this to about 45 Mbps of total bandwidth meaning that a product of 40 Mbps download and 5 Mbps upload is possible. That is certainly not fiber speeds, but it would be a great rural product. The problem comes in in the many places where part of the spectrum is still in use, and in those places the radios would have to work around the used spectrum and the speeds would be correspondingly slower.

It seems like this is a spectrum with a lot of potential, especially in rural places where there are no existing uses of the spectrum. This could be used for new deployments or for supplementing existing WiFi deployments for WISPS. There is equipment that works on the spectrum today and I guess we are now waiting for the FCC here and regulatory bodies around the world to open this up to more use. The US isn’t the only place that used this spectrum for TV and much of the rest of the world shares the same interference concerns. But if this is ever released from the regulatory holds I think we would quickly hear a lot more about it.

Categories
Technology

A Great Time to Love Technology

I love science and technology and I read dozens of different on-line publications to see what is going on in the telecom and related science worlds. I find something interesting almost every day in my reading due to the fact that there is so much R&D happening around the world. But last Sunday in a very short period of reading I was struck by the sheer volume of new stull that scientists and engineers are working on. I want to share some of what I found in one short hour of reading as a way to highlight how amazing the world is right now. I read recently that all of human knowledge is now doubling every two years and I can believe it.

This article talks about how slime molds can be used as a more efficient way of designing complex circuits. I’ve done a lot of hiking in my life and slime molds are those veiny orange-yellow molds that you find growing in dark damp places in old woods. But it turns out that slime molds ‘move’ by following nutrition and they can be tricked into following the best paths on circuit boards and highlight the most efficient path to take in a circuit. Now that we are trying to make smaller, faster and more energy efficient chips and boards, every breakthrough like this helps.

There is also now a technology for building a circuit that be applied to your skin like a heat tattoo. This means that almost any technology that can be put onto a small circuit could be cheaply adhered to your skin for personal use. The initial application is probably going to be medical, like the sensors people wear for sleep tests. But soon you can apply a cell phone or a host of other electronics to your skin to wear while you go hiking or running.

How about a biodegradable battery that can melt inside your body? Currently, when you need a medical device that needs power it means an operation to put it in and another to take it out. But this concept means that a wider array of devices can be implanted for things like treating cancer that will just be absorbed by the body when they are done.

In Australia they have come up with a robotic kangaroo that can recapture much of the energy used to propel it. And that recapture of energy is the breakthrough because it means that we can build robots and other machines that can be made to need far less external power to operate. Obviously we can’t build a perpetual motion machine, but this is getting closer.

There is a new 3D printer that only costs $200 and that can print a coffee mug in half an hour. That is getting to the point where people can actually 3D print things they need like replacement parts for an appliance. You don’t hear a lot about this, but 3D printing frees the world from reliance on factories and might be the most transformational technology out there soon.

And finally, there is now a working model of a battery charger that uses biological semiconductors that can charge a smartphone in 30 seconds. This kind of technology could finally free us from worrying about keeping our portable devices charged since things could be recharged in the time it takes to go get a cup of coffee.

These are just the wow sort of technology things that I found in half an hour of reading. In addition to these items I ran across a long list of more telecom specific things that happened last week including things like Sprint offering $650 to move large family plans. Xbox is going to be putting out interactive programming where you can see the show from the perspective of different characters. Qualcomm is coming out with really fast new chips that are going to make for blazingly faster smartphones next year (made note to hold off on buying new phone this year). Leaked documents show that Google has plans for a major Android TV. It is a great time to be a techie, for sure.

Categories
Technology The Industry

The Skinny on U.S. 4G Data Speeds

I am a statistic freak and I read any and all statistics I can find about the telecom industry. A lot of statistics are interesting but require a lot of heavy lifting to see what is going on beneath the numbers. But I ran across one set of statistics that sums up the problems of wireless 4G data in this country in a few simple numbers.

A company called OpenSignal has an app that people can use to measure the actual download speeds they see on LTE 4G networks. This app is used worldwide and so we can also compare the US to other parts of the world. In 2014 the comparisons were made from readings from 6 million users of the app.

The first interesting statistic is that the US came in 15th in the world in LTE speeds. In 2014 the US average download speed was a paltry 6.5 Mbps across all US downloads using 4G. At the top of the chart was Australia at 24.5 Mbps, Hong Kong at 21 Mbps, Denmark at 20.1 Mbps, Canada at 19.3 Mbps, Sweden at 19.2 Mbps and South Korea at 18.6 Mbps. Speeds drop pretty significantly after that, and for example Japan was at 11.8 Mbps. So beyond all of the hype from AT&T and Verizon touting their network speeds, they have not done a very good job in the US.

But the second statistic is even more telling. The speeds in the US dropped from 9.6 Mbps in 2013 to 6.5 Mbps in 2014. The US was the only country on the list of the top fifteen countries that saw a significant percentage drop from one year to the next. Sweden did have a drop, but they went from 22.1 Mbps to 19.2 Mbps

So what does this all mean? First, the drop in speed can probably best be explained by the fact that so many people in this country are using wireless data. Large amount of users are obviously overwhelming the networks, and as more people use the wireless data networks the speeds drop. Our wireless networks are all based upon the total bandwidth capacity at a given cell site, and so to the extent that more people want data than a cell site is designed for, the speeds drop as the cell site tries to accommodate everybody.

But for the average 4G speed for the whole year to only be 6.5 Mbps there has to be a whole lot more to the story. One might expect Canada to be faster than the US simply because we have a lot more large cities that can put strains on wireless networks. But you wouldn’t expect that to make the Canadian 4G experience three times faster than the US experience. And there are very few places on earth as densely populated as Hong Kong and they have the second fastest 4G networks in the world.

It’s obvious from these numbers that the US wireless carriers are not making the same kinds of investments per customer as other countries are doing. It’s one thing to beef up urban cell sites to 4G, but if those cell sites are too far apart then too many people are trying to use the same site. I would have to guess that our main problem is the number and spacing of cell sites.

But we also have a technology issue and regardless of what the carriers say, there are a lot of places that don’t even have 4G yet. I don’t have to drive more than 2 miles outside my own town to drop to 3G coverage and then only a few more miles past that to be down to 2G. A few weeks ago I was in Carlsbad California, a nice town halfway between LA and San Diego and right on I-5. I couldn’t even find a 2G network there at 5:00 in the evening, probably due to all of the traffic on the interstate.

I hope the FCC looks at these kinds of statistics because they debunk all of the oligopoly hype we get from the wireless carriers. I laugh when people tell me they are getting blazing fast speeds on 4G, because it’s something I look at all of the time when I travel and I have never seen it. When I hear of somebody who claims that they are getting 30 Mbps speeds I know that they must be standing directly under a cell tower at 3:00 in the morning. I like speed, but not quite that much.

Categories
Technology

Keep People in the Equation

As I keep reading about the coming Internet of Things I keep running into ideas that make me a bit uneasy. And since I am a tech head, I imagine that things that make me a little uneasy might make many people a whole lot uneasy.

For instance, I read about the impending introduction of driverless cars. I have to admit that when I am making a long drive on the Interstate that having the ability to just hand the driving off to a computer sounds very appealing. I would think that the challenge of driving on wide-open highways at a consistent speed is something that is quite achievable.

But it makes me uneasy to think about all cars everywhere becoming driverless. I sit here wondering if I really want to trust my personal safety to traveling in a car in which software is making all of the decisions. I know how easily software systems crash, get into loops and otherwise stutter and I can’t help but picturing being in a vehicle when a software glitch raises its ugly head.

I know that a road accident can happen to anybody, but when I drive myself I have a sense of control, however misplaced. I feel like I have the ability to avoid problems a lot better than software might when it comes down to a bad situation.

I am probably wrong, but it makes me uneasy to think about climbing into a cab in a crowded City and trusting my life to an automated vehicle. And I really get nervous thinking about sharing the road with robot tractor-trailers. The human-driven ones are scary enough.

I am probably somewhat irrational in this fear because I would guess that if all vehicles were computer-controlled there would be a lot fewer accidents, and we certainly would be protected from drunk drivers. Yet a nagging part of my brain still resists the idea.

I also worry about hacking. Perhaps one of the easiest ways to bump somebody off would be to hack their car and make it have an accident at a fast speed. You know it’s going to happen and that will make people not trust the automated systems. Hacking can break our faith in a whole lot of the IoT since there will be ample opportunities to hurt people by interfering with their car or their medicine or other technology that can harm as easily as it can help.

I can’t think I am untypical in this kind of fear. I think somehow as we make these big changes that somehow people have to be part of the equation. I don’t have an answer to this and frankly this blog just voices the concern. But it’s something we need to consider and talk about as a society.

The people issue is going to spring up around a lot of the aspects of IoT. It has already surfaced with Google Glass and many people have made it clear that they don’t want to be recorded by somebody else surreptitiously. As the IoT grows past its current infancy there are bound to be numerous clashes coming where tech confronts human fears, feelings and emotions.

There are certainly many of the aspects of the IoT that excite me, but as I think about them I would bet these same changes will frighten others. For instance, I love the idea of nanobots in my bloodstream that will tell me days early if I am getting sick or that will be able to kill pre-cancerous cells before they get a foothold in my body. But I am sure that same concept scares the living hell out of other people, the idea of having technology in our blood.

I don’t know how it’s going to happen, but the human equation must become part of the IoT. It has to. If nothing else, people will boycott the technology if it doesn’t make us feel safe.

Categories
Technology

5G Already?

We knew it was coming and the wireless industry is already bantering about the term 5G. Nobody knows exactly what it is going to be, but the consensus is that it’s going to be fast. The South Koreans are devoting $1.5 billion in research to develop the next generation wireless. And there are vendors like Samsung who are already starting to claim that the upgrades in their labs today are 5G.

And of course, all of this is hype. There is still not any broadband anywhere that complies to the original 4G specifications. This all got out of hand when the marketing groups started to tout 3G performance for networks that were not yet at the 3G specs. And then came 3.5 G and 4G, and now I guess 5G.

But let’s look at the one claim that it seems 5G is going to have, which is blistering fast speeds, perhaps up to 1 gigabit per second. What would it really take to provide a 1 gigabit cell phone data link? The answers can all be derived by looking at the basic physics of the spectrum.

Probably the first characteristic is going to be proximity to the transmitter. When you look at spectrum between 3 GHz and 6 GHz, the likely candidates for US deployment, then the math tells you that it’s going to be hard to send a 1 gigabit signal very far, maybe 150 feet from the transmitter. After that the signal is still fast, but the speeds quickly drop with distance. Unless we are going to place a mini cell site in every home and on every floor of a business it is not very likely that we are going to get people close enough to transmitters to achieve gigabit speeds.

It certainly is possible to generate speeds that fast at the transmitter. But such a network would need fiber everywhere to feed cell phones. A network with fiber that dense probably wouldn’t even need to be cellular and could handle nearby phones using WiFi.

We certainly need new antenna technologies and those things are being worked on in labs. I’ve written previous blog posts about the various breakthroughs in antenna technology such as with very bit arrays using large number of MIMO antennas. I think we can believe that antennas will get better with more research.

We need better processors and chips. A chip capable of receiving and processing a gigabit of data is going to be an energy hog in terms of the power available in a cell phone. Such chips are already here but they are deployed with bigger devices with enough power to run them. So we are going to need the next generation chip sets that will require less energy and that will generate less heat before any cell phone can actually use a gigabit of data.

We need carriers willing to supply that much data. Let’s face it, cellular networks are designed to provide something okay to many rather than something great to a few. Perhaps making cell sites smaller would help alleviate this issue, but it is a real one. If somebody is really dragging a gigabit out of a cell site there is not a whole lot left for anybody else. And this would require an increase the backhaul to cell sites to 100 GB or even terabit speeds if 1 GB phones became the norm.

Finally, we need a new FCC. Because the way that spectrum is divvied up in the US makes these kinds of speeds nearly impossible. Gigabit speeds would be easily achievable today if there were some giant swaths of bandwidth available. But our bandwidth is split into little discrete pieces and most of those pieces are further divided into channels. This makes it really hard to cobble together a big consistent bandwidth delivery system. We tend to think of wireless as a big pipe in the same manner than a fiber is. But it’s really a whole lot of discrete little signals that somebody has to join together to get a huge throughput.

Categories
Technology

Beyond Cookies

This not a blog entry about cakes and pies, but rather more discussion about how companies are tracking people on the web. A few weeks back I wrote a primer on cookies, which are the scripts that are left on your machine to store facts about you. Cookies can store almost anything and can be as simple as something that remembers your login and password to as complex as storing all sorts of other information about what you are doing on the web.

But many people have become very conscious of cookies and routinely delete them from their computers. Further, our web habits have changed and we access the web from multiple platforms. Cookies are only good for the device they are stored on and are not particularly effective in today’s multi-device environment. So there are new techniques being used to track what you do on the web including authenticated tracking, browser footprinting and cross-device tracking.

We make it easy for big companies to track us without cookies because we basically tell them who we are when we log onto our devices. You routinely authenticate who you are when you use sites like Facebook, iTunes, Gmail and others. An example of how you do this is your android phone. The first thing you are asked to do when you buy an android phone is to log on with a Gmail account as part of the activation process. It never asks you for this again, but every time you turn on your phone it automatically logs you in to that Gmail account again and Google always knows who you are. Apple phones and tablets have something similar in that each device is given a unique identifier code known as a UDID.

So Google is tracking android phones and Apple is tracking iPhones and I have to guess that Microsoft is tracking their phones. Since you ‘authenticate’ yourself by logging onto a cell phone you have basically given permission for somebody to learn a lot about you without the need for cookies – where you are and what you are doing on your cell phone.

The next tool that can be used to identify you is browser footprinting. This is interesting because each one of us basically creates our own digital fingerprint telling the world who we are through our browser footprint. The browser footprint is the sum total of all of the things that are stored in your browser. Some of this is pretty basic data like your screen size, the fonts you prefer, your time zone, your screen settings. But there are other identifying features like Plugins or any other program that wants to create a place on one of your tool bars.

As it turns out, almost everybody has a unique browser footprint. You can test this yourself. You can go to the website Panopticlick and this will tell you if your browser footprint is unique. It will show the kind of information that others can see online about you and your machine. One would think that most people have the same sort of stuff on their computers, but it only takes one thing different to give you a unique browser footprint and almost everybody is unique. And the people who are not unique still share a browser footprint with a discrete number of other people.

Finally there is cross-device tracking and Google is at the forefront of this effort. Over time as you log onto Google from different devices, or as you authenticate who you are on multi-devices, Google and others can note that information coming from these various devices are all from you. And so when your browse from home and are looking at new cars, it will become possible for them to tell an auto dealer what you have already done in terms of research once Google notices by your cellphone GPS that you are at a car dealer. They aren’t doing this quite yet, and for now they are just linking and tracking you across your multiple devices. But this tracking effort gives them a more complete picture of who you are, which is what big data is all about.

Categories
Current News Technology

The New Satellite Internet

A new satellite Internet service launched earlier this year. I’ve been meaning to write about this and was prompted by seeing them in a booth at a rodeo I went to on Saturday. The service is provided by ViaSat under the brand name of Exede. They launched a new satellite, the ViaSat-1, last October for the sole purpose of selling rural broadband.

The broadband they are selling is a big step up over other previous satellite broadband, including earlier products offered by ViaSat. The basic broadband product offers up to 12 Mbps download and 3 Mbps upload. I went to the web and read reviews and people are saying that they are actually getting those speeds and in some cases even a little more. I would caution that like any broadband system, as they get more customers the satellite will get contention and the speeds will slow down.

The base product is priced at $50 per month and is a huge improvement over other satellite products. Exede’s older base product was also $50 but offered 512 kbps download and 128 kbps upload. For $79 you could get 1.5 Mbps download and 256k kbps upload.

But like everything there is a catch and that catch is data caps. The speeds are a great improvement because even web browsing at 512 kbps is nearly impossible. But the caps are a killer. For the $50 product the cap is 7 Gigabits of total download for the month. To put that into perspective, that is around 4 HD movies per month.

You can buy larger caps. For $80 per month you can get a 15 GB cap and for $130 per month you can get a 25 GB cap. If you hit the cap Exede doesn’t shut you down, but instead sets you to a very slow crawl for the rest of the month.

So obviously the satellite program is not going to be useful for anybody who wants to use the Internet for watching video or doing those kinds of things that most families use the Internet for. There can be no real gaming over a satellite connection, both due to the cap but also to the latency, since the signal bounces high above the earth and back. The latency also plays hell with voice over the Internet. You can do a mountain of emails and web surfing within that cap, but you have to always be cautious about downloading too much. Imagine if you worked from home and one of your kids watched too many videos and for the rest of the month you just crawled along at dial-up speeds.

For now this is only available on the east and west coasts and won’t be available in the middle of the country until they launch another satellite. Exede has a product in the Midwest that is $50 for up to 5 Mbps download and 1 Mbps upload, but reports are that most people there are not getting those speeds.

I am the first to say that this is a big step up in the rural areas. If I was on dial-up this would feel wonderful. But any home that gets this is not getting the same Internet that the rest of us get. One of my employees has four kids and they watch 4 – 5 hours per day of Internet video. We estimated that some months he is probably using a terabit of total download. His speeds are only half of this satellite service, but the unlimited download makes a huge difference in the way his family can use the Internet.

The scariest thing about this product is that I know that one of these days that some policy-head at the FCC is going to announce that the whole country has broadband and then they can wash their hands of the rural broadband gap. This is the fastest download speeds that anybody has brought to much of rural America. But anybody on this service is going to be so throttled by the data caps that they are not going to be able to use the Internet like the rest of us. So this is a good service, but it’s not broadband – it’s something else.

Categories
Current News Technology

Where Will We Draw the Privacy Line?

The efficiencies, convenience and societal cost savings that will be realized from the IoT are so enormous that it is inevitable that the future will eventually become just what the IoT developers imagine – a seamlessly networked world that brings a lot of Star Trek into our lives. But we are not just going to magically pop to that great future and my gut tells me that there is going to be some gigantic growing pains for the technology and some major setbacks on the way to the inevitable future.

One only has to peek behind the curtain at some of the early attempts at developing IoT devices to understand where some of the snafus and problems are going to come from. One area where I foresee the possibility for a lot of backlash is privacy. In order for the IoT to work people are going to have to sacrifice some privacy. The question that I don’t see being asked is how much privacy the average person is going to be willing to give up to gain the convenience of using numerous IoT devices.

Already today we can see a little of the how social sharing interfaces with privacy. For example, when running monitors first hit the market my Facebook got filled with maps showing how far and how fast my various runner friends had run each day. But over a few months these all disappeared and I haven’t seen one in a while. This is not because they have ditched the monitors, but rather that after the novelty wore off people realized they didn’t want to share. They didn’t want their friends to notice that they took a day off from running or that they ran slowly or only did a short route on a given day. It turns out that people don’t want to automatically share things that might reflect negatively on them.

And if people quickly edited their sharing over something like a jogging monitor I can’t help but wonder how people are going to react when they realize that one of the biggest aspects of the IoT is that we will be constantly watched and monitored.

I heard this concern when it was announced that Google was buying Nest, the maker of smoke detectors and other security devices. The promise is being made that IoT devices are going to be smart (or at least that the network that controls them will be smart). And this means that our every movement will be tracked. It doesn’t sound particularly threatening if Google finds out what time of day we turn various lights on and off or when we enter certain rooms. But the technology is at the bare beginning and the fact is that eventually our devices will let companies like Google know more about us that we often know about ourselves.

The whole point of big data analytics is to look for patterns. Knowing how and when a certain person moves around the house is data that can be used to see a pattern. Google can compare the way you move to the way other people move and can see that there are 10,000 other people just like you in the US and that you also have a lot of other traits in common.

I know this sounds simplistic and that would be a big stretch to understand you from just being monitored by a few devices in your home. But it’s not going to eventually going to be just a few devices. It’s likely that there will be enough monitors in the average home where an outside company like Google could understand your sleep patterns, your eating habits, what you watch and read, who you talk to, how you exercise – basically everything about you.

And I just wonder if at some point if there will not be a big rebellion against that kind of invasion of privacy. I foresee a huge pushback coming against IoT until they can solve the privacy issue and give control to each person over how their own data is shared with the world. This is contrary to the goals of Google and others and it will be very interesting to see where society draws the line.

Categories
Technology

Software Defined Networks

AT&T announced last week that they are going to implement software defined networking (SDN) in their network and that over a few years they will replace other kind of telecom gear. They say that over time this is going to save them billions on hardware costs. This announcement probably is a watershed moment for the telecom industry and is going to have huge implications for the way we build our networks and the vendors we use for routers and switches.

For those who are not familiar with the term, SDN is an idea that got started at UC Berkeley in 2008 and is now starting to hit the market. Its core concept is to use generic low cost routers, switches and other network hardware and to control them with specialized and centralized software. Today the routers that operate our networks come as packages of combined hardware and software, of which software is the more expensive component. Each vendor has their own way of doing things and you will find networks that are Cisco centric or Juniper centric, and network technicians become proficient with a specific brand of equipment.

But SDN is going to change all of that. With SDN a company like AT&T will be able to buy one set of centralized software and control their devices all over the network. The equipment becomes secondary in this configuration and AT&T could mix and match different brands of equipment. The biggest obvious savings will come in that they are not having to buy the software again each time they buy a router.

But there are even bigger savings promised with SDN over time. The promise of the technology is that companies can tailor their networks on the fly by making a software change rather than swapping or upgrading hardware systems. For a company that is as decentralized and huge as AT&T this could be transformational. I am sure many of you have waited before for AT&T to make facilities available because they were in the middle of a network upgrade. AT&T says that it is not unusual today for them to take 18 months to effectuate complex network changes. With SDN they could do it on the fly, and even after taking time with testing and double checks, they will be able to effectuate major changes in weeks instead of many months. And if circumstances dictate it, such as in an emergency, they could make changes on the fly.

SDN will give a whole new set of tools to network engineers. Today traffic is forwarded using industry standards such as MPLS, BGP or OSPF. With SDN a network engineer will be able to get extremely granular with traffic. For example, they might shuttle all traffic that is experiencing jitter to a specific place in the network. Since an SDN network is programmable it is going to give them flexibility they never have had.

This announcement has to be putting fear into the large telecom vendors like Cisco, Juniper and Alcatel. These companies supply the majority of the gear to the large network providers and the companies who are pioneering SDN are much smaller start-ups. Cisco and others are already climbing onto the SDN bandwagon and developing products, but there is no doubt that SDN will hurt these vendors. The billions of dollars of savings envisioned by AT&T has to come from somewhere. Carriers will be buy cheap generic switches and routers, will be able to keep them longer and are not likely to be as loyal to specific vendors as they were in the past.

This announcement should not send you out quite yet to change your own network to SDN. The industry is still in its infancy and the cost of the master SDN software is really steep today. But like every change of this magnitude the product will eventually get cheaper and work its way down into the rest of the industry. Let’s let AT&T figure out the bugs and at some point this will become the industry norm.

Exit mobile version