Those Damned Statistics

thCAVW45NPOne of my biggest pet peeves in life is the misuse of statistics. I am a math guy and I sometimes tackle math problems just for the fun of it. I understand statistics pretty well and my firm performs surveys. I think I disappoint a lot of my clients when I try to stop them from interpreting the results in a survey to prove something that the responses really don’t prove. Surveys are a really useful tool, but too often I see the survey results used to support untruthful conclusions.

A week ago the NTIA (National Telecommunications and Information Administration) released their latest poll looking at broadband usage in the US. The survey asked a lot of good questions and some of the results are very useful. For example, they show that overall broadband penetration in the US is up to 72% of households. But even that statistic is suspect, as I will discuss below.

The problem with this survey is that they didn’t ask the right questions, and this largely invalidates the results. The emphasis of this particular survey was to look at how people use cellphones for data access. And so they asked questions such as asking the various activities that people now use their phone for such as browsing the web or emails. And as one would expect, more people are using their cellphones for data, largely due to the widespread introduction of smartphones over the last few years.

There is nothing specific with any of the individual results. For example, the report notes that 42% of phone users browse the web on their phone compared to 33% in 2011. I have no doubt that this is true. It’s not the individual statistics that are a problem, but rather the way the statistics were used to reach conclusions. In reading this report one gets the impression that cellphone data usage is just another form of broadband and that using your cellphone to browse the web is more or less the same as browsing off a wired broadband connection.

The worst example of this is in the main summary where the NTIA concluded that “broadband, whether fixed or mobile, is now available to almost 99% of the U.S. population”. This implies that broadband is everywhere and with that statement the NTIA is basically patting themselves on the back for a job well done. But it’s a load of bosh and I expect better from government reports.

As I said, the main problem with this report is that they didn’t ask the right questions, and so the responses can’t be trusted. Consider data usage on cellphones. In the first paragraph of the report they conclude that the data usage on cellphones has increased exponentially and is now deeply ingrained in the American way of life. The problem I have with this conclusion is that they are implying that cellphone data usage is the same as the use of landline data – and it is not. The vast majority of cell phone data is consumed on WiFi networks at work, home or at public hot spots. And yes, people are using their cellphones to browse the web and read email, but most of this usage is carried on a landline connection and the smartphone is just the screen of choice.

Cellular data usage is not growing exponentially, or maybe just barely so. Sandvine measures data usage at all of the major Internet POPs and they show that cellular data is growing at about 20% year, or doubling every five years, while landline data usage is doubling every three years. I trust the Sandvine data because they look at all of the usage that comes through the Internet and not just at a small sample. The cell carriers have trained us well to go find WiFi. Sandvine shows that on average that a landline connection today uses almost 100 times more data than a cellphone connection. This alone proves that cellphones are no substitute for a landline.

I have the same problems with the report when it quantifies the percentage of households on landline broadband. The report assumes that if somebody has a cable modem or DSL that they have broadband and we know for large parts of the country that having a connection is not the same thing as having broadband. They consider somebody on dial-up to not be broadband, but when they say that 72% of households have landline broadband, what they really mean is that 72% of homes have a connection that is faster than dial-up.

I just got a call yesterday from a man on the eastern shore of Maryland. He live a few miles outside of a town and he has a 1 Mbps DSL connection. The people a little further out than him have even slower DSL or can only get dial-up or satellite. I get these kinds of calls all of the time from people wanting to know what they can do to get better broadband in their community.

I would challenge the NTIA to go to rural America and talk to people rather than stretching the results of a survey to mean more than it does. I would like them to tell the farmer that is trying to run a large business with only cellphone data that he has broadband. I would like them to tell the man on the eastern shore of Maryland that he and his neighbors have broadband. And I would like them to tell all of the people who are about to lose their copper lines that cellular data is the same as broadband. Because in this report that is what they have told all of us.

What Makes Cellphone Coverage Vary?

HTC-Incredible-S-SmartphoneIt seems I have been writing about cellphones for a few days, so I thought I would cover a question that I have been asked many times. I travel a lot and it’s not unusual to sit next to somebody and note that the two of you are having a very different cellular experience. One of you may be getting one bar for data and voice while the other might be getting four, sitting only a few feet apart. What explains this difference in cellular performance? I will start with the obvious explanations, but sometimes the differences are due to more subtle issues.

Who is your carrier? Both people might have an iPhone, but if one has Verizon and the other has AT&T the experience is different because both are connected to completely different technologies and totally separate networks. AT&T and T-Mobile use GSM (Global System for Mobile) technology, the technology that is used in most of the rest of the world. But Verizon and Sprint use CDMA (Code Division Multiple Access) technology. These technologies are so different that a handset that is made only for one technology won’t work on the other. This is why you can’t take your Verizon handset to most of the rest of the world when you travel.

Who’s on the nearest tower? I’ve often been driving with somebody and hear them be glad to see an upcoming cell tower because they assume this means they’ll get better coverage. But you can’t assume this because not every carrier is on every cell tower. There are a large number of cell towers in the country. Some of these are owned by the wireless carriers but many are leased. The cellular companies look at available towers and then cobble together the combination of towers that make the most effective and cost-efficient network for them.

This task has gotten hard for the carriers because of the fact that cellphones now carry data. The original cell tower network with all of the giant towers was created back when cellphones only carried voice. But now that the networks are deploying data and using higher frequencies it turns out that a more ideal network would place the towers closer together than the traditional locations. This is causing massive reconfigurations of the networks as the carriers try to improve data reception.

Cell sites get busy. Or said another way, any one carrier on a tower might get busy while another carrier might not be busy. As cell sites get busy they do a couple of things to handle the excess traffic. Most carriers still give preference to voice over data, so as more voice calls are added to a network the amount of bandwidth allocated to data is often choked down (but not always). And eventually the tower site refuses to add new customers. But when sites get busy the performance normally degrades.

You might be roaming. Roaming is when a customer is riding a different network than the one to which they subscribe. If you are an AT&T customer and are roaming on a T-Mobile site, you will not get the same priority as a T-Mobile customer. This might mean getting slower data speeds if the site becomes busy, and it could also mean being booted from the site as it becomes full.

Spectrum is not created equal. There is not just one spectrum being used for cellular data. There are already nearly a dozen different slices of spectrum being used and the FCC is going to be auctioning more over the next two years. Every time you move to a different cell site you might be changing the frequency you are using. Carriers not only cobble together a network of the ideal cell sites, but they also play a chess game of deciding which spectrum to deploy at each tower. None of the carriers owns all of the different spectrum available, and the spectrums they own in different cities can be drastically different. This means getting four bars at your home might not give you the same experience as getting four bars when you are traveling.

What your phone allows. Perhaps one of the biggest differences in reception is that each cellphone maker decides what spectrum a given handset is going to receive. It costs a lot of energy, meaning battery time, for a phone to always search on all of the different frequencies. So different handsets allow different frequency bands. This is why LTE coverage differs so widely because there are many sets that don’t even see some of the LTE frequencies. All handsets look for the basic cellular bands, but only the most expensive sets are designed to look for almost everything out there. And as more cellular bands are allowed into the mix this will get more pronounced. Of course, you have to read very deep into the specifications of your phone to understand what it does and does not receive. Good luck asking that question at the cellphone store.

Plain old interference. Every cellular frequency has a different propagation characteristic. If you and the guy next to you are talking on different frequencies then you each will be dealing with a different set of interference. This is one of the reasons that cellular coverage is so wonky in complicated buildings like airports and hospitals. Each cellular frequency is likely to find a different set of problems in a complex environment and one frequency might get killed in a given part of the airport while another is fine. This is why you might find yourself walking around trying to find a signal while people around you are still talking.

The Upcoming AWS Spectrum Auction

Transmitter_tower_in_SpainThe FCC’s auction for new cellular data spectrum will begin on November 13. This is the first big spectrum auction in six years, so it’s worth watching. The spectrum being auctioned is being referred to as AWS or Advanced Wireless Spectrum. There are three separate bands being auctioned that go from 1,695MHz to 1,710MHz, from 1,755MHz to 1,780MHz and from 2,155MHz to 2,180MHz.

The FCC has set aside a reserve big for the auction at $10.5 billion. That means that if they don’t receive bids totaling at least that much in the first round that the FCC has the right to cancel the auction. Assuming that price is met, then the normal FCC bidding process will take place and one would expect the auction to go for a few more rounds.

The AWS spectrum is expected to be used almost entirely for data, and both Verizon and AT&T already own some spectrum that sits next to these new blocks. That is going to make it fairly easy for carriers to incorporate the spectrum into handsets. Further, this same spectrum is used in Europe for wireless data, meaning that there are already a wide array of handsets capable of using the spectrum.

Because it’s high frequency, this spectrum is capable of handling a lot of data. However, like other high frequencies it’s not great at penetrating building walls and other obstacles. Contrast this to the next auction that’s on the horizon. In two years the FCC will be auctioning chunks of the 600 MHz spectrum that is being vacated by television stations. This frequency can penetrate into elevators but doesn’t carry as much data per channel as the higher frequencies.

As you would expect the bulk of the spectrum is going to be auctioned to the largest carriers. It is expected that T-Mobile is going to be aggressive in the auction with AT&T and Verizon also buying a lot of spectrum. Sprint is expected to sit out the auction since they already own a lot of high frequency bandwidth. The wildcard player is going to be Dish Networks which may go after a lot of this spectrum. Dish has announced plans to offer a fixed data product using wireless spectrum that will also be used to deliver a cable TV line-up. This spectrum would give them more bandwidth for that offering.

The AWS spectrum is not immediately available since the Department of Defense and a few other government agencies still occupy some of the spectrum. It is expected that the bulk of the government usage will be gone in about two years, but these kinds of transitions almost invariably take longer than expected. This means that it’s unlikely that the bandwidth will have much of an impact on wireless data speeds until the two to three year time frame.

The spectrum is being auctioned off by market and as you would expect this means a wide variance in the interest by the carriers in any given market. In similar auctions in the past some markets went unclaimed, meaning that nobody was willing to pay the FCC’s minimum bid for the market, and if that happens again you can expect a second auction of the leftover, and certainly rural markets. This auction does have some incentives for small bidders and while the big carriers will grab the vast majority of the spectrum you can expect to see smaller companies going after secondary and rural markets.

The auction is expected to be tactical is that each carrier has holes they are trying to fill in certain markets. And the big carriers are keeping the upcoming 600 MHz auction in mind and may hold off on bidding now in markets where they would rather have that spectrum. This makes the auction a big chess game by market. The funny thing is that the carriers know exactly what each other already owns in terms of spectrum, so they know basically what each other is most interested in. But because there are two auctions close together or very different spectrum, nobody is going to know each other’s strategies until the first round bidding is done. The auction is often finished after the first round for a lot of markets and the following rounds are usually only for the prime markets.

I just looked at the amount of spectrum that cellphone users consume late last week. The current statistics show that the average landline connection is using almost 100 times more aggregate data in a month (download and upload combined) than the average cell phone. With that said, Cisco has predicted that the amount of wireless data usage will triple over the next five years, and many analysts think this is conservative.

It’s obvious that cellphone data is never going to rival landline data usage or even come close. I chuckle whenever I see somebody say that wireless data will win the bandwidth battle. There just is not enough wireless spectrum for that to ever happen. While cellular data usage is now doubling every five years, landline data is doubling every three years and one has to carry that trend out twenty years to see that the average landline home connection might be using nearly a terabit of data each month.

But we like using data on our cellphones. The wireless carriers have trained us to be very cautious in that usage because of the severe data caps and the horrendously high price for exceeding your data cap. But even with those restrictions, the wireless carriers need more spectrum and are expected to make this an interesting auction.

Some Tiny Steps for Web TV

Rabbit_Ears)There were several announcements in the last week from programmers who are going to put their content onto the Internet. I’ve had several people ask me if they think this means that OTT is finally here, and unfortunately I have to say no. But from these time cracks might eventually come bigger fissures. What people are hoping for is the ability to buy only the channels they want without having to buy the big cable bundles. But we still have a long way to go to get to that

The first announcement was from HBO. They plan to roll out an undefined OTT product in 2015. HBO and the other movie channels are unique in the programming world since they are always sold as premium channels and are always expensive. HBO was reported to have over 28 million US subscribers in mid-2013 through terrestrial or satellite TV subscriptions

But HBO also has the most pirated show with Game of Thrones and they have gotten a lot of requests to sell their content on an a la carte basis. HBO has not announced the details of the planned offering, but one can picture it being something like the HBO Go product that comes with most cable subscriptions. It would not be surprising to see their offering consisting of one streaming live channel along with access to the HBO library of content. There has also been no talk of price, but it won’t be cheap. HBO sells its content wholesale to cable companies in the range of $12 per month, so one would expect them to charge an OTT price at least as high as the cable companies, meaning a price of between $15 – $20. Such a product is going to appeal to some cord-cutters and cord-nevers who want to get Game of Thrones and Bill Maher without having to pirate it. But it’s going to be easier and cheaper for most people to buy HBO from their cable company. It’s a smart move by HBO who will probably be able to add a few million new subscribers. But in doing so they are not going to be damaging the traditional cable market

The other announcement this week was from CBS which announced an OTT package for $5.99 per month. This would consist of a live network stream from major market affiliates as well as a library of older content on demand. But for now it won’t include football. This product is a bit more of a puzzle from an OTT perspective. Currently if you buy content from the big cable companies like Comcast you normally get access to the CBS library online to any device. For example, I pay my cable company for a basic package for about $20 that gets me access to the libraries of all four major networks. If ABC, NBC and Fox match the CBS offering, then a person wanting all four networks online would be paying more than they pay for basic cable

The only real advantage of the CBS package is that it comes with a live stream on-line, and this is the first time that a network has offered live content on-line. But one has to ask if that is really worth $6 per month? This is about triple what CBS gets from cable companies that carry their content, so one can see why they want to sell their content for a premium price. But are that many people willing to pony up $6 just to get one channel on the Internet? There will be some but I can’t see this being very popular. After all, in most of the US I can get this on a TV for the cost of a pair of rabbit ears

It’s becoming obvious that any OTT programming that makes it to the web is not going to be cheap. And it’s money that drives the cord cutters. The New York Post reported a week ago that the upcoming Sony OTT package was going to offer 100 channels on the web for $80, while others are reporting a price of between $60 and $65. Those prices are not going to lure many people off cable in metropolitan markets due to the bundling from the big cable companies. Most people are in a position where the cost of their cable internet product rises if they ditch cable TV. In my own case, Comcast would only sell me a 50 Mbps connection if I bought at least basic cable

One has to ask if any of the packages mentioned to date are going to have much appeal. There are going to be the stray customers who will think these products are great. The one with the most chance of success is HBO, because it’s going to appeal to some of those with no cable subscription. But the CBS offer to me is a head scratcher. While there will be some who would love to get network TV on any device, the $6 monthly price tag feels like a lot for one channel. And Sony’s plans are even odder to me. There are certainly people who hate their cable company and would love to change to somebody else. Having 100 channels available on any device sounds attractive (assuming that this won’t only be available on Sony smart TVs). But it’s really hard in metropolitan areas going against the bundle, so it seems that selling packages for about the same price as the cable companies won’t be that attractive. Sony might do better in rural areas for people who want to get off satellite, but those are the areas that often have the worst broadband and where people might not be able to subscribe to OTT programming

None of these announced products are going to make a big crack in the cable market, but these are all the starts to the change. Somebody is going to have to come up with packages that a lot of people are going to find attractive to get any market traction, and that is going to take the willingness of the programmers. They are still making too much from the traditional cable packages to flinch too much. A lot of these early attempts at OTT will probably fail, but that’s what happens to those willing to go first in a new market – a market that consumers want if it can ever be done right.

Our Cellphone Data Usage

HTC-Incredible-S-SmartphoneLast week I looked at what we are downloading on home Internet connections. Today I am taking the same look at the latest statistics about what we download on our cellphones. These numbers represent downloading only through cellular connections and don’t include using cellphones on WiFi.

The first thing that strikes you about the numbers is how small the numbers are compared to landline data connections. We are literally using 100 times more data, on average, for a landline connection compared to the average cell phone. It’s obvious that people are still very cautious using their cellphone data. But the overall use of data on cellphones is rising and it is not going to take very many years of growth until the average person is bumping up against the normal 2 gigabit monthly cap on smaller cellphone plans. It’s clear in looking at these numbers that the cellular carriers have trained us not to use cellular data, but as I recently wrote, they are now trying to figure out ways to reverse that trend.

I compared the statistics from the first half of 2014 to the first half of 2012 just to see how things have changed. Consider the following chart that compares average cellular downloads in the US for the two periods.

Rank           1st Half 2012      Pct                1st Half 2014      Pct

1                   YouTube         27.2%             YouTube          17.6%

2                   HTTP               19.9%             Facebook         14.0%

3                   Facebook         8.7%              HTTP                 12.7%

4                   MPEG               7.2%              MPEG                 8.6%

5                   Pandora           5.4%              SSL                     6.5%

6                   SSL                    4.8%             Google Market   5.3%

7                   Google Market 3.5%             Pandora              5.2%

8                   Netflix               2.2%             Netflix                  5.1%

9                   Flash Video       1.7%             Instagram           3.5%

10                 Windows           1.7%             iTunes                  3.1%

  1. 3%             78.5%

To put this into perspective it’s also important to look at average Internet usage per cellular customer. Consider the following numbers:

.                                                        1st Half 2012   1st Half 2014     Pct Increase

Average Monthly Download              280 MB          404 MB               44%

Average Monthly Upload                     33 MB            69 MB              109%

Aggregate Monthly Usage                  313 MB          465 MB                50%

This shows that average cellphone download increased only 44% over a two year period. This is much slower than the 33% annual increase we see in landline data use. But uploads doubled over two years and maybe it’s all those selfies! Combining these two sets of statistics tells the real story. Following is the average download in megabits used by cellphone users for each of the top data applications in 2012 and 2014.

Rank                  1st Half 2012                         1st Half 2014

1                   YouTube         76.7 MB          YouTube            71.1 MB

2                   HTTP                55.7 MB          Facebook          56.6 MB

3                   Facebook         24.4 MB          HTTP                 51.3 MB

4                   MPEG               20.2 MB          MPEG                34.7 MB

5                   Pandora          15.1 MB           SSL                    26.3 MB

6                   SSL                   13.4 MB           Google Market 21.4 MB

7                   Google Market  9.8 MB           Pandora            21.0 MB

8                   Netflix                6.2 MB           Netflix                20.6 MB

9                   Flash Video        4.8 MB           Instagram          14.1 MB

10                 Windows            4.8 MB           iTunes                12.5 MB

If you look at the first table you would suppose that YouTube downloads are way down. But the second table shows YouTube is down only 7% over two years. There are a few uses of cellphones that are way up. Netflix is up 235% over two years. Facebook is up 132% over two years. And Google Market is up 118% over two years. The only thing other than YouTube that is slightly down over the last two years is HTTP, or web browsing.

If you trended this two years forward then Facebook will clearly become the predominant use on cellphones, followed by Netflix. YouTube and MPEG pictures would trend to become third and fourth. It’s obvious that video overall is growing on the cellphone faster than anything else, with social networking also a significant use.

It will be interesting to see what impact is felt over time as the wireless carriers push more data usage. Both AT&T and Verizon have been pushing bigger family plans in the attempt to get people off their WiFi and back onto their 4G networks. It’s pretty obvious that, on average, people are not using a lot more data on their cellphones and continue, on the whole to be cautious. This is not to say that there are not many people who use a lot of data or that there are not already a lot of people who exceed their monthly data plans. These statistics represent the nationwide averages. Cellular companies report that data sales are way up, while these numbers show that average usage is not. One has to think that perhaps people are buying more data than they actually use.

Who Will Own the Internet of Things?

Tribrid_CarYesterday’s blog talked about the current Internet that is falling under the control of a handful of large corporations – Apple, Amazon, Facebook, Google and Microsoft. This leads me to ask if the upcoming Internet of Things is also going to be owned by a handful of companies

This is not an idle question because it has become clear lately that you don’t necessarily own a connected device even though you might pay for it. As an example, there was recently an article in the New York Times that reported that a car company was able to disable cars for which the owners were late in making payments. The idea of Ford or General Motors still having access to the brains of your vehicle even after you buy it is unsettling. It’s even more unsettling to think access is in the hands of somebody at your local car dealer. Imagine them turning off your car when you are far away from home or when you have a car full of kids. But even far worse to me is that if somebody can turn off your car then somebody else can hack it

The car companies are able to do this because they maintain access to the root directory of your car’s computer system. Whether you financed the car with them or paid cash, they still maintain a backdoor that lets them get remotely into your car’s computer. They might use this backdoor to disable the vehicle as in this example or to download software upgrades. But the fact is, as long as they have that ability, then to some degree they still have some control over your car and you. You have to ask if you truly own your own car. As an aside, most people don’t realize that almost all cars today also contain a black box, much like the recorder in airplanes that records a lot of data about your car and your specific driving habits. It records data on how fast you drive or if you are wearing your seatbelt – and this data is available to the car companies

Perhaps the car is an extreme example because car is probably the most complicated device that you own. But it’s likely that every IoT device is going to have the same backdoor access to the root directory. This means that the company that made an IoT device is going to have a way to gain access. This means every smartphone, appliance, thermostat, door lock, burglar alarm and security camera can be controlled to some degree by somebody else. It makes you seriously ask the question if you entirely own any smart device

Over time it is likely that the IoT industry will consolidate and that there will be a handful of companies that control the vast majority of IoT devices just like the big five companies control a lot of the Internet. And it might even be the same companies. Certainly Apple, Google and Microsoft are all making a big play for the IoT

I’ve written before about the lack of security in a most IoT devices. My prediction is that it’s going to take a few spectacular failures and security breaches of IoT devices before the companies that make them pay real attention to security. But even should they tighten up every security breach, if Google or Apple maintains backdoor access to your devices, then they are not truly secure

I think that eventually there will be a market for devices that a buyer con control and that don’t keep backdoor access. It certainly would be possible to set up an IoT network that doesn’t communicate outside the home but where devices all report to a master controller within the home. But it’s going to take people asking for such devices to create the market for them

If people are happy to have Apple or Google spy on them in their homes then those companies will be glad to do it. One of the first things that crossed my mind when Google bought Nest was that Google was going to be able to start tracking a lot of behavior about people inside their homes. They will know when you wake and sleep and how you move around the home. That may not sound important to you, but every smart device you add to your house will report something else about you. With the way that the big companies mine big data, the more they know about you the better they can profile you and the easier it is for them to sell to you. I don’t really want Google to know my sleep habits and when I go to the bathroom. To be truthful, it sounds creepy.

A New Kind of Internet?

Hong-Kong-ProtestSomething very interesting happened over the last few weeks that provides a glimpse into a different future for the Internet. The students and others who were protesting in Hong Kong were able to set up a private network that bypassed the Chinese authorities using a blockchain. I will describe what that is below.

There are currently a lot of different people who don’t like the way the Internet is operated today. First, we have the NSA surveillance and everything that implies. Numerous countries around the world are in the process of setting up servers that will keep a lot of local data on their own servers so that it doesn’t leave the country. In effect we are looking at a world where each country may have its own Internet cloud and a firewall around their data

Probably even more intrusive, we have a few large companies controlling a large percentage of what happens on the web. Science Fiction writer and futurist Bruce Sterling calls these large companies the ‘Stacks’ and his current list includes Apple, Amazon, Google, Facebook and Microsoft. We know that these large companies each have their own agenda for tracking each of us, mostly for marketing purposes, but each of them also has cooperated to some degree with the NSA.

Finally, we have the possibility that in the US that the FCC is going to vote against net neutrality which will free the large companies and the ISPs to do what they wish. It may seem a bit nationalistic to think that what happens here is important to the world, but since the large companies that control the web today are all American, to some extent, as goes America so goes the web.

There are many who are disturbed by these trends and I have seen numerous articles asking if the Internet as we think of it is already dying. To counteract these trends we have seen numerous new browsers, email services and encryption programs introduced in the last year for people who are looking to opt out of the surveillance world.

And now we have Hong Kong which is behind the Great Firewall of China where the government micromanages Internet access. Had the Hong Kong protesters used any of the normal available services to communicate, such as email or the various social media sites in China they would have been quickly squashed. So instead they cleverly established a blockchain.

A blockchain is a software technology that was established as the basis for trading Bitcoins. A simplified explanation of a blockchain is that it’s a distributed consensus network that allows communications to be made securely and without any centralized authority. The Bitcoin world is entirely based upon trading currency, but the same technology can be used to exchange any other kind of communication such as emails, tweets, etc. Every transaction is encrypted including a unique encrypted code given to each user of the network. This means that even if intercepted by the Chines authorities, the blockchain communications were coded both in terms of content and identity of participants.

One of the interesting things about blockchain communications is the consensus required for it to work. Each user involved in the blockchain basically validates everybody else. If for some reason there is no communication on the blockchain, then after a set amount of time the whole chain collapses.

So the protesters in Hong Kong established a temporary encrypted peer-to-peer network that was impenetrable by the authorities. This let them communicate and coordinate their activities free from oversight or censorship. And when the protest ends the blockchain will collapse and disappear.

This concept could become the basis for establishing secure group communications in the future that falls outside of the NSA or large company tracking. It’s not hard to imagine those with similar interests of some sort being able to launch their own blockchain that would just look like indecipherable bits as it passed through any Internet hubs or monitoring points. Such networks need not be nefarious and a blockchain network could be used for any group like a college fraternity, a science fiction fan club, the fans of a sports team or band, or anything else. Such networks could spring up and disappear as needed and would only be available to those with some sort of in on how to join. But even those who are insiders in the network have no way to see what others are doing and they can only decode their own transactions.

It’s an interesting concept and is the first effective way that people are bypassing the surveillance world. The vast majority of people in the world have nothing to hide, but that doesn’t mean that they enjoy having large companies or governments track and record everything they do. Expect there to be numerous attempts to create alternatives to today’s Internet. And expect it to be a cat-and-mouse game where new strategies avoid surveillance for a while until cracked, but with new ideas strategies waiting to work next. We could be seeing the start here of a new Internet where people take back privacy by opting out of products offered by the mainstream companies.

Are We Ready for the New Digital Divide?

digital-divideYesterday I briefly discussed a few of the major predictions that have come out of a Pew Research survey of industry experts that ask what we’ll be seeing from broadband applications by 2025. They predicted such things as a major use of telepresence, greatly enhanced virtual reality and closer daily tie between us and our computers. Today I want to talk in more detail about one of the negative predictions where many of the experts predicted that we will see a new digital divide that will be more extreme than the current one.

The digital divide today is between people who have broadband and those that don’t. Those without broadband fall into a few categories – those that live in rural areas where broadband is not available, those who are too poor to afford broadband and those that don’t want it. I’ve talked about this before, but these experts are saying that the future digital divide will be more extreme because it will separate those who can participate in an all-digital world and those who cannot.

The future digital divide will matter because there are going to be essential services that require big bandwidth. Businesses without enough bandwidth will not be able to take part in telepresence, and this is going to cut them off from much of the world. Both their suppliers and customers are going to expect them to be able to communicate virtually. Homes are going to need big bandwidth for education, medical care and even shopping. Anybody without big bandwidth is going to be left out of the mainstream and will have to accept something less.

Those that have access to the bandwidth and the kinds of applications that are predicted for a decade from now will have a major advantage over those who do not have good enough broadband. This means people with big broadband will get the jobs, enjoy better health, be able to live in their homes to an older age and be better educated than those that don’t have big broadband. The gap today is not nearly this extreme, but with the future that the experts all foresee, broadband becomes a necessity and not something that is nice to have.

Big bandwidth services are going to require a landline broadband connection, be that fiber or an upgraded cable network. Wireless is going to have its place to keep you connected to the basic services while on the move, but telepresence, virtual reality and most IoT services are going to be landline-based.

It is almost certain that a lot more people will fall on the wrong side of the digital divide than today. Today there are tens of millions of households and businesses for which the broadband they have today will become totally inadequate in the future. Many of the technologies we use today that deliver okay bandwidth – DSL networks, older generation cable networks and WISP wireless networks – are not capable of delivering the kind of bandwidth that will be needed in the future. These technologies today can provide bandwidth speeds that most people find acceptable. But when we start using applications that are going to require speeds of a hundred megabits or maybe many hundreds of megabits, these technologies are all going to be inadequate.

The only two technologies that can deliver the kind of bandwidth needed in the future are fiber and updated cable networks. We all know that fiber is capable of incredible speeds and normally requires an upgrade in lasers and electronics to go faster. But there are upgrade paths for cable networks that ought to be able to provide gigabit speeds. The problem is that the cable network upgrades are complicated and costly. In many cases it’s not just electronics that needs to be changed for a cable network to go faster. It can mean building a lot more fiber into the cable network and sometimes even having to replace much of the coaxial cable. It means changing the cable headend, the settop boxes and the cable modems. It means almost a whole new network to get to gigabit speeds. But it can be done.

One has to realistically ask how many communities are going to get very fast, yet still affordable broadband. Certainly some of the major cities are getting gigabit fiber from Google and a handful of other providers. But even in those communities it looks like fiber isn’t going everywhere. Fiber is being put into neighborhoods willing to pay for the advanced services but it’s largely bypassing poorer neighborhoods and apartment buildings. In those same communities the cable companies are responding to fiber competition by upgrading speeds.

But what about all of the places that don’t get fiber over the next decade? Will the cable companies make the needed investments in smaller markets to get faster speeds? Much of small-town America that has broadband speeds today between 3 Mbps – 15 Mbps due to older technology and its not hard to bet that they are not going to upgraded.

One of the new industry buzzwords is that fiber is a utility, and is something that every community needs to be able to thrive. While this may be somewhat true today, within a decade fast data speeds will be essential for businesses to operate and for homes to partake in the services that come only with speed. The demands for faster broadband will become louder as more and more communities that have okay broadband today find that same broadband to be totally inadequate tomorrow.

What Does a Gigabit Get Us?

Alexander_Crystal_SeerPew Research did a survey of 1,464 industry experts and asked then what killer apps we can expect if the US is able to significantly increase customer bandwidth between now and 2025. About 86% of the experts thought that bandwidth would improve enough by then to provide a platform for supporting widespread new applications.

The question does not suppose that everybody will have a gigabit of download speed, although by then there will many homes and businesses with that much speed available. But one can also suppose that by then that there will be many people with download speeds of hundreds of megabits. The cable companies are on a path with DOCSIS 3.1 to be able to increase speeds significantly on their networks if they so choose. So the biggest chance for fast speeds for the masses is not having fiber built everywhere by 2025, but rather of having the cable companies stepping up over the next decade. Most experts are thinking that they will to some extent (and I agree).

There were a few applications that a number of the experts agreed would become prevalent if download speeds increase:

Telepresence. There was a feeling that telepresence will have come a long way over the next decade. We already see the beginning of this today. For example, Julian Assange from WikiLeaks recently appeared at a summit in Nantucket via hologram. That is the precursor for having routine meetings with people by hologram. This would not just be speakers at conferences (but it would make it easier to get more impressive speakers when they don’t have to travel). But it means having salesmen make calls by telepresence. It means having staff meeting and other business meetings by telepresence. This is going to have a huge impact on business and could represent huge cost savings by reducing travel and the wasted costs and hours due to travel.

But there is also going to be a huge market for residential telepresence. One of the most popular features today of an iPhone is Facetime that lets people easily see each other while they talk. And Skyping has become wildly popular. One can imagine that people will grab onto telepresence as soon as the associated hardware is affordable, as a way to spend time with family and friends.

The experts also think that telepresence will have a big impact on medicine and education. Telemedicine will have come a long way when if a patient can spend time in the ‘presence’ of a doctor. Telepresence also will be huge for shopping since you will be able to get 3D demos of products online. In fact, this might become the first most prominent use of the technology.

Virtual Reality. Somewhat related to telepresence will be greatly improved virtual reality. We have the start of this today with Oculus Rift, but over a decade, with more bandwidth and faster processors we can have improved virtual reality experiences that can be used for gaming or for blending the fantasy world with the real one. There was also news last week that Microsoft demonstrated a 3D hologram gaming platform they are calling GameAlive that brings something akin to a holodeck experience into your living room. Over a decade virtual reality is likely to move beyond the need for a special helmet and will instead move into our homes and businesses.

Imagine being in a gym room and playing a game of tennis or some other sport with a friend who is elsewhere or against an imaginary opponent. Imagine taking virtual tours of foreign tourist destinations or even of visiting imaginary places like other planets or fantasy worlds. It is likely that gaming and virtual reality will become so good that they will become nearly irresistible. So I guess if computers take all of our jobs at least we’ll have something fun to do.

Internet of Things. Within a decade the IoT will become a major factor in our daily lives and the interaction between people and machines will become more routine. We are already starting to see the beginning of this in that we spend a lot of our time connected to the web. But as we become more entwined with technology it means a big change in our daily lives. For example, experts all expect personal assistants like Siri to improve to the point where they become a constant part of our lives.

Just last week we saw IBM roll out their Watson supercomputer platform for the use in daily apps. That processing speed along with better conversational skills is quickly going to move the web and computer apps deeper into our lives. Many of the experts refer to this as a future of being ‘always-on”, where computers become such a routine part of life that we always are connected. Certainly wearables and other devices will make it easier to always have the web and your personal assistant with you.

Aside from the many benefits of the IoT which I won’t discuss here, the fact that computers will become omnipresent is perhaps the most important prediction about our future.

Not everything predicted by the experts was positive and tomorrow I am going to look at a few of those issues.

Mr. Watson . . . . come here.

Watson-supercomputer-635This week IBM cut the ribbon on a “Watson Client Experience Center” in New York City, where along with five other centers it will provide access to the Watson supercomputer. A few weeks ago IBM also announced the availability of what it calls Bluemix, a suite of several cognitive-based cloud services. Several of the articles I read about this announcement say that Watson is bringing artificial intelligence to the world. But it’s not. Watson is a pretty amazing computer system and can do a lot of great things, but the computer is still no smarter than your toaster. You may ask how I can say that since Watson was able to soundly beat the two best Jeopardy champs a few years ago.

Let’s look at how Watson works. First, Watson is a supercomputer, meaning that it has massive computational power and a fast input / output. Watson is configured as cluster of ninety IBM Power 750 servers each of which uses a 3.5 GHz POWER7 eight core processor, with four threads per core. In total, the system has 2,880 POWER7 processor cores and has 16 terabytes of RAM. Watson has a natural language interface meaning that it is designed to be queried by conversation, in the same manner as Apple’s Siri.

Watson uses a hypothesis generator. What this means is that when it is asked something, Watson searches its databases and compiles all of the answers that seem to answer the question posed to it. Through its sheer blazing computational speed Watson can search this entire database quickly. It then ranks the results according to the frequency that it encounters answers. For the Jeopardy challenge Watson was fed with multiple reference sources like encyclopedias, textbooks and all of Wikipedia.

Finally, Watson uses what IBM calls dynamic learning. This means that when Watson makes a mistake, which has to be often when working in something as imprecise as English, Watson can take feedback from the user when told that its answer is wrong. It stores this feedback and uses its ‘learning’ to influence the rankings when it next encounters the same question.

But under it all Watson is no smarter than your desktop computer because there is no actual intelligence in the system, artificial or otherwise. What Watson does to simulate intelligence is to present a friendly language interface and fast computational power to come up with answers to questions. But Watson is only as ‘smart’ as the databases underneath of it. For Jeopardy they did not allow Watson access to the Internet because the internet is full of incorrect facts. Watson has no way of distinguishing between what is true or not true, other than through feedback from users who correct its mistakes. But Watson would be like many of us and would fall for every Internet hoax that hits the web. For example, there was an Internet hoax earlier this year that said that Flo from the insurance commercials was killed, and if Watson was connected to the web it would believe such an untrue rumor based upon the sheer volume of claims made about the hoax.

This is not to say that Watson can’t do amazing things. Imagine Watson paired with Siri. Let’s face it, Siri is okay with driving directions but can quickly get flustered on almost anything else. With Watson’s database behind Siri it would become much more useful in a hurry. And even for driving directions Watson would help Siri be better. Siri is great at getting you between towns, but I’ve noticed that in crowded urban environments that Siri regularly wants you to pull into the wrong parking lot or driveway, and over time Watson would help Siri learn these little nuances of the map through user feedback.

Expect over the next few years to see a flood of new apps that do a better job of working through spoken interface. Already there are interesting new ventures that plan on incorporating Watson. For example, the founder of Travelocity wants to roll out a service called WayBlazer that will help you figure out things to do when you travel. The goal is to help you find activities that interest you rather than being steered to the normal tourist traps. A start-up called LifeLearn wants to build a tool to help veterinarians diagnose pet ailments better. A company called SparkCognition wants to offer a service to help security people spot security risks by having Watson ‘think like a security expert’. Expect all sorts of new programs and apps that take advantage of Watson’s language interface and the ability to quickly search databases.

This is a big breakthrough in that this is the first time that mass computational power will be brought into our daily lives through apps. Those apps are going to start doing things that we have always wanted computers to do. But let’s not forget how quickly computers are getting better. I reported last month on a company that expected to have a desktop supercomputer by 2017 that will be several magnitudes faster than Watson. Within a decade there will be computers everywhere with the power that Watson has today. And let’s also not forget that Watson is not smart and that there is zero cognition in the system. Watson doesn’t think, but rather just searches and compiles large databases quickly. That is incredibly useful and I will be glad to use Watson-based services – but this is not yet anything close to artificial intelligence.