Wireless Shorts

HTC-Incredible-S-SmartphoneToday’s blog covers several interesting bits of news out of the cellular world.

New Cellphone Tracking Rules: The Justice Department recently declared that US prosecutors and federal law enforcement agencies will need to get a search warrant if they want to use tools to track the location of a cellphone.

It’s been reported that the FBI and the DEA have been using cell phone simulators without warrants. These devices mimic a cellphone tower and gather information on all of the cellphones within range without actually processing calls. The ACLU has estimated that at least 53 different agencies have been using cellphone simulators, but it may be even more since many of them have been secret about the process.

Privacy advocates have complained that these devices pick up information about every cellphone within range of the cellphone simulator and not just those people who are being tracked. Under the new guidelines these agencies will have to immediate delete any information they gain that does not apply to a specific person being tracked.

California Requires Warrant to Search a Cellphone: In a related story, California just passed the California Electronics Communications Privacy Act which now requires police to get a warrant to search a cellphone. California joins Maine and Utah as the only other two states that provide this protection for citizens.

Until now, and still in most states, police often routinely search cellphones from people who get arrested, often just fishing for information rather than having any specific reason to check a given phone.

Faster Cellphone Web Browsing: Google has announced a new technology that is going to dramatically speed up web browsing on cellphones. Until now cellphones have been designed to quickly process data coming from apps, but as anybody who has tried to read native web pages from their cellphone knows, the process of downloading and displaying traditional web pages has been painfully slow.

Google is calling the technology Accelerate Mobile Pages (AMP). In practical application AMP is going to change the time required to open a web page from many seconds down to milliseconds. Google is going to make the technology available to everyone, but the consensus is that it will still mostly benefit them. Applications like sites like Facebook have captured more mobile traffic than Google and this technology can make their search engines and other applications relevant on cellphones.

Young People Think it’s Okay to Track a Lover: A survey in Australia showed that over half of young people between 16 and 24 think that using technology to track a lover is okay. This includes practices such as looking through a mate’s phone or installing apps that track and follow their location.

But older people in the survey overwhelmingly, at 84%, thought that tracking somebody’s cellphone usage is a violation of privacy and trust. The folks giving the survey were not able to pinpoint why the younger generation largely thinks this is okay other than to postulate that this is the first generation that has grown up with cellphones available since birth.

Selfies More Deadly Than Sharks: So far in 2015 there have been 8 worldwide deaths due to shark attacks but 15 deaths due to people taking selfies. This includes incidents like the Japanese tourist who fell down the steps of the Taj Mahal while taking a selfie. It also includes a man killed while taking a selfie during the running of the bulls in Pamplona. There have been a few such deaths in Russia this year prompting the government to put out a TV commercial warning people to be careful when taking selfies.

New Video Format

alliance-for-open-mediaSix major tech companies have joined together to create a new video format. Google, Amazon, Cisco, Microsoft, Netflix, and Mozilla have combined to create a new group called the Alliance for Open Media.

The goal of this group is create a video format that is optimized for the web. Current video formats were created before there was wide-spread video using web browsers on a host of different devices.

The Alliance has listed several goals for the new format:

Open Source Current video codecs are proprietary, making it impossible to tweak them for a given application.

Optimized for the Web One of the most important features of the web is that there is no guarantee that all of the bits of a given transmission will arrive at the same time. This is the cause of many of the glitches one gets when trying to watch live video on the web. A web-optimized video codec will be allowed to plow forward with less than complete data. In most cases a small amount of missing bits won’t be noticeable to the eye, unlike the fits and starts that often come today when the video playback is delayed waiting for packets.

Scalable to any Device and any Bandwidth One of the problems with existing codecs is that they are not flexible. For example, consider a time when you wanted to watch something in HD but didn’t have enough bandwidth. The only option today is to fall back the whole way to an SD transmission, at a far lower quality. But in between these two standards is a wide range of possible options where a smart codec could analyze the bandwidth available and could then maximize the transmission by choosing different options among the many variables within a codec. This means you could produce ‘almost HD’ rather than defaulting to something of much poorer in quality.

Optimized for Computational Footprint and Hardware. This means that the manufacturers of devices would be able to maximize the codec specifically for their devices. All smartphones or all tablets or all of any device are not the same and manufacturers would be able to choose a video format that maximizes the video display for each of their devices.

Capable of Consistent, High-quality, Real-time Video Real-time video is a far greater challenge than streaming video. Video content is not uniform in quality and characteristics and there is thus a major difference in the quality between watching two different video streams on the same device. A flexible video codec could standardize quality much in the same way that a sound system can level out differences in listener volume between different audio streams.

Flexible for Both Commercial and Non-commercial Content A significant percentage of videos watched today are user-generated and not from commercial sources. It’s just as important to maximize the quality of Vine videos as it is for showing commercial shows from Netflix.

There is no guarantee that this group can achieve all of these goals immediately, because that’s a pretty tall task. But the power of these various firms combined certainly is promising. The potential for a new video codec that meets all of these goals is enormous. It would improve the quality of web videos on all devices. I know that personally, quality matters and this is why I tend to watch videos from sources like Netflix and Amazon Prime. By definition streamed video can be of much higher and more consistent quality than real-time video. But I’ve noticed that my daughter has a far lower standard of quality than I do and watches videos from a wide variety of sources. Improving web video, regardless of the source, will be a major breakthrough and will make watching video on the web enjoyable to a far larger percentage of users.

Broadband in Canada

canada_flag-1920x1080We can get a bit myopic and tend to think that issues in the US are somehow different than the issues elsewhere. But Canada is having the same issues with broadband that we are experiencing in the US. There is considerable activity in Canada to bring gigabit speeds to its major cities. Like in the US, there are various incumbents building or upgrading networks in cities.

Bell Canada is spending over $20 billion to pass 2.2 million homes with fiber in Ontario and Quebec under the tradename of Gigabit Fibe (‘r’ intentionally not included). This will bring fiber to major cities like Montreal and Toronto. Unlike the pricing in the US, they will upgrade customers to a gigabit speed for an additional $10 per month.

Telus has announced it is going to spend several billion to bring fiber to Edmonton and Vancouver. Their plan is to extend fiber over a 5-year period but to eventually cover the cities.

Not to be outdone by Bell Canada, Rogers Communications, the cable incumbent, has announced plans to upgrade to gigabit speeds in Toronto and Atlantic Canada by the end of 2016.

But just like in the US, Canada has poor broadband outside of the cities. Their rural areas are much like the ones here with slow satellite or dial-up access. Last year the Canadian government announced a $305 million plan to bring better broadband over five years to about 280,000 rural households. This is a successor to the Broadband Canada Program which spent $225 million over three years to bring faster broadband to remote northern parts of the country.

These national programs are very analogous to the Connect America Fund in the US which is being used to upgrade rural areas to 10 Mbps download speeds. The funding in Canada is also being largely used to extend rural DSL or wireless and to bring some broadband to rural areas that have little or no broadband today.

Both countries are putting band-aids on rural broadband while large commercial companies are bringing gigabit speeds to the urban areas. While the rural areas in both countries are going to welcome getting faster broadband, especially if they never had it before, these areas will soon be further behind the cities in terms of the difference in broadband speeds than they were just a few years ago.

The governments of both countries face a major rural dilemma. It is going to cost many billions to bring real broadband to rural places. The governments in both countries are throwing federal money at old broadband technologies in order to take off some of the political heat from rural citizens. But by doing so they are doing a long-term disservice to rural areas.

In the US a lot of rural counties are willing to tackle bringing fiber to their areas. These efforts would be greatly improved if the federal government would have made the federal subsidies available for fiber instead of for tweaking obsolete DSL.

Federal and state governments in the US have further made it harder for rural business plans to succeed by funding broadband to ‘anchor institutions’ like schools and local governments. In a lot of rural America fiber has been built to those entities but to nobody else, thus removing those anchor revenues from any local effort to fund fiber projects.

And in both countries there is an additional swath of citizens who will soon be on the wrong side of the digital divide. While large cities are getting gigabit fiber, there is not nearly as much interest in bringing faster broadband to the smaller cities in either country. While smaller towns and cities in the US have somewhat okay broadband today, they are quickly falling behind the push for urban gigabit speeds. I don’t see a lot of business plans from anybody to fund and build fiber in places under 50,000 in population – which includes many county seats across the country.

I guess it’s not surprising to see Canada’s path so closely paralleling ours. Since they have fewer large cities it is likely that they will have widespread urban gigabit broadband before we do. But in both countries the gulf between urban and rural broadband, between the haves and have-nots, is growing rapidly.

Cities Blast Verizon

fios vanIn the attached letter the mayors of twelve northeast cities with populations over 12 million blasted Verizon for not expanding FiOS. In some of the cities the complaint is that Verizon never finished the expansion they had promised. But in other cities on the list the complaint is that Verizon never came to their cities at all.

You certainly can understand the pain felt by these mayors. Verizon was first on the scene in the US with fiber and at one point in time most of these cities felt like it was just a matter of time until Verizon brought fiber to their cities. But to a large degree Verizon built in suburbs more than in cities, and last year they announced that they were done expanding FiOS. These mayors, like mayors in cities all across the country, have citizens demanding a broadband solution.

But I think these mayors are barking up the wrong tree; perhaps they know this and the letter is just a way for them to demonstrate their frustration to their constituents. I have been watching Verizon for many years and I am not so sure that Verizon even wants to be in the landline business any longer. There are numerous signs of this:

  • First is the fore-mentioned end of the FiOS expansion. One has to consider that broadband penetration rates are much higher than when Verizon first started fiber expansion and they are surrounded by money-making expansion opportunities that would land them large number of new high-margin fiber customers.
  • They recently sold a large swath of landlines, including a sizable number of FiOS lines, to Frontier. Verizon has been selling copper lines to Frontier for a number of years, but these were the first sizable FiOS sales.
  • One only has to look at their annual report to see that all that they talk about to investors is their cellular business. The entire landline business is buried deep inside the report and gets no emphasis.
  • They are the only major telco to refuse large amounts of free money from the FCC to help expand rural DSL. The Connect America funds gives telcos six years to upgrade rural DSL, and Verizon’s indifference to this money tells me that they hope to not own those properties before the end of six years.
  • There are numerous documented complaints, including from their employee’ unions, that Verizon is spending the bare minimum needed to keep their copper networks functional. I’m not sure that this makes them very different than AT&T or CenturyLink. Until the recent Connect America Fund money, rural copper has largely been neglected by every large telco.
  • All of their negative press comes from the landline business. This letter from the majors is just another in a long line of complaints about the way Verizon is ignoring their landline business. Their press from the cellular business is much more positive.

I don’t have any inside information, but it’s my bet that if somebody offered to buy their entire landline business Verizon would take it. They have been selling chunks of landlines to Frontier over the last decade, but I doubt that Frontier can put together the funding to buy the rest of the Verizon landline business. I am not sure that anybody other than AT&T could pull off such a large purchase, and AT&T certainly would not want to inherit all of the copper markets that Verizon has been neglecting for decades.

I doubt that these mayors think that this letter, or any political pressure, is going to change Verizon’s behavior. Verizon has been ignoring these same cities now for literally decades, and in my estimation they will continue to do so. These cities all want better broadband and it’s probably time for them to consider some other solution than Verizon, as is being done by cities all over the rest of the US. The cities with the biggest problems are the ones that Verizon has only partially built – those markets are not very attractive to Google or any other fiber overbuilder. The cities with no FiOS are basically in the same position as thousands of other cities around the US, all of which are pondering if they need to find their own broadband solution.

Finally, Cable TV in the Cloud

Cloud_computing_icon_svgThere is finally a cable TV solution for small cable providers that does not require them to own and operate their own headend. In fact, this new solution doesn’t even require them to be a regulated cable company.

The solution is provided by Skitter TV. Skitter TV was started by Skitter Inc. based in Atlanta, Georgia and headed by Robert Saunders, one of the pioneers of IPTV. The company has developed proprietary cable hardware and software that is far less costly than other cable headends.

Skitter TV has been operating for a few years as a sort of a cooperative, and is owned by Skitter Inc. and a number of independent telephone companies. The company’s cable model for the last few years was to come to a small carrier that offers cable TV and to supplant the incumbent product with Skitter TV. Most small cable operators are losing money on their cable product. Skitter TV becomes the cable provider of record and then shares profits with the local provider, which guarantees a small profit on cable.

But Skitter TV just upped their game and has partnered with Iowa Network Services (INS) to bring Skitter TV to more carriers for a lower cost. INS is a consortium of independent telephone companies in Iowa and the company owns a substantial middle-mile fiber network as well as provides a number of services to members.

The latest move takes advantage of the INS fiber network and includes plans to interconnect to other telco-owned fiber networks throughout the country. This will allow companies with access to these other fiber networks to get their cable signal from the INS headend. The same economic model still holds and Skitter will offer a revenue share with local providers, who get to disconnect their existing losing cable business.

There are a few key issues to consider for a small provider looking at this opportunity. The primary one is the cost of transport needed to connect to Skitter and INS. It’s likely that companies that can get a connection to one of the other statewide networks can get this transport for a reasonable cost. But providers outside of those networks need to consider the transport costs in looking at the opportunity.

I’ve looked closely at Skitter TV and it’s a very interesting product offering. They don’t have as many standard network channels as the large urban cable systems, and this helps to hold down the costs of providing the service. But Skitter has made up for a smaller line-up by bringing a large number of non-traditional channels to their line-up. They also have created channels for many of the popular online services. Overall the Skitter lineup is probably an improvement in rural markets and might even be an interesting alternative in urban markets.

One interesting option that Skitter brings is the possibility of offering a cord cutter package that includes local network channels plus a wide array of non-traditional programming. The Skitter cord cutter programming looks to be one of the more robust non-traditional packages on the market.

Customers can connect to Skitter TV using a Roku box, which is cheaper than traditional settop boxes. But Skitter also can support most traditional IPTV settop boxes that providers already have deployed.

Any small cable provider who is losing money on cable TV ought to take a look at this alternative. Even if transport costs look to be a barrier, Skitter TV is often willing to bring their own headend into a market if the numbers look attractive to them.

I think that Skitter TV will do well in the telco and IPTV cable markets because it’s become nearly impossible for a small provider to be profitable on the cable product. It’s a lot more sensible for a provider to partner with Skitter and get a guaranteed small positive margin from cable customers than to continue to bleed cash on the business line. Other than having to provide settop boxes, the Skitter partnership gets companies out of the headend, hardware, and middleware business, taking a lot of pressure off capital budgets.

Farmers and Big Data

johndeereoutsideProbably the biggest change coming soon to crop farming is precision agriculture. This applies GPS and sensors to monitor field conditions like water, soil, nutrients, weeds, etc. to optimize the application of water, pesticides, and fertilizers in order to maximize the crop yields in different parts of the farm. Anybody who has ever farmed knows that fields are not uniform in nature and that the various factors that produce the best crops differ even within one field.

Precision agriculture is needed if we are to feed the growing world population, which is expected to reach almost 10 billion by 2050. As a planet we will need to get the best possible yield out of each field and farm. This might all have to happen against a back drop of climate change which is playing havoc with local weather conditions.

A number of farmers have started the process of gathering the raw data needed to understand their own farms and conditions. Farmers know the best and worst sections of their fields, but they do not understand the subtle differences between all of the acreage. In the past farmers haven’t known the specific yield differences between the various microcosms within their farm. But they are now able to gather the facts needed to know their land better. It’s a classic big data application that will recommend specific treatments for different parts of a field by sifting through and making sense of the large numbers of monitor readings.

In order to maximize precision agriculture new automated farm machinery will be needed to selectively treat different parts of the fields. The large farm equipment manufacturers expect that farming will be the first major application for drones of all types. They are developing both wheeled vehicles and aerial drone systems that can water or treat sections of the fields as needed.

This is a major challenge because farming has historically been a somewhat low technology business. While farms have employed expensive equipment, the thinking part of the business was always the responsibility of each farmer, and the farmers with the best knowledge and experience would typically out-produce their neighbors. But monitoring can level the playing field and dramatically increase yields for everybody.

There are several hurdles in implementing precision agriculture. First is access to the capital needed to buy the monitors and the equipment used to selectively treat fields. This need for capital is clearly going to favor large farms over small ones and will be yet one more factor leading to the consolidation of small farms into larger enterprises.

But the second need is broadband. Gathering all of the needed data, analyzing it, and turning it into workable solutions presupposes the ability to get data from the fields and sent to a supercomputer somewhere for analysis. And that process needs broadband. A farmer who is still stuck with dial-up or satellite broadband access is not going to have the bandwidth needed to properly gather and crunch the large amount of data needed to find the best solutions.

This doesn’t necessitate fiber to the fields because a lot of the data gathering can be done wirelessly. But it does require that farms are fed with high-speed Internet access and good wireless coverage, something that does require rural fiber. I published a blog a few weeks ago that outlined the availability of broadband on farms and it is not currently a pretty picture. Far too many farms are still stuck with dial-up, satellite, or very slow rural DSL.

Some farmers are lucky to live in areas where communications co-ops and rural telcos are bringing them good broadband, but most are in areas where there is neither broadband nor anybody currently planning on expanding broadband. At some point the need for farming broadband will percolate up as a national priority. Meanwhile, in every rural place I visit, the farmers are at the forefront of those asking for better broadband.


Our Lagging 4G Networks

Cell-TowerI have to scratch my head when I read about people who rave about the 4G data speeds they get. First, I travel all over the country and I have yet to see a 4G data speed above 20 Mbps. And yet I’ve seen claims in various online forums for speeds as high as 60 Mbps. I’ve been in a number of major cities in the last six months and have not once seen speeds that I would consider fast.

Second, a report just came out from OpenSignal, a company that provides an app that maps cellular coverage. They collected data recently from 325,000 users around the world and used that data to compare the 4G networks in 140 different countries. Their data showed that the US has the 14th slowest 4G of all these countries at an average paltry speed of 10 Mbps.

Hungary, Denmark, South Korea, Romania, Singapore, and New Zealand have the fastest 4G, all with average speeds of above 25 Mbps, with New Zealand seeing an average speed of 36 Mbps download.

I often run speed tests, but the real way to test 4G speeds is by trying to open web pages I often use at home. I know it’s generally far more efficient to use an app rather than the mobile web, but I open web pages just to see how fast coverage is. It’s well known that speed test results can be rigged by your carrier who knows you are using a speed test site. What I generally find is that web pages that leap onto my screen at home seem to take forever to load on my cellphone, and sometimes they never load.

Why does this matter? I think it matters because there are tons of industry pundits who opine that our broadband future is wireless and that we don’t need to be investing in fiber. They say that wireless is going to get so fast that nobody will feel the need for a landline based internet connection. For a whole long list of reasons I think that argument is totally bosh. Consider the following:

  • Cellular data speeds drop quickly with distance from the cell tower. Today cell phone towers are not situated for data coverage and were built to handle voice traffic. A cell tower can complete a voice call at a much greater distance from the tower than it can make a strong data connection.
  • We could always build more towers to bring transmitters closer to people. But for those new towers to work they are going to have to be fiber fed, something that very few companies are willing to invest in.
  • Cell phone signals don’t penetrate structures very well. I recently visited my dentist. In the parking lot I was easily able to read news articles on Flipboard. I then walked into the waiting room, which has big windows to the outside world, but the ability to read articles slowed down a lot. Then when I was taken back to an interior room that was only one room further from the outside, I couldn’t even get the app to open. This is not an unusual experience and I see it often.
  • Cell sites can only handle a limited number of customers and they get overwhelmed and degrade if they get more demand than optimum. And the more bandwidth that is delivered, the easier it is for a cell site to reach capacity.
  • The various swaths of spectrum used for cellular data each have their own unique limitations. In many cases the spectrum is carved into somewhat small channels (which was done before we conceived using the spectrum for data) and it’s very hard to cobble together a large wireless data path. It generally means linking several frequencies to a given customer data path, which is both complicated and somewhat taxing on a cellphone.
  • Data caps, data caps, data caps. Let’s face it, as long as the cellphone companies want to charge $10 per downloaded gigabit then they cannot be a serious contender for anybody’s real life data usage. I estimate that my household downloads at least 500 gigabits per month at home and I don’t think we are unusual. If I was paying cellphone data rates that would cost me an astounding $5,000 per month. Even should they cut their rates by 90% this would still cost an eye-popping $500 per month. As long as cellphone data rates are 100 times higher than landline rates they are something you use to casually browse the news, not as a real internet connection.

What’s Up with 4K Video?

4K videoIt seems like I can’t read tech news lately without seeing an article mentioning something new going on with 4K video. So I thought I would talk a bit about what 4K is and how it differs from other current types of video.

4K is the marketing term to cover what is officially named Ultra High Definition (UHD) video. UHD video is distinguished from current high definition video by having a higher picture resolution (more pixels) as well as more realistic colors and higher frame rates (meaning more pictures per second).

Let’s start with some definitions. 4K video is defined by the Consumer Electronics Association as a video stream that has at least 3,840 X 2,160 pixels. This contrasts to existing high definition (HD) video that has 1,920 X 1,080 pixels and standard definition video (SD) that has 720 X 480 pixels. These are not precise standards—for example there is some SD video that is broadcast at 540 pixels. There is also an existing standard for some video cameras that record at 4,096 X 2,160 pixels which is also considered 4K.

The 4K standard was developed in an attempt to be able to deliver digital media to movie theaters. This would save a lot of money compared to shipping around reels of film. Standard HD does not project well onto the big screens and 4K will overcome a lot of these shortfalls. But high action movies require more definition than is provided by 4K and will require the upcoming 8K video standard to be able to be digitally transmitted for use on the largest screens.

Interestingly, there is not a huge increase in quality from shifting home viewing from HD to 4K. There is a huge improvement in quality between SD and HD, but the incremental improvements between HD and 4K are much harder to discern. The improvements are more due to the number of different colors being projected, because the human eye cannot really see the pixel differences when viewed on relatively small computers or home TV screens. It’s easy to get fooled about the quality of 4K due to some of the spectacular demo videos of the technology being shown on the web. But these demos are far different than what run-of-the-mill 4K will look like, and if you think back there were equally impressive demos of HD video years ago.

The major difference between HD and 4K for the broadband industry is the size of the data stream needed to transmit all of the pixel data. Current 4K transmissions online require a data path between 18 Mbps and 22 Mbps. This is just below the FCC’s definition of broadband and according to the FCC’s numbers, only around 20% of homes currently have enough broadband speed to watch 4K video. Google just recently announced that they have developed some coding schemes that might reduce the required size of a 4K transmission by 40% to 50%, but even with that reduction 4K video is going to put a lot of strain on ISPs and broadband networks, particularly if homes want to watch more than one 4K video at a time.

I recently read that 15% of the TVs sold in 2015 were capable of 4K and that percentage is growing rapidly. However, lagging behind this is 4K capable settop boxes; anybody that wants to get 4K from their cable provider will require a new box. Most of the large cable providers now offer these boxes, but often at the cost of another monthly fee.

Interestingly, there is a lot of 4K video content on the web, much of it filmed by amateurs and available on sites like YouTube or Vimeo. But there is a quickly increasing array of for-pay content. For instance, most of the Netflix original content is available in 4K. Amazon Prime also has Breaking Bad and other original content in 4K. It’s been reported that the next soccer World Cup will be filmed in 4K. There are a number of movies now being shot in 4K as well as a library of existing IMAX films which fit well into this format. Samsung has even lined up a few movies and series in 4K which are only available to people with Samsung 4K TVs.

One thing is for sure, it looks like 4K is here to stay. More and more content is being recorded in the format and one has to imagine that over the next few years 4K is going to become as common as HD video is today. And along with the growth of 4K demand will come demand for better bandwidth.

Comcast Trying Data Caps Again

comcast-truck-cmcsa-cmcsk_largeYet again Comcast is trying to introduce data caps. They have introduced what they are calling ‘data usage trials’ in Miami and the Florida Keys. For some reason most of their past trials for this have also been in the southeast. The new plan gives customers a monthly data cap of 300 gigabits of downloaded data. After you hit that cap then every additional 50 gigabits costs $10. For $30 extra you can get unlimited downloads.

When Comcast tried caps a few years ago they used a monthly cap of 250 gigabits. Since the average household has been doubling the amounts of data used every three years, the new cap is stingier than the old 250 GB cap since households would have normally almost doubled usage compared to the last time Comcast tried this. This means the 300 GB cap is going to affect a lot more people than the old cap.

What is probably most annoying about this is that Comcast is refusing this time to call these data caps. Instead they are calling this a ‘data usage trial’ and are trying hard to compare themselves to the plans sold by the cell phone companies. Of course, everybody in the world understands those cellular plans to be data caps.

It’s not hard to understand why Comcast wants to do this. While broadband subscriptions continue to grow, with the overall US market at an 83% broadband penetration there is not a lot of future upside in broadband sales. Further, I know that Comcast is eyeing the financial performance of the cellphone companies with envy since they can see the significant revenues generated by AT&T and Verizon with their data caps.

But Comcast also must understand that customers are absolutely going to hate these caps. Households are watching online video more and more and it is that usage that is driving the vast majority of downloads. There are other households that have big usage due to gaming, and some households that still engage in file-sharing, even though that is often illegal and riskier than it used to be.

The last time Comcast did this they saw a massive customer revolt and I certainly expect that to happen again. Take my case. I estimate that we probably use at least 500 GB per month. So for me this is basically means a $30 increase in my data rate. They have already pushed me to the edge of tolerance by forcing me to buy a basic TV package that I don’t use in order to get a 50 Mbps cable modem. If they introduce this cap they would push me over $100 per month just to get a broadband connection. At that point I start taking a very serious look at CenturyLink, the other provider in my neighborhood.

The biggest problem with any data caps is that, no matter where the cap is set, over time more and more customers are going to climb over it. We are just now starting to see the first proliferation of 4K video, and at download requirements of 18–22 Mbps this will blow past the data cap in no time.

What is most ridiculous about data caps either for cellular or landline data is that the US already has the most expensive Internet access of all of the developed countries. ISPs are already reaming us with ridiculously expensive broadband access and are now scheming for ways to make us pay more. The margins on US broadband are astronomical, in the 90% plus profit margin range. So data caps at a company like Comcast are purely greed driven, nothing else. There are zero network or performance issues that could justify penalizing customers who actually use the data they are paying for.

I am not entirely against data caps. For example, I have one client that has a 1 terabit cap on their basic data product and 2 terabits on their fastest product. They don’t use these caps to jack up customer prices, but instead use them as an opportunity to discuss usage with customers. For instance, they might convince somebody who is constantly over the 1 terabit cap to upgrade to a product with a higher cap. But mostly they use these caps as a way to force themselves to monitor customers. Their monitoring found a few customers who went over the cap because they were operating some kind of commercial retail server out of their home. Their terms of service prohibit operating a business service over a residential product and they upgraded such customers to a business product, which has no data cap.

If you want to get really annoyed, look at this Comcast blog which explains the new ‘data usage trials.’ It is frankly one of the worst cases of corporate doublespeak that I have read in a long time. You have to feel a bit sorry for the corporate communications people who had to write this drivel, but the ones to hate are their corporate bosses who are determined to make us all pay more for using data.


US and Europe at Odds over Privacy

Scales-Of-Justice-12987500-300x300A few weeks ago I wrote about the various battles currently raging that are going to determine the nature of the future Internet. None of these battles are larger than the battle between spying and surveillance, and citizens and countries that want to protect their citizens from being spied upon.

Recently, we’ve seen this battle manifest in several ways. First, countries like Russia and Thailand are headed down a path to create their own fire-walled Internet. Like the Chinese Great Firewall, these networks aim to retain control of all data originating within a country.

But even where the solution is not this dramatic we see the same battle. For instance, Facebook is currently embroiled in this fight in Europe. Facebook might have been singled out in this fight because they already have a bad reputation with European regulators. That reputation is probably deserved since Facebook makes most of their money from their use of customer data.

But this fight is different. The Advocate-General of the European Court of Justice (their equivalent of the Supreme Court) just ruled against Facebook in a ruling that could affect every US Internet company doing business in Europe. The ruling has to do with the ‘safe harbor’ arrangement that has been used as the basis for transferring European customer data back to US servers. The safe harbor rules come from trade rules negotiated between the US and the European Union in 2003. These rules explicitly allow what Facebook (and almost everybody else) is doing with customer data.

The Advocate-General has ruled that the EU was incorrect in negotiating the safe harbor rules. He says that they contradict some of the fundamental laws of the EU including the Charter of Fundamental Rights, the equivalent to our Constitution. He says the safe harbor rules violate the basic rights of citizens to privacy. He explicitly says that this is due to NSA spying, and that by letting Facebook and others take European data out of the country they are making it available to the NSA.

This ruling is still not cast in concrete since the Court of Justice still has to accept or reject the recommendations from the Advocate-General. However, they accept these recommendations most of the time. If this is upheld it is going to create a huge dilemma for the US. Either the NSA will have to back off from looking at data from the US companies, or else US companies won’t be able to bring that data out of Europe.

For companies like Facebook this could be fatal. There are some commercial web services that could be hosted in Europe to operate for Europeans. But social media like Facebook operate by sharing their data with everybody. It would be extremely odd on Facebook if an American couldn’t friend somebody from Europe or perhaps be unable to post pictures of their vacation while they were still in Europe. And this might put a real hitch in American companies like Google and Amazon doing business in Europe.

Such a final ruling would send US and EU negotiators back to the table, but in new negotiations safe harbor rules would no longer be an option. This ruling could bring about a fundamental change in the worldwide web. And this comes at a time when Facebook, of all companies, is talking about bringing the rest of the human race onto the web. But perhaps, as a consequence of NSA and surveillance by other companies, each country or region might end up with a local web, and the worldwide web will be a thing of the past.