Gaming Migrates to the Cloud

We are about to see a new surge in demand for broadband as major players in the game industry have decided to move gaming to the cloud. At the recent Game Developer’s Conference in San Francisco both Google and Microsoft announce major new cloud-based gaming initiatives.

Google announced Stadia, a platform that they tout as being able to play games from anywhere with a broadband connection on any device. During the announcement they showed transferring a live streaming game from desktop to laptop to cellphone. Microsoft announced the new xCloud platform that let’s Xbox gamers play a game from any connected device. Sony Playstation has been promoting online play between gamers from many years and now also offers some cloud gaming on the Playstation Now platform.

OnLive tried this in 2011, offering a platform that was played in the cloud using OnLive controllers, but without needing a computer. The company failed due to the quality of broadband connections in 2011, but also due to limitations at the gaming data centers. Both Google and Microsoft now operate regional data centers around the country that house state-of-the-art whitebox routers and switches that are capable of handling large volumes of simultaneous gaming sessions. As those companies have moved large commercial users to the cloud they created the capability to also handle gaming.

The gaming world was ripe for this innovation. Current gaming ties gamers to gaming consoles or expensive gaming computers. Cloud gaming brings mobility to gamers, but also eliminates need to buy expensive gaming consoles. This move to the cloud probably signals the beginning of the end for the Xbox, Playstation, and Nintendo consoles.

Google says it will support some games at the equivalent of an HD video stream, at 1080p and 60 frames per second. That equates to about 3GB of downloaded per hour. But most of the Google platform is going to operate at 4K video speeds, requiring download speeds of at least 25 Mbps per gaming stream and using 7.2 GB of data per hour. Nvidia has been telling gamers that they need 50 Mbps per 4K gaming connection.

This shift has huge implications for broadband networks. First, streaming causes the most stress on local broadband networks since the usage is continuous over long periods of times. A lot of ISP networks are going to start showing data bottlenecks when significant numbers of additional users stream 4K connections for hours on end. Until ISPs react to this shift, we might return to those times when broadband networks bogged down in prime time.

This is also going to increase the need for download and upload speeds. Households won’t be happy with a connection that can’t stream 4K, so they aren’t going to be satisfied with a 25 Mbps connection that the FCC says is broadband. I have a friend with two teenage sons that both run two simultaneous game streams while watching a steaming gaming TV site. It’s good that he is able to buy a gigabit connection on Verizon FiOS, because his sons alone are using a continuous broadband connection of at least 110 Mbps, and probably more

We are also going to see more people looking at the latency on networks. The conventional wisdom is that a gamer with the fastest connection has an edge. Gamers value fiber over cable modems and value cable modems over DSL.

This also is going to bring new discussion to the topic of data caps. Gaming industry statistics say that the average serious gamer averages 16 hours per week of gaming. Obviously, many play longer than the average. My friend with the two teenagers is probably looking at least at 30 GB per hour of broadband download usage plus a decent chunk of upload usage. Luckily for my friend, Verizon FiOS has no data cap. Many other big ISPs like Comcast start charging for data usage over one terabyte per month – a number that won’t be hard to reach for a household with gamers.

I think this also opens up the possibility for ISPs to sell gamer-only connections. These connections could be routed straight to peering arrangements with the Google or Microsoft to guarantee the fastest connection through their network and wouldn’t mix gaming streams with other household broadband streams. Many gamers will pay extra to have a speed edge.

This is just another example of how the world find ways to use broadband when it’s available. We’ve obviously reached a time when online gaming can be supported. When OnLive tried is there were not enough households with fast enough connections, there weren’t fast enough regional data centers, and there wasn’t a peering network in place where ISPs connect directly to big data companies like Google and bypass the open Internet.

The gaming industry is going to keep demanding faster broadband and I doubt they’ll be satisfied until we have a holodeck in every gamer’s home. But numerous other industries are finding ways to use our increasing household broadband capcity and the overall demand keeps growing at a torrid pace.

 

Another Rural Wireless Provider?

T-Mobile announced the start of a trial for a fixed wireless broadband product using LTE. The product is being marketed as “T-Mobile Home Internet”. The company will offer the product by invitation only to some existing T-Mobile cellular customers in “rural and underserved areas”. The company says they might connect as many as 50,000 customers this year. The company is marketing the product as 50 Mbps broadband, with a monthly price of $50 and no data cap. The company warns that speeds may be curtailed during times of network congestion.

The company further says that their ultimate goal is to offer speeds of up to 100 Mbps, but only if they are allowed to merge with Sprint and gain access to Sprint’s huge inventory of mid-range spectrum. They said the combination of the two companies would enable them to cover as many as 9.5 million homes with 100 Mbps broadband in about half of US zip codes.

There are positive aspects the planned deployment, but also a number of issues that make me skeptical. One positive aspect is that some of the spectrum used for LTE can better pass through trees compared to the spectrum used for the fixed wireless technology that is being widely deployed in the open plains and prairies of the Midwest and West. This opens up the possibility of bringing some wireless broadband to places like Appalachia – with the caveat that heavy woods are still going to slow down data speeds. It’s worth noting that this is still a line-of-sight technology and fixed LTE will be blocked by hills or other physical impediments.

The other positive aspect of the announced product is the price and lack of a data cap. Contrast this to the AT&T fixed LTE product that has a price as high as $70 along with a stingy 160 GB monthly cap, and with overage charges that can bring the AT&T price up to $200 per month.

I am skeptical of a number of the claims made or implied by the announcement. The primary concern is download speeds. Fixed LTE will be the same as any other fixed wireless product and speeds will decrease with the distance of a customer from the serving tower. In rural America distances can mount up quickly. LTE broadband is similar to rural cellular voice and works best where customers can get 4 or 5 bars. Anybody living in rural America understands that there are a lot more places with 1 or 2 bars of signal strength than of 4 or 5 bars.

The 50 Mbps advertised speed is clearly an ‘up-to’ speed and in rural America it’s doubtful that anybody other than those who live under a tower could actually get that much speed. This is one of the few times when I’ve seen AT&T advertise truthfully and they market their LTE product as delivering at least 10 Mbps speed. I’ve read numerous online reviews of the AT&T product and the typical speeds reported by customers range between 10 Mbps and 25 Mbps, with only a few lucky customers claiming speeds faster than that.

The online reviews of the AT&T LTE product also indicate that signal strength is heavily influenced by rain and can completely disappear during a downpour. Perhaps even more concerning are reports that in some cases speeds remain slow after a rain due to wet leaves on trees that must be scattering the signal.

Another concern is that T-Mobile is touting this as a solution for underserved rural America.  T-Mobile has far less presence in rural America than AT&T and Verizon and is on fewer rural cellular towers. This is evidenced by their claim that even after a merger with Sprint they’d only be seeing 9.5 million passings – that’s really small coverage for a nationwide cellular network. I’m a bit skeptical that T-Mobile will invest in connecting to more rural towers just to offer this product – the cost of backhaul to rural towers often makes for a lousy business case.

The claim also says that the product will have some aspects of both 4G and 5G. I’ve talked to several wireless engineers who have told me that they can’t see any particular advantage for 5G over 4G when deploying as fixed wireless. A carrier already opens up the available data path fully with 4G to reach a customer and 5G can’t make the spectrum perform any better. I’d love to hear from anybody who can tell me how 5G would enhance this particular application. This might be a case where the 5G term is tossed in for the benefit of politicians and marketing.

Finally, this is clearly a ploy to keep pushing for the merger with Sprint. The claim of the combined companies being able to offer 100 Mbps rural broadband has even more holes than the arguments for achieving 50 Mbps. However, Sprint does have a larger rural presence on rural towers today than T-Mobile, although I think the Sprint towers are already counted in the 9.5 million passings claim.

But putting aside all my skepticism, it would be great if T-Mobile can bring broadband to any rural customers that otherwise wouldn’t have it. Even should they not achieve the full 50 Mbps claim, many rural homes would be thrilled to get speeds at half that level. A wireless product with no data caps would also be a welcomed product. The timing of the announcement is clearly aimed at promoting the merger process with Sprint and I hope the company’s deployment plans don’t evaporate if the merger doesn’t happen.

Broadband Usage Continues to Grow

The firm OpenVault, a provider of software that measures data consumption for ISPs reported that the average monthly data use by households grew from 201.6 gigabytes in 2017 to 268.7 gigabytes in 2018 – a growth rate of 33%. The company also reported that the medium use per household grew from 103.6 gigabytes in 2017 to 145.2 gigabytes in 2018 – a growth rate of 40%. The medium represents the midpoint of users, with half of all households above and half below the medium.

To some degree, these statistics are not news because we’ve known for a long time that broadband usage at homes, both in total download and in desired speeds has been doubling every three years since the early 1980s. The growth in 2018 is actually a little faster than that historical average and if the 2018 growth rate was sustained, in three years usage would grow by 235%. What I find most impressive about these new statistics is the magnitude of the annual change – the average home used 67 more gigabytes of data per month in 2018 than the year before – a number that would have seemed unbelievable only a decade ago when the average household used a total of only 25 gigabytes per month.

There are still many in the industry who are surprised by these numbers. I’ve heard people claim that now that homes are watching all the video they want that the rate of growth is bound to slow down – but if anything, the rate of growth seems to be accelerating. We also know that cellular data consumption is also now doubling every two years.

This kind of growth has huge implications for the industry. From a network perspective, this kind of bandwidth usage puts a big strain on networks. Typically the most strained part of a network is the backbones that connect to neighborhood nodes. That’s the primary stress point in many networks, including FTTH networks, and when there isn’t enough bandwidth to a neighborhood then everybody’s bandwidth suffers. Somebody that designed a network ten years ago would never have believed the numbers that OpenVault is reporting and would likely not have designed a network that would still be sufficient today.

One consequence of the bandwidth growth is that it’s got to be driving homes to change to faster service providers when they have the option. A household that might have been happy with a 5 Mbps or 10 Mbps connection a few years ago is likely no longer happy with it. This has to be one of the reasons we are seeing millions of homes each year upgrade from DSL to cable modem each year in metropolitan areas. The kind of usage growth we are seeing today has to be accelerating the death of DSL.

This growth also should be affecting policy. The FCC set the definition of broadband at 25/3 Mbps in January of 2015. If that was a good definition in 2015 then the definition of broadband should have been increased to 63 Mbps in 2019. At the time the FCC set that threshold I thought they were a little generous. In 2014, as the FCC was having this debate, the average home downloaded around 100 gigabytes per month. In 2014 the right definition of broadband was probably more realistically 15 – 20 Mbps and the FCC was obviously a little forward-looking in setting the definition. Even so, the definition of broadband should be increased – if the right definition of broadband in 2014 was 20 Mbps, then today the definition of broadband ought to have been increased to 50 Mbps today.

The current FCC is ignoring these statistics for policy purposes – if they raise the definition of broadband then huge numbers of homes will be classified as not having broadband. The FCC does not want to do that since they are required by Congressional edict to make sure that all homes have broadband. When the FCC set a realistic definition of broadband in 2015 they created a dilemma for themselves. That 2015 definition is already obsolete and if they don’t change it, in a few years it is going to be absurdly ridiculous. One only has to look forward three years from now, when the definition of broadband ought to be 100 Mbps.

These statistics also remind us of the stupidity of handing out federal subsidies to build technologies that deliver less than 100 Mbps. We still have two more years of CAF II construction to upgrade speeds to an anemic 10 Mbps. We are still handing out new subsidies to build networks that can deliver 25/3 Mbps – networks that are obsolete before they are completed.

Network designers will tell you that they try to design networks to satisfy demands at least seven years into the future (which is the average life of many kinds of fiber electronics). If broadband usage keeps doubling every three years, then looking forward seven years to 2026, the average home is going to download 1.7 terabytes per month and will expect download speeds of 318 Mbps. I wonder how many network planners are using that target?

The final implications of this growth are for data caps. Two years ago when Comcast set a terabyte monthly data cap they said that it affected only a few homes – and I’m sure they were right at the time. However, the OpenVault statistics show that 4.12% of homes used a terabyte per month in 2018, almost double from 2.11% in 2017. We’ve now reached that point when the terabyte data cap is going to have teeth, and over the next few years a lot of homes are going to pass that threshold and have to pay a lot more for their broadband. While much of the industry has a hard time believing the growth statistics, I think Comcast knew exactly what they were doing when they established the terabyte cap that seemed so high just a few years ago.

The Reality of Rural Broadband

I recently saw the results of several rural surveys that probably tell the best story about the state of rural broadband. The two areas being studied are far apart geographically, but they are similar in many ways. The areas are both rural and are not near to a metropolitan area. The areas have some modest manufacturing and some modest amount of tourism, but neither in a big way. Both areas included some small towns, and a few of these towns have cable TV. And in both places, the customers in the rural area have poor broadband choices. These are not small isolated pockets of people, and the two surveys cover nearly 20,000 homes.

If you listen to FCC rhetoric it’s easy to think that rural broadband is improving – but in areas like these you can’t see it. These areas have both were supposed to get some upgrades from CAF II – but from what the locals tell me there have been zero improvements so far. The CAF program still has a few years to go, so perhaps there will be some modest improvement in rural DSL.

For now, the broadband situation in these areas is miserable. There are homes with DSL with speeds of a few Mbps at best, with some of the worst speeds hovering at dial-up speeds. One respondent to a survey reported that it took 8 hours to download a copy of Microsoft Office online.

The other broadband choices are also meager. Some people use satellite broadband but complain about the latency and about the small data caps. These areas both have a smattering of fixed wireless broadband – but this is not the modern fixed wireless you see today in the open plains states that delivers 25 Mbps or faster broadband. Both of the areas in the surveys are heavily wooded with hilly terrain, and fixed wireless customers report seeing speeds of 1-2 Mbps. There are a number of homes using their cell phones in lieu of home broadband – an expensive alternative if there are school kids or if any video is watched. There were customers who reported using public hotspots in nearby small towns. And there were a number of households, included many with school kids who have given up and who have no broadband – because nothing they’ve tried has worked.

As would be expected in rural areas, slow speeds are not the only problem. Even homes that report data speeds that should support streaming video complain that streaming doesn’t work. This indicates networks with problems and it’s likely the networks have high latency, are full of jitter, or are over-subscribed and have a lot of packet loss. People don’t really judge the quality of their broadband connection by the speed they get on a speed test, but instead by the ability to do normally expected activities on the Internet.

Many of these homes can’t do things that the rest of us take for granted. Many report the inability to stream video – even a single stream. This is perhaps the biggest fallacy in the way the FCC measures broadband, because they expect that a house getting a speed like 5 Mbps ought to be able to do most needed tasks. In real life the quality of many rural connections are so poor that they won’t stream video. Many people in these areas also complained that their Internet often froze and they had to constantly reboot – something that can kill large downloads or kill online sessions for school or work.

One of the biggest complaints in these areas was that their network only supported one device at a time, meaning that members of the family have to take turns using the Internet. I picture a family with a few school kids and can see how miserable that must be.

The surveys produced a long list of other ways that poor broadband was hurting households. Number one was the inability of people to work at home. Many people said they could work at home more often if they had broadband. A few respondents want to start home businesses but are unable to because of the poor broadband. Another common complaint was the inability for kids to do schoolwork, or for adults to pursue college degrees on line.

The problems many people reported were even more fundamental than these issues. For instance, there were households saying that they could not maintain a good enough connection to bank online or pay their bills online. There were respondents who say they can’t shop online. Many households complained that they couldn’t offload cellular data at home to WiFi, driving up their cellular bills. A number of homes would like to cut the cord to save money but can’t stream Netflix as an alternative to cable.

When you look the raw data behind these kinds of surveys you quickly see the real issues with lack of broadband. In today’s society, not having home broadband literally takes a home out of the mainstream of society. It’s one thing to look at the national statistics and be told that the number of homes without broadband is shrinking. But it’s an entirely different story when you see what that means for the millions of homes that still don’t have adequate broadband. My guess is that some of the areas covered by these surveys show as underserved on the FCC maps – when in fact, their broadband is so poor that they are clearly unserved, ignored and forgotten.

The Zero-rating Strategy

The cable companies are increasingly likely to be take a page from the cellular carriers by offering zero-rating for video. That’s the practice of providing video content that doesn’t count against monthly data caps.

Zero-rating has been around for a while. T-Mobile first started using zero-rating in 2014 when it provided its ‘Music Freedom’ plan that provided free streaming music that didn’t count against cellular data caps. This highlights how fast broadband needs have grown in a short time – but when data caps were at 1 GB per month, music streaming mattered.

T-Mobile then expanded the zero-rating in November 2015 to include access to several popular video services like Netflix and Hulu. AT&T quickly followed with the first ‘for-pay’ zero-rating product, called FreeBee Data that let customers (or content providers) pay to zero-rate video traffic. The AT&T plan was prominent in the net neutrality discussions since it’s a textbook example of Internet fast lanes using sponsored data where some video traffic was given preferential treatment over other data.

A few of the largest cable companies have also introduced a form of zero-rating. Comcast started offering what it called Stream TV in late 2015. This service allowed customers to view video content that doesn’t count against the monthly data cap. This was a pretty big deal at the time because Comcast was in the process at the time of implementing a 300 GB monthly data cap and video can easily push households over that small cap limit. There was huge consumer pushback against the paltry data caps and Comcast quickly reset the data cap to 1 terabyte. But the Stream TV plan is still in effect today.

What’s interesting about the Comcast plan is that the company had agreed to not use zero-rating as part of the terms of its merger with NBC Universal in 2011. The company claims that the Stream TV plan is not zero-rating since it uses cable TV bandwidth instead of data bandwidth – but anybody who understands a cable hybrid-fiber coaxial network knows that this argument is slight-of-hand, since all data uses some portion of the Comcast data connection to customers. The prior FCC started to look into the issue, but it was dropped by the current FCC as they decided to eliminate net neutrality.

The big cable companies have to be concerned about the pending competition with last-mile 5G. Verizon will begin a slow roll-out of its new 5G technology in October in four markets, and T-Mobile has announced plans to begin offering it next year. Verizon has already announced that they will not have any data caps and T-Mobile is also unlikely to have them.

The pressure will be on the cable companies to not charge for exceeding data caps in competitive markets. Cable companies could do this by eliminating data caps or else by pushing more video through zero-rating plans. In the case of Comcast, they won’t want to eliminate the data caps for markets that are not competitive. They view data caps as a potential source of revenue. The company OpenVault says that 2.5% of home currently exceed 1 TB in monthly data usage, up from 1.5% in 2017 – and within a few years this could be a lucrative source of extra revenue.

Comcast and the other big cable companies are under tremendous pressure to maintain earnings and they are not likely to give up on data caps as a revenue source. They are also likely to pursue sponsored video plans where the video services pay them to provide video outside of data caps.

Zero-rating is the one net neutrality practice that many customers like. Even should net neutrality be imposed again – through something like the California legislation or by a future FCC – it will be interesting to see how firmly regulators are willing to clamp down on a practice that the public likes.

Metering Broadband

A lot of the controversy about Comcast data caps disappeared last year when they raised the monthly threshold for data caps from 300 gigabytes to 1 terabyte. But lately I’ve been seeing folks complaining about being charged for exceeding the 1 TB data cap – so Comcast is still enforcing their data caps rules.

In order to enforce a data cap an ISP has to somehow meter the usage. It appears that in a lot of cases ISPs do a lousy job of measuring usage. Not all ISPs have data caps. The biggest ISPs that have them include Comcast, AT&T, CenturyLink for DSL, Cox and Mediacom. But even these ISPs don’t enforce data caps everywhere, like Comcast not enforcing them where they compete directly against Verizon FiOS.

Many customer home routers can measure usage and there are reports of cases where Comcast data usage measurements are massively different than what is being seen at the home. For example, there are customers who have seen big spikes in data measurement from Comcast at a time when their routers were disconnected or when power was out to the home. There are many customers who claim the Comcast readings always greatly exceed what they are seeing at their home routers.

Data caps matter because customer that exceed the caps get charged a fee. Comcast charges $10 for each 50 GB of monthly over the cap. Mediacom has the same fees, but with much smaller data caps such as a 150 GB monthly cap on customers with a 60 Mbps product.

It’s not hard to imagine homes now exceeding the Comcast data cap limit. Before I left Comcast a year ago they said that my family of three was using 600 – 700 GB per month. Since I didn’t measure my own usage I have no idea if their numbers were inflated. If my measurements were accurate it’s not hard to imagine somebody with several kids at home exceeding the 1 TB. The ISPs claim that only a small percentage of customers hit the data cap limits – but in world where data usage keep growing exponentially each year there are more homes that will hit the limit as time goes by.

What I find interesting is that there is zero regulation of the ISP data ‘meters’. Every other kind of meter that is used as a way to bill customers are regulated. Utilities selling water, electric or natural gas must use meters that are certified to be accurate. Meters on gas pumps are checked regularly for accuracy.

But there is nobody monitoring the ISPs and the way they are measuring data usage. The FCC effectively washed their hands from regulating ISPs for anything broadband when they killed Title II regulation of broadband. Theoretically the Federal Trade Commission could tackle the issue, but they are not required to do so. They regulate interactions with customers in all industries and can select the cases they want to pursue.

There are a few obvious reasons why the readings from an ISP would differ from a home, even under ideal conditions. ISPs measure usage at their network hub while a customer measurement happens at the home. There are always packets lost in the network due to interference or noise on the network, particularly with older copper and coaxial networks. The ISP would be counting all data passing through the hub as usage although many of the packets never make it to customers. But when you read some of the horror stories where homes that don’t watch video see daily readings from Comcast of over 100 GB in usage you know that there is something wrong in the way that Comcast is measuring usage. It has to be a daunting task to measure the usage directed for thousands of users simultaneously and obviously Comcast has problems in their measurement algorithms.

I’ve written about data caps before. It’s obvious that the caps are just a way for ISPs to charge more money, and it’s a gigantic amount of extra revenue if Comcast can bill $10 per month extra to only a few percent of their 23 million customers. Anybody that understand the math behind the cost of broadband understands that a $10 extra charge for 50 GB of usage is almost 100% profit. It doesn’t cost the ISP anything close to $10 for the connections for the first terabyte let alone an incrementally small additional amount. And there certainly is no cost at all if the Comcast meters are billing for phantom usage.

I don’t know that there is any fix for this. However, it’s clear that every customer being charged for exceeding data caps will switch to a new ISP at the first opportunity. The big ISPs wonder why many of their customers loathe them, and this is just one more way for a big ISP to antagonize their customers. It’s why every ISP that builds a fiber network to compete against a big cable companies understand that they will almost automatically get 30% of the market due to customers who have come to hate their cable ISP.

Is the FCC Disguising the Rural Broadband Problem?

Buried within the FCC’s February Broadband Deployment Report are some tables that imply that over 95% of American homes can now get broadband at speeds of at least 25/3 Mbps. That is drastically higher than the report just a year earlier. The big change in the report is that the FCC is now counting fixed wireless and satellite broadband when compiling the numbers. This leads me to ask if the FCC is purposefully disguising the miserable condition of rural broadband?

I want to start with some examples from this FCC map that derives from the data supporting the FCC’s annual report. I started with some counties in Minnesota that I’m familiar with. The FCC database and map claims that Chippewa, Lyon, Mille Lacs and Pope Counties in Minnesota all have 100% coverage of 25/3 broadband. They also claim that Yellow Medicine County has 99.59% coverage of 25/3 Mbps broadband and the folks there must be wondering who is in that tiny percentage without broadband.

The facts on the ground tell a different story. In real life, the areas of these counties served by the incumbent telcos CenturyLink and Frontier have little or no broadband outside of towns. Within a short distance from each town and throughout the rural areas of the county there is no good broadband to speak of – certainly not anything that approaches 25/3 Mbps. I’d love to hear from others who look at this map to see if it tells the truth about where you live.

Let me start with the FCC’s decision to include satellite broadband in the numbers. When you go to the rural areas in these counties practically nobody buys satellite broadband. Many tried it years ago and using it is a miserable experience. There are a few satellite plans that offer speeds as fast as 25/3 Mbps. But satellite broadband today has terrible latency, as high as 900 milliseconds. Anything over 100 milliseconds makes it hard or impossible to do any real-time computing. That means on satellite broadband that you can’t stream video. You can’t have a Skype call. You can’t connect to a corporate WAN and work from home or connect to online classes. You will have problems staying on many web shopping sites. You can’t even make a VoIP call.

Satellite broadband also has stingy data caps that make it impossible to use as a home broadband connection. Most of the plans come with a monthly data caps of 10 GB to 20 GB, and unlike cellular plans where you can buy additional data, the satellite plans cut you off for the rest of the month when you hit your data cap. And even with all of these problems, it’s also expensive and is priced higher than landline broadband. Rural customers have voted with their pocketbooks that satellite broadband is not broadband that many people are willing to tolerate.

Fixed wireless is a more mixed bag. There are high-quality fixed wireless providers who are delivering speeds as fast as 100 Mbps. But as I’ve written about, most rural fixed broadband delivers speeds far below this and the more typical fixed wireless connection is somewhere between 2 Mbps and 6 Mbps.

There are a number of factors needed to make a quality fixed broadband connection. First, the technology must be only a few years old because older radios older were not capable of reaching the 25/3 speeds. Customers also need a clear line-of-sight back to the transmitter and must be within some reasonable distance from a tower. This means that there are usually s significant number of homes in wireless service areas that can’t get any coverage due to trees or being behind a hill. Finally, and probably most importantly, the wireless provider needs properly designed network and a solid backhaul data pipe. Many WISPs pack too many customers on a tower and dilute the broadband. Many wireless towers are fed by multi-hop wireless backhaul, meaning the tower doesn’t have enough raw bandwidth to deliver a vigorous customer product.

In the FCC’s defense, most of the data about fixed wireless that feeds the database and map is self-reported by the WISPs. I am personally a big fan of fixed wireless when it’s done right and I was a WISP customer for nine years. But there are a lot of WISPs who exaggerate in their marketing literature and tell customers they sell broadband up to 25/3 Mbps when their actual product might only be a tiny fraction of those speeds. I have no doubt that these WISPs also report those marketing speeds to the FCC, which leads to the errors in the maps.

The FCC should know better. In those counties listed above I would venture to say that there are practically no households who can get a 25/3 fixed wireless connection, but there are undoubtedly a few. I know people in these counties gave up on satellite broadband many years ago. My conclusion from the new FCC data is that this FCC has elected to disguise the facts by claiming that households have broadband when they don’t. This is how the FCC is letting themselves off the hook for trying to fix the rural broadband shortages that exist in most of rural America. We can’t fix a problem that we won’t even officially acknowledge, and this FCC, for some reason, is masking the truth.

Data Caps Again?

My prediction is that we are going to see more stringent data caps in our future. Some of the bigger ISPs have data caps today, but for the most part the caps are not onerous. But I foresee data caps being reintroduced as another way for big ISPs to improve revenues.

You might recall that Comcast tried to introduce a monthly 300 GB data cap in 2015. When customers hit that mark Comcast was going to charge $10 for every additional 50 GB of download, or $30 extra for unlimited downloading.

There was a lot of public outcry about those data caps. Comcast backed down from the plan due to pressure from the Tom Wheeler FCC. At the time the FCC probably didn’t have the authority to force Comcast to kill the data caps, but the nature of regulation is that big companies don’t go out of their way to antagonize regulators who can instead cause them trouble in other areas.

To put that Comcast data cap into perspective, in September of 2017 Cisco predicted that home downloading of video would increase 31% per year through 2021. They estimated the average household data download in 2017 was already around 130 GB per month. You might think that means that most people wouldn’t be worried about the data caps. But it’s easy to underestimate the impact of compound growth and at a 31% growth rate the average household download of 130 GB would grow to 383 gigabits by 2021 – considerably over Comcast’s propose data cap.

Even now there are a lot of households that would be over that caps. It’s likely that most cord cutters use more than 300 GB per month – and it can be argued that the Comcast’s data caps would punish those who drop their video. My daughter is off to college now and our usage has dropped, but we got a report from Comcast when she was a senior that said we used over 600 GB per month.

So what are the data caps for the largest ISPs today?

  • Charter, Altice, Verizon and Frontier have no data caps.
  • Comcast moved their data cap to 1 terabyte, with $10 for the first 50 GB and $50 monthly for unlimited download.
  • AT&T has almost the stingiest data caps. The cap on DSL is 150 GB, on U-verse is 250 GB, on 300 Mbps FTTH is 1 TB and is unlimited for a Gbps service. They charge $10 per extra 50 GB.
  • CenturyLink has a 1 TB cap on DSL and no cap on fiber.
  • Cox has a 1 TB cap with $30 for an extra 500 GB or $50 unlimited.
  • Cable One has no charge but largely forces customers who go over caps to upgrade to more expensive data plans. Their caps are stingy – the cap on a 15 Mbps DSL connection is 50 GB.
  • Mediacom has perhaps the most expensive data caps – 60 Mbps cap is 150 GB, 100 Mbps is 1 TB. But the charge for violating the cap is $10 per GB or $50 for unlimited.

Other than AT&T, Mediacom and Cable One none of the other caps sound too restrictive.

Why do I think we’ll see data caps again? All of the ISPs are looking forward just a few years and wondering where they will find the revenues to increase the demand from Wall Street for ever-increasing earnings. The biggest cable companies are still growing broadband customers, mostly by taking customers from DSL. But they understand that the US broadband market is approaching saturation – much like has happened with cellphones. Once every home that wants broadband has it, these companies are in trouble because bottom line growth for the last decade has been fueled by the growth of broadband customers and revenues.

A few big ISPs are hoping for new revenues from other sources. For instance, Comcast has already launched a cellular product and also is seeing good success with security and smart home service. But even they will be impacted when broadband sales inevitably stall – other ISPs will feel the pinch before Comcast.

ISPs only have a few ways to make more money once customer growth has stalled, with the primary one being higher rates. We saw some modest increases earlier this year in broadband rates – something that was noticeable because rates have been the same for many years. I fully expect we’ll start seeing sizable annual increases in broadband rates – which go straight to the bottom line for ISPs. The impact from broadband rate increases is major for these companies – Comcast and Charter, for example, make an extra $250 million per year from a $1 increase in broadband rates.

Imposing stricter data caps can be as good as a rate increase for an ISPs. They can justify it by saying that they are charging more only for those who use the network the most. As we see earnings pressure on these companies I can’t see them passing up such an easy way to increase earnings. In most markets the big cable companies are a near monopoly and consumers who need decent speeds have fewer alternative as each year passes.Since the FCC has now walked away from broadband regulations there will be future regulatory hindrance to the return of stricter data caps.

AT&T’s CAF II Data Caps

AT&T recently launched its CAF II cellular data plan in a number of rural areas. This is being launched from the federal program that is giving AT&T $2.5 billion dollars spread over 6 years to bring broadband to about 1.1 million homes. That works out to $2,300 per home.

Customers are guaranteed speeds of at least 10 Mbps down and 1 Mbps up. The broadband product is priced at $60 per month with a contract or $70 per month with no contract. Installation is $99. The product comes with a WiFi router that also includes 4 Ethernet ports for wired connections.

For a rural household that has never had broadband this is finally going to get them connected to the web like everybody else. But the 10 Mbps speed of the product is already obsolete and in the footnotes to the product AT&T warns that a customer may not be able to watch two HD video streams at the same time.

But the real killer is the data cap which is set at 160 gigabytes per month. Extra data above this limit will cost a household $10 for each 50 gigabytes (or fraction thereof). AT&T has obviously set the data cap this low because that was the cap suggested by the FCC in the CAF II order.

Let me throw out some statistics that shed some light on how puny the 160 GB month cap is. Following are some statistics about data usage for common functions in the home:

  • The average desktop or laptop uses about 3 GB per month for basic functions like email, upgrading software, etc.
  • Cisco says that the average smartphone uses about 8 GB per month on WiFi.
  • Web browsing uses about 150 MB per hour.
  • Streaming music uses 1 GB for 24 hours of streaming
  • Facebook estimates that it’s average user uses the service for 20 hours per month, which consumes 2.5 GB.
  • Video is the real bandwidth eater. Netflix says that an SD video uses 0.7 GB per hour or 1.4 GB for a movie. They say HD video uses 3 GB per hour or 6 GB per movie.
  • The average online gamer uses at least 5 GB per month, and for some games much more than this.

So how does all of this stack up for an average family of three? It might look something like this:

3 computers / laptops                      9 GB

3 Smartphones                                24 GB

60 hours of web browsing               9 GB

3 social networks                              8 GB

60 hours of streaming music          3 GB

1 Gamer                                             5 GB

Schoolwork                                      10 GB

Subtotal                                            68 GB

This leaves 92 GB for watching video for a month. That will allow a home to watch 15 HD movies a month or 30 1-hour shows. That means one TV show per day for the whole household. Any more than that and you’d go over the data cap. The majority of video content on the web is now only available in HD and much of the content on Netflix and Amazon no longer come in SD. To make matters worse, these services are now starting to offer 4k video which is 4 times more data intensive than HD video.

Also note that this subtotal doesn’t include other normal functions. Working from home can use a lot of bandwidth. Taking online courses is data intensive. IoT devices like home security cameras can use a lot of bandwidth. And we are starting to see smart home devices add up to a pile of data that goes on behind the scenes without our knowledge.

The fact is that within a few years the average home is going to likely exceed the AT&T data cap without watching any video. The bandwidth used for everything we do on the web keeps increasing over time.

To show how ridiculously low this cap is, compare it to AT&T’s ‘access’ program which supplies broadband to low-income homes for speeds up to the same 10 Mbps and prices up to $10 per month. That low-income plan has a 1 terabyte data cap – over six times higher than the CAF II data cap. Since the company offers both products from the cellular network it’s impossible for the company to claim that the data caps are due to network constraints or any other technical issues. AT&T set the data cap at the low 160 GB because the FCC stupidly suggested that low amount in the CAF II order. The low data cap is clearly about money.

The last time we measured our home with 3 users we used over 700 GB per month. We are cord cutters and watch all video on the web. We work from home. And our daughter was taking on-line classes. Under the AT&T CAF II product our monthly bill would be $170 per month. And even then we would have a data product that would not allow us to do the things we want to do, because the 10 Mbps download speed would not allow all three of us to use the web at the same time. If you’ve been reading my blog you’ve heard me say often what a colossal waste of money the CAF II program is. The FCC gave AT&T $2.5 billion to foist this dreadful bandwidth product on rural America.

Who Wins with Cable Deregulation?

There has been a lot of press lately discussing what might happen if the FCC does away with Title II regulation of broadband. But broadband isn’t the only battle to be fought and we are also in for big changes in the cable industry. Since our new FCC is clearly anti-regulation I think the future of cable TV is largely going to be won by whoever best copes with a deregulated cable world.

Cable companies today are governed by arcane rules that rigidly define how to provide terrestrial cable TV. These rules, for example, define the three tiers of cable service – basic, expanded basic and premium – and it is these rules that have led us to the big channel line-ups that are quickly falling out of favor. Most households watch a dozen or so different channels regularly and even big cable users rarely watch more than 30 channels – but yet we have all been sucked into paying for 150 – 300 channel line-ups.

It’s likely that the existing rules governing cable will either be relaxed or ignored by the FCC. A lot of the cable rules were defined by Congress in bills like the Telecommunications Act of 1996, so only Congress can change those rules. But the FCC can achieve deregulation by inaction. Already today we see some of the big cable providers violating the letter of those rules. For example, Verizon has a ‘skinny’ package that does not fit into the defined FCC definition of the structure of cable tiers. The FCC has turned a blind eye to these kinds of changes, and if they are more overt about this then we can expect cable providers everywhere to start offering line-ups people want to watch – and at more affordable prices if the cable companies can avoid paying for networks they don’t want to carry.

The cable companies are now in a battle with the OTT providers like Netflix, Sling TV and others. It’s clear to the cable companies that if they don’t fight back that they are going to bleed customers faster and faster, similar to what happened to landline voice.

One way cable companies can fight back is to introduce programming packages that are similar to what the OTT providers are offering. This is going to require a change in philosophy at cable companies because the larger companies have taken to nickel and diming customer to death in the last few years. They sell a package at a low advertised price and then load on a $10 settop box fee, a number of other fees that are made to look like taxes, and the actual price ends up $20 higher than advertised. That’s not going to work when competing head-to-head with an OTT competitor that doesn’t add any fees.

The cable companies are also going to have to get nimble. I can currently connect and disconnect from a web service like Sling TV at will. Two or three clicks and I can disconnect. And if I come back they make it easy to reconnect. The cable companies have a long way to go to get to this level of customer ease.

Of course, the big ISPs can fight back in other ways. For example, I’ve seen speculation that they will try to get taxes levied on OTT services to become more price competitive. Certainly the big ISPs have a powerful lobbying influence in Washington and might be able to pull that off.

There is also speculation that the big ISPs might try to charge ‘access fees’ to OTT providers. They might try to charge somebody like Netflix to get to their customers, much in the same manner that the big telcos charge long distance carriers for using their networks. That might not be possible without Congressional action, but in today’s political world something like this is conceivable.

Another tactic the cable companies could take would be to reintroduce low data caps. If the FCC eliminates Title II regulation that is a possibility. The cable companies could make it costly for homes that want to watch a lot of OTT content.

And perhaps the best way for the cable companies to fight back against OTT is to join them. Just last week Comcast announced that it will be introducing its own OTT product. The cable companies already have the programming relationships – this is what made it relatively easy for Dish Network to launch Sling TV.

It’s impossible to predict where this might all go. But it seems likely that we are headed towards a time of more competition – which is good for consumers. But some of these tactics could harm competition and make it hard for OTT providers to be profitable. Whichever way it goes it’s going to be an interesting battle to watch.