Is AT&T the 800-pound Gorilla?

For years it’s been understood in the industry that Comcast is the hardest incumbent to compete against. However, they are still a cable company and many people dislike cable companies – but Comcast has been the most formidable competitor. The company is reported to have the highest gross margins on cable TV and might be one of the few companies still making a significant profit on cable. Much of that is due to their extensive programming holdings – it’s easier to make money on cable when you buy your own programming. Comcast has also been the best in the industry in creating bundles to lock in customers – bundling things like smart home and more recently cellular service.

But the new 800-pound Gorilla in the industry might be AT&T. The company seems to be finally shaking out of the transition period from integrating their purchase of Time Warner. It can be argued that the programming that came from that merger – things like HBO, CNN, and blockbuster movies – will make AT&T a more formidable competitor than Comcast.

AT&T will be launching its new streaming service, AT&T TV, next month. The company already has one of the largest streaming services with DirecTV Now. It’s been rumored that the streaming service will start at a price around $18 per month – an amazingly low price considering that HBO retails for $15 online today. The company is trying to coax more money out of the millions of current HBO subscribers. This pricing also will lure customers to drop HBO bought from cable companies and instead purchase it online.

AT&T has also been building fiber for the last four years and says that they now pass 20 million homes and businesses. They recently announced the end of the big fiber push and will likely now concentrate on selling to customers in that big footprint. The company is one of the more aggressive marketers and has sent somebody to my door several times in the last year. That’s a sign of a company that is working hard to gain broadband subscribers.

The one area where AT&T is still missing the boat is in not bundling broadband and cellular service. AT&T is still number one in the country with cellular customers, with almost 160 million customers at the end of the recently ended second quarter. For some reason, they have never tried to create bundles into that large customer base.

AT&T has most recently been having a customer purge at DirecTV. For years that business bought market share by offering low-prices significantly below landline cable TV. Over the last, year the company has been refusing to renew promotional pricing deals and is willing to let customers walk. In the first quarter of this year alone the company lost nearly one million customers. The company says they are not unhappy to see these customers leave since they weren’t contributing to the bottom line. This is a sign of a company that is strengthening its position by stripping away the cost of dealing with unprofitable customers.

AT&T has also pushed a few net neutrality issues further than other incumbents. As a whole, the industry seems to be keeping a low profile with issues that are identified as net neutrality violations. There is speculation that the industry doesn’t want to stir up public ire on the topic and invite a regulatory backlash if there is a change in administration.

AT&T widely advertised to its cellular customers earlier this year that the company would not count DirecTV Now usage against cellular or landline data caps. The same will likely be true for AT&T TV. Favoring one’s own service over the competition is clearly one of the things that net neutrality was intended to stop. Since there are data caps on both cellular and AT&T landline products, the move puts Netflix and other streaming services at a competitive disadvantage. That disadvantage will grow over time as more landline customers hit the AT&T data caps.

AT&T has made big mistakes in the past. For instance, they poured a fortune into promoting 50 Gbps DSL instead of pushing for fiber a decade sooner. They launched their cable TV product just as that market peaked. The company seemed to lose sight of all landline and fiber-based products for a decade when everything the company did was for cellular – I remember a decade ago having trouble even finding mention of the broadband business in the AT&T annual report.

We’ll have to wait a few years to see if a company like AT&T can reinvent itself as a media giant. For now, it looks like they are making all of the right moves to take advantage of their huge resources. But the company is still managed by the same folks who were managing it a decade ago, so we’ll have to see if they can change enough to make a difference.

Should Satellite Broadband be Subsidized?

I don’t get surprised very often in this industry, but I must admit that I was surprised by the amount of money awarded for satellite broadband in the reverse auction for CAF II earlier this year. Viasat, Inc., which markets as Exede, was the fourth largest winner, collecting $122.5 million in the auction.

I understand how Viasat won – it’s largely a function of the way that reverse auctions work. In a reverse auction, each bidder lowers the amount of their bid in successive rounds until only one bidder is left in any competitive situation. The whole pool of bids is then adjusted to meet the available funds, which could mean an additional reduction of what winning bidders finally receive.

Satellite providers, by definition, have a huge unfair advantage over every other broadband technology. Viasat was already in the process of launching new satellites – and they would have launched them with or without the FCC grant money. Because of that, there is no grant level too low for them to accept out of the grant process – they would gladly accept getting only 1% of what they initially requested. A satellite company can simply outlast any other bidder in the auction.

This is particularly galling since Viasat delivers what the market has already deemed to be inferior broadband. The download speeds are fast enough to satisfy the reverse auction at speeds of at least 12 Mbps. The other current satellite provider HughesNet offer speeds of at least 25 Mbps. The two issues that customers have with satellite broadband is the latency and the data caps.

By definition, the latency for a satellite at a 23,000 orbit is at least 476 ms (milliseconds) just to account for the distance traveled to and from the earth. Actual latency is often above 600 ms. The rule of thumb is that real-time applications like VoIP, gaming, or holding a connection at a corporate LAN start having problems when latency is greater than 100-150 ms.

Exede no longer cuts customers dead for the month once they reach the data cap, but they instead reduce speeds when the network is busy for any customer over the cap. Customer reviews say this can be extremely slow during prime times. The monthly data caps are small and range from $49.99 monthly for a 10 GB data cap to $99.95 per month for a 150 GB data cap. To put those caps into perspective, OpenVault recently reported that the average landline broadband household used 273.5 GB per month of data in the first quarter of 2019.

Viasat has to be thrilled with the result of the reverse auction. They got $122.5 million for something they were already doing. The grant money isn’t bringing any new option to customers who were already free to buy these products before the auction. There is no better way to say it other than Viasat got free money due to a loophole in the grant process. I don’t think they should have been allowed into the auction since they aren’t bringing any broadband that is not already available.

The bigger future issue is if the new low-earth orbit satellite companies will qualify for the future FCC grants, such as the $20.4 billion grant program starting in 2021. The new grant programs are also likely to be reverse auctions. There is no doubt that Jeff Bezos or Elon Musk will gladly take government grant money, and there is no doubt that they can underbid any landline ISP in a reverse auction.

For now, we don’t know anything about the speeds that will be offered by the new satellites. We know that they claim that latency will be about the same as cable TV networks at about 25 ms. We don’t know about data plans and data caps, although Elon Musk has hinted at having unlimited data plans – we’ll have to wait to see what is actually offered.

It would be a tragedy for rural broadband if the new (and old) satellite companies were to win any substantial amount of the new grant money. To be fair, the new low-orbit satellite networks are expensive to launch, with price tags for each of the three providers estimated to be in the range of $10 billion. But these companies are using these satellites worldwide and will be launching them with or without help from an FCC subsidy. Rural customers are going to best be served in the long run by having somebody build a network in their neighborhood. It’s the icing on the cake if they are also able to buy satellite broadband.

Gaming Migrates to the Cloud

We are about to see a new surge in demand for broadband as major players in the game industry have decided to move gaming to the cloud. At the recent Game Developer’s Conference in San Francisco both Google and Microsoft announce major new cloud-based gaming initiatives.

Google announced Stadia, a platform that they tout as being able to play games from anywhere with a broadband connection on any device. During the announcement they showed transferring a live streaming game from desktop to laptop to cellphone. Microsoft announced the new xCloud platform that let’s Xbox gamers play a game from any connected device. Sony Playstation has been promoting online play between gamers from many years and now also offers some cloud gaming on the Playstation Now platform.

OnLive tried this in 2011, offering a platform that was played in the cloud using OnLive controllers, but without needing a computer. The company failed due to the quality of broadband connections in 2011, but also due to limitations at the gaming data centers. Both Google and Microsoft now operate regional data centers around the country that house state-of-the-art whitebox routers and switches that are capable of handling large volumes of simultaneous gaming sessions. As those companies have moved large commercial users to the cloud they created the capability to also handle gaming.

The gaming world was ripe for this innovation. Current gaming ties gamers to gaming consoles or expensive gaming computers. Cloud gaming brings mobility to gamers, but also eliminates need to buy expensive gaming consoles. This move to the cloud probably signals the beginning of the end for the Xbox, Playstation, and Nintendo consoles.

Google says it will support some games at the equivalent of an HD video stream, at 1080p and 60 frames per second. That equates to about 3GB of downloaded per hour. But most of the Google platform is going to operate at 4K video speeds, requiring download speeds of at least 25 Mbps per gaming stream and using 7.2 GB of data per hour. Nvidia has been telling gamers that they need 50 Mbps per 4K gaming connection.

This shift has huge implications for broadband networks. First, streaming causes the most stress on local broadband networks since the usage is continuous over long periods of times. A lot of ISP networks are going to start showing data bottlenecks when significant numbers of additional users stream 4K connections for hours on end. Until ISPs react to this shift, we might return to those times when broadband networks bogged down in prime time.

This is also going to increase the need for download and upload speeds. Households won’t be happy with a connection that can’t stream 4K, so they aren’t going to be satisfied with a 25 Mbps connection that the FCC says is broadband. I have a friend with two teenage sons that both run two simultaneous game streams while watching a steaming gaming TV site. It’s good that he is able to buy a gigabit connection on Verizon FiOS, because his sons alone are using a continuous broadband connection of at least 110 Mbps, and probably more

We are also going to see more people looking at the latency on networks. The conventional wisdom is that a gamer with the fastest connection has an edge. Gamers value fiber over cable modems and value cable modems over DSL.

This also is going to bring new discussion to the topic of data caps. Gaming industry statistics say that the average serious gamer averages 16 hours per week of gaming. Obviously, many play longer than the average. My friend with the two teenagers is probably looking at least at 30 GB per hour of broadband download usage plus a decent chunk of upload usage. Luckily for my friend, Verizon FiOS has no data cap. Many other big ISPs like Comcast start charging for data usage over one terabyte per month – a number that won’t be hard to reach for a household with gamers.

I think this also opens up the possibility for ISPs to sell gamer-only connections. These connections could be routed straight to peering arrangements with the Google or Microsoft to guarantee the fastest connection through their network and wouldn’t mix gaming streams with other household broadband streams. Many gamers will pay extra to have a speed edge.

This is just another example of how the world find ways to use broadband when it’s available. We’ve obviously reached a time when online gaming can be supported. When OnLive tried is there were not enough households with fast enough connections, there weren’t fast enough regional data centers, and there wasn’t a peering network in place where ISPs connect directly to big data companies like Google and bypass the open Internet.

The gaming industry is going to keep demanding faster broadband and I doubt they’ll be satisfied until we have a holodeck in every gamer’s home. But numerous other industries are finding ways to use our increasing household broadband capcity and the overall demand keeps growing at a torrid pace.

 

Another Rural Wireless Provider?

T-Mobile announced the start of a trial for a fixed wireless broadband product using LTE. The product is being marketed as “T-Mobile Home Internet”. The company will offer the product by invitation only to some existing T-Mobile cellular customers in “rural and underserved areas”. The company says they might connect as many as 50,000 customers this year. The company is marketing the product as 50 Mbps broadband, with a monthly price of $50 and no data cap. The company warns that speeds may be curtailed during times of network congestion.

The company further says that their ultimate goal is to offer speeds of up to 100 Mbps, but only if they are allowed to merge with Sprint and gain access to Sprint’s huge inventory of mid-range spectrum. They said the combination of the two companies would enable them to cover as many as 9.5 million homes with 100 Mbps broadband in about half of US zip codes.

There are positive aspects the planned deployment, but also a number of issues that make me skeptical. One positive aspect is that some of the spectrum used for LTE can better pass through trees compared to the spectrum used for the fixed wireless technology that is being widely deployed in the open plains and prairies of the Midwest and West. This opens up the possibility of bringing some wireless broadband to places like Appalachia – with the caveat that heavy woods are still going to slow down data speeds. It’s worth noting that this is still a line-of-sight technology and fixed LTE will be blocked by hills or other physical impediments.

The other positive aspect of the announced product is the price and lack of a data cap. Contrast this to the AT&T fixed LTE product that has a price as high as $70 along with a stingy 160 GB monthly cap, and with overage charges that can bring the AT&T price up to $200 per month.

I am skeptical of a number of the claims made or implied by the announcement. The primary concern is download speeds. Fixed LTE will be the same as any other fixed wireless product and speeds will decrease with the distance of a customer from the serving tower. In rural America distances can mount up quickly. LTE broadband is similar to rural cellular voice and works best where customers can get 4 or 5 bars. Anybody living in rural America understands that there are a lot more places with 1 or 2 bars of signal strength than of 4 or 5 bars.

The 50 Mbps advertised speed is clearly an ‘up-to’ speed and in rural America it’s doubtful that anybody other than those who live under a tower could actually get that much speed. This is one of the few times when I’ve seen AT&T advertise truthfully and they market their LTE product as delivering at least 10 Mbps speed. I’ve read numerous online reviews of the AT&T product and the typical speeds reported by customers range between 10 Mbps and 25 Mbps, with only a few lucky customers claiming speeds faster than that.

The online reviews of the AT&T LTE product also indicate that signal strength is heavily influenced by rain and can completely disappear during a downpour. Perhaps even more concerning are reports that in some cases speeds remain slow after a rain due to wet leaves on trees that must be scattering the signal.

Another concern is that T-Mobile is touting this as a solution for underserved rural America.  T-Mobile has far less presence in rural America than AT&T and Verizon and is on fewer rural cellular towers. This is evidenced by their claim that even after a merger with Sprint they’d only be seeing 9.5 million passings – that’s really small coverage for a nationwide cellular network. I’m a bit skeptical that T-Mobile will invest in connecting to more rural towers just to offer this product – the cost of backhaul to rural towers often makes for a lousy business case.

The claim also says that the product will have some aspects of both 4G and 5G. I’ve talked to several wireless engineers who have told me that they can’t see any particular advantage for 5G over 4G when deploying as fixed wireless. A carrier already opens up the available data path fully with 4G to reach a customer and 5G can’t make the spectrum perform any better. I’d love to hear from anybody who can tell me how 5G would enhance this particular application. This might be a case where the 5G term is tossed in for the benefit of politicians and marketing.

Finally, this is clearly a ploy to keep pushing for the merger with Sprint. The claim of the combined companies being able to offer 100 Mbps rural broadband has even more holes than the arguments for achieving 50 Mbps. However, Sprint does have a larger rural presence on rural towers today than T-Mobile, although I think the Sprint towers are already counted in the 9.5 million passings claim.

But putting aside all my skepticism, it would be great if T-Mobile can bring broadband to any rural customers that otherwise wouldn’t have it. Even should they not achieve the full 50 Mbps claim, many rural homes would be thrilled to get speeds at half that level. A wireless product with no data caps would also be a welcomed product. The timing of the announcement is clearly aimed at promoting the merger process with Sprint and I hope the company’s deployment plans don’t evaporate if the merger doesn’t happen.

Broadband Usage Continues to Grow

The firm OpenVault, a provider of software that measures data consumption for ISPs reported that the average monthly data use by households grew from 201.6 gigabytes in 2017 to 268.7 gigabytes in 2018 – a growth rate of 33%. The company also reported that the medium use per household grew from 103.6 gigabytes in 2017 to 145.2 gigabytes in 2018 – a growth rate of 40%. The medium represents the midpoint of users, with half of all households above and half below the medium.

To some degree, these statistics are not news because we’ve known for a long time that broadband usage at homes, both in total download and in desired speeds has been doubling every three years since the early 1980s. The growth in 2018 is actually a little faster than that historical average and if the 2018 growth rate was sustained, in three years usage would grow by 235%. What I find most impressive about these new statistics is the magnitude of the annual change – the average home used 67 more gigabytes of data per month in 2018 than the year before – a number that would have seemed unbelievable only a decade ago when the average household used a total of only 25 gigabytes per month.

There are still many in the industry who are surprised by these numbers. I’ve heard people claim that now that homes are watching all the video they want that the rate of growth is bound to slow down – but if anything, the rate of growth seems to be accelerating. We also know that cellular data consumption is also now doubling every two years.

This kind of growth has huge implications for the industry. From a network perspective, this kind of bandwidth usage puts a big strain on networks. Typically the most strained part of a network is the backbones that connect to neighborhood nodes. That’s the primary stress point in many networks, including FTTH networks, and when there isn’t enough bandwidth to a neighborhood then everybody’s bandwidth suffers. Somebody that designed a network ten years ago would never have believed the numbers that OpenVault is reporting and would likely not have designed a network that would still be sufficient today.

One consequence of the bandwidth growth is that it’s got to be driving homes to change to faster service providers when they have the option. A household that might have been happy with a 5 Mbps or 10 Mbps connection a few years ago is likely no longer happy with it. This has to be one of the reasons we are seeing millions of homes each year upgrade from DSL to cable modem each year in metropolitan areas. The kind of usage growth we are seeing today has to be accelerating the death of DSL.

This growth also should be affecting policy. The FCC set the definition of broadband at 25/3 Mbps in January of 2015. If that was a good definition in 2015 then the definition of broadband should have been increased to 63 Mbps in 2019. At the time the FCC set that threshold I thought they were a little generous. In 2014, as the FCC was having this debate, the average home downloaded around 100 gigabytes per month. In 2014 the right definition of broadband was probably more realistically 15 – 20 Mbps and the FCC was obviously a little forward-looking in setting the definition. Even so, the definition of broadband should be increased – if the right definition of broadband in 2014 was 20 Mbps, then today the definition of broadband ought to have been increased to 50 Mbps today.

The current FCC is ignoring these statistics for policy purposes – if they raise the definition of broadband then huge numbers of homes will be classified as not having broadband. The FCC does not want to do that since they are required by Congressional edict to make sure that all homes have broadband. When the FCC set a realistic definition of broadband in 2015 they created a dilemma for themselves. That 2015 definition is already obsolete and if they don’t change it, in a few years it is going to be absurdly ridiculous. One only has to look forward three years from now, when the definition of broadband ought to be 100 Mbps.

These statistics also remind us of the stupidity of handing out federal subsidies to build technologies that deliver less than 100 Mbps. We still have two more years of CAF II construction to upgrade speeds to an anemic 10 Mbps. We are still handing out new subsidies to build networks that can deliver 25/3 Mbps – networks that are obsolete before they are completed.

Network designers will tell you that they try to design networks to satisfy demands at least seven years into the future (which is the average life of many kinds of fiber electronics). If broadband usage keeps doubling every three years, then looking forward seven years to 2026, the average home is going to download 1.7 terabytes per month and will expect download speeds of 318 Mbps. I wonder how many network planners are using that target?

The final implications of this growth are for data caps. Two years ago when Comcast set a terabyte monthly data cap they said that it affected only a few homes – and I’m sure they were right at the time. However, the OpenVault statistics show that 4.12% of homes used a terabyte per month in 2018, almost double from 2.11% in 2017. We’ve now reached that point when the terabyte data cap is going to have teeth, and over the next few years a lot of homes are going to pass that threshold and have to pay a lot more for their broadband. While much of the industry has a hard time believing the growth statistics, I think Comcast knew exactly what they were doing when they established the terabyte cap that seemed so high just a few years ago.

The Reality of Rural Broadband

I recently saw the results of several rural surveys that probably tell the best story about the state of rural broadband. The two areas being studied are far apart geographically, but they are similar in many ways. The areas are both rural and are not near to a metropolitan area. The areas have some modest manufacturing and some modest amount of tourism, but neither in a big way. Both areas included some small towns, and a few of these towns have cable TV. And in both places, the customers in the rural area have poor broadband choices. These are not small isolated pockets of people, and the two surveys cover nearly 20,000 homes.

If you listen to FCC rhetoric it’s easy to think that rural broadband is improving – but in areas like these you can’t see it. These areas have both were supposed to get some upgrades from CAF II – but from what the locals tell me there have been zero improvements so far. The CAF program still has a few years to go, so perhaps there will be some modest improvement in rural DSL.

For now, the broadband situation in these areas is miserable. There are homes with DSL with speeds of a few Mbps at best, with some of the worst speeds hovering at dial-up speeds. One respondent to a survey reported that it took 8 hours to download a copy of Microsoft Office online.

The other broadband choices are also meager. Some people use satellite broadband but complain about the latency and about the small data caps. These areas both have a smattering of fixed wireless broadband – but this is not the modern fixed wireless you see today in the open plains states that delivers 25 Mbps or faster broadband. Both of the areas in the surveys are heavily wooded with hilly terrain, and fixed wireless customers report seeing speeds of 1-2 Mbps. There are a number of homes using their cell phones in lieu of home broadband – an expensive alternative if there are school kids or if any video is watched. There were customers who reported using public hotspots in nearby small towns. And there were a number of households, included many with school kids who have given up and who have no broadband – because nothing they’ve tried has worked.

As would be expected in rural areas, slow speeds are not the only problem. Even homes that report data speeds that should support streaming video complain that streaming doesn’t work. This indicates networks with problems and it’s likely the networks have high latency, are full of jitter, or are over-subscribed and have a lot of packet loss. People don’t really judge the quality of their broadband connection by the speed they get on a speed test, but instead by the ability to do normally expected activities on the Internet.

Many of these homes can’t do things that the rest of us take for granted. Many report the inability to stream video – even a single stream. This is perhaps the biggest fallacy in the way the FCC measures broadband, because they expect that a house getting a speed like 5 Mbps ought to be able to do most needed tasks. In real life the quality of many rural connections are so poor that they won’t stream video. Many people in these areas also complained that their Internet often froze and they had to constantly reboot – something that can kill large downloads or kill online sessions for school or work.

One of the biggest complaints in these areas was that their network only supported one device at a time, meaning that members of the family have to take turns using the Internet. I picture a family with a few school kids and can see how miserable that must be.

The surveys produced a long list of other ways that poor broadband was hurting households. Number one was the inability of people to work at home. Many people said they could work at home more often if they had broadband. A few respondents want to start home businesses but are unable to because of the poor broadband. Another common complaint was the inability for kids to do schoolwork, or for adults to pursue college degrees on line.

The problems many people reported were even more fundamental than these issues. For instance, there were households saying that they could not maintain a good enough connection to bank online or pay their bills online. There were respondents who say they can’t shop online. Many households complained that they couldn’t offload cellular data at home to WiFi, driving up their cellular bills. A number of homes would like to cut the cord to save money but can’t stream Netflix as an alternative to cable.

When you look the raw data behind these kinds of surveys you quickly see the real issues with lack of broadband. In today’s society, not having home broadband literally takes a home out of the mainstream of society. It’s one thing to look at the national statistics and be told that the number of homes without broadband is shrinking. But it’s an entirely different story when you see what that means for the millions of homes that still don’t have adequate broadband. My guess is that some of the areas covered by these surveys show as underserved on the FCC maps – when in fact, their broadband is so poor that they are clearly unserved, ignored and forgotten.

The Zero-rating Strategy

The cable companies are increasingly likely to be take a page from the cellular carriers by offering zero-rating for video. That’s the practice of providing video content that doesn’t count against monthly data caps.

Zero-rating has been around for a while. T-Mobile first started using zero-rating in 2014 when it provided its ‘Music Freedom’ plan that provided free streaming music that didn’t count against cellular data caps. This highlights how fast broadband needs have grown in a short time – but when data caps were at 1 GB per month, music streaming mattered.

T-Mobile then expanded the zero-rating in November 2015 to include access to several popular video services like Netflix and Hulu. AT&T quickly followed with the first ‘for-pay’ zero-rating product, called FreeBee Data that let customers (or content providers) pay to zero-rate video traffic. The AT&T plan was prominent in the net neutrality discussions since it’s a textbook example of Internet fast lanes using sponsored data where some video traffic was given preferential treatment over other data.

A few of the largest cable companies have also introduced a form of zero-rating. Comcast started offering what it called Stream TV in late 2015. This service allowed customers to view video content that doesn’t count against the monthly data cap. This was a pretty big deal at the time because Comcast was in the process at the time of implementing a 300 GB monthly data cap and video can easily push households over that small cap limit. There was huge consumer pushback against the paltry data caps and Comcast quickly reset the data cap to 1 terabyte. But the Stream TV plan is still in effect today.

What’s interesting about the Comcast plan is that the company had agreed to not use zero-rating as part of the terms of its merger with NBC Universal in 2011. The company claims that the Stream TV plan is not zero-rating since it uses cable TV bandwidth instead of data bandwidth – but anybody who understands a cable hybrid-fiber coaxial network knows that this argument is slight-of-hand, since all data uses some portion of the Comcast data connection to customers. The prior FCC started to look into the issue, but it was dropped by the current FCC as they decided to eliminate net neutrality.

The big cable companies have to be concerned about the pending competition with last-mile 5G. Verizon will begin a slow roll-out of its new 5G technology in October in four markets, and T-Mobile has announced plans to begin offering it next year. Verizon has already announced that they will not have any data caps and T-Mobile is also unlikely to have them.

The pressure will be on the cable companies to not charge for exceeding data caps in competitive markets. Cable companies could do this by eliminating data caps or else by pushing more video through zero-rating plans. In the case of Comcast, they won’t want to eliminate the data caps for markets that are not competitive. They view data caps as a potential source of revenue. The company OpenVault says that 2.5% of home currently exceed 1 TB in monthly data usage, up from 1.5% in 2017 – and within a few years this could be a lucrative source of extra revenue.

Comcast and the other big cable companies are under tremendous pressure to maintain earnings and they are not likely to give up on data caps as a revenue source. They are also likely to pursue sponsored video plans where the video services pay them to provide video outside of data caps.

Zero-rating is the one net neutrality practice that many customers like. Even should net neutrality be imposed again – through something like the California legislation or by a future FCC – it will be interesting to see how firmly regulators are willing to clamp down on a practice that the public likes.

Metering Broadband

A lot of the controversy about Comcast data caps disappeared last year when they raised the monthly threshold for data caps from 300 gigabytes to 1 terabyte. But lately I’ve been seeing folks complaining about being charged for exceeding the 1 TB data cap – so Comcast is still enforcing their data caps rules.

In order to enforce a data cap an ISP has to somehow meter the usage. It appears that in a lot of cases ISPs do a lousy job of measuring usage. Not all ISPs have data caps. The biggest ISPs that have them include Comcast, AT&T, CenturyLink for DSL, Cox and Mediacom. But even these ISPs don’t enforce data caps everywhere, like Comcast not enforcing them where they compete directly against Verizon FiOS.

Many customer home routers can measure usage and there are reports of cases where Comcast data usage measurements are massively different than what is being seen at the home. For example, there are customers who have seen big spikes in data measurement from Comcast at a time when their routers were disconnected or when power was out to the home. There are many customers who claim the Comcast readings always greatly exceed what they are seeing at their home routers.

Data caps matter because customer that exceed the caps get charged a fee. Comcast charges $10 for each 50 GB of monthly over the cap. Mediacom has the same fees, but with much smaller data caps such as a 150 GB monthly cap on customers with a 60 Mbps product.

It’s not hard to imagine homes now exceeding the Comcast data cap limit. Before I left Comcast a year ago they said that my family of three was using 600 – 700 GB per month. Since I didn’t measure my own usage I have no idea if their numbers were inflated. If my measurements were accurate it’s not hard to imagine somebody with several kids at home exceeding the 1 TB. The ISPs claim that only a small percentage of customers hit the data cap limits – but in world where data usage keep growing exponentially each year there are more homes that will hit the limit as time goes by.

What I find interesting is that there is zero regulation of the ISP data ‘meters’. Every other kind of meter that is used as a way to bill customers are regulated. Utilities selling water, electric or natural gas must use meters that are certified to be accurate. Meters on gas pumps are checked regularly for accuracy.

But there is nobody monitoring the ISPs and the way they are measuring data usage. The FCC effectively washed their hands from regulating ISPs for anything broadband when they killed Title II regulation of broadband. Theoretically the Federal Trade Commission could tackle the issue, but they are not required to do so. They regulate interactions with customers in all industries and can select the cases they want to pursue.

There are a few obvious reasons why the readings from an ISP would differ from a home, even under ideal conditions. ISPs measure usage at their network hub while a customer measurement happens at the home. There are always packets lost in the network due to interference or noise on the network, particularly with older copper and coaxial networks. The ISP would be counting all data passing through the hub as usage although many of the packets never make it to customers. But when you read some of the horror stories where homes that don’t watch video see daily readings from Comcast of over 100 GB in usage you know that there is something wrong in the way that Comcast is measuring usage. It has to be a daunting task to measure the usage directed for thousands of users simultaneously and obviously Comcast has problems in their measurement algorithms.

I’ve written about data caps before. It’s obvious that the caps are just a way for ISPs to charge more money, and it’s a gigantic amount of extra revenue if Comcast can bill $10 per month extra to only a few percent of their 23 million customers. Anybody that understand the math behind the cost of broadband understands that a $10 extra charge for 50 GB of usage is almost 100% profit. It doesn’t cost the ISP anything close to $10 for the connections for the first terabyte let alone an incrementally small additional amount. And there certainly is no cost at all if the Comcast meters are billing for phantom usage.

I don’t know that there is any fix for this. However, it’s clear that every customer being charged for exceeding data caps will switch to a new ISP at the first opportunity. The big ISPs wonder why many of their customers loathe them, and this is just one more way for a big ISP to antagonize their customers. It’s why every ISP that builds a fiber network to compete against a big cable companies understand that they will almost automatically get 30% of the market due to customers who have come to hate their cable ISP.

Is the FCC Disguising the Rural Broadband Problem?

Buried within the FCC’s February Broadband Deployment Report are some tables that imply that over 95% of American homes can now get broadband at speeds of at least 25/3 Mbps. That is drastically higher than the report just a year earlier. The big change in the report is that the FCC is now counting fixed wireless and satellite broadband when compiling the numbers. This leads me to ask if the FCC is purposefully disguising the miserable condition of rural broadband?

I want to start with some examples from this FCC map that derives from the data supporting the FCC’s annual report. I started with some counties in Minnesota that I’m familiar with. The FCC database and map claims that Chippewa, Lyon, Mille Lacs and Pope Counties in Minnesota all have 100% coverage of 25/3 broadband. They also claim that Yellow Medicine County has 99.59% coverage of 25/3 Mbps broadband and the folks there must be wondering who is in that tiny percentage without broadband.

The facts on the ground tell a different story. In real life, the areas of these counties served by the incumbent telcos CenturyLink and Frontier have little or no broadband outside of towns. Within a short distance from each town and throughout the rural areas of the county there is no good broadband to speak of – certainly not anything that approaches 25/3 Mbps. I’d love to hear from others who look at this map to see if it tells the truth about where you live.

Let me start with the FCC’s decision to include satellite broadband in the numbers. When you go to the rural areas in these counties practically nobody buys satellite broadband. Many tried it years ago and using it is a miserable experience. There are a few satellite plans that offer speeds as fast as 25/3 Mbps. But satellite broadband today has terrible latency, as high as 900 milliseconds. Anything over 100 milliseconds makes it hard or impossible to do any real-time computing. That means on satellite broadband that you can’t stream video. You can’t have a Skype call. You can’t connect to a corporate WAN and work from home or connect to online classes. You will have problems staying on many web shopping sites. You can’t even make a VoIP call.

Satellite broadband also has stingy data caps that make it impossible to use as a home broadband connection. Most of the plans come with a monthly data caps of 10 GB to 20 GB, and unlike cellular plans where you can buy additional data, the satellite plans cut you off for the rest of the month when you hit your data cap. And even with all of these problems, it’s also expensive and is priced higher than landline broadband. Rural customers have voted with their pocketbooks that satellite broadband is not broadband that many people are willing to tolerate.

Fixed wireless is a more mixed bag. There are high-quality fixed wireless providers who are delivering speeds as fast as 100 Mbps. But as I’ve written about, most rural fixed broadband delivers speeds far below this and the more typical fixed wireless connection is somewhere between 2 Mbps and 6 Mbps.

There are a number of factors needed to make a quality fixed broadband connection. First, the technology must be only a few years old because older radios older were not capable of reaching the 25/3 speeds. Customers also need a clear line-of-sight back to the transmitter and must be within some reasonable distance from a tower. This means that there are usually s significant number of homes in wireless service areas that can’t get any coverage due to trees or being behind a hill. Finally, and probably most importantly, the wireless provider needs properly designed network and a solid backhaul data pipe. Many WISPs pack too many customers on a tower and dilute the broadband. Many wireless towers are fed by multi-hop wireless backhaul, meaning the tower doesn’t have enough raw bandwidth to deliver a vigorous customer product.

In the FCC’s defense, most of the data about fixed wireless that feeds the database and map is self-reported by the WISPs. I am personally a big fan of fixed wireless when it’s done right and I was a WISP customer for nine years. But there are a lot of WISPs who exaggerate in their marketing literature and tell customers they sell broadband up to 25/3 Mbps when their actual product might only be a tiny fraction of those speeds. I have no doubt that these WISPs also report those marketing speeds to the FCC, which leads to the errors in the maps.

The FCC should know better. In those counties listed above I would venture to say that there are practically no households who can get a 25/3 fixed wireless connection, but there are undoubtedly a few. I know people in these counties gave up on satellite broadband many years ago. My conclusion from the new FCC data is that this FCC has elected to disguise the facts by claiming that households have broadband when they don’t. This is how the FCC is letting themselves off the hook for trying to fix the rural broadband shortages that exist in most of rural America. We can’t fix a problem that we won’t even officially acknowledge, and this FCC, for some reason, is masking the truth.

Data Caps Again?

My prediction is that we are going to see more stringent data caps in our future. Some of the bigger ISPs have data caps today, but for the most part the caps are not onerous. But I foresee data caps being reintroduced as another way for big ISPs to improve revenues.

You might recall that Comcast tried to introduce a monthly 300 GB data cap in 2015. When customers hit that mark Comcast was going to charge $10 for every additional 50 GB of download, or $30 extra for unlimited downloading.

There was a lot of public outcry about those data caps. Comcast backed down from the plan due to pressure from the Tom Wheeler FCC. At the time the FCC probably didn’t have the authority to force Comcast to kill the data caps, but the nature of regulation is that big companies don’t go out of their way to antagonize regulators who can instead cause them trouble in other areas.

To put that Comcast data cap into perspective, in September of 2017 Cisco predicted that home downloading of video would increase 31% per year through 2021. They estimated the average household data download in 2017 was already around 130 GB per month. You might think that means that most people wouldn’t be worried about the data caps. But it’s easy to underestimate the impact of compound growth and at a 31% growth rate the average household download of 130 GB would grow to 383 gigabits by 2021 – considerably over Comcast’s propose data cap.

Even now there are a lot of households that would be over that caps. It’s likely that most cord cutters use more than 300 GB per month – and it can be argued that the Comcast’s data caps would punish those who drop their video. My daughter is off to college now and our usage has dropped, but we got a report from Comcast when she was a senior that said we used over 600 GB per month.

So what are the data caps for the largest ISPs today?

  • Charter, Altice, Verizon and Frontier have no data caps.
  • Comcast moved their data cap to 1 terabyte, with $10 for the first 50 GB and $50 monthly for unlimited download.
  • AT&T has almost the stingiest data caps. The cap on DSL is 150 GB, on U-verse is 250 GB, on 300 Mbps FTTH is 1 TB and is unlimited for a Gbps service. They charge $10 per extra 50 GB.
  • CenturyLink has a 1 TB cap on DSL and no cap on fiber.
  • Cox has a 1 TB cap with $30 for an extra 500 GB or $50 unlimited.
  • Cable One has no charge but largely forces customers who go over caps to upgrade to more expensive data plans. Their caps are stingy – the cap on a 15 Mbps DSL connection is 50 GB.
  • Mediacom has perhaps the most expensive data caps – 60 Mbps cap is 150 GB, 100 Mbps is 1 TB. But the charge for violating the cap is $10 per GB or $50 for unlimited.

Other than AT&T, Mediacom and Cable One none of the other caps sound too restrictive.

Why do I think we’ll see data caps again? All of the ISPs are looking forward just a few years and wondering where they will find the revenues to increase the demand from Wall Street for ever-increasing earnings. The biggest cable companies are still growing broadband customers, mostly by taking customers from DSL. But they understand that the US broadband market is approaching saturation – much like has happened with cellphones. Once every home that wants broadband has it, these companies are in trouble because bottom line growth for the last decade has been fueled by the growth of broadband customers and revenues.

A few big ISPs are hoping for new revenues from other sources. For instance, Comcast has already launched a cellular product and also is seeing good success with security and smart home service. But even they will be impacted when broadband sales inevitably stall – other ISPs will feel the pinch before Comcast.

ISPs only have a few ways to make more money once customer growth has stalled, with the primary one being higher rates. We saw some modest increases earlier this year in broadband rates – something that was noticeable because rates have been the same for many years. I fully expect we’ll start seeing sizable annual increases in broadband rates – which go straight to the bottom line for ISPs. The impact from broadband rate increases is major for these companies – Comcast and Charter, for example, make an extra $250 million per year from a $1 increase in broadband rates.

Imposing stricter data caps can be as good as a rate increase for an ISPs. They can justify it by saying that they are charging more only for those who use the network the most. As we see earnings pressure on these companies I can’t see them passing up such an easy way to increase earnings. In most markets the big cable companies are a near monopoly and consumers who need decent speeds have fewer alternative as each year passes.Since the FCC has now walked away from broadband regulations there will be future regulatory hindrance to the return of stricter data caps.