Why are US Broadband Prices so High?

I’ve wondered for years about why broadband prices are higher in the US than the rest of the world. The average price in other industrial counties is significantly lower. In France broadband averages $31, Germany is $35, Japan is $35, South Korea is $33, and the UK is $35. The average price of broadband in the US is approaching $70, so we’re at twice the price as other countries.

Thomas Philippon tackles this question in his new book The Great Reversal: How America Gave Up on Free Markets. He’s an economist at NYU who moved to the US in the 1990s but has kept an eye on Europe. The book looks at a lot more than just broadband prices and Philippon looks at other major industries like airlines, pharmaceuticals, and the US food chain.

He says something that was a wake-up call to me. Go back 30-40 years and the situation was reversed. At that time the US had some of the lowest prices in the world for things like telecom, airline tickets, pharmaceuticals, and food – and not just a little cheaper. Prices here were 30-40% lower than in Europe at that time. In just a few decades the situation has completely reversed and US prices from major industries are now much higher than in Europe.

How did this happen? He says the cause is almost entirely due to what he calls corporate concentration. In every one of the industries where prices have climbed in the US, there have been numerous large corporate mergers that have had the net impact of reducing competition. As fewer and fewer giant companies control a market there is less competition. One result of corporate concentration is the ability of industries to squash regulations through corporate lobbying – and lowering regulations inevitably leads to higher profits and higher prices.

It’s not hard to trace the history of consolidation through any of the major industries in this country. Since the readers of the blog are telecom folks, consider the telecom landscape in 1990:

  • At that time the Baby Bell companies were still separate from AT&T.
  • There was a vigorous CLEC market starting to grow led by companies like MCI.
  • There were probably a hundred different medium-sized regional cable companies.
  • There were not as many cellular companies due to the limited licenses granted for spectrum, but there was still strong regional competition from smaller cellular companies.
  • There were dozens of thriving manufacturers of telecom electronics.
  • In 1990 we had vigorous regulation, and at the state level there was still a lot of telecom rate regulation.

In just thirty years that picture changed. Most of the Baby Bells came back together under the AT&T umbrella. Comcast and Charter went on wild buying sprees and consolidated most of the medium-sized cable companies. Telcos purchased and neutered their competition, like the purchase of MCI by Verizon. Comcast and AT&T went on to merge with giant content providers to further consolidate the industry supply chain.

Telecom regulation has been all but killed in the country. This is almost entirely at the bidding of lobbyists. The current FCC went so far as to write themselves out of regulating broadband. All of these events resulted in US broadband that now costs twice as much as the rest of the industrialized world.

Meanwhile, Europe took the opposite approach. In 1990, regulation in Europe was local to each country and concentrated on protecting local industries in each country – and that led to high prices. However, after the creation of the European Union in 1993, regulators adopted the philosophy of promoting competition in every major industry. From that point forward, European regulators made it extremely difficult for competing corporations to merge. Regulators took special care to protect new market entrants to give them a chance to grow and thrive.

The regulatory policies in the US and Europe have completely flipped since 1990. The US was pro-competition in the 90s, as well-evidenced by the Telecommunications Act of 1996. Today’s FCC is working hard to eliminate regulation. European regulators now put competition first when making decisions.

It’s never too late for the US to swing back to being more competitive. However, for now, the monopolies are winning, and it will be hard to break their hold on politicians and regulators. But this is something we’ve seen before. At the turn of the nineteenth century, big corporations had a stranglehold on the US. Monopolies inevitably abuse their market power and eventually there is enough public backlash to push the government to re-regulate industries.

CenturyLink Ready to Launch Gigabit Broadband in Springfield, MO

It’s rare to be surprised by events in the telecom world. The announcement last summer that CenturyLink will be an ISP on a city-owned fiber network in Springfield MO was one of the most surprising things I’ve heard since the announcement years ago that Google was going to become a gigabit ISP. The joint venture has been progressing and CenturyLink says it should be adding customers in the community this spring.

The partnership between the city and CenturyLink is interesting:

  • CenturyLink has agreed to lease the network over 15-years at a payment that made the city comfortable enough to build the network. The city says they won’t have to raise electric rates since the lease revenue stream justifies the cost of the new $120 million fiber expansion.
  • The city is providing dark fiber and CenturyLink will provide all of the electronics. There have been no public announcements saying which party pays for the fiber drops. Since this is being touted as an expansion of smart-grid, it would make sense that the city owns the drops.
  • The arrangement is described as non-exclusive, meaning that other ISPs are free to serve on the network. The announcements don’t say if CenturyLink gets a head-start over other ISPs through some period of exclusivity before open access kicks in. That’s been the case in similar arrangements.
  • CenturyLink is offering $65 gigabit broadband ‘for-life’ with a guarantee that the price will never be increased. Speeds are advertised ‘up to 940 Mbps. In other CenturyLink markets the gigabit product requires paperless billing and prepayment with a credit card or bank debit. CenturyLink charges $5 for an optional WiFi modem.

There are a few other similar well-known arrangements in the industry. This is similar to the Google Fiber arrangement with Huntsville, Alabama. It’s similar to the Ting arrangement in Westminster, Maryland and Charlottesville, Virginia. What’s unusual and surprising about this deal is that it’s with one of the big incumbent telcos. However, CenturyLink is not the incumbent in Springfield and enters the market purely as an outside ISP. CenturyLink will be competing side-by-side with AT&T, the first instance of two large incumbents telcos competing in a residential market. The other competitor and the incumbent cable provider in Springfield is Mediacom.

There are some in the industry touting this as a new paradigm for bringing gigabit fiber – but I’m not sure that is so. Like with any business model, all of the facts and the numbers must line up for any market to be a good target for overbuilding with fiber. It’s possible that there are unique characteristics of Springfield that might make this model hard to replicate in most other places.

Springfield owns a municipal electric utility and the utility decided years ago to build fiber to serve its own needs and to bring fiber to businesses in the city. The city started this new venture already owning 700 miles of fiber – much of which will likely be the backbone for building the last-mile for this venture. Springfield is also touting this as a smart grid initiative, meaning the electric utility is likely picking up a piece of the cost of the new fiber construction. There is a good chance that the math would not look nearly so favorable for a city without an electric utility – because in that case the venture would be starting with no existing fiber and the new fiber venture would have to absorb 100% of the cost of the new construction. I’ve looked at this lease model for cities that don’t own existing fiber or an electric utility and the math is often not pretty.

Don’t read those last statements as a criticism of the fiber lease model, but rather just as a recognition that all of the financial factors must align just right for this kind of venture to work. Any city owning an electric utility ought to do the math and consider this model. Cities with low construction costs for fiber might also be good candidates.

The surprising part of this arrangement is that this is being done by CenturyLink. This is an incumbent telco that is well known throughout rural America for operating lousy copper networks. The company has been ignoring the customers in rural markets, and CenturyLink customers living in rural Missouri can’t be thrilled to hear that the company will be offering gigabit fiber in a new market while continuing to ignore their broadband plight. CenturyLink is not going to sink a lot of capital in Springfield, but it’s paying for the cost of electronics and installation.

I have to give CenturyLink credit for tackling this venture. They were building fiber-to-the-home networks before Jeff Storey, the new CEO put a kibosh on spending capital for projects earning ‘infrastructure returns’. The FTTP businesses is an economy of scale business and CenturyLink can take advantage of the staff and platforms they already have in place to operate efficiently in Springfield. Since this is dark fiber the company can still do everything the CenturyLink way – which is an important factor for a big telco. We’ll have to wait to see if this is a new business line for CenturyLink or if Springfield is a unique case.

Sharing Grant-funded Fiber

The FCC misses no opportunity to talk about how much they support rural broadband, so hopefully they will take advantage of an opportunity to open up a lot of new fiber in rural America. The FCC is going to fund $9 billion for the 5G Fund later this year that is intended to bring better cell phone coverage to rural areas. That funding will go to cellular carriers.

A lot of the 5G Fund is going to be used to build fiber to rural cell towers and the FCC should make any such middle-mile fiber available to others at affordable rates. One of the biggest impediments to building last-mile networks in remote areas is still the absence of fiber backhaul. If the FCC is going to pay to run fiber to rural areas, then it only makes sense they would make such fiber available to last-mile ISPs.

The big cellular carriers will say that this is a burden they don’t want to bear, but that is bosh. Big companies like Verizon and AT&T are already are among the largest seller of fiber transport in the country, so they have everything needed to sell transport on these new fiber routes. The cellular companies will already be obligated to maintain the new fiber routes, so carrying additional traffic in the fibers doesn’t increase ongoing costs. Since the fiber will be free to the cellular carriers, the transport rates ought to be set low – any revenue derived on these fibers would still be pure gravy for the cellular companies

There will be smaller cellular carriers in the auction, and I would expect most of them to already be planning on selling transport on any new fiber routes. But not all of the smaller carriers will do so, so the FCC should make this mandatory – as they should for any middle-mile fiber route funded by the federal coffers.

States should also adopt this same policy. I’ve seen state grants go towards middle-mile fiber that was not made available to other carriers at affordable rates. Middle-mile fiber subsidized by the government should always be made available to others, and at subsidized rates that recognize the government contribution towards paying for the fiber.

I don’t think the same thing should be true for last-mile fiber. Most grant funding today is being used to build last-mile fiber in areas of low density. Even with grant funding, many of these last-mile projects barely pay for themselves. It would make no sense to allow competitors into last-mile fiber, because doing so might bankrupt the ISP that won the grant to build to a remote area.

The FCC mandated the sharing of middle-mile fiber built with the stimulus grants fifteen years ago. Many of those middle-mile networks have been leveraged to enable last-mile broadband projects that might otherwise never have materialized. But there are middle-mile projects from that program that didn’t follow the rules, like the middle middle-mile network in West Virginia that was basically handed to Frontier to use and charge as they wish.

The big carriers have a poor record of sharing fiber with competitors. The Telecommunications Act of 1996 mandated that the big telcos make excess dark fiber available to others with rates set at incremental cost. While some persistent ISPs have been able to lease dark fiber under those rules, the big telcos have worked hard to make it too difficult for somebody to buy. The telcos have also convinced the FCC over the years to change the rules to make it harder to buy dark fiber.

If this new batch of fiber is made available to others there must be rules. Without guidelines, the big telcos will declare that they need all of the fiber strands being built, even if they only use two fiber out of a 24-fiber. The FCC rules should include guidelines for setting a reasonable number of spare and reserve fibers.

The rules for the 5G fund have not yet been finalized, and hopefully, the FCC will do the right thing. These new fiber routes are going to some of the most remote places in the country and not all middle-mile routes will be of any use to others. Even if only one out of ten of the fiber routes built with the 5G Fund is used to create last-mile networks, the 5G Fund will have accomplished more than just improving rural cellular coverage.

FCC Reports on Poor Rural 4G Coverage

The FCC released a report in January that shows that the cellular networks of the major carriers underperform in rural America. This is no news to anybody who lives and works in a rural county. The tests allowed the FCC to conclude that the national coverage maps for 4G LTE are largely fiction in rural America.

The FCC conducted 25,000 tests in twelve states to verify the coverage maps of Verizon, T-Mobile, and US Cellular. The majority of tests were done in Arizona, New Mexico, Oklahoma, Vermont, Alabama and Montana. Speeds were tested from both stationary locations and in a moving vehicle. AT&T and Sprint weren’t tested because the maps they provided to the FCC showed only the combined upload and download speeds – something that is meaningless to test. The other three carriers reported what they claimed were actual upload and download speeds, shown separately.

The FCC undertook the testing in response to numerous complaints filed in the FCC’s docket for the Mobility Fund Phase II grants. The intention of this fund was to improve 4G coverage in rural areas with little or no cellular coverage. Smaller cellular carriers and the public complained to the FCC that the cellular data coverage claimed by the large cellular carriers was overstated. Small cellular carriers worried that the overstatements would stop them from asking for funding for areas that need upgrading. Local governments were worried that the overstated coverage meant that their areas wouldn’t see upgrades and they’d be doomed for another decade with poor cellular coverage.

The tests were conducted in areas where the carrier maps showed cellular data coverage. The results of the testing were rather bleak. 16% of all calls tried on Verizon were unable to make a data connection. The failures to connect were 23% on T-Mobile and 38% on US Cellular.

Overall, the three carriers met the FCC’s minimum requirement of 5 Mbps download for 4G only 62% of the time. That was 64% on Verizon, 63% on T-Mobile and only 45% for US Cellular. However, even within those reported results, the testers said that they experienced intermittent dropped calls on all three networks.

The FCC responded to these tests by revamping the reporting of cellular data speeds in the future, asking for far more granular speed data by location. The FCC also convened a group of experts to recommend to the FCC how to better test cellular speeds. Finally, the FCC issued an Enforcement Advisory on the accuracy of the cellular data on form 477. That’s a step short of issuing fines and likely will have little impact on the carriers. It doesn’t appear that any of them have pared back their national coverage maps that still claim coverage across most of rural America.

There are significant real-life implications of overstated cellular coverage maps. Just like with the RDOF grant program that will rely on faulty maps of landline broadband, poor maps of cellular coverage mean that many areas with overstated cellular coverage won’t be eligible for federal grants to help fix the problem.

The big downside is that many rural households have no 4G LTE coverage, or at best have slow and intermittent 4G data available. These are often the same areas where landline broadband is slow or non-existent. As hard as it is to live without good cellular coverage or good landline broadband, homes without both are cut off from the rest of the world. To make matters worse, there is still 3G coverage in a lot of rural America and all of the carrirs have plans to cut that dead over the next few years.

The FCC has revamped the Mobility Fund II grant program by doubling the amount of funding to $9 billion and renaming it as the 5G Fund. That’s a silly name because the goal of the program is to bring at least minimal 4G coverage to rural areas, not 5G. Remember that the grant program was originally aimed only at areas that showed no coverage by the carriers. Ideally the FCC would also  direct funding to the many areas where the carriers were lying about their coverage – but It’s doubtful that they have any meaningful maps of real 4G coverage.

How FCC Policies Hurt Communities

I was recently looking at one of the counties where the winner of the CAF II reverse auction was Viasat, a satellite broadband provider. There are many other rural counties with an identical outcome. As I thought about these counties, I came to realize that a series of FCC policies and decisions have hurt these counties in their search for better broadband. There is no single FCC action that hurt them, but a cascading series of individual decisions have made it harder for them to find a broadband solution.

The first FCC decision that created the current situation is when the current FCC declined to consider an increase in the definition of broadband from 25/3 Mbps. That definition was set in 2015 and there is ample record on file in FCC proceedings that 25/3 is already an obsolete definition of broadband.

The most recent evidence comes from OpenVault. The company just released its Broadband Industry Report for 4Q 2019 that shows the average subscribed speed in the US grew from 103 Mbps in 2018 to 128 Mbps in 2019. That result is largely being driven by the cable companies and the fiber providers that serve more than 2/3 of all of the broadband customers in the country. The FCC is stubbornly sticking to the 25/3 Mbps definition of broadband even as a large majority of households in the country are being given speeds greater than 100 Mbps.

The decision to stick to the outdated 25/3 Mbps then created a second problem for rural America when the outdated FCC speed definition is used to award federal grants. The FCC decided in the CAF II reverse auction grants that any technology that met the 25/3 Mbps speed was acceptable. The FCC boxed themselves in since they couldn’t set a higher speed threshold for grants without admitting that the 25/3 Mbps threshold is inadequate. That auction awarded funding for technologies that can’t deliver much more than 25 Mbps. What’s worse is that the winners don’t have to finish building new networks until 2025. When the FCC blessed the use of the 25/3 threshold in the reverse auction they also blessed that 25/3 Mbps broadband will still be adequate in 2025.

The next FCC decision that is hurting these specific counties is when the FCC decided to allow satellite broadband companies to bid for scarce federal broadband grant monies. The FCC probably thought they had no choice since the satellite providers can meet the 25/3 Mbps speed threshold. This was a dreadful decision. Satellite broadband is already available everywhere in the US, and a grant given to satellite broadband brings no new broadband option to a rural area and only pads the bottom line of the satellite companies – it doesn’t push rural broadband coverage forward by a millimeter.

Finally, the FCC recently rubbed salt in the wound by saying that areas that got a previous state or federal broadband grants won’t be eligible for the additional federal grants out of the upcoming $20.4 billion RDOF grant program. This means that a county where a broadband grant was given to satellite provider is ineligible for grant money to find a real broadband solution.

Such counties are possibly doomed to be stuck without a broadband solution due to this chain of decisions by the FCC. I’m sure that the FCC didn’t set out to hurt these rural counties – but their accumulated actions are doing just that. Each of the FCC decisions I described was made at different times, in reaction to different issues facing the FCC. Each new decision built on prior FCC decisions, but that culminated in counties with a real dilemma. Through no fault of their own, these counties are now saddled with satellite broadband and a prohibition against getting additional grant monies to fund an actual broadband solution.

A lot of this is due to the FCC not having a coherent rural broadband policy. Decisions are made ad hoc without enough deliberation to understand the consequences of decisions. At the heart of the problem is regulatory cowardice where the FCC is refusing to acknowledge that the country has moved far past the 25/3 Mbps broadband threshold. When 2/3 of the country can buy speeds in excess of 100 Mbps it’s inexcusable to award new grant monies for technologies that deliver speeds slower than that.

It’s obvious why the FCC won’t recognize a faster definition of broadband, say 100 Mbps. Such a decision would instantly classify millions of homes as not having adequate broadband. There is virtually no chance that current FCC will do the right thing – and so counties that fell through the regulatory cracks will have to find a broadband solution that doesn’t rely on the FCC.

Bad Customer Service as a Profit Center

There was a December article in Fast Company that spelled out what I’ve long suspected – that many big companies have lousy customer service on purpose – they want to make it hard for customers to get refunds or to drop service. The article was written by Anthony Dukes of USC and Yi Zhu of the University of Minnesota. The article is worth reading if you have the time to click through all of the links, which elaborate numerous ways that big companies abuse their customers.

This certainly rings true for the big ISPs. I harken back to the days of AOL, which was famous for making it a challenge to drop their service. Comcast has always had a reputation of making it hard for customers to break a bundle or leave the company for another ISP.

The article cites some interesting statistics. They claim that in 2013 that a study showed that the average home spent 13 hours per year disputing charges with customer service. That’s nearly two workdays of time, and it’s little wonder that people hate to call customer service.

Customer service at the big telcos and cable companies was never great, but in my time in the industry it’s gotten worse – the big ISPs are now rated at the bottom for customer satisfaction among all corporations. I think the big change in the industry came in the last few decades when the big ISPs got enamored with win-back programs – offering customers incentives to stop them from dropping service. Unfortunately, the ISPs tied employee compensation to the percentage of win-backs and there have been numerous articles published of ISP employees who would not let somebody drop service and who would keep a customer on the phone for an hour to convince them not to leave.

ISP customer service also took a downward spin when every call with a customer turned into a sales call trying to sell more services. Unfortunately, these sales efforts seem to result in new revenues, but it’s irksome to customers to have to listen to several sales pitches to accomplish some simple customer service task.

Dukes and Zhu claim that a lot of customer service centers are structured to dissuade customers from dropping service. They say that long hold times are on purpose to get customers to give up. They cite some customer service centers where the people answering the first call from customers have no authority to change a customer’s billing – only customers willing to fight through to talk to a supervisor have a chance at fixing a billing problem. They claim that chatbots are often set up in the same way – they can sound helpful, but they often can’t make any changes.

They also believe that companies are getting sophisticated and use different tactics for different customers. Studies have shown that women get annoyed faster than men in dealing with poor customer service. Research has shown that some demographics, like the elderly, are easier to dissuade from getting a refund.

Smaller ISPs understand the poor customer service from the big ISPs and most of them strive to do better. However, I know of smaller ISPs with aggressive win-back programs or who use every call as a marketing opportunity, and such ISPs have to be careful to not fall into the same bad habits as the big ISPs.

I find it amusing that one of the many reasons cited for breaking up the Bell System was to improve customer service. Regulators thought that smaller regional companies would be nimbler and do a better job of interacting with customers. This turned out not to be true. In fact, I consider my interactions with monopolies to be the easiest. I can’t recall a call I’ve ever had with an electric or water utility that wasn’t completed quickly and efficiently. Perhaps ISPs ought to strive to be more like them.

Challenging the FCC Broadband Maps

I’ve written many times about the absurdity of using the FCC mapping data for identifying areas with or without broadband. I’ve lately been looking at the FCC mapping data in West Virginia and New Mexico – two of the states with the worst broadband coverage in the country – and the FCC maps are atrocious. I see counties where the claimed broadband coverage in the FCC maps is wrong for more than half of the geographic area.

Unfortunately, the FCC is about to award $20.4 billion in RDOF grants later this year based solely on these dreadful maps. Luckily, there are other grant programs that allow grant applicants to challenge the FCC data. This includes the USDA ReConnect grants and many of the state grant programs.

One of the few ways to challenge the FCC maps is with speed tests. Anybody undertaking such a challenge needs to be aware that the incumbent telcos might challenge your speed test results, and unfortunately, some of their criticisms will be right. This means that anybody challenging the FCC maps has to take some steps to maximize the effectiveness of speed tests. Here are a few aspects of administering speed tests that should be considered.

  • A speed test needs to distinguish between cellular and landline connections. Rural folks with no broadband connection or those using cellular for home broadband are going to take the test with their cellphone. While such results are interesting, cellular speed tests can’t be mixed into a challenge of landline broadband coverage.
  • Everybody needs to use the identical speed test because each speed test measures speed using a different algorithm. Never use a speed test from the incumbents – it might be baked to show too good results.
  • A challenge can be most effective if it can get feedback from folks with no broadband available at their home. You need to somehow solicit and include results from folks that can’t take the speed tests.
  • You also should be aware a speed test sometimes doesn’t work for somebody with really slow broadband or high latency. We recently sat on the phone with somebody using satellite broadband and they couldn’t get the speed test to complete, even after many attempts.
  • The biggest challenge is in mapping the results. If you map the results so precisely that the results can be overlaid on individual homes on Google Earth, then you have provided an incumbent ISP the chance to challenge the test results. They can likely identify homes where they aren’t the ISP, or homes that have broadband that meets the FCC speed thresholds, meaning that slow speed test results might be due to poor WiFi or some other reason. Ultra-precise mapping might also violate the privacy of the people taking the speed test, This is an issue that many state speed test programs have wrestled with – some of them take such care to mask the precise location of the data that their final product can’t be used to challenge the FCC maps. For example, if speed test results are summarized by Census blocks then the results incorporate the same kinds of problems that are included in the FCC maps. Probably the best approach is to embed the final results in a pdf that is of low enough resolution to not identify individual homes.

There is one other way to map broadband coverage. An experienced field technician or engineer can drive around an area and can identify every broadband asset in the field. They can precisely identify where the cable TV networks end, down to the house. They can identify field DSLAMs that generate DSL signals out of rural cabinets – and they can often precisely identify the flavor of DSL and know the maximum speed capability of a given unit. They can identify the location and height of wireless transmitters and can map out the likely coverage areas. This kind of effort is most effective at identifying where there is no broadband, A good technician can make a decent map of the likely maximum broadband speeds available in a given area – something that is rarely achieved on most rural networks. This kind of challenge could be expensive and time-consuming, and I’ve never seen a challenge done this way. But I know engineers and technicians capable of making highly accurate maps.

Communities can tackle speed tests – they can get households to take the same speed test, such as the speed test from Ookla, and then match and map the results using GIS data. This can be a lot of work. Mapping can also be provided by many telecom engineering companies. One of the lowest-costs solutions is a speed test by Neo Partners that administers the speed test and overlays the speed test results automatically on Google maps.

Even if you aren’t challenging a grant, communities ought to consider speed tests to better understand the broadband in their community. As an example, I worked for a city where the speed tests showed that one neighborhood had far slower speeds than the rest of the city – something the city hadn’t known before the speed test. We’ve done speed tests that showed that the incumbent was delivering more than the advertised speed – again, something worth knowing.

Broadband Consumption Continues Explosive Growth

OpenVault Just released its Broadband Industry Report for 4Q 2019 that tracks the way that the US consumes data. The results of the reports are as eye-opening as OpenVault reports for the last few years. OpenVault has been collecting broadband usage for more than ten years.

As usual, the OpenVault statistics are a wake-up cry for the industry. The most important finding is that the average monthly data consumed by households grow by 27% from 2018 to 2019, and in the fourth quarter of 2019 the average home used 344 gigabytes of data, up from 275 gigabytes a year earlier. Note that consumption is a combination of download and upload usage – with most consumption being downloaded.

For the first time, the company compared homes with unlimited data plans to those that have plans with data caps. They reported that homes with no data caps used 353 gigabytes per month while homes with data caps used 337 gigabytes per month. That statistic would suggest that homes with data caps try to curb their usage to avoid overage charges.

Interestingly, median usage was significantly different than average usage. Median means the same as midpoint, and the median data usage was 191 gigabytes per month, meaning half of US homes used more than that and half used less. In looking at their numbers, I have to suppose that the median is a lot less than average due to the many homes using slow DSL that can’t consume a lot of broadband.

The report also looks at power users – homes that consume a lot of broadband. They report that nearly 1% of homes now use 2 terabytes per month and 7.7% use over 1 terabyte per month. A terabyte is 1,000 gigabytes. The percentage of homes using over 1 terabyte climbed from 4% a year earlier. This statistic is important because it shows a quickly increasing number of homes that will be hitting the 1 terabyte data caps of ISPs like Comcast, AT&T, Cox, and CenturyLink. I clearly remember Comcast saying just a few years ago that almost no homes had an issue with their data caps, but that no longer can be true.

Homes are starting to buy 1 gigabit broadband when it’s available and affordable. 2.8% of homes in the country now subscribe to gigabit speeds, up 86% from the 1.5% of homes that bought gigabit in 2018.

54% of homes now purchase broadband plans with speeds of 100 Mbps or faster. Another 23.6% of homes are subscribing to broadband between 50-75 Mbps. This means that nearly 78% of homes are subscribing to data plans of greater than 50 Mbps. The average subscribed speed grew significantly since 2018 from 103 Mbps to 128 Mbps. These subscriber statistics should shame the FCC for deciding to stick with the 25/3 Mbps definition of broadband. The agency is clearly setting a target speed for rural America that is far behind the reality of the marketplace.

OpenVault made one comparison to Europe and showed that we consume a lot more broadband here. While the US average consumption of broadband in 4Q 2019 was 344 gigabytes here, it was 196 gigabytes in Europe.

As OpenVault statistics have done in the past, they show network engineers that the demand for new broadband is not abating but is continuing to explode. An annual 27% increase in broadband consumption means that broadband demand is continuing to double every three years. If that growth rate is sustained, then our networks need to be prepared within a decade to carry 8.6 times more data than today. That’s enough to keep network engineers up at night.

Broadband in China

For years I’ve been hearing how we are losing the broadband battle with China, so I decided to take a look at the current state of broadband in the country. The China Internet Network Information Center (CNNIC) publishes statistics about the state of broadband in the country, and I used the Statistical Report on Internet Development in China from August 2019 in writing this blog.

Here are some of the more interesting statistics about the state of broadband in the country:

  • China is a lot larger than the US with a current population just below 1.4 billion, compared to an estimate of US population of around 327 million.
  • As of June 2019, China had 854 million people connected to the web in some manner, for an overall Internet penetration based on population of 61.2%. It’s not easy to compare that statistic to the US since we track Internet usage using subscriptions to households.
  • China is still rapidly adding people to the internet. In the first six months of 2019, the country added 26 million new Internet users.
  • The Chinese interface with the internet in a variety of ways, with the following statistics for June 2019:

Cellphone        847 million      99%

Desktop           394 million      46%

Laptop             308 million      36%

TV                     283 million      33%

Tablet              242 million      28%

  • As of June 2019, China had 396 million users on fiber-to-the-home. China is adding fiber faster than the US and there were over 67 million customers added for the year ending in June 2019.
  • Chinese speeds for landline connections averaged 31.3 Mbps in June 2019, up 25% since 2018. Mobile speeds in 2019 averaged 23 Mbps, up 7% from 2018.
  • Like the US, China has a rural digital divide. In 2018 the country had 225 million rural Internet users representing a 39% penetration. Urban Internet users were 630 million, a 77% penetration. There are 347 million rural Chinese without access to the Internet, almost 25% of all citizens in the country. It’s hard to compare that statistic to the US since the FCC does such a lousy job of counting households with broadband.
  • China is working to solve the rural digital divide and added 3 million rural Chinese to the Internet in the first half of 2019. However, much like here, that rate of growth is glacial, and at that rate of growth it will take 36 years for the rural population to grow to the same current penetration seen in urban areas.
  • The Chinese are heavy users of instant messaging with 96.5% of Internet users using messaging in 2018.
  • It’s important to remember that Chinese web users are monitored closely and live behind what the west calls the Great Firewall of China. The government tracks how people use broadband, and we don’t have direct statistics for the following:

Watch online video       88.8%

Use online news            80.3%

Shop online                   74.8%

Online bill payment      74.1%

Order meals online       49.3%

Car hailing services       39.4%

  • China’s mobile data traffic is growing even faster than in the US. In the first half of 2018, the Chinese mobile networks carried 266 petabytes of traffic. By the first half of 2019 that traffic had doubled to 554 petabytes. China’s cellular data usage doubled in one year, while here it’s taking two years to double. The numbers are huge, and a petabyte equals 100 billion gigabytes.
  • The average Chinese broadband user spent 27.9 hours online in 2019.
  • The CNNIC tracks why people don’t use the internet. 45% don’t have access to broadband; 37% lack the skills to use broadband; 15% don’t have computers; 11% say they have no need. The interesting thing about the list in China is that nobody said they couldn’t afford Internet access.

There was one interesting thing missing in the Chinese report. There was no mention of 5G. That means, at least to the government agency that tracks broadband usage in China, there is no 5G race. It’s obvious that the Chinese need 5G, probably more badly than here since the volumes of data on their mobile networks are doubling annually. But the topic wasn’t worth a mention in their annual report of the status of broadband.

There is No Artificial Intelligence

It seems like most new technology today comes with a lot of hype. Just a few years ago, the press was full of predictions that we’d be awash with Internet of Thing sensors that would transform the way we live. We’ve heard similar claims for technologies like virtual reality, block chain, and self-driving cars. I’ve written a lot about the massive hype surrounding 5G – in my way of measuring things, there isn’t any 5G in the world yet, but the cellular carriers are loudly proclaiming its everywhere.

The other technology with a hype that nearly equals 5G is artificial intelligence. I see articles every day talking about the ways that artificial intelligence is already changing our world, with predictions about the big changes on the horizon due to AI. A majority of large corporations claim to now be using AI. Unfortunately, this is all hype and there is no artificial intelligence today, just like there is not yet any 5G.

It’s easy to understand what real 5G will be like – it will include the many innovations embedded in the 5G specifications like frequency slicing and dynamic spectrum sharing. We’ll finally have 5G when a half dozen new 5G technologies are on my phone. Defining artificial intelligence is harder because there is no specification for AI. Artificial intelligence will be here when a computer can solve problems in much the way that humans do. Our brains evaluate available data on hand to see if we know enough to solve a problem. If not, we seek the additional data we need. Our brains can consider data from disparate and unrelated sources to solve problems. There is no computer today that is within a light-year of that ability – there are not yet any computers that can ask for specific additional data needed to solve a problem. An AI computer doesn’t need to be self-aware – it just has to be able to ask the questions and seek the right data needed to solve a given problem.

We use computer tools today that get labeled as artificial intelligence such as complex algorithms, machine learning, and deep learning. We’ve paired these techniques with faster and larger computers (such as in data centers) to quickly process vast amounts of data.

One of the techniques we think of artificial intelligence is nothing more than using brute force to process large amounts of data. This is how IBM’s Deep Blue works. It can produce impressive results and shocked the world in 1997 when the computer was able to beat Garry Kasparov, the world chess champion. Since then, the IBM Watson system has beat the best Jeopardy players and is being used to diagnose illnesses. These computers achieve their results through processing vast amounts of data quickly. A chess computer can consider huge numbers of possible moves and put a value on the ones with the best outcome. The Jeopardy computer had massive databases of human knowledge available like Wikipedia and Google search – it looks up the answer to a question faster than a human mind can pull it out of memory.

Much of what is thought of as AI today uses machine learning. Perhaps the easiest way to describe machine learning is with an example. Machine learning uses complex algorithms to analyze and rank data. Netflix uses machine learning to suggest shows that it thinks a given customer will like. Netflix knows what a viewer has already watched. Netflix also knows what millions of others who watch the same shows seem to like, and it looks at what those millions of others watched to make a recommendation. The algorithm is far from perfect because the data set of what any individual viewer has watched is small. I know in my case, I look at the shows recommended for my wife and see all sorts of shows that interest me, but which I am not offered. This highlights one of the problems of machine learning – it can easily be biased and draw wrong conclusions instead of right ones. Netflix’s suggestion algorithm can become a self-fulfilling prophecy unless a viewer makes the effort to look outside of the recommended shows – the more a viewer watches what is suggested, the more they are pigeonholed into a specific type of content.

Deep learning is a form of machine learning that can produce better results by passing data through multiple algorithms. For example, there are numerous forms of English spoken around the world. A customer service bot can begin each conversation in standard English, and then use layered algorithms to analyze the speaker’s dialect to switch to more closely match a given speaker.

I’m not implying that today’s techniques are not worthwhile. They are being used to create numerous automated applications that could not be done otherwise. However, almost every algorithm-based technique in use today will become instantly obsolete when a real AI is created.

I’ve read several experts that predict that we are only a few years away from an AI desert – meaning that we will have milked about all that can be had out of machine learning and deep learning. Developments with those techniques are not leading towards a breakthrough to real AI – machine learning is not part of the evolutionary path to AI. At least for today, both AI and 5G are largely non-existent, and the things passed off as these two technologies are pale versions of the real thing.