Cable Company Cellular Growing

Cable companies are starting to quietly build a significant cellular business to bundle with broadband and other products. Consider the most recent customer count from the eight largest U.S. cellular carriers:

Verizon 143.0 M
T-Mobile 110.2 M
AT&T 101.6 M
Dish 8.5 M
US Cellular 4.9 M
Comcast 4.6 M
Charter 4.3 M
C-Spire 1.2 M

It’s worth noting that AT&T has over 200 million cellular customers worldwide, which makes them the eleventh largest cellular carrier in the world, with China Mobile first with over 851 million customers.

Comcast’s Xfinity Mobile added 317,000 customers in the second quarter of this year to bring the company to a total of 4.6 million customers. Comcast mostly uses the Verizon network to complete calls. However, Comcast demonstrates the major benefit of a cable company being in the cellular business since the company is able to offload a large portion of its outgoing mobile traffic to its WiFi network. Comcast has been experimenting with the use of 600 MHz spectrum to carry some of its cellular traffic. The company purchased $1.7 billion of spectrum in the 2017 incentive auction that freed up spectrum formerly used by television channels. Comcast also purchased $458 million of CBRS spectrum in 2020. The company says it may selectively offload traffic onto licensed spectrum in places where that is cheaper than buying wholesale minutes.

Charter’s Spectrum Mobile added 344,000 mobile customers in the second quarter of the year to bring the company to 4.3 million customers. Spectrum also uses the Verizon network. Charter purchased $464 million of PAL licenses in the CBRS spectrum in 2020. Charter says it intends to place its own radios in high-traffic areas where that will save money. Charter’s CEO Brian Roberts said a few months ago that Charter saw $700 million in new revenues from cellular over the past twelve months.

Altice has been selling mobile services branded as Optimum Mobile for several years and added 33,000 customers in the second quarter, bringing the company to 231,000 total mobile customers. Altice uses the T-Mobile network.

Cox announced the launch of a mobile pilot program on August 29, launching Cox Mobile in Hampton Roads, Virginia, Omaha, Nebraska, and Las Vegas.

All of these companies have a huge potential upside. For example, the mobile customer penetration rate for both Comcast and Charter is under 10%, and both companies believe they can become major mobile players in their markets.

The cable companies face an unusual marketing challenge since each cable company is only in selected urban markets, meaning that a lot of nationwide advertising goes to waste.

The primary reason that Comcast first entered the mobile market was to develop another product that would create a stickier bundle. Comcast figured it would be hard for a customer to leave if that meant finding a new cellular carrier along with a new ISP. Cable companies are still only selling to their own broadband customers, which is a good indication bundling is still a key reason for doing this. It’s also less costly to sell cellular to households that can offload cellular traffic to the cable company broadband network.

The big three cellular carriers have continued to grow in recent years, but the cable companies have definitely made a dent in the market with almost ten million retail mobile customers. The real test for the cellular industry is going to come when Dish finally gets its act together and offers low-cost mobile service in most markets. That’s going to put price pressure on everybody else. If Dish starts a price war, as promised, we’re going to see a real shake-up.

 

 

Starlink and RDOF

In August, the FCC denied the SpaceX (Starlink) bid to receive $885 million over ten years through the RDOF subsidy. This is something that Starlink won in a reverse auction in December 2020.

In the press release for the rejection, FCC Chairman Jessica Rosenworcel was quoted as saying, “After careful legal, technical, and policy review, we are rejecting these applications. Consumers deserve reliable and affordable high-speed broadband. We must put scarce universal service dollars to their best possible use as we move into a digital future that demands ever more powerful and faster networks. We cannot afford to subsidize ventures that are not delivering the promised speeds or are not likely to meet program requirements.”

The FCC went on to say in the order that there were several technical reasons for the Starlink rejection. First was that Starlink is a “nascent” technology, and the FCC doubted the company’s ability to deliver broadband to 642,925 locations in the RDOF areas along with serving non-RDOF areas. The FCC also cited the Ookla speed tests that show that Starlink speeds decreased from 2021 into 2022.

Not surprisingly, Starlink appealed the FCC ruling this month. In the Starlink appeal, the company argued, “This decision is so broken that it is hard not to see it as an improper attempt to undo the commission’s earlier decision, made under the previous administration, to permit satellite broadband service providers to participate in the RDOF program. It appears to have been rendered in service to a clear bias towards fiber, rather than a merits-based decision to actually connect unserved Americans”.

Rather than focus on the facts in dispute in the appeal, today’s blog looks at the implications on the broadband industry during the appeal process. Current federal grant rules don’t allow federal subsidies to be given to any area that is slated to get another federal broadband subsidy. This has meant that the RDOF areas have been off-limits to other federal grants since the end of 2020. This has included NTIA grants, USDA ReConnect grants, and others. Federal grant applicants for the last few years have had to carefully avoid the RDOF areas for Starlink and any other unresolved RDOF award areas.

As a reminder, the RDOF areas were assigned by Census block and not in large coherent contiguous areas. The RDOF award areas have often been referred to as Swiss cheese, meaning that Census blocks that were eligible for RDOF were often mixed with nearby ineligible Census blocks. A lot of the Swiss cheese pattern was caused by faulty FCC maps that excluded many rural Census blocks from RDOF that should have been eligible, but for which a telco or somebody else was probably falsely claiming speeds at least 25/3 Mbps.

ISPs that have been contemplating grant applications in the unresolved RDOF areas were relieved when Starlink and other ISPs like LTE Broadband were rejected by the FCC. It’s difficult enough to justify building rural broadband, but it’s even harder when the area to be built is not a neat contiguous study area.

The big question now is what happens with the Starlink areas during an appeal. It seems likely that these areas will go back into the holding tank and remain off-limits to other federal grants. We’re likely going to need a definitive ruling on this from grant agencies like the USDA to verify, but logic would say that these areas still need to be on hold in case Starlink wins the appeal.

Unfortunately, there is no defined timeline for the appeal process. I don’t understand the full range of possibilities of such an appeal. If Starlink loses this appeal at the FCC, can the agency take the appeal on to a court? Perhaps an FCC-savvy lawyer can weigh in on this question in the blog comments. But there is little doubt that an appeal can take some time. And during that time, ISPs operating near the widespread Starlink grant areas are probably still on hold in terms of creating plans for future grants.

Lobbying the BEAD Rules

Thirteen Republican Senators sent a letter to the NTIA asking the agency to change its approach in administering some of the provisions of the $42.5 billion BEAD grants. This is just one of the first of what I think will be many attempts to influence how the grant funding is awarded. We can’t ignore that there will be politics involved in determining who gets grant awards. That became inevitable for a grant program of $42.5 billion that also involves the States.

The letter specifically asked for changes related to rate regulation, technology preference, provider preference, workforce requirements, middle mile deployments, and the application review process.

Rate Regulation. The Senators point out that the legislation has a specific prohibition of the BEAD program suggesting or requiring broadband rates. The letter argues that the NOFO for the program suggests several requirements that will set or restrict rates, such as a suggestion that there should be a low-cost option established at $30 along with a still-undefined middle-class affordability plan.

Technology Neutrality. The Senators take exception to the NTIA’s clear preference for fiber and want to make sure that fixed wireless and cable technologies can be considered for grants.

Preferences for Grant Recipients. The Senators are concerned that the NOFO for the program insists that there is an equitable and nondiscriminatory focus for choosing grant winners. They fear that this is going to push state grant offices to favor non-traditional broadband providers instead of existing proven ISPs.

BEAD and Digital Equity Participation. The Senators want to make sure that there is no automatic link between a State participating in both the BEAD program and the Digital Equity program. This is the first time I’ve heard of this issue, and this means there are States considering not accepting the funding that will be used for getting computers into homes and offering digital literacy training.

Workforce Preference. The Senators believe that the BEAD rules favor ISPs that use a ‘directly employed workforce’ as opposed to contractors and subcontractors. That observation was a new one for me and will send me back to reading the NOFO more carefully. The Senators are also worried about the requirement that projects greater than $35 million must enter into a project labor agreement – something they say will be challenging in a market with a skilled labor shortage.

Middle-Mile Deployment. The Senators don’t like the requirement that any project that includes middle-mile routes must allow for interconnection with other carriers that want to use the fiber routes.

Unnecessary Burdens. The Senators say there are requirements that add burdens on grant applicants that were not included in the legislation. This includes issues such climate resiliency and system hardening for the useful life of fiber. They say such requirements add unnecessary costs and will delay the deployment of networks.

It’s an interesting list of objections. A few of the objections are on everybody’s hate list of the grant rules. Grant applicants do not want to figure out a climate resiliency plan and will be fearful if they do it poorly, they might not win a grant.

A few of the requests are clearly in favor of incumbent ISPs, such as any requirement that might force a State broadband office to consider non-traditional ISPs like cities.

And a few requests are things that concern all ISPs, such as the NTIA requiring broadband rates that are too low to make a business plan work.

Just as interesting are the items not included on the list. Small ISPs are worried about the requirement to have a certified letter of credit – something that doesn’t concern large ISPs. Not having this on the list makes me think the Senators are being prompted by big ISPs.

This blog is not meant as a criticism of the Senators’ suggestions. Every constituency in the country is going to have its own wish list of things the BEAD grants should emphasize or deemphasize. I’m hoping to collect these as I see them – it will be interesting when the dust clears to see who had the most influence on the BEAD rules.

Faster Speeds for Comcast

Comcast held a press release on September 8 that announced the introduction of a 2-gigabit download broadband product. The product is already available in Colorado Springs, CO, Augusta, GA, Panama City Beach, FL, and in the Comcast headquarters market of Philadelphia. I can’t find any mention yet of the price.

Along with the announcement of faster download speeds, the company is claiming new upload speeds of as much as 200 Mbps – at least for the 2 Gbps plan. The press release made it sound like all upload speeds would be increased by five to ten times the existing speeds, and today’s blog looks at what it would take for a cable company to increase upload speeds across the board.

Interestingly, the same press announcement said that Comcast would be introducing DOCSIS 4.0 in 2023, at least for some business customers. That’s an announcement that has me scratching my head. Comcast just announced a successful test for DOCSIS 4.0 in January of this year. To be able to go from a lab prototype to production units in less than two years would be extraordinary. The normal time to market for a major new technology is five or six years. I’m skeptical about the announcement and wonder if this is aimed at Wall Street more than any actual technology plan. The company has been asked non-stop about DOCSIS 4.0 for several years, and maybe this announcement is taking advantage of that hype. Comcast could hold a field trial of the new technology next year and still meet this promise.

But cable companies have another option to get faster upload speeds. A cable network is essentially a captive radio network inside of the coaxial cable. Cable networks don’t all have the same total bandwidth, and most of the big cable company networks have total bandwidth of either 1 GHz or 1.2 GHz. The total bandwidth has to be shared between video channels and broadband.

Most existing cable companies have allocated bandwidth between download and upload using something called the sub-split. This assigns a relatively small amount of frequency between 5 MHz and 42 MHz for upload. On top of being a small swath of throughput, this is also the part of the spectrum that suffers from external interference. This combination results in both relatively slow upload speeds and also variable speeds due to interference – something most cable customers are aware of.

There are two additional configurations for allocating upload speeds. A mid-split configuration uses the spectrum between 5 MHz to 85 MHz for upstream. In a high-split, the upload is enhanced by using the spectrum up to 204 MHz. DOCSIS 4.0 will provides multiple options for upload bandwidth with possible spits at 300 MHz, 396 MHz, 492 MHz, and 684 MHz.

If Comcast is going to improve bandwidth in the near future, it will have to implement one of the larger DOCSIS 3.1 splits. There is a cost for moving to a different split. There must first be enough room available for video channels and download bandwidth. It can be expensive if the entire bandwidth of the network must be increased. That can mean replacing amplifiers and other outside electronics, and even some coax. In most cases, the existing customer modems would need to be replaced unless already configured to accept the different split.

At the recent SCTE Cable-Tec Expo, CommScope, Vecima, and CableLabs said there are plans for a different upgrade path for the DOCSIS 3.1 higher splits. They are claiming new ‘turbocharged’ modems that will add more effective upload bandwidth capability. I’ve not heard of any field trials of the new modems, and perhaps this is what Comcast has in mind by the end of 2023.

Cable companies are sensitive about the marketing advantage that faster upload speeds give to fiber and even to slower technologies like FWA cellular wireless. It’s hard to know if the Comcast announcement foreshadows big improvements next year or was just a way to signal to Wall Street that cable companies are working towards improved bandwidth. It’s inevitable that faster upload bandwidth is coming – the big questions are when and how much faster.

What’s The Trend for Broadband Prices?

For years, cable companies have been raising broadband prices every year. These annual rate increases meant a huge boost the earnings of the largest cable companies like Comcast and Charter. Most of the annual price increases of $3 to $5 went straight to the bottom line. While price increases don’t hit every customer immediately because of customers on term contracts, every price increase reaches every customer eventually.

It’s going to be really interesting to see if Comcast, Charter, and the other big cable companies raise prices later this year. The industry has changed, and it doesn’t seem as obvious as in the past that cable companies can raise rates and that customers will just begrudgingly go along with it.

First, the cable companies have stopped growing, and in the second quarter of this year, both Comcast and Charter experienced a tiny loss of customers. This seems to be for a variety of reasons. First, the FWA fixed cellular carriers are thriving. In the second quarter of this year, T-Mobile and Verizon added 816,000 new FWA broadband customers using 5G frequencies. The product is not as robust as cable broadband, with download speeds of roughly 100 Mbps, but FWA has faster upload speeds than cable. What’s making FWA attractive is the price of $50 – $60 for unlimited broadband – far below the prices charged by cable companies.

The cable companies have to be feeling some sting also from the large telcos and others who are building and selling fiber in cable company markets. There must be a few million customers moving to fiber annually at this point – a number that is going to grow.

The big question is if cable companies will keep raising rates in the face of customer stagnation. This can’t be an easy decision for cable companies. New revenues from raising rates go straight to the bottom line, and it is the annual rate increases that have sustained the earning growth and stock prices for cable companies. Comcast has over 32 million customers, and Charter has over 30 million, so forgoing a rate increase would mean forgoing a lot of new cash and earnings.

The strategic question is if the cable companies are willing to accelerate customer losses for the extra earnings from higher rates. Households getting a rate increase notice are going to be prompted to look around for alternatives, and many of them will find one. The time when cable companies are a monopoly in many cities is starting to come to an end.

The rest of the industry is going to watch this issue closely because it’s going to be easier to compete against the cable companies if they continue to raise rates. Higher cable broadband prices let other ISPs creep up rates and still stay competitive.

It’s interesting that almost no ISP has raised rates during this year when inflation is a major topic of conversation. One thing this shows is that there are big margins on broadband, and there is real cash pressure to raise rates to stay whole. But this also means that the big ISPs are absorbing higher labor, materials, and operating costs without charging more – and without increasing revenues through customer growth.

The biggest cable companies have other sources of revenue. Comcast and Charter both have a growing cellular business, but many analysts are speculating that it’s not generating a big profit. However, as the cable companies start utilizing licensed spectrum it might become quite profitable.

This is a really interesting time for the industry. The biggest cable companies have been the king of the hill for a decade and could do almost anything they wanted. They’ve been converting DSL customers by the millions annually while also raising rates – meaning getting doubly more profitable. Comcast and Charter are so large that they are not going to stop being the largest ISPs for a long time to come – but they are starting to show some market vulnerability, and there are plenty of ISPs willing to pounce on their markets.

Broadband Deserts

Perhaps it’s because the death of Queen Elizabeth is everywhere in the news, but somebody sent me an article from the BBC from 2008 where then Prince Charles warned that the lack of rural broadband in the UK was going to eventually result in broadband deserts.

The now King Charles III was quoted as saying that lack of broadband puts too much pressure on the people who live without broadband and that if a solution wasn’t found to bring rural broadband that the rural areas would turn into ghost towns as people abandoned rural areas. He was particularly worried about farmers, who, even in 2008, needed broadband to thrive. His fear was that the rural UK would turn into nothing more than a place for city residents to have a second home.

He was right, of course, and we’re already starting to see problems crop up in rural areas in the U.S. that don’t have broadband. Counties with poor broadband are seeing people move away to get better jobs or to get broadband for their kids. Farmers without broadband are at a serious disadvantage compared to peers using the latest broadband-enabled technology. Real estate agents are reporting that it’s extremely difficult to sell a home that has no broadband option. Several studies have shown that students that grow up without home broadband don’t perform nearly as well as students with broadband.

There are hundreds of rural counties working hard to get fiber broadband with the hope of stemming the population loss. Many are hoping to attract people who work from home as the best way to stem population loss and stimulate the local economy. They are banking on the notion that people will want to live in a beautiful and low-cost place while working from home.

There is a lot of hope that the various grant programs will solve a huge amount of the rural digital divide. There is grant money being used from ReConnect, ARPA, and the upcoming giant $42.5 billion BEAD grants that will bring broadband to a lot of rural counties. I’m already working with some counties that feel certain that they will be getting wall-to-wall fiber.

But I’m also working with counties that are not so sure they will get better broadband. They live in parts of the country where there are no small telcos or electric cooperatives interested in serving them. They live in places where the cost of building broadband is going to push them into the high-cost category, where a 75% BEAD grant is not going to be enough to entice an ISP.

As I wrote in a recent blog, there is also no guarantee that the current grants are going to reach everywhere. I think that there is a legitimate concern that communities that don’t get funding from current grants might have a long wait to ever see gigantic broadband grants again.

The world has changed a lot since King Charles warned about broadband deserts. In 2008, the Internet was already important to some folks, but over time it has become vital to a majority of households. In the community surveys I’ve been conducting this year, I am seeing where at least 30% of homes include somebody who works remotely from home – and in some counties, that’s a lot higher. These same surveys routinely show that many homes don’t have the broadband needed to support homework for students. I routinely hear from rural businesses that are struggling due to the lack of broadband.

The UK also has a program to build rural fiber. Project Gigabit is currently using £5 billion to bring broadband to 567,000 remote homes. Most of these projects start construction in 2023 and are expected to be done by the end of 2024.

To some degree, promoting rural broadband is a demographic experiment on a large scale. Congress is better than broadband infrastructure will revitalize many rural communities and give them the basis for a sustainable economy. I have no doubts that this isn’t going to happen everywhere because faster broadband by itself is not a cure-all for social woes. But communities that make a commitment to foster the best benefits of better broadband increase the chances of surviving and thriving.

The Data Divide

A report from the Center for Data Innovation warns of a new broadband gap they call the data divide, which is when some parts of society are not sharing in the big societal advantages that come from using and understanding the huge amounts of data that are being generated today.

The report includes examples of the data divide that make the concept easier to understand.

  • Patients who aren’t part of the system that is creating lifetime medical records won’t get the same quality of healthcare as people who have a fully documented medical history. People who don’t use medical monitoring devices won’t be informed early of health problems. People who aren’t included in genetic databases won’t be alerted to lifesaving techniques specific to their genome.
  • People that don’t participate heavily in the transactions that are used to calculate a credit score are often discriminated against even though they may be creditworthy.
  • Poor and rural school districts often don’t collect the same kind of data on students and student achievement, putting those students at a disadvantage for higher education opportunities.

These are just a few of the many examples that are going to become apparent as we move into a world that is data-centric. The report warns that the data gap doesn’t just affect individuals but also whole communities that are not as connected. The report points out the obvious – that access to the best data tends to be a luxury item available to wealthier individuals and communities.

We are now awash in data. Practically everything we do in public is recorded. The report warns that data can be put to good use to benefit society or used alternatively for the detriment of individuals and society.

Having access to mountains of data is a relatively new phenomenon that has been created by the combination of large data centers, the nascent AI industry, quantum computing, and companies that specialize in mining data. The report says that we are just entering a new world where data will play an important role in everybody’s life and that now is the time to start thinking, as a country and a society, about how we are going to control and use the mountains of data.

The report suggests that we develop national standards and goals for public data gathering so that decision-makers have the right data at their fingertips. Some of the ideas fostered by the report include:

  • Improve the quality of government-generated data.
  • Create standards to enhance the quality and usability of non-government data.
  • Support standard ways to incorporate crowd-sourced data and private-sourced data into government datasets.
  • Direct government agencies to develop strategies to make sure that data is available to all communities.

The report warns that the data gap will become severe unless we make sure that accurate data is available to everybody. Without a strategy and data policies, we’re on a path where most data will be used to market selectively to people or to lobby people with political ideas based upon a personal profile. Those uses of data seem inevitable, but we ought to be guaranteeing the upsides of using the data gathered about us rather than the downsides.

I’ll be the first to admit that I don’t understand enough about large datasets to know what is floating around in the world. But I know that both the government and private companies are gathering huge amounts of data about us, and it makes sense to create rules that make it hard to misuse the data and that make useful data available to everybody.

Overestimating Inflation

I’ve worked through a number of full economic cycles in the industry, including a number of recessions, that include:

  • The double-dip recession of 1980–1982 that was worsened by the Iranian revolution and the oil crisis.
  • A recession in 1990 was caused by accumulated consumer debt and another oil price spike.
  • The recession in 2001 was driven by the dot.com (and telecom) crash and worsened by September 11.
  • The great recession of 2007 – 2009 was the result of the subprime mortgage crisis and the meltdown of Wall Street.
  • A short recession in 2020 was caused by the COVID pandemic.

Most of these recessions were followed by periods of economic turmoil, which saw fluctuating interest rates and periods of inflation – like we are experiencing now as the delayed impact of the pandemic.

There are two interesting economic phenomena that recur in times of economic turmoil. The first is that people and businesses don’t believe that interest rates will rise. This current period of rising interest rates is a great example. Anybody that’s been paying attention has been hearing from the Federal Reserve for nearly a year that it would likely have to increase interest rates. With that much warning, people and businesses with adjustable-rate mortgages and loans had a lot of warning to refinance the debt to a fixed rate – and yet many did not do so. Mortgage rates recently hit 6% and are likely to increase further, and we’re soon going to be seeing stories of families and businesses defaulting because of the increased debt payments. For whatever reason, most people and businesses don’t refinance, even with persistent warnings that higher interest rates are on the way. This seems to be a recurring theme every time we enter a period of increasing interest rates.

The other phenomenon that I’ve seen repeated over and over is that businesses overact to inflation. Once they start seeing cost increases, they begin acting like inflation will be permanent and will continue to climb steadily. It has never done that in the U.S. economy. Costs spike, but the rate of inflation invariably slows down and goes back to normal. We’re already seeing inflation slowing due to falling fuel prices – which makes sense since escalating gasoline prices was one of the drivers of inflation.

What do I mean when I say that businesses overact to inflation? One way that I see it is in helping ISPs build business plans. I have clients that want me to build perpetual inflation into future forecasts. Building never-ending inflation into a forecast can quickly make a good idea look terrible. It is extremely unlikely that inflation will continue unabated – it never has in the U.S. economy. There are parts of the world where hyperinflation is normal, such as Nigeria that has seen inflation rates of 15% – 20% annually for many years. The country adjusts for this with currency manipulation, and businesses give big pay raises every year – and the real-life impact of inflation is barely noticed. It’s just how the economy works. But we have monetary policies in the U.S. and most of the top economies that are able to quash continuous hyperinflation.

The other overreaction to inflation also comes from vendors who raise prices in anticipation of inflation instead of reacting to actual cost increases. Economists will tell you that anticipation of inflation is one of the major drivers of inflation. It becomes a self-fulfilling prophecy when enough of the players in an industry raise prices in anticipation of future higher costs, which then becomes a driver of continued inflation. There was a lot of recent talk about price gouging and excess profits, and at least some of that came from businesses and industries that raised prices in anticipation that underlying costs would increase. There is a fine line between raising prices to be safe and price gouging. This is a phenomenon that happens in every recession, and the good news is that this also self-corrects over time when people flock to sellers that didn’t raise prices, and market prices eventually stabilize and stop climbing.

One of the most interesting things about the current period of inflation is that I’m not seeing ISPs increasing to keep up with inflation. I’m sure some ISPs have raised rates, but most broadband prices have not climbed in the past six months, as might be expected from looking at prices in other industries.

There are good reasons for this that are not related to inflation. The biggest cable companies have suddenly stopped growing, and they are probably afraid that raising rates will push folks to competitive alternatives – we’ll find out for sure at the end of the year. And most smaller ISPs take direction from the behavior of their large competitors – nobody wants to raise rates if they are competing against a bigger company that isn’t raising rates.

The good news for the industry is that everything will return to normal in a year or two – it always does. Interest rates will reach a peak and then slowly drop when the Federal Reserve no longer needs to be tweaking borrowing. Inflation will slow and will return to normal trends, and everybody will forget about it until the next crisis. This doesn’t mean that there won’t be casualties. There will be ISPs that will get into trouble if they hold a lot of variable rate debt – and some may fold. There will be some new projects that get derailed when costs climb higher than expected. But overall, the broadband sector will hunker down and wait out the current trends.

 

 

The Latest Stats on Broadband Usage

OpenVault released its Broadband Insights Report for 2Q22, which contains statistics about nationwide broadband usage at the end of the quarter. As usual, the OpenVault report is an invaluable peek into the national state of broadband.

The report shows that the average home used 491 gigabytes of data at the end of the second quarter. This is up by 13% of the 433 gigabytes used at the end of the second quarter of 2021. The second quarter usage, which represents June 30, is typically the lowest usage levels of the year due to schools being out of session and folks on vacation.

Upload bandwidth usage continues to grow and averaged 31.2 gigabytes per household, up from 28 gigabytes in 2021 and 13.6 gigabytes in 2018. Average upload speeds increased from 17 Mbps in 2021 to 23 Mbps in 2022.

The percentage of households that use more than 1-terabyte (1,000 gigabytes) of data each month continues to grow rapidly. OpenVault calls these power users. These are homes that will trigger data caps if they have an ISP that enforces them. At the end of the second quarter of 2022, 13.7% of homes are using more than a terabyte compared to 10.8% in 2021, an increase of over 26%. The percentage of homes using more than 2-terabytes increased from 1.5% in 2021 to 2.2% in 2022, an increase of 47%.

There has been a huge migration of folks subscribing to faster tiers of broadband. The most extraordinary statistic is that 14.2% of American homes now subscribe to gigabit service, up from 4.6% just two years ago.

Much of this shift, shown in the table below, is coming from cable companies that unilaterally increased speeds for customers, but many millions of customers have also upgraded to more expensive broadband tiers to get faster broadband. The following chart shows a remarkable trend since June 2020:

Subscriptions June 2020 June 2021 June 2022
Under 50 Mbps 18.4% 10.5% 5.7%
50 – 99 Mbps 20.4% 9.6% 8.5%
100 – 199 Mbps 37.8% 47.5% 10.1%
200 – 499 Mbps 13.5% 17.2% 55.4%
500 – 999 Mbps 5.0% 4.7% 6.0%
1 Gbps 4.9% 10.5% 14.2%

This chart, better than anything else I’ve seen, lays to rest the idea that the national definition of broadband ought to be 100 Mbps download. As of June of this year, 76% of U.S. households are subscribing to broadband plans of 200 Mbps or faster. The most popular tier is 200-400 Mbps, which makes sense since the big cable companies have migrated most broadband customers to speeds of 200 Mbps or 300 Mbps.

Interestingly, OpenVault says that ACP customers are using more broadband, at 654 gigabytes, than the average home. There are also 52% more power users among ACP subscribers using more than a terabyte of data than the overall population. OpenVault doesn’t speculate why this is so, but I would guess that part of the difference might be that homes getting broadband for the first time have a lot of streaming video from recent years to catch up on. I remember how much video my household watched when we first got Netflix – but over time, we went back to other activities.

The FCC Mapping Fabric

You’re going to hear a lot in the next few months about the FCC’s mapping fabric. Today’s blog is going to describe what that is and describe the challenges of getting a good mapping fabric.

The FCC hired CostQuest to create the new system for reporting broadband usage. The FCC took a lot of criticism about the old mapping system that assumed that an entire Census block was able to buy the fastest broadband speed available anywhere in the Census block. This means that even if only one home is connected to a cable company, the current FCC map shows that everybody in the Census block can buy broadband from the cable company.

To fix this issue, the FCC decided that the new broadband reporting system would eliminate this problem by having an ISP draw polygons around areas where it already serves or could provide service within ten days after a customer request. If done correctly, the new method will precisely define the edge of cable and fiber networks.

The creation of the polygons creates a new challenge for the FCC – how to count the passings inside of any polygon an ISP draws. A passing is any home or business that is a potential broadband customer. CostQuest tried to solve this problem by creating a mapping fabric. A simplistic explanation is that they placed a dot on the map for every known residential and business passing. CostQuest has written software that allows them to count the dots of the mapping fabric inside of any possible polygon.

That sounds straightforward, but the big challenge was creating the dots with the actual passings. My consulting firm has been helping communities try to count passings for years as part of developing a broadband business plan, and it is never easy. Communities differ in the raw data available to identify passings. Many counties have GIS mapping data that shows the location of every building in a community. But the accuracy and details in the GIS mapping data differ drastically by county. We have often tried to validate GIS data to other sources of data like utility records. We’ve also validated against 911 databases that show each registered address. Even for communities that have these detailed records, it can be a challenge to identify passings. We’ve heard that CostQuest used aerial maps to count rooftops as part of creating the FCC mapping fabric.

Why is creating a fabric so hard? Consider residential passings. The challenge becomes apparent as soon as you start thinking about the complexities of the different living arrangements in the world. Even if you have great GIS data and aerial rooftop data, it’s hard to account for some of the details that matter.

  • How do you account for abandoned homes? Permanently abandoned homes are not a candidate for broadband. How do you make the distinction between truly abandoned homes and homes where owners are looking for a tenant?
  • How do you account for extra buildings on a lot. I know somebody who has four buildings on a large lot that has only a single 911 address. The lot has a primary residence and a second residence built for a family member. There is a large garage and a large workshop building – both of which would look like homes from an aerial perspective. This lot has two potential broadband customers, and it’s likely that somebody using GIS data, 911 data, or aerial rooftops won’t get this one property right. Multiply that by a million other complicated properties, and you start to understand the challenge.
  • Farms are even harder to count. It wouldn’t be untypical for a farm to have a dozen or more buildings. I was told recently by somebody in a state broadband office that it looks like the CostQuest mapping fabric is counting every building on farms – at least in the sample that was examined. If this is true, then states with a lot of farms are going to get a higher percentage of the BEAD grants than states that don’t have a lot of compound properties with lots of buildings.
  • What’s the right way to account for vacation homes, cabins, hunting lodges, etc.? It’s really hard with any of the normal data sources to know which ones are occupied full time, which are occupied only a few times per year, which have electricity, and which haven’t been used in many years. In some counties, these kinds of buildings are a giant percentage of buildings.
  • Apartment buildings are really tough. I know from working with local governments that they often don’t have a good inventory of the number of apartment units in each building. How is the FCC mapping data going to get this right?
  • I have no idea how any mapping fabric can account for homes that include an extra living space like an in-law or basement apartment. Such homes might easily represent two passings unless the two tenants decide to share one broadband connection.
  • And then there is the unusual stuff. I remember being in Marin County, California and seeing that almost every moored boat has a full-time occupant who wants a standalone broadband connection. The real world is full of unique ways that people live.

Counting businesses is even harder, and I’m not going to make the list of the complexities of defining business passings – but I think you can imagine it’s not easy.

I’m hearing from folks who are digging into the FCC mapping fabric that there are a lot of problems. ISPs say they can’t locate existing customers. They tell me there are a lot of mystery passings shown that they don’t think exist.

We can’t blame CostQuest if they didn’t get this right the first time – Americans are hard to count. I’m not sure this is ever going to be done right. I’m sitting here scratching my head and wondering why the FCC took this approach. I think a call to the U.S. Census would have gotten that advice that this is an impossible goal. The Census spends a fortune every ten years trying to identify where people live. The FCC has given itself the task of creating a 100% census of residences and businesses and updating it every six months.

The first set of broadband map challenges will be about the fabric, and I’m not sure the FCC is ready for the deluge of complaints they are likely to get from every corner of the country. I also have no idea how the FCC will determine if a suggestion to change the fabric is correct because I also don’t think communities can count passings perfectly.

This is not the only challenge. There are going to be challenges of the coverage areas claimed by ISPs. The big challenge, if the FCC allows it, will be about the claimed broadband speeds. If the FCC doesn’t allow that they are going to get buried in complaints. I think the NTIA was right to let the dust settle on challenges before using the new maps.