The Big Ethernet Carrier Market

I haven’t talked about the big Ethernet carriers for a while. These are the giant companies that serve many of the largest businesses in the country and that also haul broadband between cities. The U.S. Carrier Ethernet Leaderboard tracks and ranks these carriers.

In the latest ranking from June 2022, the Leaderboard says that the largest six Ethernet carriers are Lumen, AT&T, Spectrum Enterprise, Verizon, Comcast Business, and Cox Business. These six carriers each have at least a 4% share of the U.S. Ethernet market. The next tier on the Leaderboard includes Altice USA, Cogent, Frontier, GTT, Windstream, and Zayo. These six carriers have a national market share between 1% and 4%.

The rankings are based on the number of billable Ethernet retail customer ports installed. In past years we used to track lit buildings, but billable ports reflect that some customers buy more than one major type of Ethernet connection. There are six categories of Ethernet service that are counted as ports, including:

  • Ethernet DIA. This is a relatively new service and connects customers directly to the Internet without passing through any intermediate carriers.
  • E-Access to IP/MPLS VPN. This is the most commonly sold big Ethernet product at 36% of the U.S. market and is more commonly called business-class virtual private network. MPLS VPNs are used to switch multiple kinds of broadband traffic across the same broadband connection.
  • Ethernet Private Lines. Private lines connect two locations with no switching in between. As an example, a bank might buy a private line between each bank branch in a city, and no carrier touches the traffic between branches.
  • Ethernet Virtual Private Lines. This is similar to dedicated private lines in that traffic is encrypted and not visible to carriers between the two end points.
  • Metro LAN. This uses Ethernet to connect multiple locations within a metropolitan network.
  • WAN VPLS. This extends Metro LAN service across the country or the world.

The next lower tier of large carriers includes companies that have less than a 1% share of the national Ethernet market. Some of the better-known names include ACD, AireSpring, Alaska Communications, Alta Fiber, American Telesis, Arelion, Armstrong Business Solutions, Astound Business, Breezeline, BT Global Services, Centracom, Consolidated Communications, Conterra, Crown Castle, Douglas Fast Net, DQE Communications, ExteNet Systems, Fatbeam, FiberLight, First Digital, FirstLight, Flo Networks, Fusion Connect, Global Cloud Xchange, Great Plains Communications, Hunter Communications, Intelsat, Logix Fiber Networks, LS Networks, MetTel, Midco, Momentum Telecom, NTT, Orange Business, Pilot Fiber, PS Lightwave, Ritter Communications, Segra, Shentel Business, Silver Star Telecom, Sparklight Business, Syringa, T-Mobile, Tata, TDS Telecom, TPx, Unite Private Networks, Uniti, US Signal, WOW!Business, Ziply Fiber and other companies selling retail Ethernet services in the U.S. market.

The names at the top of the Leaderboard are familiar since those are also most of the largest retail ISPs in the country.

What many people don’t realize is that most cities of any size include connections from some of these carriers. For example, most national chain stores, hotels, or other large national businesses have a single carrier that coordinates and connects all of its locations. This allows big businesses to efficiently and reliably connect locations to headquarters or the cloud. Anybody that has crawled through the FCC’s 477 data will see a number of these carriers listed as fiber providers in most cities.

In most cases, these carriers use somebody else’s fiber to connect to customers. Some large carriers like AT&T, Lumen, or Comcast will build fiber to business districts and then sell wholesale arrangements to carriers that need to reach specific businesses. It’s not unusual for a local outlet of a national business to not even know who the underlying carrier is – that’s something arranged by a corporate office and done behind the scenes. But any ISP salespeople knocking on the doors on chains is familiar with the story that the ISP connections are arranged by corporate.

I know a few fiber overbuilders who have cracked a hair into this market and have convinced some of the carriers on the list to buy from them and not one of the national carriers. It’s not easy to get onto the radar of these carriers, but it can be done with persistence.

The 12 GHz Battle

A big piece of what the FCC does is to weigh competing claims to use spectrum. It seems like there have been non-stop industry fights over the last decade on who gets to use various bands of spectrum. One of the latest fights, which is the continuation of a fight going on since 2018, is for the use of the 12 GHz spectrum.

The big wrestling match is between Starlink’s desire to use the spectrum to communicate with its low-orbit satellites and cellular carriers and WISPs who want to use the spectrum for rural broadband. Starlink uses this spectrum to connect its ground-based terminals to satellites. Wireless carriers argue that the spectrum should also be shared to enhance rural broadband networks.

The 12 GHz band is attractive to Starlink because it contains 500 MHz of contiguous spectrum with 100 MHz channels – a big data pipe for reaching between satellites and earth. The spectrum is attractive to wireless ISPs for these same reasons, along with other characteristics. The 12 GHz spectrum will carry twice as far as the other spectrum in point-to-multipoint broadband networks, meaning it can cover four times the area from a given tower. The spectrum is also clear of any federal or military encumbrance – something that restricts other spectrum like CBRS. The spectrum also is being used for cellular purposes internationally, which makes for an easy path to find the radios and receivers to use it.

In the current fight, Starlink wants exclusive use of the spectrum, while wireless carriers say that both sides can share the spectrum without much interference. These are always the hardest fights for the FCC to figure out because most of the facts presented by both sides are largely theoretical. The only true way to find out about interference is in real-world situations – something that is hard to simulate any other way,

A few wireless ISPs are already using the 12 GHz spectrum. One is Starry, which has recently joined the 12 GHz Coalition, the group lobbying for terrestrial use of the spectrum. This coalition also includes other members like Dish Networks, various WISPs, and the consumer group Public Knowledge. Starry is one of the few wireless ISPs currently using millimeter-wave spectrum for broadband. The company added almost 10,000 customers to its wireless networks in the second quarter and is poised to grow a lot faster. If the FCC opens the 12 GHz spectrum to all terrestrial uses, it seems likely that use of the spectrum would quickly be used in many rural areas.

As seems usual these days, both sides in the spectrum fight say that the other side is wrong about everything they are saying to the FCC. This must drive the engineers at the FCC crazy since they have to wade through the claims made by both sides to get to the truth. The 12 GHz Coalition has engineering studies that show that the spectrum could coexist with satellite usage with a 99.85% assurance of no interference. Starlink, of course, says that engineering study is flawed and that there will be significant interference. Starlink wants no terrestrial use of the spectrum.

On the flip side, the terrestrial ISPs say that the spectrum in dispute is only 3% of the spectrum portfolio available to Starlink, and the company has plenty of bandwidth and is being greedy.

I expect that the real story is somewhere in between the stories told by both sides. It’s these arguments that make me appreciate the FCC technical staff. It seems every spectrum fight has two totally different stories defending why each side should be the one to win use of spectrum.

Are BEAD Grants Large Enough?

One of the biggest questions associated with the $42.5 billion BEAD grant program is if that is enough money to solve the national rural digital divide. The funding will be allocated to states in a three-step process. First, States will get an automatic $100 million. Next, $4.2 billion will be directly allocated to States using the relative percentage of locations in each state defined as unserved and high-cost. This will rely on the new FCC maps, and the NTIA may still refine the definition of high-cost areas. The remaining $38.1 million will also be allocated to States using the new FCC maps, and will also use the relative number of unserved locations in each State.

The funding works out to be around $850 million per state, but the funding will vary significantly by state. Preliminary estimates have a number of states only getting $100 million – Connecticut, Delaware, District of Columbia, Hawaii, Maine, New Hampshire, North Dakota, Rhode Island, and Vermont. The largest estimated allocations are estimated to go to Texas at $4.2 billion and California at $2.8 billion.

States have been doing the math to see if they think the BEAD grant funding will be enough to reach every rural household with good broadband. I’ve only been able to find one article that cites an estimate of the effectiveness of the BEAD grants, but this one example raises some good questions.

The State of Minnesota is estimated to receive about $650 million in BEAD grant funding. In March of this year, the State Legislature approved $110 million for the existing Border-to-Border grant program, with most of the funding coming from federal ARPA funding given to the state. At that time, the State broadband office estimated that the state will need around $1.3 billion in total grant funding to reach everybody in the state. If that is a good estimate, then even after BEAD grants and the $110 million State grants, the state will be $540 million short.

This raises a lot of questions. First, inflation has hit the broadband industry hard, and I’ve seen a lot of estimates that the cost to build broadband networks is between 15% to 25% higher than just two years ago. That means that the $42.5 billion in BEAD funding is not going to stretch nearly as far as was estimated when Congress established the BEAD grants. This also raises the question of how much inflation will further increase costs over the years it’s going to take to build BEAD-funded networks. It’s not hard to imagine BEAD networks still being constructed in 2026 and beyond.

I’ve also seen estimates that the rules established by Congress and the NTIA for the BEAD grants could add as much as another 15% to the cost of building broadband networks compared to somebody not using grant funding. These extra costs come from a variety of factors, including the requirement to pay prevailing wages, expensive environmental studies that are not undertaken for non-grant projects, the requirement of getting a certified letter of credit, etc. The extra grant-related costs and the general inflation in the industry might mean that BEAD projects could cost 30% or more than building the same networks two years ago without grant funding.

This also raises an interesting question about how states allocated ARPA funding to broadband. Minnesota’s allocation of $110 million to broadband from ARPA is smaller than what many other states have done. As an example, my state of North Carolina allocated nearly $1 billion of the state’s ARPA money to broadband, and there are many states that have allocated $300 million or more to broadband. Part of the blame for a state like Minnesota not having enough money to reach everybody could be placed on the Legislature for not allocating much ARPA funding for broadband.

Another interesting question to be addressed is how State broadband offices will deal with areas where a 75% grant is not enough for an ISP to make a business case. From the feasibility work I’ve been doing this year, I think there are a lot more areas that fit the high-cost category than might be expected. The NTIA says that it might allow exceptions for grants up to 100% of the cost of assets – but asking for extra funding will probably open up the possibility for a State to instead fund less costly technologies. It might turn out that finding solutions for the many high-cost areas might be the unpredictable wild card in the BEAD grant process.

Finally, there are going to be areas where a State doesn’t make a BEAD grant award. It’s not hard to imagine a situation where only one ISP asks to serve an area, and a State broadband office decides that the ISP is unqualified to receive funding.

If the Minnesota estimate is even roughly accurate, it’s likely that Minnesota won’t be the only state that doesn’t receive enough BEAD money to get broadband to everybody. We’re not going to know this for sure until ISPs start applying for grants, but it won’t be a surprise if the BEAD grants are not large enough.

Congressional Push for a National Broadband Strategy

In August, a bill was passed through to the Committee for Commerce, Science, and Transportation to align the federal government’s efforts related to broadband. The bill was co-sponsored by Senators Roger Wicker, R-Mississippi, Ben Ray Luján, D-New Mexico, and Representatives Tim Walberg, R-Michigan., and Peter Welch, D-Vermont.

The Bill, S-4767 is titled the Proper Leadership to Align Networks (PLAN) for Broadband Act. The legislation is based upon a report earlier this year from the Government Accountability Office that determined that federal broadband efforts are fragmented and overlapping. The bill proposes that the President develop a national broadband strategy to better align the federal broadband effort.

There is no question that national broadband policy is fragmented. We have an FCC which is ostensibly in charge of broadband policy but which essentially washed its hand of broadband regulation under past Chairman Ajit Pai. The FCC has been in charge for years of tracking the state of broadband in the country and completely botched that task through an inadequate mapping process that allowed ISPs to report whatever they wanted about broadband coverage. For much of the last few decades, the feeling in DC is that the FCC has been in the pocket of the giant ISPs the agency is supposed to be regulating.

Congress gave responsibility for the giant BEADs grant program to the NTIA, largely due to the fact that Congress didn’t trust the FCC to administer the grant program. But the NTIA doesn’t have a lot of authority outside of the grant program. When the BEAD grants are behind us, the NTIA will fade into obscurity again in terms of national broadband policy.

The latecomer to the game is the FTC. The FCC handed some authority to the FTC when it abandoned broadband regulation. But the FTC mostly only prosecutes individual ISPs for bad behavior and has no authority to impose any regulation on all ISPs.

This bill is asking the Executive branch to take a shot at fixing federal broadband dysfunction through the creation of a broadband plan. I guess this plan would be aimed at discussing how to put broadband regulation back together again to have a cohesive federal policy. If you’ve read my blog for years, you know how I feel about broadband plans. They are only as good as any follow-through on the recommendations made. The decade-old national broadband plan was as near as you could get to a total bust – not because it didn’t include good recommendations, but because it was put on the shelf and quickly forgotten.

It’s hard to think that a new broadband plan, even one coming from this legislation, would fare any better than the last one. It will likely be a document with a few good ideas – but ideas that are softened to appease the many parties with input to the plan. It’s hard to imagine a new federal broadband plan going anywhere but on the shelf, as in the past.

I find it almost humorous that Congress would ask the White House to come up with the plan on how to fix the national broadband administration and regulation. The White House has almost zero power to implement any ideas the plan might suggest.

The one government entity that can create a coherent broadband plan is Congress. Congress writes the rules that direct how the FCC operates and they could change the direction of the FCC overnight. Congress is the one who gave the NTIA the strong current role in setting national broadband policy through the grant process and could expand that role if desired.

If Congress wants a coherent broadband policy, it needs to do nothing more than go into a room and write it. This Act is a way for Congress to pretend to be addressing broadband without actually doing so. If nothing happens after the creation of a newly written broadband plan, Congress can blame the White House.

The reality is that there are not enough votes in Congress to pass a new Telecommunications Act, which is what is needed to put national telecom policy back on track. There has obviously not been enough votes over the last decade to make any drastic changes to telecom policy. The large ISPs have bought enough influence in both parties to sidetrack any attempt by the federal government to try to regain the reins of broadband policy.

There is no telling if this particular legislation has enough legs to get to a floor vote – but it’s the kind of legislation that could garner enough votes from both parties to pass since the outcome threatens nobody.

Averting a Mapping Disaster?

Alan Davidson, the head of the National Telecommunications and Information Administration, recently announced that the agency is canceling plans to use the first iteration of the new FCC maps that the FCC says will be available by early November. Davidson says that he feels obligated to let the FCC’s challenge process play out before using the mapping data. I’m sure this wasn’t an easy decision, but it says that it’s better to hold out for a more accurate map rather than settling for the first iterations of the new FCC maps.

This decision will clearly add more time and delay to the $42.5 billion BEAD grant program. But the decision to wait recognizes that using incorrect maps would almost inevitably mean lawsuits that could delay the grant program even longer.

The timing of the new maps became unfortunate when Congress mandated that the FCC maps must be used to allocate over $38 billion in grant funding to states. The FCC has been stating all summer that it hopes that the new maps will be relatively accurate and will fix many of the obvious problems in the current broadband maps. If it wasn’t for the pressure of the BEAD grant program, the FCC would have had several cycles of the new maps to smooth out kinks and errors in the reporting before they had to bless the new maps as solid. The NTIA decision to delay relieves the pressure to have the first set of maps be error-free – which nobody believes will happen. I have a hard time recalling any cutover of a major government software system that was right the first time, and the FCC’s assurances all summer have felt more like bravado than anything else.

Over the last few weeks, I’ve been talking to the engineers and other folks who are helping ISPs with the new maps. I didn’t talk to anybody who thinks the new maps will be solid or accurate. Engineers are, by definition, somewhat cautious folks, but I expected to find at least a few folks who thought the new maps would be okay.

I’ve been saying for six months that the likelihood of the new maps being accurate is low, and I was thinking about not writing anything more about mapping until we see what the new maps produce. However, I was prompted to write about mapping again when I saw a headline in FierceTelecom that quoted Jonathan Chambers of Conexon saying that the new maps will be a train wreck. Conexon is working with electric cooperatives all across the country to build broadband networks, which gives the company an interesting perspective on rural issues.

Jonathan Chambers cites two reasons for pessimism. One is the reason I already mentioned, which is that it’s irrational to use the outputs of a new federal mapping system to allocate billions of dollars between states. He says that there are simpler alternatives that would take all of the pressure off the new mapping system. He’s right, but unfortunately, Congress specifically required In the IIJA legislation that the FCC maps be used. It would take an act of Congress to change that ruling.

Chambers is also pessimistic about the challenge process that is being allowed for the new maps. He expects the challenges to be major and ongoing. It seems unlikely that the FCC is prepared to investigate the huge number of protests that could come from every corner of the country claiming that the new maps got the facts wrong.

My discussions with engineers raised other questions not mentioned by Chambers. Some engineers told me that the underlying mapping fabric has a lot of mistakes. This is where CostQuest, the firm that created the new mapping system, laid out the location nationwide of every possible broadband customer. This was a nearly impossible task in the short time the company had to create the maps. I’ve been working for years with local governments that use GIS data to define potential broadband locations, and it’s always a challenge to identify only those buildings where somebody might buy broadband and exclude buildings used for some other purpose.

My biggest concern is that ISPs are still allowed to report marketing speeds instead of actual speeds, and I fear that ISPs will be motivated to overstate broadband speeds in the new maps (like many have done in the old ones). Any areas designated by the maps to already have broadband available at 100/20 Mbps will be declared ineligible for the BEAD grants, and any ISP that wants to protect against being overbuilt has a high motivation to claim that speed – and it seems likely that many of them will do so. I don’t know if this is true, but my interpretation of the FCC map challenge is that the FCC won’t entertain challenges based on speed, but only on the coverage area. If that is true there will be a huge uproar from states and communities that get disadvantaged from deceptive reporting by ISPs.

I’ve also heard from ISPs in the last week that were unable to navigate the new mapping system by the deadline. These are relatively small ISPs, but many of them have built fiber and it’s not good to have them excluded from the maps. I’ve heard from multiple sources that the new mapping system is not easy to use. I’ve heard from ISPs who didn’t have an engineer who was able to certify the maps and just gave up.

I guess we’ll find out in a few months how the first draft of the maps turns out. The FCC says it will release the results by early November. I expect there are a whole lot of folks who are poised to compare the new maps to their local knowledge of actual broadband usage – and then the challenges will begin.

Satellite Cell Service

T-Mobile and Starlink made a joint announcement recently about an arrangement where Starlink will enable voice and texting capabilities to T-Mobile cellphones by the end of 2023. This is a service that would work with existing cell phones and would supposedly kick in when a phone can’t find a signal from a cell tower. Starlink said the technology would be enabled by new satellites that have significantly larger antennae than the current satellites in the constellation. In the press release, Elon Musk touted this as being able to reach people lost in the wilderness, but the much bigger use will be to fill in cellular coverage in rural areas for T-Mobile.

While the two companies made a big splashy announcement about the arrangement, they are late to the game as other industry players already have similar plans underway.

AST SpaceMobile has been working on deploying satellites aimed specifically at the cellular market. The company plans to launch its first five satellites in 2024. The company’s business plan is to launch fairly large satellites weighing over 3,300 pounds to create a constellation dedicated to cellular coverage. The company has already created partnerships with more than 25 mobile operators around the world, including the giant cellular company Vodaphone.

Lynk is taking a different approach and will launch small satellites around the size of a pizza box. The company has one test satellite in orbit with another schedule this December. The company plans to have 50 satellites in orbit by the end of 2023. Lynk already has 14 commercial agreements in place and will support large corporations and governments as well as mobile providers.

Just yesterday, Apple announced that it will offer a texting service for those lost in the wilderness in a partnership with Globalstar. This service is going to be text only and is going to be exceedingly slow, but it will supposedly work for folks who have the latest iPhone and who also are able to point the phone directly at the satellite. There will be an app that will tell a user where the satellite can be found.

All of these plans raise a lot of questions that we won’t get answered until somebody has a working satellite product. For example, could somebody inside a vehicle connect to a satellite? I have no problem connecting to the Sirius XM satellite service, so this might not be a problem. Will these connections somehow roam and connect back to cellular carriers when the user is in reach of a cell tower? That would be really complicated, and my guess is that this won’t work. Mike Sievert, the CEO of T-Mobile said this project is like putting a cell site in the sky, but much harder – and I believe him. I’ve been trying to picture how the satellites will pick out the right calls because filtering through the many billions of cellphone calls to find the right ones sounds like a huge data processing challenge.

The service would certainly be a boon to somebody lost in the  woods, but this is a much-needed service for a lot of people. My consulting firm does surveys, and it’s not unusual to find rural counties today where 30% or more of homes say they have no cellular coverage at their homes. The national coverage maps of the big cellular companies are a joke in many rural places.

T-Mobile and Starlink said that these connections would be only for voice calls and texting at first but that using cellular data might be on the horizon. That would be a significant accomplishment since a receiver many times larger than a cell phone is needed today to communicate with a satellite.

The real potential for this product is not in the U.S. and Europe where a large percentage of folks can connect today to cellular networks. The real market is the many parts of the world where modern cellular towers are a rarity. Most Americans probably don’t understand or appreciate that there is still a lot of the world where folks are not connected, or perhaps only connected through one universal connection that is shared by a whole community.

More WiFi Spectrum

There is more WiFi spectrum on the way due to the US Court of Appeals for the District of Columbia that rejected a legal challenge from the Intelligent Transportation Society of America and the American Association of State Highway and Transportation Officials that had asked to vacate the FCC’s 2020 order to repurpose some of the spectrum that had been reserved for smart cars.

The spectrum is called the 5.9 GHz band and sits between 5.85 GHz and 5.925 GHz. The FCC had decided to allocate the lowest 45 MHz of spectrum to WiFi while allowing the upper 30 MHz to remain with the auto industry.

The process will now begin to make the transition to WiFi. The FCC had originally given the auto industry a year to vacate the lower 45 MHz of spectrum. The FCC is likely going to have to set a new timeline to mandate the transition. The FCC also needs to rule on a waiver from the auto industry to redeploy technology using the Cellular Vehicle-to-Everything (C-V2X) technology from the lower to the higher frequency band. This is the technology that most of the industry is using for testing and deploying self-driving vehicles.

The lower 45 MHz of the new spectrum sits adjacent to the existing WiFi 5.8 GHz spectrum. Combining the new spectrum with the existing band is a boon to WISPs, which now get a larger uninterrupted swath of spectrum for point-to-multipoint broadband deployment. During the early stage of the pandemic, the FCC gave multiple WISPs the ability to use the 5.9 GHz spectrum on a trial basis for 60 days, and many of them have been regularly renewing that temporary licenses since then.

When the FCC announced the resolution of the lawsuit, the agency issued a press release discussing the benefits touted by WISPs for using the new spectrum. Some of them claimed to see between a 40% and 75% increase in throughput bandwidth. This was mostly due to less congestion on this spectrum, which is rarely used. There was little or no interference during the last year. The spectrum also provided a clear path for wireless backhaul between towers. Of course, once this is made available to all WISPs, it’s likely that much of this benefit will disappear as everybody starts vying to use the new spectrum. But it is an increase in bandwidth potential, and that has to mean higher quality wireless signals.

This spectrum will also be available for home WiFi. However, it takes a lot longer for the home WiFi industry to respond to new spectrum. It means upgrading home WiFi routers but also adding the capability to use the spectrum to the many devices in our homes and offices that use WiFi. Everything I’m reading says that we are still years away from seeing widespread use of the 6 GHz WiFi spectrum, and this new bandwidth will likely be rolled out at the same time.

This was an interesting lawsuit for several reasons. First, the entities filing the court suit challenged the FCC’s ability to change the use of spectrum in this manner. The court decision made it clear that the FCC is fully in the driver’s seat in terms of spectrum allocation.

This was also a battle between two large industries. The FCC originally assigned this spectrum to the auto industry twenty years ago. But the industry was slow to adopt any real-world uses of the spectrum, and it largely sat idle, except for experimental test beds. There is finally some movement toward deploying self-driving cars and trucks in ways that uses the spectrum. But even now, there is still a lot of disagreement about the best technology to use for self-driving vehicles. Some favor the smart road that uses spectrum to communicate with vehicles, while the majority opinion seems to favor standalone smart-driving technology in each vehicle.

Between this order and the 6 GHz spectrum, the FCC has come down solidly in favor of having sufficient WiFi spectrum going into the future. It’s clear that the existing bands of WiFi are already heavily overloaded in some settings, and the WiFi industry has been successful in getting WiFi included in huge numbers of new devices. I have an idea that we’ll look back twenty years from now and say that these new WiFi spectrum bands are not enough and that we’ll need even more. But this is a good downpayment to make sure that WiFi remains vigorous.

Streaming Video Continues to Grow

I saw recent headlines that claim that the time people watch streaming content is now greater than all of the time spent watching content from cable companies. A deeper look at the underlying statistics shows that this isn’t entirely true, but it makes for a great headline. But it’s still news that the percentage of viewing done through streaming continues to grow while the number of traditional cable customers continues to plunge.

Let’s start with the July 2022 numbers. These statistics come from Nielsen, which has been tracking viewership of content for decades and publishes monthly reports on the video industry. The best-known statistics coming from Nielsen are the ranking and popularity of various TV shows. Nielsen says that July streaming was up significantly, partially due to a surge in the viewing of the newly released Stranger Things 4.

July 2021 July 2022
Broadcast 23.8% 21.6%
Cable 37.7% 34.4%
Streaming 28.3% 34.8%
Other 10.1% 9.2%

The statistics show that the total time spent watching content through streaming was greater than the time spent watching all content through traditional cable TV connections. The reason that this isn’t completely accurate is that the “Other” category includes some viewing that also comes from cable companies such as streaming through cable settop boxes.

Nielsen counts streaming through a cable settop box as other because the company has no visibility into the identity of such content. An example would be somebody reaching a streaming service like Netflix through the Comcast settop box. Such content is hard to classify because it all comes from a traditional cable company, but much of it is also streaming.

But the nuances of the numbers aren’t what matters as much as the trend. It’s clear over time that the percentage of time watching streaming content is growing while the use of traditional cable TV viewing is waning.

One explanation for this is the decreasing number of traditional cable customers. According to Leichtman Research Group, the number of cable customers of the largest cable companies dropped from 70.7 million in June 2021 to 65.0 million in June 2022. That’s an 8% drop in total cable customers, which aligns well with a drop in the share of cable viewing in the Nielsen numbers above that show a 9% drop in cable viewing.

All of this means there is still a lot of video content that will eventually be migrating to the Internet. Traditional cable still carried 37.7% of all viewing in July. Another 23.8% of viewing was through broadcast (rabbit ears). Those still represent over 60% of all video views.

The Nielsen numbers and trends suggest that broadband networks will continue to see increased demand from video that will continue to shift to the web. While video viewing is still the largest single use of the web, many other uses of broadband are still growing rapidly. In recent years we’ve moved a lot of the software we use daily to the cloud. Web-connected security cameras are becoming ubiquitous. Practically every device we buy for the home now talks to the Internet. I rarely had video calls before the pandemic, and it’s now a routine part of my workday.

Another RDOF Auction?

There was a recent interview in FierceTelecom with FCC Commissioner Brandon Carr that covered a number of topics, including the possibility of a second round of RDOF. Commissioner Carr suggested that improvements would need to be made to RDOF before making any future awards, such as more vetting of participants upfront or looking at weighting technologies differently.

The FCC is building up a large potential pool of broadband funding. The original RDOF was set at $20 billion, with $4.4 billion set aside for a second reverse auction, along with whatever was left over from the first auction. The participants in the first RDOF auction claimed only $9.2 billion of $16 billion, leaving $6.8 billion. When the FCC recently decided not to fund LTD Broadband and Starlink, the leftover funding grew by another $2 billion. Altogether that means over $11 billion left in funds that were intended for RDOF.

We also can’t forget that around the same time as the RDOF that the FCC had planned to fund a 5G fund to enhance rural cellular coverage. Due to poor mapping and poor data from the cellular carriers, that auction never occurred. That puts the pool of unused funding at the FCC at $20 billion, plus whatever new FCC money might have accrued during the pandemic. That’s a huge pool of money equal to half of the giant BEAD grants.

The biggest question that must be asked before considering another RDOF reverse auction is how the country will be covered by the BEAD grants. It would be massively disruptive for the FCC to try to inject more broadband funding until that grant process plays out.

Commissioner Carr said that some of the FCC’s funding could go to enhance rural cellular coverage. Interestingly, once BEAD grant projects are built, that’s going to cost a lot less than was originally estimated. A lot of the money in the proposed 5G fund would have been used to build fiber backhaul to reach rural cell sites. I think the BEAD last-mile networks will probably reach most of those places without additional funding. However, there is probably still a good case to be made to fund more rural cell towers.

But there are larger questions involved in having another reverse auction. The big problem with the RDOF reverse auction was not just that the FCC didn’t screen applicants first, as Carr and others have been suggesting. The fact is that a reverse auction is a dreadful mechanism for awarding broadband grant money. A reverse auction is always going to favor lower-cost technologies like fixed wireless over fiber – it’s almost impossible to weight different technologies for an auction in a neutral way. It doesn’t seem like a smart policy to give federal subsidies to technologies with a 10-year life versus funding infrastructure that might last a century.

Reverse auctions also take state and local governments out of the picture. The upcoming BEAD funding has stirred hundred of communities to get involved in the process of seeking faster broadband. I think it’s clear that communities care about which ISP will become the new monopoly broadband provider in rural areas. If the FCC has a strict screening process up front, then future RDOF funding will only go to ISPs blessed by the FCC – and that probably means the big ISPs. I would guess that the only folks possibly lobbying for a new round of RDOF are companies like Charter and the big telcos.

The mechanism of awarding grants by Census block created a disaster in numerous counties where RDOF was awarded in what is best described as swiss cheese serving areas. The helter-skelter nature of the RDOF coverage areas makes it harder for anybody else to put together a coherent business plan to serve the rest of the surrounding rural areas. In contrast, states have been doing broadband grants the right way by awarding money to coherent and contiguous serving areas that make sense for ISPs instead of the absolute mess created by the FCC.

A reverse auction also relies on having completely accurate broadband maps – and until the FCC makes ISPs report real speeds instead of marketing speeds, the maps are going to continue to be fantasy in a lot of places.

Finally, the reverse auction is a lazy technique that allows the FCC to hand out money without having to put in the hard effort to make sure that each award makes sense. Doing grants the right way requires people and processes that the FCC doesn’t have. But we now have a broadband office and staff in every state thanks to the BEAD funding. If the FCC is going to give out more rural broadband funding, it ought to run the money through the same state broadband offices that are handling the BEAD grants. These folks know local conditions and know the local ISPs. The FCC could set overall rules about how the funds can be used, but it should let the states pick grant winners based upon demonstrated need and a viable business plan.

Of course, the simplest solution of all would be for the FCC to cut the USF rate and stop collecting Universal Service Fund revenues from the public. The FCC does not have the staff or skills needed to do broadband grants the right way. Unfortunately, that might not stop the FCC from tackling something like another RDOF auction so it can claim credit for having solved the rural digital divide. If the FCC plans on another RDOF auction I hope Congress stops them from being foolhardy again.

Traditional Cable in Less than Half of Households

Leichtman Research Group recently released the cable customer counts for the largest providers of traditional cable service at the end of the second quarter of 2022. LRG compiles most of these numbers from the statistics provided to stockholders, except for Cox, which is privately held and estimated. Leichtman says this group of companies represents 96% of all traditional U.S. cable customers.

The traditional cable providers continue to lose customers at a torrid pace, losing over 1.65 million customers in the second quarter, up from 1.4 million customers the previous quarter. Overall, the traditional cable providers lost almost 18,200 customers every day during the quarter.

The big news for the quarter is that traditional cable providers are now in less than half of homes and have collectively dropped to a 49% market penetration. The industry has lost almost seventeen million customers since the end of 2017, when traditional cable was in over 73% of homes.

2Q 2022 Change Change
Comcast 17,144,000 (520,000) -2.9%
Charter 15,495,000 (226,000) -1.4%
DirecTV 13,900,000 (400,000) -2.8%
Dish Network 7,791,000 (202,000) -2.5%
Verizon 3,479,000 (87,000) -2.4%
Cox 3,230,000 (80,000) -2.4%
Altice 2,574,200 (84,500) -3.2%
Mediacom 540,000 (15,000) -2.7%
Frontier 343,000 (20,000) -5.5%
Breezeline 332,312 (6,709) -2.0%
Cable ONE 221,000 (17,000) -7.1%
   Total 65,049,512 (1,658,209) -2.5%
Hulu Live 4,000,000 (100,000) -2.4%
Sling TV 2,197,000 (55,000) -2.4%
FuboTV 946,735 (109,510) -10.4%
Total Cable 39,536,512 (949,209) -2.3%
Total Telco / Satellite 25,513,000 (709,000) -2.7%
Total vMvPD 7,143,735 (264,510) -3.6%

It doesn’t look like people are replacing traditional cable with an online alternative like Hulu and Sling TV – which collectively lost 264,000 customers in the quarter. A few major online alternatives like YouTube TV aren’t on the list, but the loss in traditional cable far surpasses any possible net gain for the online cable alternatives.

Charter is still losing customers at a slower rate than everybody else in the industry and has for the past several years – although Charter’s losses are starting to climb. Charter CEO Tom Rutledge says that Charter actively points out to customers that the online alternatives cost more. The rest of the industry seems resigned to letting cable customers go.

The biggest percentage losers continue to be Frontier and Cable ONE.