Why No New FCC Commissioner?

I’ve put off writing a blog on this question since early summer. For some perplexing reason, the Biden administration has not yet named a nominee to fill the vacant fifth seat on the FCC and has not named a new FCC Chairman. It’s perplexing because the President called for expanded broadband both before and after his election.

To those who don’t follow the FCC closely, a vacant seat matters because many FCC issues are decided on party lines. It’s been traditional for every president to put in commissioners from the same party. What makes it even more perplexing is that the term of current Acting-Chair Jessica Rosenworcel expired this summer, and she will be forced to leave at the end of the year unless the White House acts to extend her seat. If she leaves, the agency would have only three commissioners, with a Republican majority.

One of the reasons that new presidents often act quickly to name a new Chairman is due to the slow grinding processes at the FCC that must be navigated to make any changes to policy. President Trump named the past Chairman Ajit Pai in January 2017, and it took until December of that year for the FCC to vacate net neutrality. That’s almost a world speed record in terms of the FCC being able to enact a new policy.

By not filing the fifth FCC seat, the agency just spent the year since the election with two democrats and two republicans. The FCC still had a busy year, mostly due to pandemic issues, tackling robocalling, and the ongoing work to revise the use of spectrum. But no new policy questions have been raised. By losing a year, it will now take until the end of the second year of the administration to see any changes in broadband policy. As you might imagine, this lack of action is making policy wonks crazy.

Washington D.C. is rife with rumors on who might eventually be named to the FCC. I’m not naming any names, because what I’ve seen in the past is that almost every nomination to the FCC has been a surprise. Nobody saw Tom Wheeler being named as FCC Chairman under President Obama. As might be expected, there are names being suggested by different faction of the democratic party.

Part of me is not unhappy with a balanced FCC that sticks to the basics. For example, a balanced FCC is not as likely to give out billions in subsidies to the big ISPs. But there are policy issues that really need to be tackled.

Chief among them is the question of whether broadband should be regulated. This is an industry that cries for regulation. Broadband is one of the key drivers of the U.S. economy. The industry is ruled by huge carriers with immense market power, with the top four ISPs serving over three-quarters of all broadband customers in the country. This is the textbook example of an oligopoly industry that must be regulated. For some reason, we keep talking about how Ajit Pai got rid of net neutrality when what he really did was eliminate all but a few vestiges of broadband regulation. If we aren’t going to bother regulating broadband, then perhaps we don’t need the FCC at all.

The broadband industry is the only major industry that has no obligation to disclose the details of the products sold to customers. Big ISPs are free to make any claims they want about speeds, latency, and quality of broadband. Big ISPs are free to overbill customers and refuse to correct mistakes. There is no regulatory body able to scold them, much less demanding more disclosure and better behavior.

I’ve held off asking this question all year, assuming that an announcement must be imminent. As these things always work out, an announcement will probably hit on the afternoon that I post this blog. It can’t be soon enough.

T-Mobile Home Broadband

T-Mobile has started to aggressively roll out its 5G Home Internet Service that provides home broadband from the company’s cellular towers. The 5G name is a misnomer because this still uses the company’s 4G LTE network – but all of the big cellular companies are mislabeling cellular broadband as 5G.

The product is being launched from an agreement with the FCC for allowing T-Mobile to purchase Sprint in November 2019. T-Mobile agreed at that time to make fixed broadband available to 97% of the households in the country by November 2022 and 99% of households by November 2025. The pandemic put T-Mobile behind schedule, but in a recent investor call the company says that the product is now available to 30 million households. The company says it still intends to meet the 97% coverage goal by the end of 2022.

This month, T-Mobile dropped the price to $50 per month. T-Mobile is advertising the product at speeds that average around 100 Mbps download. T-Mobile doesn’t make any claims about upload speeds, but in searching reviews from customers, upload speeds seem to range between 10 and 15 Mbps.

Bandwidth is unlimited, but the product has some unusual caveats. The company says, “Home Internet is not intended for unattended use, automatic data feeds, automated machine-to-machine connections, or uses that automatically consume unreasonable amounts of available network capacity.” This would seemingly block broadband usage for functions like backing up data in the cloud or automatically uploading files at night to an office server. The terms of service don’t say if this is prohibited or just throttled.

The product is not for the sports fan with a giant 4K television. T-Mobile automatically throttles all video to a lower resolution. This also might not be attractive to gamers, with average latency between 30 and 40 ms – a little more lag than most cable company connections.

The T-Mobile modem is placed inside the home, so the speed will be whatever signal can penetrate the outside walls. Customers report that coverage is best with the gateway placed close to a window facing a cellular tower. Like with all cellular coverage, this won’t work for every home. Cities tend to have a lot of small cellular dead zones caused by hilly streets or neighboring homes that can block the signal.

The broadband product uses the same network as T-Mobile cellphones, so data speeds will likely decrease in times of busy cellular usage – the terms of service state that cellphone usage gets priority over the fixed product.

It’s an interesting product and could be market disruptive. T-Mobile believes it will have 500,000 customers by the end of this year, which would make the company the 16th largest ISP. A year ago, T-Mobile said its goal was to have 7 million customers by the end of 2025, which would make it the fifth-largest ISP, just below Verizon.

It’s an interesting broadband product. In cities this is the product that should displace DSL, which is increasingly being used by households that care most about saving money. Homes with multiple users and big bandwidth demands are probably not going to like the restrictions – or the upload speeds, which are about the same as with a cable modem connection.

The FCC forced this commitment on T-Mobile in hopes of improving rural broadband. It’s going to do that, and rural homes should start paying attention to see when it comes to your local cell site. But the big issue in rural areas is going to be the distance that people live from a cell site. The broadband speeds will start dropping at about a mile from a cell site, will be weak after two miles, and non-existent at three miles. Even after T-Mobile puts this product on every cell site, it’s not going to reach 97% of homes. The FCC cellular coverage maps are even more of a joke than landline broadband maps. I’ve worked in a lot of rural counties in recent years where cellular voice coverage is weak or non-existent, and the coverage for the broadband product will be even smaller. But even a home two miles from a cell site might love this product if it can deliver 25 Mbps and replace a DSL connection delivering a fraction of that.

There are similar products being deployed by AT&T and Verizon, and one would assume eventually by Sprint. For now, T-Mobile is the most aggressive in deployment, probably due to the agreement with the FCC. But cellular data is poised to enter our markets as a significant competitor.

The Future of Satellite Broadband

People ask me a lot about what Starlink means for somebody building a rural broadband network. That set me to contemplate the long-term prospects for LEO satellite broadband.

Today, the broadband provided by Starlink is a boon to rural subscribers who have had no alternatives. Hundreds of thousands of prospective customers have gotten onto the Starlink waiting list. It’s not hard to understand why when the rural broadband alternatives are extraordinarily slow rural DSL, high orbit satellite broadband, or cellular hotspots. A rural resident who gets Starlink broadband delivering 50 Mbps to 150 Mbps feels like they won the broadband lottery.

Assuming that Starlink and the other satellites can deliver the same decent broadband to larger numbers of users, the companies should do extremely well for the rest of this decade. That’s still a big if because there are still hurdles for Starlink to overcome. It must find enough backhaul to fully support the satellite constellation  – we know from other broadband technologies what poor backhaul means. There are spectrum battles at the FCC that might limit Starlink’s backhaul. Starlink also still has to launch many thousands of satellites, and according to Elon Musk, still needs $30 billion to finish the build.

But let’s assume that Starlink raises the money, launches the needed satellites, and gets the backhaul bandwidth needed to keep broadband speeds up. I foresee other issues that all of the satellite companies will face if I look out after ten years.

Landline Competition. Assuming that Congress gets its act together and passes a rural broadband bill, then many of the natural rural markets for Starlink will be getting fiber. It’s hard to see satellite broadband keeping much market share if fiber can bring more broadband for a lower price. Fiber is likely to never be everywhere in the country, but it could end up in most places. One only has to look at the Dakotas to see that there is a sustainable business plan for fiber-to-the-home if the initial fiber construction is subsidized. As the richer markets like the U.S., Canada, Australia, Europe, and the far east get fiber, the potential market for selling satellite broadband will be much diminished. We’re not going to see rural fiber everywhere immediately, but in a decade, we might be seeing a lot of it.

Prices. I also wonder about Starlink’s ability to sell broadband outside of wealthy economies.  I’ve read articles by several analysts who say that Starlink needs the $99 price everywhere to be viable. If this is so, then Starlink broadband is not a viable product in much of the world – there are not a lot of people in the third world who can pony up $99 per month for broadband.

Starlink also has to worry about competition from Jeff Bezos. One has to assume that Amazon will offer irresistible bundles of broadband tied to delivery and video services. If Starlink really needs a high price to succeed, it will be vulnerable to a ruthless competitor.

Growth in Broadband Demand. The issue that nobody talks about is the ever-growing demand for broadband. Since 2018 the national average monthly household use of broadband has grown from 215 gigabytes to 433 gigabytes. The growth has been on the same curve since the early 1980s. Even if the Starlink constellation can find enough backhaul to provide adequate speeds today, how will the technology keep up when demand in ten years will be ten times higher or in twenty years will demand will be a hundred times higher?

Viasat, which uses geosynchronous satellites at 22,000 miles above the earth, handles the demand issue by imposing severe data caps. On Viasat, a plan to get just 100 gigabytes of usage for a month is $170, with a 2-year contract required. A household that is limited to 100 gigabytes per month is not participating in the modern broadband world. If low-orbit satellite providers have to implement data caps to remain viable, their days will become numbered.

Constant Need to Replace Satellites. One of the most unusual aspects of low-orbit satellites is that satellites are expected to be replaced every five years or so. That means that the satellite company will face the giant cost of paying for rocket launches forever, just to stay in place. If Starlink ever grows to maturity, it will still have to constantly replace satellites, even if that means no new revenues. Any broadband technology suddenly feels a lot less attractive if the owner has to replace 20% of the network every year in perpetuity.

Summary. Assuming that the satellite companies actually raise the money, launch the needed thousands of satellites into orbit, and keep broadband speeds at a decent level, it’s not hard picturing the companies getting many millions of customers and even being profitable by the end of this decade. But the bigger long-term question is how a satellite company can sustain a business in a world where broadband demand is likely to outstrip the ability of satellites to deliver? The inability to keep up with demand is what ultimately killed DSL and is putting huge pressure today on cable companies to improve upload speeds. I think keeping up with demand will be the long-term Achilles heel for LEO satellites.

Rural Cellular Coverage

2021 is going to into history as the year when the whole country is finally talking about rural broadband. The pandemic made it clear that many millions of households and even entire communities don’t have adequate broadband. Congress and the White House responded by funding billions of dollars for improved broadband in the ARPA legislation. We are perhaps edging closer to an infrastructure bill that will allocate tens of billions of additional dollars to fix a lot of the rural broadband divide.

An issue that seems to have fallen off the radar is rural cellular coverage. In 2020, the FCC halted its plans to fund a reverse auction for a $9 billion 5G Fund that was intended to improve rural cellular coverage. That program was halted after it became apparent that Verizon and others were reporting inaccurate cellular coverage data to the FCC – much like the problem we have with FCC broadband maps. In July of this year, the FCC started another round of data collection, so the plans for the 5G fund must still be in play.

The topic has come to my attention recently as my consulting firm was doing several surveys in rural counties. In these surveys, between 30% and 50% of survey respondents said that they had little or no cellular coverage at their homes. In all of these cases, the FCC data from cellular carriers showed ubiquitous coverage throughout each county.

To be fair, the FCC reporting for cellular coverage is outdated. In its annual report to Congress, the FCC reports the percentage of homes and businesses in every county that can receive 5/1 Mbps LTE coverage. Ookla recently reported that the millions of speed tests show that the average national cellular download speed in early 2021 was 76.6 Mbps for AT&T, 82.4 Mbps for T-Mobile, and 67.2 Mbps for Verizon. The FCC is still reporting on a cellular speed that is far slower than the speed that a majority of cellular customers can receive – much in the manner that it keeps focusing on 25/3 Mbps as the definition of broadband.

But it’s really troublesome when the FCC reports to Congress that whole counties can get 5/1 Mbps cellular broadband when residents tell us they have almost no cellular overage. It’s hard to understand why members of Congress who live in some of these rural counties don’t push back on the FCC.

I’ve heard the same kind of rural stories about rural cellular coverage that I’ve heard about rural broadband. People say that coverage peters out during their commute home, and they have no coverage in their neighborhood. I heard the other day from somebody who told me they only get weak cellular coverage in one tiny part of their backyard – anybody of my age remembers running around the corner of airports trying to find that magic spot.

I would venture to say that almost everybody reading this blog knows of cellular dead spots. I live in the center of a city, and there is no Verizon coverage at my house – yet there is good Verizon coverage at the other end of my block. I have relatives that live in a nearby older suburb who have weak and intermittent cellular coverage for the entire neighborhood from both AT&T and Verizon. Every Uber driver in this area can paint a pretty accurate picture of the location of cellular dead zones – and there are a lot of them.

Cellular dead zones are largely a matter of geography and topography. I don’t know if it’s realistic to think that Verizon should be required to show the dead zone at my house on its cellular coverage maps. But I wonder how many people buy homes in hilly cities like mine, only to find that cellphones don’t work. I decided to not make an offer to buy a home here when my cellphone didn’t work at the house.

The problems in cellular rural areas are much more of a concern than my half-block of bad coverage. In rural counties, poor coverage comes from the combination of two issues – distance from a tower and terrain. In cities, there are cell towers within a mile or two of everybody, and with the introduction of small cells, even closer in many cases. In rural counties, there might only be a few cell sites, and many residents live more than two miles from the nearest tower.

Local is Better

I am lucky enough to live in a place that is ideal for growing vegetables. I’m up in the mountains, where temperatures are moderate, and we have an average of three inches of rain every month all year. For the past several years, I have bought a big box of fresh produce every week from two local farms. I have never eaten so well, and I am comforted by knowing exactly where my vegetables were raised. I consider myself lucky to even have this option.

Most people in the country buy broadband from a handful of giant ISPs – The four largest ISPs serve more than three-fourths of all broadband customers in the country. But some people are lucky enough to benefit from great local ISPs in the same way that I benefit from being near to local vegetable farms.

There is a wide variety of local ISPs that includes independent telephone companies, telephone cooperatives, municipal ISPs, electric cooperatives, independent fiber overbuilders, and WISPs. Not all local ISPs are great – but from the stories I hear around the country, most people served by local ISPs feel lucky to have the local option.

There was a time 75 years ago when the federal government acknowledged that local is better. When it became clear that the big commercial electric companies were never going to build and serve in rural America, somebody in the federal government came up with the brilliant idea of letting communities build and operate their own electric companies through cooperatives. This didn’t cost the government much since the federal electrification plan provided long-term loans through the Rural Utility Service, and the cooperatives repaid the government for funding rural electric networks.

We’re in the process of handing out billion in federal grants to build rural broadband networks, and there isn’t one shred of localism in the new federal programs. Instead, the government is handing out huge grants that are often lucrative enough to instead attract the biggest ISPs and speculative investors to build rural broadband.

Does anybody really think that AT&T or Charter really wants to provide broadband in rural America? AT&T spent the last forty years milking the last vestiges of profits out of rural networks while making zero investments. Charter and other big ISPs are not going to like the much higher operating costs in rural areas that come from long truck rolls and increased maintenance costs to keep up with huge miles of rural miles.

The big federal grants are providing an incentive for big ISPs or speculative investors to build fiber networks because the grants make it possible to earn high returns. It’s impossible to believe that a big ISP like AT&T is going to provide the same level of customer service, repairs, and network maintenance as would be done by a local ISP. I don’t need a crystal ball to know that there will be a huge difference in twenty years between a fiber network built today by a giant ISP and one built by a rural cooperative. Without future grants, AT&T won’t invest in future electronics upgrades, and the company won’t do the routine maintenance needed to keep the network in good working order. Cooperative fiber networks will be humming along like brand new while the big ISP networks will already be showing signs of age.

The electrification program didn’t include grants, and the newly formed cooperatives eventually repaid the federal government for lending it the money to build power grids. I can’t imagine that the federal government has ever made a better investment or gotten a better return than it did from the electrification loans – the only things that come even close are Interstate highways and the G.I. Bill that sent millions to college after WW II.

In a real slap-down against localism, current federal broadband grant programs are actually stacked against small local ISPs. Federal grants expect a grant recipient to prove upfront that it has the matching funds to pay for the portion of the project not funded from a grant. Small ISPs typically don’t have the kind of balance sheets that traditional banks are looking for, and I know small ISPs that are unable to get grants due to the inability to raise the matching funds. And forget about starting a new cooperative – the grants and banks are not interested in helping start-ups.

It’s a shame that we forgot about the most successful program imaginable for building rural networks. Forty years from now, we are going to see that many of the areas where big ISPs get grant money today will have gone to hell and will need federal funding relief again. But rural markets operated by good local ISPs will be humming nicely along in forty or a hundred years from now. Broadband is one of the areas where local really is better. We all know this, but little guys don’t get a say in writing grant rules.

Viasat and the CAF II Reverse Auction

The FCC has done some very odd things over the years to try to solve the rural broadband problem. One of the oddest was the CAF II reverse auction that awarded $122.5 million to Viasat. Things move so fast in this industry that this award at times feels like ancient history, but the reverse auction just ended in August 2018.

There is a lot of grant money currently raining down on the rural broadband industry, but when this award was made, that wasn’t happening. It’s hard to fathom that only three short years ago that the FCC deemed that giving $122.5 to a geosynchronous satellite company was good policy.

This blog is not intended as a criticism of Viasat. If you live in a place where DSL is not available, then satellite broadband is your likely only alternative. Viasat satellite broadband has gotten better over time. The broadband on the ViaSat-1 satellite launched in 2011 was dreadfully slow. The company markets broadband as fast as 100 Mbps download on the ViaSat-2 satellite launched in 2017. The company plans three new ViaSat-3 satellites with even higher capacity, with the first to launch sometime in 2022.

My consulting firm does detailed market research in rural counties, and we’ve heard from satellite customers across the country. We’ve never met anybody that fully loves the product. The most common complaints are high prices, small data caps, and big latency. Prices are high compared to other forms of broadband, with the latest pricing from Viasat as follows:

Price Speed Data Cap
Unlimited Bronze $84.99 12 Mbps 40 GB
Unlimited Silver $119.99 25 Mbps 60 GB
Unlimited Gold $169.99 100 Mbps 100 GB
Unlimited Platinum $249.99 100 Mbps 150 GB

There is a $12.99 per month additional fee for equipment on top of these prices. A customer must sign a 2-year contract to get these prices, with a fee of $15 per remaining month if a customer breaks a contract.

We’ve been told by customers that download speed slows to a crawl after a customer exceeds the monthly data allotment. To put these data caps into perspective, OpenVault says that at the end of the first quarter of this year that the average U.S. home used 462 gigabytes of data. It’s not easy for a modern home to curtail usage down to 60 or 100 GB.

The biggest performance problem is probably the latency, which can be 10 to 15 times higher than with terrestrial broadband. The latency lag is due to the time required for the signals to go to and from the satellites parked at over 22,000 miles above the earth which adds time to every round-trip connection to the web. Most real-time web connections, such as using voice-over-IP or connecting to a school or corporate WAN, work best with a latency of less than 100 ms (milliseconds). We see speed tests on satellites with a reported latency between 600 ms and 900 ms.

I wonder about the long-term repercussions of this reverse auction grant award. Most federal programs prohibit providing a government subsidy to an area that is already receiving a federal broadband subsidy. Viasat is going to be collecting $12.25 million per year from the FCC through 2027. Will this mean that people unlucky enough to live where Viasat won the reverse auction can’t get faster broadband out of the wave of new grant funding? If so, these homes might be doomed to not get a landline broadband solution for decades.

The FCC should never have allowed geosynchronous satellite broadband into the reverse auction. Perhaps we can’t fully blame the FCC for not foreseeing the pandemic where rural people are screaming for better broadband. But it didn’t take much of a crystal ball in 2018 to understand that something better would come along sooner than the 10-year window they are providing in the Viasat award areas.

The FCC is likely to repeat this same mistake if it awards nearly $1 billion to Starlink in the RDOF reverse auction. If the federal infrastructure funding becomes available, there will be ISPs willing to build fiber to areas where Starlink will be getting a 10-year RDOF subsidy. It’s not too late for the FCC to change course and not make the RDOF award. Otherwise, the agency might be dooming a lot more people to not getting a permanent broadband solution.

5G for Cars – An Idea that Won’t Die

An industry group calling itself 5G Americas has published a whitepaper that touts the advantages of a smart auto grid powered by 5G and the C-V2X technology. This technology is the car connectivity standard that much of the industry has gelled around, replacing the older DSRC standard.

Over a decade ago, the FCC became so enamored over the idea of self-driving cars that the agency dedicated the 5.9 GHz spectrum band for the sole use of smart cars. The picture painted to the FCC at the time was the creation of a 5G network along roadways that would communicate with self-driving cars. As engineers experimented with smart cars, they quickly came to understand that the time lag involved in making real-time driving decisions in the 5G cloud was never going to be fast enough for the split-second decisions we constantly make while driving. Last year, the FCC halved the amount of bandwidth available for smart cars but didn’t totally kill the spectrum.

This whitepaper still envisions that the concept of a ubiquitous wireless network supporting smart cars. It’s not entirely surprising when looking at the companies that make up 5G Americas – AT&T, Ciena, Cisco, Crown Castle, Ericsson, Intel, Liberty Latin America, Mavenir, Nokia, Qualcomm, Samsung, Shaw Communications, T-Mobile, Telefónica, VMware and WOM. These companies would stand to make a lot of money on the idea if they could talk the government into funding the needed wireless network along roads.

There are still some interesting ideas suggested by the whitepaper. There are a lot of benefits to car-to-car communications. A car can be alerted when a neighboring car wants to change lanes or wants to pass. Drivers could peek into a camera of the car in front of them before trying to pass. Drivers can be alerted about a host of hazards, such as a car running a red light or patches of ice on the road ahead.

Most cars today already include a lot of safety features that weren’t imagined twenty years ago and the benefits envisioned by C-V2X technology sound like the next generation of safety features that car manufacturers are likely to embrace.

But this whitepaper doesn’t give up on a wireless network positioned along roads to communicate with vehicles. It refers to this as an intelligent transportation system (ITS), which would consist of a system of sensors and communications devices along roads specifically designed to communicate with vehicles. The paper touts additional benefits from a wireless network such as communications between cars and traffic lights and smart parking systems in cities.

Much of this whitepaper could have been written over a decade ago and probably was. The benefits are the same ones that have been discussed for years, although there has been some progress in developing the chips and the technology that could enable smart vehicles.

But the one thing that is largely skipped over in the paper is who pays for the infrastructure to support this. The paper suggests a collaboration between roadbuilders (federal, state, and local governments) and the cellular carriers. There is also an allusion about offering such amazing new features that car owners will pony up for a subscription to use the technology. My guess is that the real purpose of this whitepaper is to lobby Congress for grant funding for roadside networks. The paper largely suggests that government should pay for the 5G infrastructure along roads while the cellular carriers collect any subscription revenues.

The benefits touted by the paper all sound worthwhile. It would be nice to feel safe when passing another vehicle. It would be nice if your car could automatically be directed to the nearest parking place to your planned destination. But it’s hard to think those benefits are enough to entice governments to pony up for the needed infrastructure. Most of the roads in America are funded by local and county governments, and most of the roads outside of major cities are lightly traveled. I imagine most counties would laugh at the idea of funding this when many of these same counties don’t yet have broadband to homes.

If enough cars are equipped with the chips to enable this technology, there might be a few major metropolitan areas that might consider the idea. But therein lies the chicken and the egg question – will a city consider an investment in the technology before most cars have the chips, and will carmakers spend the money to install the chips before there are real-world places where this will work?

I hope that the car industry is pursuing the car-to-car communications ideas. That technology could enable most of the safety aspects touted by this whitepaper without investing in the external cellular network. The chipmakers can still make a lot of money by improving car safety. But this idea of having a ubiquitous 5G network along roads is never going to be practical, but it’s an idea that will seemingly never go away.

Aging Coaxial Copper Networks

We’re not talking enough about the aging coaxial copper networks that provide broadband to the majority of the broadband customers in the country. Comcast and Charter alone serve more than half of all broadband customers in the country.

These copper networks are getting old. Most coaxial networks were constructed in the 1970s and had an expected life of perhaps forty years. We seem to be quietly ignoring that these networks will soon be fifty years old.

There is an industry-wide consensus that telephone copper is past the end of its economic life, and most telephone networks that are still working are barely limping along. It was no surprise last October when AT&T announced that it would no longer connect new DSL customers – if the company had its way, it would completely walk away from all copper, other than as a convenient place to overlash fiber.

To some degree, coaxial networks are more susceptible to aging than telephone copper networks. The copper wire inside of a coax cable is much thicker than telephone copper wires, and that is what keeps the networks chugging along. However, coaxial networks are highly susceptible to outside interference. A coaxial network uses a technology that creates a captive RF radio network inside of the wires. This technology used the full range of radio spectrum between 5 MHz and 1,000 MHz inside the wires, with the signals arranged in channels, just like is done with wireless networks. A coaxial copper network is susceptible to outside interference anywhere along the wide range of frequencies being carried.

Decades of sun, cold, water, and ice accumulate to create slow deterioration of the coaxial copper and the sheathe around the wire. It’s vital for a coaxial sheathe to remain intact since it acts as the barrier to interference. As the sheath gets older, it develops breaks and cracks and is not as effective in shielding the network. The sheathing also accumulates breaks due to repairs over the decades from storms and other damage. Over forty or fifty years, the small dings and dents to the network add up. The long coaxial copper wires hanging on poles act as a giant antenna, and any break in the cable sheathing is a source for interference to enter the network.

Just like telcos never talk publicly about underperforming DSL, you never hear a cable that admits to poor performance in networks. But I’ve found that the performance of coaxial networks varies within almost every community of any size. I’m worked in several cities in recent years where we gathered speed tests by address, and there are invariably a few neighborhoods that have broadband speeds far slower than the rest of the network. The primary explanation for poorly performing neighborhoods is likely deterioration of the physical coaxial wires.

Cable companies could revitalize neighborhoods by replacing the coaxial cable – but they rarely do so. Anybody who has tried to buy house wiring knows that copper wiring has gotten extremely expensive. I haven’t done the math recently, but I wouldn’t be surprised if it costs much more to hang coaxial copper than fiber. You can see by the following chart how copper prices have peaked in recent years.

Big cable companies deliver decent bandwidth to a lot of people, but there are subsets of customers in most markets who have lousy service due to local network issues. I talk to cities all of the time who are tired of fielding complaints from the parts of town where networks underperform. City governments want to know when the cable companies will finally bite the bullet and upgrade to fiber. A lot of industry analysts seem to think the cable companies will put off upgrades for as long as possible, and that can’t be comforting to folks living in pockets of cable networks that already have degraded service. And as the networks continue to age, the problems experienced with coaxial networks will get worse every year.

Explaining Growth in Broadband Demand

I haven’t talked about the growth of broadband usage for a while. I was explaining the exponential growth of broadband usage to somebody recently, and I suddenly realized an easy way for putting broadband growth into context.

The amount of data used by the average broadband user has been doubling roughly every three years since the advent of the Internet. This exponential growth has been chugging along since the earliest dial-up days, and we’re still seeing it today. Consider the following numbers from OpenVault showing the average monthly U.S. household broadband usage:

1st Quarter 2018          215 Gigabytes

1st Quarter 2019          274 Gigabytes

1st Quarter 2020          403 Gigabytes

1st Quarter 2021          462 Gigabytes

Average household usage more than doubled in the three years from 2018 to 2021. The growth happened at a compounded growth rate of 29% annually. That’s a little faster than the more recent past, probably due to the pandemic, but in the decade before the pandemic, the compounded annual growth rate was around 26%.

What does this kind of growth mean? One way to think about broadband growth is to contemplate what growth might mean in your own neighborhood. Supposes you are served by DSL or by a cable company using HFC technology. If your ISP has the same number of customers in your neighborhood now as in 2018, the local network is now twice as busy, carrying twice as much traffic as just three years earlier. If your ISP hasn’t made any upgrades in that time, the chances are that you can already see some signs of a stressed network. Perhaps you notice a slowdown during the evening prime time house when most of the neighborhood is using broadband. You’ve probably run into times when it was a challenge making or maintaining a Zoom call.

To the average person, this kind of broadband traffic growth might not seem like a big deal, because they probably assume that ISPs are doing magic in data centers to keep things working. But any network engineer will tell that a doubling of traffic is a big deal. That kind of growth exposes the bottlenecks in a network where things get bogged down at the busiest times.

The most interesting way to put broadband growth into perspective is to look into the future. Let’s say that the historical 26% growth rate continues into the future. There is no reason to think it won’t because we are finding more ways every year to use broadband. If broadband keeps growing at the historical rate, then in ten years your neighborhood network will be carrying ten times more traffic than today. In twenty years it will be carrying one hundred times more traffic than today.

When you think of growth in this manner, it’s a whole lot easier to understand why we shouldn’t be funding any technologies with grant money today that won’t be able to keep up with the normal expected growth in broadband traffic. Looking at growth from this perspective explains why AT&T made the decision last year to stop selling DSL. Understanding the normal growth rate makes it clear that it was idiotic to give CAF II funding to Viasat. Expected growth might be the best reason to not give RDOF subsidies to Starlink.

I have nothing against Starlink. If I still lived in a rural area, I would have been one of the first people on the list for the beta test. But there are already engineers I respect who believe that the Starlink network will struggle if the company sells to too many customers. If that’s even just a little bit true today, then how will Starlink perform in ten short years when the traffic will be ten times higher? And forget twenty years – Starlink is at heart a wireless network, and there are no tweaks to a wireless network that will ever handle a hundred-fold increase in traffic. If Starlink is still viable in twenty years, it will be because it took the same path as Viasat and imposed severe data caps to restrict usage or else raised rates to restrict the number of customers on the network.

I take flak every time I say this, but if I was in charge of grant programs, I wouldn’t fund anything other than fiber. I can’t think of any reason why we would fund any technology that doesn’t have a reasonable chance to still be viable in ten short years when broadband usage will likely be ten times higher than today. I would hope that a government-funded network will still be viable in twenty years when traffic volumes are likely to be one hundred times greater than today. If we don’t get this right, then we’re going to be talking about ways to build rural fiber a decade from now when other technologies crash and burn.

A Fiber Land Grab?

I was surprised to see AT&T announce a public-private partnership with Vanderburgh County, Indiana to build fiber to 20,000 rural locations. The public announcement of the partnership says that the County will provide a $9.9 million grant, and AT&T will pick up the remaining $29.7 million investment.

The primary reason this surprised me is that it is a major reversal in direction for AT&T. The company spent the last thirty years working its way out of rural America, capped by an announcement in October 2020 that the company will no longer connect DSL customers. AT&T has publicly complained for years about the high cost of serving rural locations and has steadily cut its costs in rural America by slashing business offices and rural technicians. It’s almost shocking to see the company dive back in as a last-mile ISP in a situation that means long truck rolls and higher operating costs.

I’m sure it was the County grant that made AT&T consider this, but even that is surprising since the County is only contributing 25% of the funding. I’ve created hundreds of rural business plans, and most rural builds need grants of 40% or even much more to make financial sense. I assume that there is something unique about this county that makes that math work. AT&T and other telcos have one major advantage for building fiber that might have come into play – they can overlash fiber onto existing copper wires at a fraction of the cost of any other fiber builder, so perhaps AT&T’s real costs won’t be $29.7 million. Obviously, the math works for AT&T, and another county will be getting a rural fiber solution.

AT&T is not alone in chasing rural funding. We saw Charter make a major rural play in last year’s RDOF reverse auction. The RDOF reverse auction also attracted Frontier and Windstream, and both of these companies have made it clear that pursuing fiber expansion opportunities and pursuing grants are a key part of their future strategic plan.

My instincts are telling me that we are about to see a fiber land grab. The big ISPs other than Verizon had shunned building fiber for decades. When Verizon built its FiOS network, every other big ISP said they thought fiber was a bad strategic mistake by Verizon. But we’ve finally reached the time when the whole country wants fiber.

This AT&T announcement foreshadows that grant funding might be a big component of a big ISP land grab. The big ISPs have never been shy about taking huge federal funding. I wouldn’t be surprised if the big ISPs are collectively planning in board rooms on grabbing a majority of any big federal broadband grant funding program.

I think there is another factor that has awoken the big ISPs, which is also related to a land grab. Consider Charter. If they look out a decade or two into the future, they can see that rural fiber will surround their current footprint if they do nothing. All big ISPs are under tremendous pressure from Wall Street to keep growing. Charter has thrived for the last decade with a simple business plan of taking DSL customers from the telcos. It doesn’t require an Ouiji board to foresee the time in a few years when there won’t be any more DSL customers to capture.

I’m betting that part of Charter’s thinking for getting into the RDOF auction was the need to grab more geographic markets before somebody else does. Federal grant money makes this a lot easier to do, but without geographic expansion, Charter will eventually be landlocked and will eventually stop growing at a rate that will satisfy Wall Street.

Charter must also be worried about the growing momentum to build fiber in cities. I think Charter is grabbing rural markets where it can have a guaranteed monopoly for the coming decades to hedge against losing urban customers to competition from fiber and from wireless ISPs like Starry.

My guess is that the AT&T announcement is just the tip of the iceberg. If Congress releases $42 billion in broadband grants, the big companies are all going to have their hands out to get a big piece of the money. And that is going to transform the rural landscape in a way that I would never have imagined. I would have taken a bet from anybody, even a few years ago, that AT&T would never build rural fiber – and it looks like I was wrong.