Latest Broadband Statistics 3Q 2022

The latest Broadband Insights Report from Ookla shows broadband statistics for the end of the third quarter of 2022.

The average household used 495.5 gigabits of broadband per month in the quarter. That is the combination of 474.2 gigabytes of download and 32.3 gigabytes of upload.

What Ookla calls power users continues to climb. 13.7% of homes used more than 1 terabyte per month. 2.1% of all households use more than 2.1 terabytes.

The speeds of household broadband subscriptions continue to migrate to faster speeds. A lot of this is ISPs arbitrarily giving consumers faster speeds. But there are also a lot of folks opting to buy faster speeds. Only 13.1% of homes are now subscribed to speeds under 100 Mbps. 15.4 Percent of homes are subscribing to gigabit or faster broadband speeds.

Subscribers 3Q 2020 3Q 2021 3Q 2022
Under 50 Mbps 18.8% 9.8% 4.7%
50 – 99 Mbps 19.9% 8.0% 8.4%
100 – 199 Mbps 36.4% 38.4% 9.9%
200 – 499 Mbps 14.1% 27.4% 54.8%
500 – 999 Mbps 5.2% 5.1% 6.7%
1 Gbps+ 5.6% 11.4% 15.4%

I always wonder when I see one of these monthly reports where the current quarter fits into broadband trends. The following chart shows the average household usage by quarter reported by Ookla since the beginning of 2019.

This chart shows a clear pattern. It shows that broadband usage is strongest in the fourth quarter of each year. The usage dips a bit for the next several quarters each year. This trend was confounded by the pandemic when the first quarter usage spiked over the end of 2019. But from that point forward, the expected trend continued.

But the overall trend is clear, and usage is growing over time. Home broadband usage spiked during the pandemic when 2020 usage was more than 40% higher than in 2019. Usage then grew by 11% from 2020 to 2021 and grew by 14% from 2021 until 2022. Household broadband usage has grown 80% from the third quarter of 2019 until 3Q of this year.

 

 

 

 

 

 

 

 

A Slowdown in Cellular Expansion?

Mike Dano had a series of articles recently in LightReading talking about how the big cellular carriers plan to significantly cut back on 5G spending in 2023. Dano cited one analyst, Tom Nolle of CIMI that said that the cellular carriers are having a hard time making the business case for expanding 5G. The cellular companies are not seeing an uptick in new incremental revenues as a result of 5G investments. He says the cellular companies are clueless and don’t see a path to increase revenues next year.

This feeling of falling 5G expectations was bolstered by a somber outlook from Crown Castle. The biggest owner of cell sites said that it doesn’t see the big cellular carriers spending heavily in 2023 for cell towers or small cell sites.

As might be expected in complicated economic times, not all analysts agree. Dano cites analysts from Raymond James that say that 2023 will mark the year when the cellular companies start spending at a slow steady pace over multiple years to put in the promised 5G expansions.

As with most topics, I ask what this might mean for rural broadband. T-Mobile and Verizon have made a big recent splash in the industry with the rollout of the FWA fixed cellular broadband product. In the second quarter of 2022, Verizon and T-Mobile added 816,000 FWA customers. For the quarter, the largest seven cable companies collectively lost 60,000 customers. The six largest telephone companies lost 88,000 customers. Before the first quarter of 2022, we heard almost nothing about FWA.

I have to wonder what the news of a cell site expansion means for rural broadband. For customers lucky enough to be able to buy it, the FWA product has been a huge improvement over other kinds of rural broadband. I talked to one farmer who lived adjacent to a cell site and was seeing speeds of 200 Mbps. For this farmer, the faster FWA speeds meant being able to finally utilize his smart farming applications. But his neighbors, only two miles away, weren’t seeing speeds over 50 Mbps.

I’ve always wondered why a cellular company would make the FWA upgrade or even the 5G upgrade at a rural cell site. For a cell site located in a farming area there probably aren’t more than a handful of potential customers within a few miles of a tower. It doesn’t seem like an investment that is ever going to see a return. Voice is a little different because a voice signal can carry many more miles from an upgraded cell site – but most upgrades are leaving voice traffic on 4G.

Both T-Mobile and Verizon said that they were seeing many of the new FWA customers in cities and suburbs and not from rural areas. This makes sense. First, a lot more people are candidates for the product in more densely populated areas. The FWA product is also priced attractively, and I’ve been thinking of it more as a DSL replacement than a direct competitor to cable broadband. The FWA speeds are not as fast as cable broadband, and the signal strength will vary as it does with any wireless product.  Just look at how the cellular bars vary at your house and ask if you want that kind of variance in a home broadband connection. If your only existing choice is lousy rural broadband, you’ll gladly take it as an upgrade. But it seems like a harder sell to folks who have faster alternatives.

I can’t do any more than speculate because even the analysts don’t agree on the trajectory of the cellular industry, although the poor outlook from Crown Castle seems fairly persuasive. We are now sitting at an odd economic time where inflation and interest rates affect everybody, including the big companies. I suspect we’re going to get mixed signals about the near-term future from others, and not just the cellular companies. 2023 is going to be an interesting year to follow the big ISPs.

AT&T in the News

AT&T has not been in the headlines a lot this year, but recently I’ve seen the company’s name everywhere.

In the recently released financial results for the third quarter, AT&T noted that it now has more fiber broadband customers than non-fiber customers. At the end of the quarter, AT&T had 6.93 million fiber customers compared to 6.86 million remaining non-fiber customers. Non-fiber customers are predominantly U-Verse customers served by two pairs of telephone copper. The company still also has 340,000 DSL customers served by a single copper pair. There are also some rural fixed-wireless customers.

In the third quarter, AT&T added 338,000 fiber customers. The company lost 367,000 non-fiber customers in the second quarter – although counting them as lost is probably a misnomer since many were likely upgraded to fiber.

Upgrading to fiber is good for the company’s bottom line. For the quarter, the average revenue per user (ARPU) was $62.62 for fiber customers compared to only $54.60 for non-fiber customers. AT&T has also been saying for years that the cost of maintenance for copper is a lot higher, so the company is likely shedding costs as it sheds customers served on copper.

We also got a peek at market AT&T’s penetration. AT&T says it passes 18.5 million potential customers with fiber, meaning the company has achieved an overall 37% market penetration on fiber. In the third quarter, the company added fiber to pass 500,000 new locations.

I saw another interesting news blurb about AT&T. Bloomberg reported that AT&T is looking for an equity partner to invest in a major expansion of fiber. That would be a big departure from the past since AT&T has always funded its own capital expenditures and networks.

But it’s not hard to see from the third quarter results why AT&T might be seeking additional funding. In the third quarter, the company generated $9.87 billion of cash. It invested $4.71 billion in new infrastructure and paid $3.75 billion in dividends – leaving only $1.41 billion in free cash.

I would conjecture that AT&T wants to invest more heavily in fiber immediately since it’s clear that there is a mad rush nationwide to build fiber in cities. Fiber overbuilders hope that if they are the first to a market with fiber that it might dissuade other fiber overbuilders – so we are currently seeing a fiber land grab. In the long run, sharing fiber profits with an investor will decrease future AT&T earnings. The calculus that the company is betting on is that the market share gained by building first to markets outweighs the cost of sharing profits.

AT&T is currently debt-heavy. AT&T hasn’t had a recent track record of making good investment decisions. It’s been reported that AT&T lost as much as $50 billion from its purchase of DirecTV. In almost the same time frame, the company lost as much as $42 billion from its purchase and sale of WarnerMedia. The company might not be able to easily borrow the money, particularly at current interest rates.

The final news is that AT&T was fined $23 million to resolve a federal investigation that the company had “unlawfully influenced” the former Illinois Speaker of the House, Michael J. Madigan. AT&T admits that it paid Madigan, through an ally, to promote legislation that would eliminate carrier of last resort in the state – meaning that the company is obligated to serve people who ask for a telephone line. That obligation also comes with legacy regulatory requirements that AT&T wanted to ditch.

What always dismays me, but never surprises me, is that nobody at a big company like AT&T got in trouble for breaking the law – in this case, bribing a government official. The size of the fine might be appropriate for the magnitude of the crime, but I’ve always thought that the folk at big companies would be more likely to hesitate to be unethical if they saw others going to jail for breaking the law. The only real consequence for AT&T, in this case, is that they got caught, and the fine will just be viewed as the cost of doing business.

Is it Time to Say Farewell to GPON?

GPON is a great technology, GPON stands for gigabit passive optical network, and it is the predominant technology in place that is delivering fiber last mile broadband. The GPON standard was first ratified in 2003, but like most new technologies, it took a few years to hit the market.

GPON quickly became popular because it allowed the provisioning of a gigabit service to customers. A GPON link delivers 2.4 gigabits downstream and 1.2 gigabits upstream to serve up to 64 customers, although most networks I’ve seen don’t deliver to more than 32 customers.

There is still some disagreement among ISPs about the best last-mile fiber technology, and some ISPs still favor active Ethernet networks. The biggest long-term advantage of GPON is that the technology serves more customers than active Ethernet, and most of the R&D for last-mile fiber over the past decade has gone to PON technology.

There are a few interesting benefits of GPON versus active Ethernet. One of the most important is the ability to serve multiple customers on a single feeder fiber. PON has one laser at a hub talking to 32 or more customers. This means a lot less fiber is needed in the network. The other advantage of PON that ISPs like is that there are no active electronics in the network – electronics are only at hubs and at the customer. That’s a lot fewer components to go bad and a less repairs to make in the field.

We’re now seeing most new fiber designs using XGS-PON. This technology increases bandwidth and delivers a symmetrical 10-gigabit path to a neighborhood (for purists, it’s actually 9.953 gigabits). The technology can serve up to 256 customers on a fiber, although most ISPs will serve fewer than that.

The biggest advantage of XGS-PON is that the electronics vendors have all gotten smarter, and XGS-PON is being designed as an overlay onto GPON networks. An ISP can slip an XGS_PON card into an existing GPON chassis and instantly provision customers with faster broadband. The faster speeds just require an upgraded ONT – the electronics at the customer location.

The vendors did this because they took a lot of grief from the industry when they converted from the earlier BPON or APON to GPON. The GPON electronics were incompatible with older PON, and it required a forklift upgrade, meaning a replacement of all electronics from the core to the customer for the upgrade. I helped a few clients through the BPON to GPON upgrade, and it was a nightmare, with staff working late nights since neighborhood networks had to be taken out of service one at a time to make the upgrade.

The other interesting aspect of XGS-PON is that the technology is also forward-looking. The vendors are already field-testing 25-gigabit cards and are working on 40-gigabit cards in the lab. A fiber network provisioned with XGS-PON has an unbelievable capacity, and with new cards added is going to make networks ready for the big bandwidth needs of the future. Any talk of having online virtual reality and telepresence can’t happen until ISPs can provision multi-gigabit connections to multiple homes in a neighborhood – something that would stress even a 10-gigabit XGS-PON connection.

XGS-PON is going to quickly open up a new level of speed competition. I have one new ISP client using XGS-PON that has three broadband products with download speeds of 1, 2, and 5 gigabits, all with an upload speed of 1 gigabit. The cable companies publicly say they are not worried about fiber competition, but they are a long way away from competing with those kinds of speeds.

I’m sure GPON will be around for years to come. But as happens with all technology upgrades, there will probably come a day when the vendors stop supporting old GPON cards and ONTs. The good news for ISPs is that I have a lot of clients that have GPON connections that have worked for over a decade without a hiccup, and there is no rush to replace something that is working great.

Broadband Pricing Disparities in L.A.

Now that digital equity has become a hot topic. I’m starting to see studies from around the country looking at the inequities in the way that large ISPs treat customers.

One of the latest studies comes from the California Community Foundation, which looked at rates being offered to new customers in different parts of Los Angeles. Los Angeles is an odd broadband market in that Charter is a monopoly in much of the market. Charter claims to provide service in almost 96% of Census blocks, while AT&T and Frontier each only serve about a fifth of the market. Fourth is Cox, with a tiny market share. This means that a majority of customers in Los Angeles can only buy broadband from Charter, with no other landline option.

The study concentrated on Charter since they are the ubiquitous ISP, but there are findings about the other two ISPs as well. The study was done by looking at broadband products and rates that are advertised to homes scattered across the 88 separate communities in the LA area. ISPs today make offers online to customers looking to connect to broadband, and the study looked at specific offers made in different communities.

The study instantly found that the products and prices offered to residents vary widely by neighborhood. You might think that the products available online from a big ISP like Charter would be the same for the whole market or even the whole country, but there is a dramatic difference in some cases with the products and prices that are offered online.

For example, the base broadband product offered by Charter online seems to be Internet Ultra, which provides a download speed of 500 Mbps. This is the only product that was offered at every address in the study. About three-quarters of addresses were offered the 300 Mbps download product. Only about one-fourth of homes were offered the 100 Mbps broadband product.

The biggest finding from the study is that Charter offers better pricing along with better terms and conditions to wealthier neighborhoods. That is counterintuitive, and basic economics 101 says that businesses should be expected to get the highest prices out of customers who can afford it.

The examples listed in the report are devastating. In one case, Charter offered an address in Willowbrook (where the poverty rate is 8%) a 2-year special rate of $30 per month for a new subscriber to the Internet Ultra product. A home just two miles away in Watts, where the poverty rate is 31%, was offered the same product for a 1-year deal at $70 per month. In both cases, the product reverts to the $95 list price at the end of the term. This is a gigantic difference. The home in Willowbrook was offered 500 Mbps for a two-year cost of $720, while the home in Watts was offered a package that would cost $1,980 over two years.

Charter called the report misleading and said that promotional rates change all of the time. But the study was done across the city at the same time, meaning there was no big timing difference where promos had changed. Charter’s defense is that everybody eventually pays the full price.

There is no easy way for Charter to defend this. It’s obvious that somebody at the company is uploading different specials into the online portal by address or neighborhood. This can’t be random, and that means that somebody in the Charter marketing department (or, more likely, some piece of software) is making these determinations based on what others are willing to pay in each neighborhood. This feels like broadband pricing set by a sophisticated pricing algorithm like what is used for airline seats.

Charter has broken no laws, but this is still a black eye for the big ISP. The big cable companies might wonder why a fiber overbuilder does so well in new neighborhoods – but they need to look no further than the findings from this study to know why customers don’t like or trust them.

Being Stingy with Broadband Speeds

As I work in various parts of the country, I help new ISPs choose the speeds and the prices to offer on fiber networks. Part of that research begins with looking at what other ISPs charge in the region. I should probably stop being surprised, but I’m still taken aback when I see fiber-based ISPs offering what can best be described as stingy speeds. Just the other day, I ran across an ISP that is offering a range of speeds between 25/3 Mbps and 100/20 Mbps on fiber. Earlier this year, I ran across an ISP that has fiber products as tiny as symmetrical 10 Mbps.

This frankly mystifies me, and I always wonder why somebody with fiber would offer broadband products that are similar to their competitors. I figure that part of the reason is what I would call old thinking. Somebody offering that kind of speed is likely a small telco that used to offer DSL or a small rural cable company that didn’t have fast speeds. DSL products were set at a range of speeds up to 25/3 Mbps because that’s what the technology would allow.

I can’t imagine the thought process that says the slow speeds are adequate. According to OpenVault, 75% of U.S. Households are currently subscribing to download speeds of 200 Mbps or faster. That includes over 14% of homes nationwide that are subscribing to a gigabit product. It’s clear that people want faster broadband.

I think another part of the reason that an ISP would set low speeds is a fundamental belief that customers that buy faster speeds will somehow cost the ISP a lot more money. But after having seen the impact of hundreds of ISPs that have upgraded to faster speeds – I know this is not true. There is a one-time increase in broadband usage when you unblock a community that has had restricted broadband. The people in such communities start using broadband like everybody else, and that looks like a one-time big increase in usage – but people are just catching up to the ways that most of the rest of country uses broadband. After that short burst to catch up, usage then grows like everybody else.

Another reason behind offering slow speeds probably goes back to the day when buying Internet backbone connections was extremely expensive, and operators feared that a burst in usage would cost a lot. That’s also not true anymore in most places. Wholesale broadband prices have tumbled over the last decade. I know ISPs that are buying eight or ten times more bandwidth than a decade ago, at basically the same cost. I know that there are still some small ISPs located deep in rural areas that are paying far too much for broadband from the local telco.

There was a time when most of the industry tried to throttle customer usage. I remember quotes from the CEOs of the big cable companies and telcos saying that people didn’t need faster speeds. However, folks like Verizon FiOS and a handful of early fiber overbuilders exploded that concept and the cable companies did a 180 and now routinely increase customer speeds as a way to keep folks satisfied.

This same thinking also manifests in pricing. An ISP that offers 25 Mbps on fiber might also offer a gigabit product – but at a price that nobody can afford. I still run across gigabit broadband on small fiber ISPs priced at $175 per month or higher. These prices are set to make sure that only a few people buy the faster broadband. This thinking comes from the underlying belief that faster speeds are a luxury. But that’s really odd thinking for somebody that operates a network that can easily provide symmetrical gigabit broadband at an affordable price.

And that’s what gets me the most – these ISPs are losing revenues by being stingy. If they offer a slow broadband product at $50, they would likely have a lot of customers willing to pay $70  or $80 per month for gigabit broadband. I can tell by looking at the offerings that most ISPs with slow speeds are making less than their peers.

I understood these speeds and prices somewhat a decade ago when fiber networks were new and buying backbone Internet was expensive. But I can’t understand ISPs that have these stingy pricing plans when their peers a town away have normal broadband pricing.

ISPs and Customer Data

The FCC recently made a data request to cellular carriers asking how long the companies retain geolocation data on customers. For those not sure what that means, it means that the carriers record your location from your smartphone as you move around during the day. Your cellular company keeps data that can retrace everywhere you’ve been during the day. This rightfully makes most people nervous that somebody is watching and recording every place they visit.

Geolocation data is only one small piece of the data that cellular carriers collect on people. Cellular companies obviously know everybody you’ve called and texted, including the content of every text. They know every app you’ve used, websites you’ve visited, and the topic of every Google search you’ve made from your phone. Cellular companies also know the content of every email you send, assuming the email is not encrypted.

The FCC sent the data request in response to pressure from the public and politicians to put some commonsense caps on the collection and use of customer data. Here is a link to the responses from the fifteen largest cellular carriers.

Here are a few of the most common responses to the data request:

  • Cellular carriers said they retained records of customer activity to be able to respond to requests from law enforcement. This is a big turnaround from twenty years ago when telephone companies only tracked customer telephone usage after getting a valid subpoena to do so. It now seems that the carriers claim to record everything done by all customers to be able to respond to subpoenas involving only a minuscule percentage of people. The carriers cite law enforcement and FCC rules that force them to track customers.
  • Ten of the carriers said that customers have no options for opting out of having their locations tracked.
  • The amount of time that carriers retain data varies from two months to five years.

This is an issue that has been investigated at the FCC before. Several years ago, the FCC considered large fines against some of the largest cellular carriers for improperly misusing customer location data, such as selling data to bail bondsmen.

This FCC investigation centered only on geolocation data, but people are concerned about how ISPs and wireless carriers use all collected customer data. Company privacy practices vary widely, as does the way that carriers explain data collection practices. Consider what various carriers tell customers in the terms of service.

T-Mobile explicitly tells customers that it uses their data to consider marketing to them. Further, its privacy policy not only says that customer data is collected directly but that the company might buy or get personal data from third parties like social media platforms, analytic providers, and consumer data resellers.

AT&T says that it might share customer data with third parties, such as device information, advertisements you view, and demographic information like your age, gender, and ZIP code.

Verizon says it may collect demographic and interest data and look at how customers use the Verizon website and apps. That may not sound like a lot, but it includes “information about browsing, searching and buying activities; IP address, mobile phone number, device numbers and identifiers, web addresses of the sites you come from and go to next, screen recordings, browser and operating system information, platform type, connection speed, and other attributes.” Verizon also sells data to third parties.

At the other extreme are carriers like Comcast, which says that it doesn’t track or record the apps people use, or the websites visited.

But interestingly, most of the carriers that say they don’t use customer data have resisted any FCC or FTC attempts to restrict the data that might be collected.

The FCC inquiry only dips a toe into the fringe of data collection practices. It’s always been assumed that the carriers make a lot of money selling and using customer data, but none of them ever identify or quantify the financial benefits. I know I am probably like a lot of the public and would like to see more restrictions and disclosure requirements for carriers and ISPs. I know this view is shared by most small ISPs that don’t record or share data – I think they need to remind folks about this more often.

Update on Satellite Broadband

It’s been a busy few weeks with announcements from the satellite broadband industry. The industry keeps moving us closer to a time when almost anybody in the world will potentially have access to broadband.

The first announcement came from OneWeb. The company successfully launched 36 new satellites with rockets supplied by NewSpace India Limited. This new rocket company was formed in 2019 and is a public sector undertaking sponsored by the Indian Government and an arm of the India Space Research Organization. This launch is a reminder that many parts of the world are now interested in the space business.

These new satellites bring the OneWeb fleet of satellites up to 462. The company says it will ultimately launch 648 satellites. OneWeb intends to soon open up the constellation to global coverage. OneWeb’s business plan is to reach the remotest places in the world. The company has also been hinting at using the satellites to bring broadband to remote cell towers and to remote outposts for governments and militaries around the world.

Project Kuiper, owned by Amazon and Jeff Bezos is finally ready to hit the skies and plans to launch its first two prototype satellites in early 2023. The company has an ultimate goal of launching a total of 3,236 satellites. The first launch will use rockets from the United Launch Alliance using the new Vulcan Centaur rockets. Project Kuiper has already secured 38 additional launches on the Vulcan Centaur rockets, but the majority of its satellites will be deployed using the ULA Atlas V rockets. The company is rumored to have secured as many as 92 rocket launches.

One of the most interesting pieces of news comes from subscribers of Starlink. The company recently added new language to the terms of service for both residential and business customers that introduces the idea of a data cap. The new terms of service say that customers will get a monthly limit of ‘priority access’, and once that limit is reached, the customer will no longer be prioritized over traffic generated by other customers.

This is interesting from several perspectives. First, Starlink said in the early days of the business that it would never put a cap on usage. And with this announcement, it still hasn’t done that since customers will be free to continue to use broadband for the remainder of the billing cycle.

This feels eerily reminiscent of plans offered by the high-orbit satellite companies where usage slows down after customers reach a monthly usage limit.

Numerous engineers have speculated that any satellite constellation will have a finite capacity to move data, and this announcement hints that that data limit is already foreseeable for Starlink. Of course, the company can continue to launch more satellites and has plans on the drawing board to have as many as 30,000 satellites in its constellation. But for now, with a little over 2,300 satellites, this announcement says that the constellation is probably already getting over-busy at times. The ability to slow down customers is a classic way to serve more customers than the capacity of a network. The technique has been used for years by cellular carriers, and the supposed unlimited cellular data plans are not really unlimited because user speeds get significantly slowed when a customer reaches the subscribed data limit.

Satellite providers face the same dilemma as all ISPs in that the average broadband data consumption by consumers continues to grow at a torrid pace. According to Ookla, the average monthly broadband usage in the US has grown from 215 gigabytes per month in early 2018 to 481 gigabytes in June of this year. This growth puts a strain on all networks, but it has to be more of a problem for a satellite constellation which is going to have more backhaul restrictions than a landline network fed by fiber.

The RDOF Fixed Wireless Dilemma

I’m working with a number of rural counties that are trying to come to grips with the long-term implications of RDOF awards in their counties going to ISPs that plan to deliver broadband using fixed wireless technology. Most of them are not sure what to make of the situation for a number of reasons.

First, many of these counties are pleased about the wireless RDOF winners if that means bringing a broadband solution sooner. The folks in their counties are crying out for a broadband solution. But the big worry about the RDOF award winners is that the FCC gave RDOF winners a relaxed construction obligation compared to most other grants.

An RDOF recipient has six years to build the full broadband solution – starting with the year after the award. A recipient of a 2022 RDOF award must build 40% of the network by the end of 2025, 60% of the deployment by the end of 2026, 80% of the network by the end of 2027, and 100% of the network by the end of 2028. At the end of 2028, the FCC will publish a final list of locations in the RDOF area, and the ISPs have until the end of 2030 to reach any locations that were not already covered. Counties are rightfully worried that RDOF recipients will use the full timeline, meaning some folks won’t see a solution until 2027 or 2028.

There is also a concern that the FCC has a poor history of follow-through with subsidy awards, such as the many locations that were slated to get CAF II upgrades that don’t seem to have been upgraded – with no apparent reaction or consequences from the FCC. The fear is RDOF winners will cherry-pick the easiest areas and not bother with the rest and some folks will never get served. The worst thing is that a county won’t know for sure that folks won’t be served until 2028.

Another concern I’m hearing is that, in many cases, the RDOF awards were given in counties where there is one or more local ISPs willing to build fiber with grant assistance. These might be an electric cooperative or small telco that would willingly have brought fiber to the RDOF areas. These counties feel cheated by the FCC, particularly the RDOF awards that were made by the FCC after the announcement and funding of the $42.5 billion in BEAD grants. These counties feel that the FCC snatched away a fiber solution instead of putting the RDOF awards on hold.

The concern several of them have expressed is the sustainability of fixed wireless. They understand that a fiber network is probably going to still be in place and working at the end of this century, with perhaps three or four electronics upgrades during that time. But they’ve all heard that wireless technology has a shelf life of perhaps seven years, and they worry if the RDOF winners are going to be willing and able to pay for upgrades ten or eleven times during the rest of the century.

Finally, the ISPs in these counties are dismayed at what can best be described as the checkerboard way that the RDOF was awarded. The RDOF award areas are rarely nice contiguous service areas but are scattered pockets of Census blocks. ISPs can see that it is going to be extra challenging to find other grant funding to bring a solution to other areas. In many cases, they’ll have to spend their own money to build across RDOF areas in order to create a coherent fiber network.

Finally, some counties are concerned that the RDOF winners have not reached out to them to discuss these concerns and to convey their plans for bringing the promised faster broadband. I know that many of these awards were just made this summer, but there has been sufficient time for the RDOF winners to have met with local officials to convey their plans.

To be fair, some of these same counties have a similar list of concerns if grants go to the giant ISPs instead of somebody local. The folks in most rural areas know that the current round of grant funding is probably the only chance to get the broadband solution done right, and none of them want to be the poster child as a place where the giant grants and subsidies failed.

Regional Differences in Broadband Costs

ISPs and communities are always asking me for metrics to help them estimate how much grant funding they might need to subsidize building a new broadband network. Unfortunately, there is no such metric because broadband costs are always unique to a given community.

One of the statistics they want to give me is the population density – the number of homes and businesses per square mile. They are often surprised when I tell them that number alone tells me almost nothing about broadband costs. It’s far more important to know where homes are located in relation to roads.

Let me give two easily contrasting examples. One is a rural area made up of rather large family farms engaged in growing row crops. Generally, the farm and a few buildings sit somewhere in the center of the farm, and there may be a few more homes added over the years along roads. From the perspective of building fiber, this is about the worst situation you can find. The cost of fiber per customer is extremely high when neighbors can’t see each other. This situation gets particularly expensive in a county with a square grid road system where there are at least a few homes on every road – that adds even more cost.

Contrast this with places that look on paper to have a lower population density. For example, some of the rural counties in New Mexico are sparely populated compared to a farming community in the Midwest when looking at people per square mile. But in the rugged terrain of New Mexico, the houses and business test to be relatively close to the handful of roads that traverse a county. While there aren’t a lot of homes, there also aren’t a lot of miles of roads. The number of homes per mile of fiber construction might be much higher in New Mexico than a Midwest farm community, even with far fewer people.

But there are still nuances that matter. Another issue is how far the homes are from the roads. In some of the mountain valleys of Pennsylvania, homes tend to be close to the country roads since the hills rise up immediately behind the homes. I looked at the cost of building a county in West Virginia and another in Minnesota where a large percentage of homes are far back long lanes off the handful of main roads. The amount of fiber per customer needed in the Pennsylvania example is far less than the other two cases.

While the relationship between homes to roads is a major factor, it’s not the only one. There are some counties where the cost to build is higher than expected for other reasons. I was looking at a county in Colorado where the existing utility poles were in dreadful shape, with more than half needing replacement. In the same county, there is hard rock right up to the surface of the soil. In this particular county there is no affordable way to build fiber. When people talk about places that need an alternative to fiber, they are talking about situations like this. This contrasts significantly from places with deep topsoil, like the farmlands in Minnesota and Iowa or the farming valleys of California. I’ve always joked that in some of these places, you can bury fiber with a tablespoon.

Most places in the country are not as extreme as the above examples. A more typical circumstance is where a county has a mix of different construction situations. For example, there might be a lot of farmland, but also many homes located along rivers where it’s rocky and much more expensive to build. It’s not unusual to find pockets of good and bad poles in most counties, meaning that any estimate of cost means looking everywhere first to understand the places that will be more expensive to build. I worked in one county that seemingly had water everywhere, with numerous small streams, lakes and ponds, and wetlands. Building a mile of fiber at any one place didn’t seem too expensive, but the cost of crossing the bridges and of avoiding the wetlands makes this whole county a challenge for building an extensive fiber network.

I know folks want quick and easy answers, but I refuse to give back-of-the-envelope construction cost estimates until I know an area well. I’ve seen too many places where a quick estimate would have been way off from the actual cost. There are consultants around who have generic models that will provide a quick cost estimate – but until an engineer has put eyes on the local situation, such quick estimates might not be the paper they are written on.