FCC Touts 6G

The FCC has seemingly joined forces with the marketing arm of the cellular industry in declaring that the spectrum between 7–16 GHz is now considered to be 6G. Chairman Jessica Rosenworcel recently announced that the agency would soon begin looking at the uses for this spectrum for mobile broadband. Specifically, the agency will be looking at 550 MHz of spectrum between 12.7-13.25 GHz for what Rosenworcel characterized as airwaves for the 6G era.

This 7-16 GHz spectrum is already used for a wide range of purposes, including fixed point-to-point microwave links, radio astronomy, communications with airplanes, and various military uses. Probably the biggest current use of the spectrum is for communicating with satellites. Rosenworcel said the agency would consider ways to share some of the spectrum between satellite and terrestrial uses.

The use of the 6G description for this spectrum is a big departure from the recent past. It was just in 2019 when Verizon defined 5G to include the millimeter-wave spectrum as high as 28-39 GHz as part of 5G. I’m sure most of you remember the never-ending TV commercials showing cellphones receiving 1-gigabit speeds. Verizon and a few other cellular carriers had deployed millimeter-wave spectrum in downtown areas of a few major cities as a gimmick to show how fast 5G could be. Verizon labeled this as Ultra Wideband to distinguish it from the 4G LTE spectrum that Verizon and others were starting to label as 5G.

It has to be confusing to be a cellular customer because I try to follow this stuff, and I can’t keep up with the cellular marketers. When Verizon used millimeter-wave spectrum and labeled it as Ultra Wideband, the company flashed a 5G UW icon to users to denote having access to the superfast speeds. But I’m hearing that people are now getting the 5G UW icon when connecting to Verizon’s C-Band spectrum, which is mid-range spectrum between 3.7-4.2 MHz.

The funny thing about everything that cellular marketers are doing is that 5G has nothing to do with any specific frequency range. 5G is a set of specifications to define how cell towers work, and the specification can be used with any spectrum. The 5G spectrum can work in the mid-range spectrum, in the band that the FCC just labeled as 6G, and in the higher millimeter wave spectrum.

I’m mystified that the FCC would suddenly label the spectrum between 7-16 GHz as 6G. There will be no 6G specification – anything we do in this spectrum will still either use the 4G LTE or 5G specifications.  Wireless scientists around the world have started experimenting with what they are calling 6G using terabit spectrum that ranges between 100 GHz and 1 THz. These high frequencies sit right below light and have the capability of being harnessed to transmit huge amounts of data for short distances, such inside superfast computer chips. Scientists expect within the next decade to develop the new 6G specifications.

Scientists understood that the 5G specifications would cover all spectrum up to 100 GHz. But apparently, we’re going to now carve up spectrum into tiny slices and label each tiny slice as a new generation of G. I’ve always joked that we’re going to be to 10G before we know it – and it turns out that was no joke at all and extremely conservative.

Behind all of the confusion behind mislabeling things as 5G and 6G is the fact that we will eventually need new cellular spectrum. Cellular networks seem robust today, but the demand for mobile data keeps growing. There are already a lot of complaints that the new spectrum labeled as 5G is overcrowded. The FCC knows it takes many years after declaring a new cellular spectrum until it shows up in the market. This is the time to look at new spectrum bands to put into use a decade from now. This is not going to be easy because satellite companies will be screaming loudly that cellular companies are trying to steal their spectrum. They aren’t completely wrong about this, and I don’t envy the FCC the job of refereeing between the competing uses of spectrum. Just recently, the FCC made it easier for satellite providers to share in existing spectrum bands. But when the FCC labeled this spectrum as 6G, I think we already know it ultimately favors the cellular companies.

The Rural Cellular Crisis

Over the last few years, I have helped dozens of counties get ready for the upcoming giant broadband grants. We’ve been very successful in helping counties identify the places in their County that don’t have broadband today – which is often drastically different than what is shown by the FCC maps. We then help county governments reach out to the ISPs in the region and open up a dialog with the goal of making sure that all rural locations get better broadband. This takes a lot of work – but it’s satisfying to see counties that are on the way to finding a total broadband solution.

In working with these counties, one thing has become clear to me. Some of these counties have a bigger cellular coverage problem than they do a broadband problem. There are often a much larger number of homes in a county that don’t have adequate cellular coverage than those who can’t buy broadband.

The counties I’ve helped have reached out to me – either directly or through an RFP looking for a consultant. Only a tiny number of the Counties identified their cellular problem up front when they hired me. Yet, when I talk to residents and businesses in the County – I hear more horror stories about poor cellular coverage than I do about poor broadband coverage.

I always knew that the cellular coverage maps published by the big cellular carriers were overstated. You might recall back before cellular advertising was all about 5G that the cellular carriers would all claim to have the best cellular coverage. They would proudly show their coverage map in the background on ads and on their websites to show how they covered most of the country.

I’ve come to learn that those maps were pure garbage. They weren’t just an exaggeration, and when you drilled down to look at specific counties, they were outright fabrications. I’ve worked recently with two counties that are the homes of major universities and one state capital. In all three of these counties, cellular coverage dies soon after people leave the biggest urban center.

If anything, I think that cellular coverage has gotten worse with the introduction of the spectrum that the carriers are all claiming as 5G. These are new frequency bands that have been introduced in the last few years to relieve the pressure on the 4G LTE networks. It makes sense that coverage would be reduced with the higher frequencies because one of the first rules of wireless technology is that higher frequencies tend to dissipate more quickly than lower frequencies. When I hear the complaints in these counties, I have to think that the 5G spectrum is not carrying as far into the rural areas.

This is a problem that is well-known to everybody in the industry, including the FCC. Back before the pandemic, the FCC came up with a plan to spend $9 billion from the universal service fund to build and equip new rural cellular towers – using a reverse auction method much like RDOF. This process derailed quickly when the biggest cellular companies produced bogus maps that Showed decent coverage in rural areas that were close to some of the smaller cellular carriers. The FCC was so disgusted by the lousy maps that it tabled the subsidy plan.

The FCC finally reconsidered this idea in 2021. Now the cellular carriers are required to produce maps every six months at the same time as ISPs report broadband coverage. If you haven’t noticed, you can see claimed cellular coverage on the same dashboard that shows the broadband map results. I haven’t spent much time digesting the new cellular maps since all of my clients are so focused on broadband. But I checked the maps in the region around where I live, and the maps still seem to exaggerate coverage. This is supposed to get better when wireless carriers are supposed to file heat maps for the coverage around each transmitter – we’ll have to see what that does to the coverage. It’s going to get harder for a wireless carrier to claim to cover large swaths of a county when it’s only on a tiny handful of towers.

There is a supposed way for folks to help fix the cellular maps. The FCC has a challenge process that requires taking a speed test using the FCC cellular speed test app. Unfortunately, this app requires a lot of speed tests in a given neighborhood before the FCC will even consider the results. I’m doubtful that most rural folks know of this app or are motivated enough to stop along the side of the road and repeatedly take the speed tests. And frankly, who knows if it will make any real difference even if they do?

The big cellular companies have clearly not invested in many new rural cell towers over the last decade because they’d rather have the FCC fork out the funding. I haven’t the slightest idea if $9 billion is enough money to solve the problem or even put a dent in it. No doubt, the FCC will saddle the program with rules that will add to the cost and result in fewer towers being built. But whatever is going to happen, it needs to start happening soon. We are not a mobile society, and it’s outrageous that a lot of people can’t make a call to 911, let alone use all of the features that are now controlled by our cell phones.

The FCC and USF

The FCC quietly won two court cases over the last month that most folks have not heard about. A group of complainants brought a suit against the FCC, saying that the agency didn’t have explicit direction from Congress for the creation of the Universal Service Fund (USF) or the authority to delegate the operation of the USF to a third party. Years ago, the FCC prompted the creation of the non-profit firm Universal Service Administrative Company (USAC) to administer the day-to-day operations of the $10 billion fund.

The plaintiffs pleaded that the FCC didn’t have the constitutional authority to create the Universal Service Fund since that was not specifically spelled out by Congress. Specifically, plaintiffs argued that the FCC was violating the nondelegation doctrine, a legal principle that says that Congress cannot delegate its legislative powers to other entities.

The first ruling was issued by the Fifth Circuit Court of Appeals and the ruling came down squarely on the side of the FCC. The Court said that Section § 254 of the Telecommunications Act of 1996 had given the FCC explicit authority to advance and preserve universal telecommunications service and that the agency’s decision to create the USF falls under that authority given to the FCC by Congress. A similar decision was recently reached by the Sixth Circuit Court of Appeals.

The Universal Service Fund has always been controversial, and this is not the first challenge to its authority. There are a lot of people who don’t think the FCC should effectively have the power to levy a tax on telecommunications services, the primary tool for funding the USF. The FCC is careful to call this a fee, but to folks who pay it, the distinction between a fee and a tax is hard to see.

The Courts also upheld the FCC’s right to delegate the administration of the Universal Service Fund to USAC. The courts noted that USAC is purely administrative and doesn’t have any authority to create rules. The rulings found that USAC makes proposals to the FCC on ideas for using the fund – ideas the FCC is free to ignore.

If the FCC had lost these cases, it would have been devastating to some highly popular programs. The most popular is probably the Schools and Libraries (E-Rate) program, where the FCC subsidizes fast Internet connection for schools that have a high percentage of low-income students. The USF also administers subsidies to get broadband to rural healthcare facilities and the Lifeline program that provides a discount on broadband bills.

Probably the most controversial use of the USF is the Connect America Fund which provides subsidies to support rural broadband. The fund was used for the CAF II program that was supposed to improve rural DSL for the largest telcos – at a time when DSL was already obsolete and copper wire maintenance was nonexistent. This money was used to create the often-criticized RDOF program that held a reverse auction for funds to support rural broadband. The FCC has been studying the use of the fund to build more rural cell towers.

The FCC is not entirely out of the woods, and there is one more similar challenge to its authority pending in the 11th Circuit Court. Historically, strong rulings like the first two would limit the chance of a different ruling in another court. However, it seems lately that courts are more independently making decisions that are not based on prior rulings.

It would be an interesting scenario if the FCC’s authority to operate the USF is ended. All current broadband subsidies would likely come to a screeching halt. It’s likely that at least a few states would leap in and fill such a void, but that would mean a plethora of subsidy programs by states – which also could be challenged. But it would also likely mean that many states would do nothing.

Rural Cellular Coverage

When working in rural areas, I find invariably that any county that has poor rural broadband also has poor cellular coverage. If you plot a 2 or 3-mile circle around the existing cell towers in many counties, it becomes quickly obvious that cell coverage is non-existent in many places. The real cellular coverage in rural areas is drastically different than the national coverage maps that cellular carriers have been advertising for years.

The FCC announced a process to address this issue in October 2020 when it announced the creation of a 5G Fund for Rural America. This will be a $9 billion fund that comes from the Universal Service Fund and that will provide subsidies for wireless carriers to build and equip new rural cell towers. This fund would work through a reverse auction in the same manner as RDOF, with the only bidders in the auction being licensed cellular carriers. The first reverse auction will be for $8 billion, with the rest specifically set aside for tribal areas.

The FCC tried this a few years earlier and abandoned the process when it became obvious that the cellular coverage maps created by the big cellular companies had little to do with reality. As part of that effort the FCC required cellular carriers to submit maps of cellular coverage as a prelude to launching this fund. The smaller cellular companies all complained that the big cellular company maps were wrong and were aimed at locking them out of the reverse auction. The FCC agreed and canceled plans for the fund until the 2020 announcement.

I haven’t been following this issue closely enough at the FCC to understand why it’s taking so long to launch the endeavor, but I have to think that mapping is still a primary issue. Then FCC has now included cellular coverage in the same BDC mapping process used for broadband. When the new maps were released there were a lot of public complaints that the new FCC cellular maps still overstate rural coverage.

There is a map challenge process for the public to provide feedback to try to fix the cellular maps by taking speed tests from rural locations – but the process is cumbersome, and it’s likely that few people know about it or are providing the speed tests in the specified way. The speed tests must be logged through an FCC app.

There is no question that something like this funding is badly needed. It’s hard to justify building rural cell towers and installing radios at a tower will only see a handful of homes. Remote rural cell sites can’t possibly generate enough money to justify the cost of the radios and backhaul, let alone the towers. One of the issues that the FCC is going to have to face is that any subsidy for this issue might need to be permanent if the goal is to keep cell towers operating where few people live.

Poor cell coverage is devastating to an area. There are huge swaths of the country where folks can’t reach 911 by cellphone. We can’t get serious about smart agriculture without the bare minimum network to provide connectivity. No cell coverage makes it hard to do tasks that the rest of us take for granted.

One of the interesting things about the timing of this effort is how the rural cellular industry will benefit from the BEAD grants. There is no fiber near many of the best spots for rural towers, and the BEAD grants will fund the construction of a lot of fiber in rural areas that could be used to provide backhaul to new cell sites.

Interestingly, one of the things that was missed in creating the BEAD rules was any requirement for BEAD grant winners to provide fiber connectivity to rural cell towers at a fair price. That would have been a good opportunity for these different federal programs to mesh together for the benefit of both wireless and wireline rural broadband. One of the legitimate complaints made by cellular companies is that they are often quoted extremely high prices for broadband connectivity at cell towers – a lot of ISPs look at cell towers as a chance to make a lot of money.

Communities with poor cellular coverage need to keep an eye on this FCC program to make sure that some cellular carrier seeks the funding for building in their county. Just like with the BEAD grants, I have no idea of $9 billion is enough to get cellular coverage everywhere – but it is a good start.

The Need for Mid-Band Spectrum

5G Americas recently released its annual white paper discussing the lack of activity at the FCC in making more mid-band spectrum available for cellular broadband. The group is made up of large cellular carriers and various vendors or other companies associated with the cellular business.

Midband spectrum is an industry-defined term for the spectrum between 1 GHz and 7 GHz spectrum. This is the sweet spot for cellular broadband because these bands of spectrum can cover the distances needed for cell phone data along and carry a decent amount of bandwidth.

The paper laments that are no actions currently at the FCC to consider any new bands of spectrum in the range for cellular data. This is a concern for the cellular industry because it takes many years to open up a frequency for a new use. All parts of mid-band spectrum are currently in use. Any plan to free existing spectrum for cellular use means either relocating the current users to a different frequency or finding a way to accommodate them to coexist with cellular carriers.

The report does a great job of looking at the status of each mid-band spectrum block. Reading through the uses, it becomes quickly apparent that a lot of these spectrum bands are reserved for the U.S. government and includes uses like air traffic control, commercial and military radar, airplane altimeters, and numerous military applications.

I’m always instantly leery of any statistics, but the paper cites a report by Ericsson that the worldwide demand for cellular data is growing at 40% annually. Even if that number is true, I have to imagine that most of the increased demand comes in third-world countries where cellular is the predominant way to use the Internet and where the technology in many networks is still far behind what we have here. This statistic feels like a scare tactic, because 40% growth per year would mean a doubling of network demand every 2.5 years. If that growth was true in the U.S., we’d have heard a lot more about this growth outside of this whitepaper.

But I don’t know anybody who doesn’t think that we’ll eventually need more spectrum for mobile services. All uses of broadband keep growing, and it’s not hard to look out ten and twenty years and see a much larger demand for using wireless spectrum.

The report includes one statistic that I hadn’t seen anywhere else. It says that at the end of 2022 that North America had 108 million connections on the spectrum carriers have labeled as 5G and 506 connections that are still using 4G LTE. The initial goal for using the new mid-band 5G spectrum was to de-load 4G networks – the goal was never to move everybody to 5G. I would have expected more users of the 5G spectrum bands, but there still are a lot of cell sites that have not been upgraded to the 5G spectrum.

I think cellular carriers are going to have a challenge making their case to the public. The carriers have done a magnificent job, at least in cities, of increasing cellular speeds. According to the latest report from Ookla, the median nationwide download speed in March 2023 was over 81 Mbps, with speeds in cities over 100 Mbps.

It’s going to be more of a challenge since cellular carriers have lost some credibility with the public and politicians. They badly needed additional spectrum five years ago, but rather than openly plead that case, the carriers invented an imaginary 5G war with China and convinced the public that giving them more spectrum would transform the world. The dilemma for cellular companies now is that it’s clear that most of the public isn’t willing to spend more to get faster cellular speeds. There is no public outcry supporting more spectrum for cellular companies.

But the public has a short memory. Five years ago, a lot of markets were having huge cellular problems. It was so bad in some places that it was getting hard to make and hold mobile voice calls. The new spectrum bands that we’re labeling as 5G had a big role in solving that problem. As this whitepaper argues, we don’t want to wait until the networks degrade to have the conversation again.

More Mapping Drama

As if the federal mapping process needed more drama, Senator Jacky Rosen (Dem-Nevada) and John Thune (Rep-South Dakota) have introduced bill S.1162 that would “ensure that broadband maps are accurate before funds are allocated under the Broadband Equity, Access, and Deployment Program based on those maps”.

If this law is enacted, the distribution of most of the BEAD grant funds to States would be delayed by at least six months, probably longer. The NTIA has already said that it intends to announce the allocation of the $42.5 billion in grants to the states on June 30. The funds are supposed to be allocated using the best count of unserved and underserved locations in each state on that date. Unserved locations are those that can’t buy broadband of at least 25/3 Mbps. Underserved locations are those unable to buy broadband with speeds of at least 100/20 Mbps.

To add to the story, FCC Commissioner Jessica Rosenworcel recently announced that the FCC has largely completed the broadband map updates. That announcement surprised the folks in the industry who have been working with the map data, since everybody I talk to is still seeing a lot of inaccuracies in the maps.

To the FCC’s credit, its vendor CostQuest has been processing thousands of individual challenges to the maps daily and has addressed 600 bulk challenges that have been filed by States, counties, and other local government entities. In making the announcement, Rosenworcel said that the new map has added over one million new locations to the broadband map – homes and businesses that were missed in the creation of the first version of the map last fall.

But the FCC map has two important components that must be correct for the overall maps to be correct. The first is the mapping fabric that is supposed to identify every location in the country that is a potential broadband customer. I view this as a nearly impossible task. The US Census spends many billions every ten years to identify the addresses of residents and businesses in the country. CostQuest tried to duplicate the same thing on a much smaller budget and with the time pressure of the maps being used to allocate these grants. It’s challenging to count potential broadband customers. I wrote a blog last year that outlined a few of the dozens of issues that must be addressed to get an accurate map. It’s hard to think that CostQuest somehow figured out all of these complicated questions in the last six months.

Even if the fabric is much improved, the more important issue is that the accuracy of the broadband map is reliant on two issues that are reported by ISPs – the coverage area where an ISP should be able to connect a new customer within ten days of a request, and the broadband speeds that are available to a home or business at each location.

ISPs are pretty much free to claim whatever they want. While there has been a lot of work done to challenge the fabric and the location of possible customers – it’s a lot harder to challenge the coverage claims of specific ISPs. A true challenge would require many millions of individual challenges about the broadband that is available at each home.

Just consider my own home. The national broadband map says there are ten ISPs available at my address. Several I’ve never heard of, and I’m willing to bet that at least a few of them can’t serve me – but since I’m already buying broadband from an ISP, I can’t think of any reason that would lead me to challenge the claims of the ISPs I’m not using. The FCC thinks that the challenge process will somehow fix the coverage issue – I can’t imagine that more than a tiny fraction of folks are ever going to care enough to go through the FCC map challenge process – or even know that the broadband map exists.

The FCC mapping has also not yet figured out how to come to grips with broadband coverage claimed by wireless ISPs. It’s not hard looking through the FCC data to find numerous WISPs that claim large coverage areas. In real life, the availability of a wireless connection is complicated. The FCC reporting is in the process of requiring wireless carriers to report using a ‘heat map’ that shows the strength of the wireless signal at various distances from each individual radio. But even these heat maps won’t tell the full story. WISPs are sometimes able to find ways to serve customers that are not within easy reach of a tower. But just like with cellphone coverage, there are usually plenty of dead zones around a radio that can’t be reached but that will still be claimed on a heat map – heat maps are nothing more than a rough approximation of actual coverage. It’s hard to imagine that wireless coverage areas will ever be fully accurate.

DSL coverage over telephone copper is equally impossible to map correctly, and there are still places where DSL is claimed but which can’t be served.

Broadband speeds are even harder to challenge. Under the FCC mapping rules, ISPs are allowed to claim marketing speeds. If an ISP markets broadband as capable of 100/20 Mbps, they can claim that speed on the broadband map. It doesn’t matter if the actual broadband delivered is only a fraction of that speed. There are so many factors that affect broadband speeds that the maps will never accurately depict the speeds folks can really buy. It’s amazingly disingenuous for the FCC to say the maps are accurate. The best we could ever hope for is that the maps will be better if, and only if ISPs scrupulously follow the reporting rules – but nobody thinks that is going to happen.

I understand the frustration of the Senators who are suggesting this legislation. But I also think that we’ll never get an accurate set of maps. Don’t forget that Congress created the requirement to use the maps to allocate the BEAD grant dollars. Grant funding could have been done in other ways that didn’t relay on the maps. I don’t think it’s going to make much difference if we delay six months, a year, or four years – the maps are going to remain consistently inconsistent.

Is Broadband Regulation Dead?

I ask this question after Gigi Sohn recently withdrew her name from consideration as an FCC Commissioner. It’s been obvious for a long time that the Senate was never going to approve her nomination. Some Senators tried to blame their reluctance to approve on Sohn’s history as an advocate for the public over big corporations.

But the objections to Sohn were all the kinds of smokescreens that politicians use to not admit the real reason they opposed the nomination. Gigi Sohn is not going to be the next Commissioner because she is in favor of regulating broadband and the public airwaves. The big ISPs and the large broadcasting companies (some companies which are both) have been lobbying hard against the Sohn nomination since it was first announced. These giant corporations don’t want a third Democratic Commissioner who is pro-regulation.

In the past, the party that held the White House was able to nominate regulators to the FCC and other regulatory agencies that reflected the philosophies of their political party. That’s been a given in Washington DC, and agencies like the FCC have bounced back and forth between different concepts of what it means to regulate according to which party controlled the White House.

But I think the failure to approve Sohn breaks the historical convention that lets the political party in power decide who to add as regulators. I predict this will not end with this failed nomination. Unless the Senate gets a larger majority for one of the parties, I have a hard time seeing any Senate that is going to approve a fifth FCC Commissioner. If Republicans win the next presidential race, their nominee for the fifth Commissioner slot will also likely have no chance of getting approved.

The primary reason for this is that votes for an FCC Commissioner are no longer purely along party lines. The large ISPs and broadcasters make huge contributions to Senators for the very purpose of influencing this kind of issue. That’s not to say that there will never be a fifth Commissioner, but rejecting this nomination means it’s going to be a lot harder in the future to seat FCC Commissioners who embrace the position of the political party in power, like was done by Ajit Pai and likely would have been done by Gigi Sohn.

I think we’re now seeing the textbook example of regulatory capture. That’s an economic principle that describes a situation where regulatory agencies are dominated by the industries they are supposed to be regulating. Economic theory says that it’s necessary to regulate any industry where a handful of large players control the market. Good regulation is not opposed to the large corporations being regulated but should strike a balance between what’s good for the industry and what’s good for the public. In a perfectly regulated industry, both the industry and the public should be miffed at regulators for not fully supporting their issues.

The concept of regulatory capture was proposed in the 1970s by George Stigler, a Nobel prize-winning economist. He outlined the characteristics of regulatory capture that describes the broadband industry to a tee.

  • Regulated industries devote a large budget to influence regulators at the federal, state, and local levels. It’s typical that citizens don’t have the wherewithal to effectively lobby the public’s side of issues.
  • Regulators tend to come from the regulated industry, and they tend to take advantage of the revolving door to return to industry at the end of their stint as a regulator.
  • In the extreme cases of regulatory capture, the incumbents are deregulated from any onerous regulations while new market entrants must jump through high hoops.

The FCC is a textbook example of a captured regulator. The FCC under Ajit Pai went so far as to deregulate broadband and to wash the FCC’s hands of broadband as much as possible by theoretically passing the little remaining regulation to the FTC. It’s hard to imagine an FCC more under the sway of the broadband industry than the last one.

There is no real fix for regulatory capture other than a loud public outcry to bring back strong regulation. But that’s never going to happen when regulatory capture is so complete so that it’s impossible to even seat a fifth Commissioner.

No More Underbuilding

Jonathan Chambers wrote another great blog this past week on Conexon where he addresses the issue of federal grants having waste, fraud, and abuse – the reasons given for holding hearings in the House about the upcoming BEAD broadband grants. His blog goes on to say that the real waste, fraud, and abuse came in the past when the FCC awarded federal grants and subsidies to the large telcos to build networks that were obsolete by the time they were constructed. He uses the term underbuilding to describe funding networks that are not forward-looking. This is a phrase that has been around for many years. I remember hearing it years ago from Chris Mitchell, and sure enough, a Google search showed he had a podcast on this issue in 2015.

The term underbuilding is in direct contrast to the large cable and telephone companies that constantly use the term overbuilding to mean they don’t want any grant funding to be used to build any place where they have existing customers. The big ISPs have been pounding the FCC and politicians on the overbuilding issue for well over a decade, and it’s been quite successful for them. For example, the big telcos convinced the FCC to provide them with billions of dollars in the CAF II program to make minor tweaks to rural DSL to supposedly bring speeds up to 25/3 Mbps. I’ve written extensively on the failures of that program, where it looks like the telcos often took the money and made minimal or no upgrades.

As bad as that was – and that is the best example I know of waste, fraud, and abuse – the real issue with the CAF II subsidy is that it funded underbuilding. Rural DSL networks were already dying when CAF II was awarded, mostly due to total neglect by the same big telcos that got the CAF II funding. Those billions could have instead gone to build fiber networks, and a whole lot of rural America would have gotten state-of-the-art technology years ago instead of a tweak to DSL networks that barely crawling alone due to abuse.

The FCC has been guilty of funding underbuilding over and over again. The CAF II reverse auction gave money to Viasat, gave more money for upgrades to DSL, and funded building 25/3 Mbps fixed wireless networks. The classic example of underbuilding came with RDOF, where the areas that were just finishing the CAF II subsidy were immediately rolled into a new subsidy program to provide ten more years of subsidy. Many of the areas in RDOF are going to be upgraded to fiber, but a lot of the money will go into underperforming fixed wireless networks. And, until the FCC finally came to its senses, the RDOF was going to give a billion dollars to Starlink for satellite broadband.

The blame for funding underbuilding lies directly with the FCC and any other federal grant program that funded too-slow technologies. For example, when the CAF II funding was awarded to update rural DSL, areas served by cable companies were already delivering broadband speeds of at least 100 Mbps to 80% of the folks in the country. By the time RDOF was awarded, broadband capabilities in cities had been upgraded to gigabit. The policy clearly was that rural folks didn’t need the same quality of broadband that most of America already had.

But the blame doesn’t just lie with the FCC – it lies with all of the broadband advocates in the country. When the ISPs started to talk non-stop about not allowing overbuilding, we should have been lobbying pro-broadband politicians to say that the FCC should never fund underbuilding. We’ve collectively let the big ISPs frame the discussion in a way that gives politicians and regulators a convenient way to support the big ISPs. Both at the federal and state levels the broadband discussion has often devolved into talking about why overbuilding is bad – why the government shouldn’t give money to overbuild existing ISPs.

Not allowing overbuilding is a ludicrous argument if the national goal is to get good broadband to everybody. Every broadband network that is constructed is overbuilding somebody, except in those exceptionally rare cases where folks have zero broadband options. If we accept the argument that overbuilding is a bad policy, then it’s easy to justify giving the money to incumbents to do better – something that has failed over and over again.

It’s time that we call out the overbuilding argument for what it is – pure protectionism. This is monopolies flexing political power to keep the status quo, however poorly that is working. The big ISPs would gladly roll from one subsidy program to another forever without investing any of their own capital to upgrade rural networks.

Every time a regulator or politician says that we should not be using federal money to overbuild existing networks, we need to prod pro-broadband politicians to counter that argument by saying we should not be spending any more money on underbuilding. Broadband is infrastructure, just like roads and bridges, and we should be investing any grant money into the most forward-looking technology possible. If the national goal is to make sure that everybody has good broadband, then we should be ready to overbuild anywhere the incumbents have underperformed, be that in rural areas or inner cities. It’s time we shift the conversation away from protectionism to instead prioritizing bringing broadband that will still be good a decade or two after the grant award. Let’s not spend another penny of grant money on underbuilding networks by investing in slow technologies that are inadequate and obsolete even before they are completed.

Will the FCC Maps Get Better?

It is unfortunate timing that the new FCC maps were issued in the middle of the process of trying to determine the BEAD grant funding. Congress said that the amount of funding that will go to each state must be based upon the FCC maps – and the first draft of the FCC maps is clearly flawed. The FCC maps whiffed in many cases in counting the location of homes and business, and too many ISPs have clearly exaggerated both the coverage and the broadband speeds that are available to customers. This really bollixes the BEAD grant allocations, but I don’t know anybody who thought the first version of the new maps would have any merit.

Assuming that that grant funding question gets resolved somehow, there remains the bigger issue of whether the new FCC maps will ever accurately portray broadband availability. Is there any hope for these maps to get better? Getting better maps requires improving the three basic flaws of the new FCC maps – the mapping fabric that defines the location of possible customers, the claimed coverage that defines where broadband is available, and the broadband speeds available to customers.

The mapping fabric will get better over time if state and local governments decide this is something that is important to fix. Local folks understand the location of homes and businesses far better than CostQuest. But there are two reasons why the fabric might never be fixed. First, many rural counties do not have the staff or resources to tackle trying to fix the mapping fabric. There are still a lot of counties that don’t have a GIS mapping system that shows the details of every home, business, land plot, etc. But counties with GIS systems are not easily able to count broadband passings. Questions like how to count cabins or farm buildings are always going to be vexing. One of the flaws of asking local governments to fix the maps is that local governments don’t spy on citizens to see which homes are occupied or how many months a year somebody uses a cabin. My bet is that once the BEAD funding has been allocated that state and local governments will quickly lose interest in the FCC mapping fabric. I expect a lot of counties will refuse to spend the time and money needed to fix a federal database.

The FCC has held out hope that the coverage areas claimed by ISPs will become more accurate over time. One of the new aspects of the FCC maps is an individual challenge by any homeowner who disputes that a given ISP can deliver broadband to their home. If Comcast incorrectly claims a home can get broadband, the homeowner can challenge this in the FCC map – and if the homeowner is correct, Comcast must fix its mapping claim. But I have to wonder how many homeowners will ever bother to tackle a broadband challenge. The real kicker is that there is no big benefit to a homeowner to make the challenge. Using this example, Comcast would fix the map, but that doesn’t mean that Comcast is likely to offer broadband to the homeowner who challenged the map – it just means the map gets fixed. Once folks realize that a challenge doesn’t change anything, I’m not sure how many people other than the broadband diehards will care much.

The coverage challenge is only going to get better if ISPs report honestly. Using this same example, there would not be much improvement in the FCC map if Comcast were to fix a false speed claim for a specific homeowner challenge unless Comcast was to fix the maps for neighboring homes – something that a challenge does not require.

The issue that most people care about is broadband speeds. Unfortunately, the new maps are as badly flawed on this issue as the old ones – maybe worse. ISPs are still allowed to claim marketing speeds instead of some approximation of actual speeds – and an ISP gets to define what it means by marketing speeds. For example, it’s hard to dispute a marketing speed if it’s something the ISP displays on its website.

Other than the challenge process, there is another possible remedy for fixing mapping problems. The Broadband Deployment, Accuracy, and Technology Availability (DATA) Act that created the new maps gives the FCC the ability to levy fines against ISPs that knowingly or recklessly submit inaccurate mapping data. But does anybody really think that the FCC is going to fine some small local WISP that exaggerates broadband speeds? I have a hard time thinking that the FCC will ever wade into the issue of disputing claims of marketing speeds versus actual speeds. Doing so would just highlight the fact that reporting marketing speeds is acceptable under the FCC rules.

The State of Vermont reacted quickly to the new FCC maps and showed the extent of the problems. The State sent a challenge letter to the FCC saying that 11% of the locations in the FCC mapping fabric don’t exist. Worse, Vermont says that 22% of locations are missing from the FCC map. Vermont also said the speeds portrayed in the new maps don’t align with its own local mapping effort. The new FCC map shows that over 95% of Vermont homes have access to broadband of at least 100/20 Mbps. The State’s broadband maps show that only 71% of homes in the state can receive broadband at 100 Mbps or faster at the end of 2021.

I really hate to say this, but I doubt that the new maps will ever be significantly better than the old ones. I don’t enjoy being pessimistic, and I should probably let the various challenge processes run the course before complaining too loudly. I think after the flurry associated with allocating the BEAD grant funding ends that most people and local governments will quickly lose interest in the map challenge process. I can’t think of any reason why ISPs won’t continue to misreport broadband speed and coverage if they think it somehow benefits them. And I’m doubtful that the FCC will take any meaningful steps to make the maps better.

Challenging Cellular Data Speeds

There has been a lot of recent press about the new ability for households to challenge broadband coverage claimed at their homes by ISPs. The new FCC National Broadband Map also allows folks to challenge the coverage claimed by cellular carriers. Anybody who lives in rural areas knows that the big national cellular coverage maps have always been badly overstated.

The new FCC maps require each cellular carrier to separately declare where it provides, 3G, 4G, and 5G coverage. You can easily see the claimed cellular broadband coverage at your house by toggling between Fixed Broadband and Mobile Broadband on the map. The FCC has plotted cellular coverage by neighborhood hexagons on the map.

There are two ways to challenge the claimed cellular coverage – by individuals or by local governments. The process of challenging the maps is not as easy as challenging the landline broadband map. The challenge process for individuals is as follows:

  • First, a challenger must download the FCC Speed Test App, which is available on the Google App store for android or the Apple Store for IOS devices. This App has been around since 2013. The app is set to not use more than 1 gigabyte of data in a month without permission. Folks probably don’t realize that repeated speed tests can use data a lot of data.
  • Tests should only be taken between 6:00 AM and 10:00 PM.
  • Users will have to make sure to disconnect from a WiFi network since the goal is to test the cellular connection. Many people don’t realize that cell phones use your home broadband connection for moving data if set on WiFi.
  • The FCC provides only two options for taking the test – either outdoors and stationary, or in a moving car. You’ll have to verify that you are not taking the test indoors.
  • You can take the test anonymously. But if you want the FCC to consider the test results, you’ll have to provide your contact information and verify that you are the authorized user of the cellphone.
  • Individual speed tests are not automatically sent to the carriers until there are enough results in a given local area to create what the FCC is calling a crowdsourced data event.

There are some major flaws for testing rural cellular coverage. If you are in any areas where a certain carrier doesn’t provide service, you obviously can’t take the speed test if you can’t make a cellular connection. You can also only challenge your subscribed carrier and you can’t claim that another carrier doesn’t have the coverage that is claimed in the FCC map. On the plus side, you can take the speed test from anywhere, not just your home, and I picture folks taking the test just to help document cellular coverage.

The other flaw is the low thresholds that constitute a successful test. The tests are based on the FCC’s massively outdated definition of acceptable cellular broadband speeds. The test for acceptable 4G coverage is a paltry 5/1 Mbps. The FCC has two thresholds for 5G at 7/1 Mbps and 35/3 Mbps. These speed definitions are out of touch with actual cellular performance. According to Ookla’s nationwide speed tests, the national average cellular speed at the end of the third quarter of 2022 was 148 Mbps download and 16 Mbps upload. The national median speed (meaning half of people are either faster or slower) was 75 Mbps download and 9 Mbps upload. This is another outdated definition that probably won’t be updated unless the FCC gets the much-needed fifth Commissioner.

I don’t know how useful it is to find out that a carrier can deliver 5/1 Mbps to my home. That’s what is claimed at my home by AT&T for 4G (the company is not yet claiming any 5G). A recent speed test from inside my house showed 173/10 Mbps. How can the FCC adopt any policies for cellar broadband if they are only asking carriers to certify that they meet an absurdly low threshold?

Local governments can also initiate challenges. This can be done by coordinating multiple people to take the tests at various locations to paint a picture of the cellular coverage across a city or county. Local governments can also use engineering-quality devices to take the test, which provides more guaranteed results than a cell phone. Local governments have the ability to document areas with no cellular coverage – something that will be hard to document without a huge number of individual speed tests.

The next time you’re driving in a place where the cellular coverage is lousy, stop by the side of the road, get out of your car, and take the speed test. It’s going to take all of us to document the real rural cellular coverage map. Also, let’s collectively push the FCC to increase the definition of acceptable broadband speeds. We talk about landline broadband speeds all of the time, but cellular coverage in rural areas is equally, or even more important.