Starlink and RDOF

In August, the FCC denied the SpaceX (Starlink) bid to receive $885 million over ten years through the RDOF subsidy. This is something that Starlink won in a reverse auction in December 2020.

In the press release for the rejection, FCC Chairman Jessica Rosenworcel was quoted as saying, “After careful legal, technical, and policy review, we are rejecting these applications. Consumers deserve reliable and affordable high-speed broadband. We must put scarce universal service dollars to their best possible use as we move into a digital future that demands ever more powerful and faster networks. We cannot afford to subsidize ventures that are not delivering the promised speeds or are not likely to meet program requirements.”

The FCC went on to say in the order that there were several technical reasons for the Starlink rejection. First was that Starlink is a “nascent” technology, and the FCC doubted the company’s ability to deliver broadband to 642,925 locations in the RDOF areas along with serving non-RDOF areas. The FCC also cited the Ookla speed tests that show that Starlink speeds decreased from 2021 into 2022.

Not surprisingly, Starlink appealed the FCC ruling this month. In the Starlink appeal, the company argued, “This decision is so broken that it is hard not to see it as an improper attempt to undo the commission’s earlier decision, made under the previous administration, to permit satellite broadband service providers to participate in the RDOF program. It appears to have been rendered in service to a clear bias towards fiber, rather than a merits-based decision to actually connect unserved Americans”.

Rather than focus on the facts in dispute in the appeal, today’s blog looks at the implications on the broadband industry during the appeal process. Current federal grant rules don’t allow federal subsidies to be given to any area that is slated to get another federal broadband subsidy. This has meant that the RDOF areas have been off-limits to other federal grants since the end of 2020. This has included NTIA grants, USDA ReConnect grants, and others. Federal grant applicants for the last few years have had to carefully avoid the RDOF areas for Starlink and any other unresolved RDOF award areas.

As a reminder, the RDOF areas were assigned by Census block and not in large coherent contiguous areas. The RDOF award areas have often been referred to as Swiss cheese, meaning that Census blocks that were eligible for RDOF were often mixed with nearby ineligible Census blocks. A lot of the Swiss cheese pattern was caused by faulty FCC maps that excluded many rural Census blocks from RDOF that should have been eligible, but for which a telco or somebody else was probably falsely claiming speeds at least 25/3 Mbps.

ISPs that have been contemplating grant applications in the unresolved RDOF areas were relieved when Starlink and other ISPs like LTE Broadband were rejected by the FCC. It’s difficult enough to justify building rural broadband, but it’s even harder when the area to be built is not a neat contiguous study area.

The big question now is what happens with the Starlink areas during an appeal. It seems likely that these areas will go back into the holding tank and remain off-limits to other federal grants. We’re likely going to need a definitive ruling on this from grant agencies like the USDA to verify, but logic would say that these areas still need to be on hold in case Starlink wins the appeal.

Unfortunately, there is no defined timeline for the appeal process. I don’t understand the full range of possibilities of such an appeal. If Starlink loses this appeal at the FCC, can the agency take the appeal on to a court? Perhaps an FCC-savvy lawyer can weigh in on this question in the blog comments. But there is little doubt that an appeal can take some time. And during that time, ISPs operating near the widespread Starlink grant areas are probably still on hold in terms of creating plans for future grants.

The FCC Mapping Fabric

You’re going to hear a lot in the next few months about the FCC’s mapping fabric. Today’s blog is going to describe what that is and describe the challenges of getting a good mapping fabric.

The FCC hired CostQuest to create the new system for reporting broadband usage. The FCC took a lot of criticism about the old mapping system that assumed that an entire Census block was able to buy the fastest broadband speed available anywhere in the Census block. This means that even if only one home is connected to a cable company, the current FCC map shows that everybody in the Census block can buy broadband from the cable company.

To fix this issue, the FCC decided that the new broadband reporting system would eliminate this problem by having an ISP draw polygons around areas where it already serves or could provide service within ten days after a customer request. If done correctly, the new method will precisely define the edge of cable and fiber networks.

The creation of the polygons creates a new challenge for the FCC – how to count the passings inside of any polygon an ISP draws. A passing is any home or business that is a potential broadband customer. CostQuest tried to solve this problem by creating a mapping fabric. A simplistic explanation is that they placed a dot on the map for every known residential and business passing. CostQuest has written software that allows them to count the dots of the mapping fabric inside of any possible polygon.

That sounds straightforward, but the big challenge was creating the dots with the actual passings. My consulting firm has been helping communities try to count passings for years as part of developing a broadband business plan, and it is never easy. Communities differ in the raw data available to identify passings. Many counties have GIS mapping data that shows the location of every building in a community. But the accuracy and details in the GIS mapping data differ drastically by county. We have often tried to validate GIS data to other sources of data like utility records. We’ve also validated against 911 databases that show each registered address. Even for communities that have these detailed records, it can be a challenge to identify passings. We’ve heard that CostQuest used aerial maps to count rooftops as part of creating the FCC mapping fabric.

Why is creating a fabric so hard? Consider residential passings. The challenge becomes apparent as soon as you start thinking about the complexities of the different living arrangements in the world. Even if you have great GIS data and aerial rooftop data, it’s hard to account for some of the details that matter.

  • How do you account for abandoned homes? Permanently abandoned homes are not a candidate for broadband. How do you make the distinction between truly abandoned homes and homes where owners are looking for a tenant?
  • How do you account for extra buildings on a lot. I know somebody who has four buildings on a large lot that has only a single 911 address. The lot has a primary residence and a second residence built for a family member. There is a large garage and a large workshop building – both of which would look like homes from an aerial perspective. This lot has two potential broadband customers, and it’s likely that somebody using GIS data, 911 data, or aerial rooftops won’t get this one property right. Multiply that by a million other complicated properties, and you start to understand the challenge.
  • Farms are even harder to count. It wouldn’t be untypical for a farm to have a dozen or more buildings. I was told recently by somebody in a state broadband office that it looks like the CostQuest mapping fabric is counting every building on farms – at least in the sample that was examined. If this is true, then states with a lot of farms are going to get a higher percentage of the BEAD grants than states that don’t have a lot of compound properties with lots of buildings.
  • What’s the right way to account for vacation homes, cabins, hunting lodges, etc.? It’s really hard with any of the normal data sources to know which ones are occupied full time, which are occupied only a few times per year, which have electricity, and which haven’t been used in many years. In some counties, these kinds of buildings are a giant percentage of buildings.
  • Apartment buildings are really tough. I know from working with local governments that they often don’t have a good inventory of the number of apartment units in each building. How is the FCC mapping data going to get this right?
  • I have no idea how any mapping fabric can account for homes that include an extra living space like an in-law or basement apartment. Such homes might easily represent two passings unless the two tenants decide to share one broadband connection.
  • And then there is the unusual stuff. I remember being in Marin County, California and seeing that almost every moored boat has a full-time occupant who wants a standalone broadband connection. The real world is full of unique ways that people live.

Counting businesses is even harder, and I’m not going to make the list of the complexities of defining business passings – but I think you can imagine it’s not easy.

I’m hearing from folks who are digging into the FCC mapping fabric that there are a lot of problems. ISPs say they can’t locate existing customers. They tell me there are a lot of mystery passings shown that they don’t think exist.

We can’t blame CostQuest if they didn’t get this right the first time – Americans are hard to count. I’m not sure this is ever going to be done right. I’m sitting here scratching my head and wondering why the FCC took this approach. I think a call to the U.S. Census would have gotten that advice that this is an impossible goal. The Census spends a fortune every ten years trying to identify where people live. The FCC has given itself the task of creating a 100% census of residences and businesses and updating it every six months.

The first set of broadband map challenges will be about the fabric, and I’m not sure the FCC is ready for the deluge of complaints they are likely to get from every corner of the country. I also have no idea how the FCC will determine if a suggestion to change the fabric is correct because I also don’t think communities can count passings perfectly.

This is not the only challenge. There are going to be challenges of the coverage areas claimed by ISPs. The big challenge, if the FCC allows it, will be about the claimed broadband speeds. If the FCC doesn’t allow that they are going to get buried in complaints. I think the NTIA was right to let the dust settle on challenges before using the new maps.

The 12 GHz Battle

A big piece of what the FCC does is to weigh competing claims to use spectrum. It seems like there have been non-stop industry fights over the last decade on who gets to use various bands of spectrum. One of the latest fights, which is the continuation of a fight going on since 2018, is for the use of the 12 GHz spectrum.

The big wrestling match is between Starlink’s desire to use the spectrum to communicate with its low-orbit satellites and cellular carriers and WISPs who want to use the spectrum for rural broadband. Starlink uses this spectrum to connect its ground-based terminals to satellites. Wireless carriers argue that the spectrum should also be shared to enhance rural broadband networks.

The 12 GHz band is attractive to Starlink because it contains 500 MHz of contiguous spectrum with 100 MHz channels – a big data pipe for reaching between satellites and earth. The spectrum is attractive to wireless ISPs for these same reasons, along with other characteristics. The 12 GHz spectrum will carry twice as far as the other spectrum in point-to-multipoint broadband networks, meaning it can cover four times the area from a given tower. The spectrum is also clear of any federal or military encumbrance – something that restricts other spectrum like CBRS. The spectrum also is being used for cellular purposes internationally, which makes for an easy path to find the radios and receivers to use it.

In the current fight, Starlink wants exclusive use of the spectrum, while wireless carriers say that both sides can share the spectrum without much interference. These are always the hardest fights for the FCC to figure out because most of the facts presented by both sides are largely theoretical. The only true way to find out about interference is in real-world situations – something that is hard to simulate any other way,

A few wireless ISPs are already using the 12 GHz spectrum. One is Starry, which has recently joined the 12 GHz Coalition, the group lobbying for terrestrial use of the spectrum. This coalition also includes other members like Dish Networks, various WISPs, and the consumer group Public Knowledge. Starry is one of the few wireless ISPs currently using millimeter-wave spectrum for broadband. The company added almost 10,000 customers to its wireless networks in the second quarter and is poised to grow a lot faster. If the FCC opens the 12 GHz spectrum to all terrestrial uses, it seems likely that use of the spectrum would quickly be used in many rural areas.

As seems usual these days, both sides in the spectrum fight say that the other side is wrong about everything they are saying to the FCC. This must drive the engineers at the FCC crazy since they have to wade through the claims made by both sides to get to the truth. The 12 GHz Coalition has engineering studies that show that the spectrum could coexist with satellite usage with a 99.85% assurance of no interference. Starlink, of course, says that engineering study is flawed and that there will be significant interference. Starlink wants no terrestrial use of the spectrum.

On the flip side, the terrestrial ISPs say that the spectrum in dispute is only 3% of the spectrum portfolio available to Starlink, and the company has plenty of bandwidth and is being greedy.

I expect that the real story is somewhere in between the stories told by both sides. It’s these arguments that make me appreciate the FCC technical staff. It seems every spectrum fight has two totally different stories defending why each side should be the one to win use of spectrum.

Are BEAD Grants Large Enough?

One of the biggest questions associated with the $42.5 billion BEAD grant program is if that is enough money to solve the national rural digital divide. The funding will be allocated to states in a three-step process. First, States will get an automatic $100 million. Next, $4.2 billion will be directly allocated to States using the relative percentage of locations in each state defined as unserved and high-cost. This will rely on the new FCC maps, and the NTIA may still refine the definition of high-cost areas. The remaining $38.1 million will also be allocated to States using the new FCC maps, and will also use the relative number of unserved locations in each State.

The funding works out to be around $850 million per state, but the funding will vary significantly by state. Preliminary estimates have a number of states only getting $100 million – Connecticut, Delaware, District of Columbia, Hawaii, Maine, New Hampshire, North Dakota, Rhode Island, and Vermont. The largest estimated allocations are estimated to go to Texas at $4.2 billion and California at $2.8 billion.

States have been doing the math to see if they think the BEAD grant funding will be enough to reach every rural household with good broadband. I’ve only been able to find one article that cites an estimate of the effectiveness of the BEAD grants, but this one example raises some good questions.

The State of Minnesota is estimated to receive about $650 million in BEAD grant funding. In March of this year, the State Legislature approved $110 million for the existing Border-to-Border grant program, with most of the funding coming from federal ARPA funding given to the state. At that time, the State broadband office estimated that the state will need around $1.3 billion in total grant funding to reach everybody in the state. If that is a good estimate, then even after BEAD grants and the $110 million State grants, the state will be $540 million short.

This raises a lot of questions. First, inflation has hit the broadband industry hard, and I’ve seen a lot of estimates that the cost to build broadband networks is between 15% to 25% higher than just two years ago. That means that the $42.5 billion in BEAD funding is not going to stretch nearly as far as was estimated when Congress established the BEAD grants. This also raises the question of how much inflation will further increase costs over the years it’s going to take to build BEAD-funded networks. It’s not hard to imagine BEAD networks still being constructed in 2026 and beyond.

I’ve also seen estimates that the rules established by Congress and the NTIA for the BEAD grants could add as much as another 15% to the cost of building broadband networks compared to somebody not using grant funding. These extra costs come from a variety of factors, including the requirement to pay prevailing wages, expensive environmental studies that are not undertaken for non-grant projects, the requirement of getting a certified letter of credit, etc. The extra grant-related costs and the general inflation in the industry might mean that BEAD projects could cost 30% or more than building the same networks two years ago without grant funding.

This also raises an interesting question about how states allocated ARPA funding to broadband. Minnesota’s allocation of $110 million to broadband from ARPA is smaller than what many other states have done. As an example, my state of North Carolina allocated nearly $1 billion of the state’s ARPA money to broadband, and there are many states that have allocated $300 million or more to broadband. Part of the blame for a state like Minnesota not having enough money to reach everybody could be placed on the Legislature for not allocating much ARPA funding for broadband.

Another interesting question to be addressed is how State broadband offices will deal with areas where a 75% grant is not enough for an ISP to make a business case. From the feasibility work I’ve been doing this year, I think there are a lot more areas that fit the high-cost category than might be expected. The NTIA says that it might allow exceptions for grants up to 100% of the cost of assets – but asking for extra funding will probably open up the possibility for a State to instead fund less costly technologies. It might turn out that finding solutions for the many high-cost areas might be the unpredictable wild card in the BEAD grant process.

Finally, there are going to be areas where a State doesn’t make a BEAD grant award. It’s not hard to imagine a situation where only one ISP asks to serve an area, and a State broadband office decides that the ISP is unqualified to receive funding.

If the Minnesota estimate is even roughly accurate, it’s likely that Minnesota won’t be the only state that doesn’t receive enough BEAD money to get broadband to everybody. We’re not going to know this for sure until ISPs start applying for grants, but it won’t be a surprise if the BEAD grants are not large enough.

Congressional Push for a National Broadband Strategy

In August, a bill was passed through to the Committee for Commerce, Science, and Transportation to align the federal government’s efforts related to broadband. The bill was co-sponsored by Senators Roger Wicker, R-Mississippi, Ben Ray Luján, D-New Mexico, and Representatives Tim Walberg, R-Michigan., and Peter Welch, D-Vermont.

The Bill, S-4767 is titled the Proper Leadership to Align Networks (PLAN) for Broadband Act. The legislation is based upon a report earlier this year from the Government Accountability Office that determined that federal broadband efforts are fragmented and overlapping. The bill proposes that the President develop a national broadband strategy to better align the federal broadband effort.

There is no question that national broadband policy is fragmented. We have an FCC which is ostensibly in charge of broadband policy but which essentially washed its hand of broadband regulation under past Chairman Ajit Pai. The FCC has been in charge for years of tracking the state of broadband in the country and completely botched that task through an inadequate mapping process that allowed ISPs to report whatever they wanted about broadband coverage. For much of the last few decades, the feeling in DC is that the FCC has been in the pocket of the giant ISPs the agency is supposed to be regulating.

Congress gave responsibility for the giant BEADs grant program to the NTIA, largely due to the fact that Congress didn’t trust the FCC to administer the grant program. But the NTIA doesn’t have a lot of authority outside of the grant program. When the BEAD grants are behind us, the NTIA will fade into obscurity again in terms of national broadband policy.

The latecomer to the game is the FTC. The FCC handed some authority to the FTC when it abandoned broadband regulation. But the FTC mostly only prosecutes individual ISPs for bad behavior and has no authority to impose any regulation on all ISPs.

This bill is asking the Executive branch to take a shot at fixing federal broadband dysfunction through the creation of a broadband plan. I guess this plan would be aimed at discussing how to put broadband regulation back together again to have a cohesive federal policy. If you’ve read my blog for years, you know how I feel about broadband plans. They are only as good as any follow-through on the recommendations made. The decade-old national broadband plan was as near as you could get to a total bust – not because it didn’t include good recommendations, but because it was put on the shelf and quickly forgotten.

It’s hard to think that a new broadband plan, even one coming from this legislation, would fare any better than the last one. It will likely be a document with a few good ideas – but ideas that are softened to appease the many parties with input to the plan. It’s hard to imagine a new federal broadband plan going anywhere but on the shelf, as in the past.

I find it almost humorous that Congress would ask the White House to come up with the plan on how to fix the national broadband administration and regulation. The White House has almost zero power to implement any ideas the plan might suggest.

The one government entity that can create a coherent broadband plan is Congress. Congress writes the rules that direct how the FCC operates and they could change the direction of the FCC overnight. Congress is the one who gave the NTIA the strong current role in setting national broadband policy through the grant process and could expand that role if desired.

If Congress wants a coherent broadband policy, it needs to do nothing more than go into a room and write it. This Act is a way for Congress to pretend to be addressing broadband without actually doing so. If nothing happens after the creation of a newly written broadband plan, Congress can blame the White House.

The reality is that there are not enough votes in Congress to pass a new Telecommunications Act, which is what is needed to put national telecom policy back on track. There has obviously not been enough votes over the last decade to make any drastic changes to telecom policy. The large ISPs have bought enough influence in both parties to sidetrack any attempt by the federal government to try to regain the reins of broadband policy.

There is no telling if this particular legislation has enough legs to get to a floor vote – but it’s the kind of legislation that could garner enough votes from both parties to pass since the outcome threatens nobody.

Averting a Mapping Disaster?

Alan Davidson, the head of the National Telecommunications and Information Administration, recently announced that the agency is canceling plans to use the first iteration of the new FCC maps that the FCC says will be available by early November. Davidson says that he feels obligated to let the FCC’s challenge process play out before using the mapping data. I’m sure this wasn’t an easy decision, but it says that it’s better to hold out for a more accurate map rather than settling for the first iterations of the new FCC maps.

This decision will clearly add more time and delay to the $42.5 billion BEAD grant program. But the decision to wait recognizes that using incorrect maps would almost inevitably mean lawsuits that could delay the grant program even longer.

The timing of the new maps became unfortunate when Congress mandated that the FCC maps must be used to allocate over $38 billion in grant funding to states. The FCC has been stating all summer that it hopes that the new maps will be relatively accurate and will fix many of the obvious problems in the current broadband maps. If it wasn’t for the pressure of the BEAD grant program, the FCC would have had several cycles of the new maps to smooth out kinks and errors in the reporting before they had to bless the new maps as solid. The NTIA decision to delay relieves the pressure to have the first set of maps be error-free – which nobody believes will happen. I have a hard time recalling any cutover of a major government software system that was right the first time, and the FCC’s assurances all summer have felt more like bravado than anything else.

Over the last few weeks, I’ve been talking to the engineers and other folks who are helping ISPs with the new maps. I didn’t talk to anybody who thinks the new maps will be solid or accurate. Engineers are, by definition, somewhat cautious folks, but I expected to find at least a few folks who thought the new maps would be okay.

I’ve been saying for six months that the likelihood of the new maps being accurate is low, and I was thinking about not writing anything more about mapping until we see what the new maps produce. However, I was prompted to write about mapping again when I saw a headline in FierceTelecom that quoted Jonathan Chambers of Conexon saying that the new maps will be a train wreck. Conexon is working with electric cooperatives all across the country to build broadband networks, which gives the company an interesting perspective on rural issues.

Jonathan Chambers cites two reasons for pessimism. One is the reason I already mentioned, which is that it’s irrational to use the outputs of a new federal mapping system to allocate billions of dollars between states. He says that there are simpler alternatives that would take all of the pressure off the new mapping system. He’s right, but unfortunately, Congress specifically required In the IIJA legislation that the FCC maps be used. It would take an act of Congress to change that ruling.

Chambers is also pessimistic about the challenge process that is being allowed for the new maps. He expects the challenges to be major and ongoing. It seems unlikely that the FCC is prepared to investigate the huge number of protests that could come from every corner of the country claiming that the new maps got the facts wrong.

My discussions with engineers raised other questions not mentioned by Chambers. Some engineers told me that the underlying mapping fabric has a lot of mistakes. This is where CostQuest, the firm that created the new mapping system, laid out the location nationwide of every possible broadband customer. This was a nearly impossible task in the short time the company had to create the maps. I’ve been working for years with local governments that use GIS data to define potential broadband locations, and it’s always a challenge to identify only those buildings where somebody might buy broadband and exclude buildings used for some other purpose.

My biggest concern is that ISPs are still allowed to report marketing speeds instead of actual speeds, and I fear that ISPs will be motivated to overstate broadband speeds in the new maps (like many have done in the old ones). Any areas designated by the maps to already have broadband available at 100/20 Mbps will be declared ineligible for the BEAD grants, and any ISP that wants to protect against being overbuilt has a high motivation to claim that speed – and it seems likely that many of them will do so. I don’t know if this is true, but my interpretation of the FCC map challenge is that the FCC won’t entertain challenges based on speed, but only on the coverage area. If that is true there will be a huge uproar from states and communities that get disadvantaged from deceptive reporting by ISPs.

I’ve also heard from ISPs in the last week that were unable to navigate the new mapping system by the deadline. These are relatively small ISPs, but many of them have built fiber and it’s not good to have them excluded from the maps. I’ve heard from multiple sources that the new mapping system is not easy to use. I’ve heard from ISPs who didn’t have an engineer who was able to certify the maps and just gave up.

I guess we’ll find out in a few months how the first draft of the maps turns out. The FCC says it will release the results by early November. I expect there are a whole lot of folks who are poised to compare the new maps to their local knowledge of actual broadband usage – and then the challenges will begin.

More WiFi Spectrum

There is more WiFi spectrum on the way due to the US Court of Appeals for the District of Columbia that rejected a legal challenge from the Intelligent Transportation Society of America and the American Association of State Highway and Transportation Officials that had asked to vacate the FCC’s 2020 order to repurpose some of the spectrum that had been reserved for smart cars.

The spectrum is called the 5.9 GHz band and sits between 5.85 GHz and 5.925 GHz. The FCC had decided to allocate the lowest 45 MHz of spectrum to WiFi while allowing the upper 30 MHz to remain with the auto industry.

The process will now begin to make the transition to WiFi. The FCC had originally given the auto industry a year to vacate the lower 45 MHz of spectrum. The FCC is likely going to have to set a new timeline to mandate the transition. The FCC also needs to rule on a waiver from the auto industry to redeploy technology using the Cellular Vehicle-to-Everything (C-V2X) technology from the lower to the higher frequency band. This is the technology that most of the industry is using for testing and deploying self-driving vehicles.

The lower 45 MHz of the new spectrum sits adjacent to the existing WiFi 5.8 GHz spectrum. Combining the new spectrum with the existing band is a boon to WISPs, which now get a larger uninterrupted swath of spectrum for point-to-multipoint broadband deployment. During the early stage of the pandemic, the FCC gave multiple WISPs the ability to use the 5.9 GHz spectrum on a trial basis for 60 days, and many of them have been regularly renewing that temporary licenses since then.

When the FCC announced the resolution of the lawsuit, the agency issued a press release discussing the benefits touted by WISPs for using the new spectrum. Some of them claimed to see between a 40% and 75% increase in throughput bandwidth. This was mostly due to less congestion on this spectrum, which is rarely used. There was little or no interference during the last year. The spectrum also provided a clear path for wireless backhaul between towers. Of course, once this is made available to all WISPs, it’s likely that much of this benefit will disappear as everybody starts vying to use the new spectrum. But it is an increase in bandwidth potential, and that has to mean higher quality wireless signals.

This spectrum will also be available for home WiFi. However, it takes a lot longer for the home WiFi industry to respond to new spectrum. It means upgrading home WiFi routers but also adding the capability to use the spectrum to the many devices in our homes and offices that use WiFi. Everything I’m reading says that we are still years away from seeing widespread use of the 6 GHz WiFi spectrum, and this new bandwidth will likely be rolled out at the same time.

This was an interesting lawsuit for several reasons. First, the entities filing the court suit challenged the FCC’s ability to change the use of spectrum in this manner. The court decision made it clear that the FCC is fully in the driver’s seat in terms of spectrum allocation.

This was also a battle between two large industries. The FCC originally assigned this spectrum to the auto industry twenty years ago. But the industry was slow to adopt any real-world uses of the spectrum, and it largely sat idle, except for experimental test beds. There is finally some movement toward deploying self-driving cars and trucks in ways that uses the spectrum. But even now, there is still a lot of disagreement about the best technology to use for self-driving vehicles. Some favor the smart road that uses spectrum to communicate with vehicles, while the majority opinion seems to favor standalone smart-driving technology in each vehicle.

Between this order and the 6 GHz spectrum, the FCC has come down solidly in favor of having sufficient WiFi spectrum going into the future. It’s clear that the existing bands of WiFi are already heavily overloaded in some settings, and the WiFi industry has been successful in getting WiFi included in huge numbers of new devices. I have an idea that we’ll look back twenty years from now and say that these new WiFi spectrum bands are not enough and that we’ll need even more. But this is a good downpayment to make sure that WiFi remains vigorous.

Another RDOF Auction?

There was a recent interview in FierceTelecom with FCC Commissioner Brandon Carr that covered a number of topics, including the possibility of a second round of RDOF. Commissioner Carr suggested that improvements would need to be made to RDOF before making any future awards, such as more vetting of participants upfront or looking at weighting technologies differently.

The FCC is building up a large potential pool of broadband funding. The original RDOF was set at $20 billion, with $4.4 billion set aside for a second reverse auction, along with whatever was left over from the first auction. The participants in the first RDOF auction claimed only $9.2 billion of $16 billion, leaving $6.8 billion. When the FCC recently decided not to fund LTD Broadband and Starlink, the leftover funding grew by another $2 billion. Altogether that means over $11 billion left in funds that were intended for RDOF.

We also can’t forget that around the same time as the RDOF that the FCC had planned to fund a 5G fund to enhance rural cellular coverage. Due to poor mapping and poor data from the cellular carriers, that auction never occurred. That puts the pool of unused funding at the FCC at $20 billion, plus whatever new FCC money might have accrued during the pandemic. That’s a huge pool of money equal to half of the giant BEAD grants.

The biggest question that must be asked before considering another RDOF reverse auction is how the country will be covered by the BEAD grants. It would be massively disruptive for the FCC to try to inject more broadband funding until that grant process plays out.

Commissioner Carr said that some of the FCC’s funding could go to enhance rural cellular coverage. Interestingly, once BEAD grant projects are built, that’s going to cost a lot less than was originally estimated. A lot of the money in the proposed 5G fund would have been used to build fiber backhaul to reach rural cell sites. I think the BEAD last-mile networks will probably reach most of those places without additional funding. However, there is probably still a good case to be made to fund more rural cell towers.

But there are larger questions involved in having another reverse auction. The big problem with the RDOF reverse auction was not just that the FCC didn’t screen applicants first, as Carr and others have been suggesting. The fact is that a reverse auction is a dreadful mechanism for awarding broadband grant money. A reverse auction is always going to favor lower-cost technologies like fixed wireless over fiber – it’s almost impossible to weight different technologies for an auction in a neutral way. It doesn’t seem like a smart policy to give federal subsidies to technologies with a 10-year life versus funding infrastructure that might last a century.

Reverse auctions also take state and local governments out of the picture. The upcoming BEAD funding has stirred hundred of communities to get involved in the process of seeking faster broadband. I think it’s clear that communities care about which ISP will become the new monopoly broadband provider in rural areas. If the FCC has a strict screening process up front, then future RDOF funding will only go to ISPs blessed by the FCC – and that probably means the big ISPs. I would guess that the only folks possibly lobbying for a new round of RDOF are companies like Charter and the big telcos.

The mechanism of awarding grants by Census block created a disaster in numerous counties where RDOF was awarded in what is best described as swiss cheese serving areas. The helter-skelter nature of the RDOF coverage areas makes it harder for anybody else to put together a coherent business plan to serve the rest of the surrounding rural areas. In contrast, states have been doing broadband grants the right way by awarding money to coherent and contiguous serving areas that make sense for ISPs instead of the absolute mess created by the FCC.

A reverse auction also relies on having completely accurate broadband maps – and until the FCC makes ISPs report real speeds instead of marketing speeds, the maps are going to continue to be fantasy in a lot of places.

Finally, the reverse auction is a lazy technique that allows the FCC to hand out money without having to put in the hard effort to make sure that each award makes sense. Doing grants the right way requires people and processes that the FCC doesn’t have. But we now have a broadband office and staff in every state thanks to the BEAD funding. If the FCC is going to give out more rural broadband funding, it ought to run the money through the same state broadband offices that are handling the BEAD grants. These folks know local conditions and know the local ISPs. The FCC could set overall rules about how the funds can be used, but it should let the states pick grant winners based upon demonstrated need and a viable business plan.

Of course, the simplest solution of all would be for the FCC to cut the USF rate and stop collecting Universal Service Fund revenues from the public. The FCC does not have the staff or skills needed to do broadband grants the right way. Unfortunately, that might not stop the FCC from tackling something like another RDOF auction so it can claim credit for having solved the rural digital divide. If the FCC plans on another RDOF auction I hope Congress stops them from being foolhardy again.

Net Neutrality Legislation

In late July, Senators Edward Markey and Ron Wyden, along with Representative Doris Matsui introduced a short bill titled the Net Neutrality and Broadband Justice Act that would classify broadband as a telecommunications service under Title II of the FCC rules.

It’s an interesting concept because this bill would stop the see-saw battle between democrats and republicans about regulating broadband. The Tom Wheeler FCC implemented net neutrality and related broadband regulation using Title II authority in 2015, and the Ajit Pai FCC completely killed Title II regulation in 2017. It’s clear that the current FCC under Jessica Rosenworcel intends to reinstate the Title II authority. If Congress was to enact this law, it would make it impossible for future FCC’s to flip flop on the issue.

What is almost comical about the issue is that both parties make this appear to be a fight over net neutrality, which it is not. All of the public discussions of the issue have been couched as a discussion of whether we need federal net neutrality rules. However, the real fight is about whether broadband should be regulated. When the Ajit Pai FCC stripped away Title II authority for broadband, most of the FCC’s ability to regulate broadband in any meaningful way disappeared. It seems crazy not to have a national policy to regulate an industry where the two biggest ISPs control over 55% of the national market, where the four largest ISPs control over 75% of the market, and where fifteen ISPs control 95% of the market. Beyond the market power of a handful of ISPs, most consumers will say they have only one choice of fast broadband.

Net neutrality has never been the issue. The big ISPs have only violated the principles of net neutrality in a serious way a few times, like when the biggest ISPs restricted Netflix traffic in 2013 and 2014 to get the company to pay more for using the Internet. Soon after the Ajit Pai FCC killed net neutrality, the State of California introduced nearly identical rules, which have subsequently been affirmed by the courts. The biggest ISPs are largely following net neutrality since doing so everywhere except California would be nearly impossible to manage.

The real fear the big ISPs have of Title II authority is that the FCC could theoretically implement rate regulation. This is the underlying issue for the continuing fight. The big ISPs also understand that the FCC will enact other restrictions if the agency has the authority to do so. But it is the fear of putting any restrictions on rates that draws heavy lobbying from the industry. The big ISPs have been using the term light-touch regulation to describe the current state of affairs – which in real life translates to practically no regulations at all.

I can’t imagine a time when the FCC would try to put a cap on ISP rates, but the agency could still restrict what ISPs charge. For example, it’s not hard to imagine the FCC putting curbs on data caps, where ISPs charge customers a lot extra for using too much broadband in a month. Everybody who knows how ISPs operate understands that there is almost no extra cost to an ISP for serving a heavy broadband user – data cap fees verge on the edge of fraud.

It doesn’t look likely that this bill has any chance of making it through the current Congress. The bill is unlikely to draw any Republican votes and may not even gain a positive vote from all of the Democrats. The only way to ever get this passed would be to somehow find a way to do so with a simple majority vote rather than the needed 60 votes to pass.

It’s a shame because there should be regulatory oversight over such a vital industry that is operated by oligopolies. While a few cities seem to be finding a way to bring multiple ISPs to complete, most of the country still only has one or two ISPs that offer fast broadband. Because of the huge barrier to market entry due to the cost of building a new network, most of the country is not likely to see price competition for broadband. At a bare minimum, we ought to have the FCC fulfilling one of its prime regulatory responsibilities, which is to make sure that ISPs don’t overreach too badly with the public.

Beware the Grant Challenges

One of the hurdles faced by communities pursuing broadband grants is that many grant programs allow incumbent broadband providers to challenge the validity of a grant. The most common challenge is for an incumbent provider to claim that a grant incorrectly includes homes and businesses that already have fast broadband. Today’s blog includes a few examples of recent grant challenges and warns that communities need to be ready for challenges as part of seeking better broadband. It appears that the purpose of many challenges is to delay the process, with the ultimate hope to derail or cancel grant requests.

The first challenge story comes from northeastern Louisiana in East Carroll Parish for the state grants that have been dubbed GUMBO grants. In this case, the grant was to go to Conexon to bring fiber to the rural parts of the Parish. A challenge was filed by Sparklight, the incumbent cable company (which has rebranded from Cable One). Sparklight claimed that it serves 2,856 homes in the East Carroll Parish with 960/50 Mbps broadband – a dubious claim since the entire Parish only has 2,792 households. I talked to several residents of the county who say that Sparklight does not serve rural residents and that most of the parish has little or no broadband options.

This challenge is unusual in that it came after the grant was awarded. The grant process had included several months to file protests, and Sparklight had said nothing during that period. I’m not sure I have the facts entirely straight, but it seems like the Legislature allowed for a second 7-day challenge period that was not part of the original rules. In this case, the challenge came on the same day that the Parish had planned to kick off the process of asking residents to join a sign-up list to get service, and after the Governor had come to the Parish to announce the grant award. There were a lot of other challenges around the state as part of the Gumbo grant process.

Another well-publicized grant challenge came from an NTIA grant being sought to bring broadband to Grafton County, New Hampshire. In this case, the incumbents challenged 3,000 of the 4,000 Census Blocks covered by the grant. It’s difficult for any grant applicant to defend a challenge of this magnitude, even if the grant areas legitimately qualify for the grant funding. It turns out that most of the challenges were erroneous.

An interesting grant story comes from Washington State. I ran across an article talking about the challenge to a grant filed by the Grays Harbor Public Utility District, which is an electric and water utility serving the county. The PUD operates an open access network where it builds fiber and lets multiple ISPs compete to serve residents and businesses, and the grants asked to expand the open-access networks.

Grays Harbor PUD had asked for a grant to serve 922 homes in an area where both broadband and cellular coverage are almost nonexistent. The PUD would have used the grant money to bring fiber to this pocket of rural homes. In Washington State, grants and challenges are handled by the Public Works Board, a group of 14 volunteers appointed by the Governor. Comcast objected to the grant and said that 249 of the homes near one of the towns could already buy broadband from Comcast.

There was no investigation of the challenge claim, and the Public Works Board rejected the grant outright, along with eight other grants that had received similar objections. The PUD believes that almost all of the homes being challenged cannot buy broadband from Comcast, but the PUD was given no opportunity to dispute the objection. A better solution would have been to investigate the challenge and trim out homes that can already buy broadband. The Public Works Board thought it was obligated to toss out a grant that violates the grant rules, but has since examined its processes since it appears that State law would have allowed the Board to make a partial grant based on the homes that don’t have broadband.

These are just a few of the many hundreds of stories of challenges that have been filed against grants over the last year. These stories are important because they presage what might happen with the upcoming $42.5 billion BEAD grants that include a challenge process. The fact that challenges are allowed puts the burden on communities to do the homework to make sure that grant areas fit the grant rules. This means gathering speed tests and also getting testimonials from residents explaining the lack of broadband choices. Grant offices can get overwhelmed if huge numbers of challenges are filed – communities that can prove their story are going to fare the best.