RDOF in Trouble?

In June, three Senators – Roger Wicker, Cindy Hyde-Smith, and J.D. Vance – sent a letter to FCC Chairperson Jessica Rosenworcel asking for relief for RDOF award winners. The Senators said that the RDOF subsidies are no longer adequate due to massive increases in construction costs to build and operate the promised RDOF networks. The Senators asked that the FCC particularly consider relief for RDOF winners that have fewer than 250,000 broadband customers.

More recently, a coalition of RDOF winners sent a similar letter pleading for relief from the same issues. The RDOF winners cited much higher costs due to both the pandemic and to actions taken by the federal government with funding programs that were not in place at the time of the RDOF awards. The Coalition of RDOF winners offered various possible solutions the FCC might consider to help the winners meet their RDOF obligations:

  • Extra funding to RDOF winners that have “affirmatively requested such funding.”
  • A short amnesty window to let RDOF winners withdraw from RDOF if the FCC is not going to provide supplemental funding.
  • Earlier payments for the RDOF funds due from years 7-10.
  • Add an additional eleventh year of RDOF subsidy.
  • Provide relief from all or some requirements related to the letter of credit requirements.

As a reminder to readers, the RDOF program provided a 10-year subsidy to cover some of the costs of deploying broadband in unserved areas. The money was awarded in a reverse auction that ended in November 2020, where the lowest bidder for the subsidy in any given Census block won the subsidy. The FCC originally considered awarding as much as $16 billion in the auction. However, Many ISPs bid the prices lower than expected, and only $9 billion was claimed at the close of the auction. Some ISPs withdrew after the auction and the FCC disqualified some large bidders like Starlink and LTD Broadband. Ultimately, only $6 billion of subsidies are now in play.

Chairwoman Rosenworcel quickly responded to the three Senators and largely closed the door on the Senator’s requests. She pointed out that the FCC reserves the funding through the Universal Service Fund each quarter to pay the $6 billion subsidy and that there is no additional funding to increase RDOF payments. She also reminded the Senators that the rules and penalties for withdrawing from RDOF were clearly known by bidders before they bid in the auction and that the penalties are in place to ensure that RDOF winners fulfill their obligations.

The entire RDOF process has been badly flawed since the beginning. Some auction winners bid down prices a lot lower than expected. The areas that were available in many places are scattered and don’t create a reasonable footprint for building a broadband solution. There clearly has been an unprecedented amount of inflation since the awards were made. And to be fair, the RDOF awards were made after the pandemic was in full force, and winners could reasonably have anticipated that there would be economic consequences of a major pandemic. Even without the last year of high inflation, it would be hard not to expect some kind of economic turmoil during a 10-year subsidy plan.

I have no doubt that many RDOF winners are now looking at a broken financial model for fulfilling their promise. They are stuck with a terrible dilemma – build the promised networks and have a losing business or pay a substantial penalty to withdraw from RDOF.

It’s disturbing that both the Senators and some RDOF winners are asking for a soft landing for anybody that wants to change their mind and withdraw from RDOF. The RDOF footprints have already been off-limits for other federal grant programs that could have brought faster broadband to these areas. It’s fully expected that the BEAD grants will start being awarded next year, and it would be a true disaster if ISPs default on RDOF after those grants have been awarded. That could strand large numbers of folks with no broadband solution.

This is a dilemma for the FCC. No matter what the agency does, there will likely be additional negative outcomes if RDOF winners are unable to fulfill the pledge to build and operate the promised networks. I’ve always expected the program to have eventual troubles since many of the winning auction bids were lower that what seemed to be needed to create a sustainable business. But I never thought that we’d be seeing requests for a major rework of the program less than three years after the end of the auction.

Too Little Too Late

On July 25, Chairwoman Jessica Rosenworcel shared with the other FCC Commissioners a draft Notice of Inquiry that would begin the process of raising the federal definition of broadband from 25/3 Mbps to 100/20 Mbps. In order for that to become the new definition, the FCC must work through the NOI process and eventually vote to adopt the higher speed definition.

This raises a question of the purpose of having a definition of broadband. That requirement comes from Section 706 of the Telecommunications Act of 1996 that requires that the FCC make sure that broadband is deployed on a reasonable and timely basis to everybody in the country. The FCC interpreted that requirement to mean that it couldn’t measure broadband deployment unless it created a definition of broadband. The FCC uses its definition of broadband to count the number of homes that have or don’t have broadband.

The FCC is required by the Act to report the status of broadband deployment to Congress every year. During the last week of Ajit Pai’s time as FCC Chairman, he issued both the 2020 and 2021 broadband reports to Congress. Those reports painted a rosy picture of U.S. broadband, partially because progress was measured using 25/3 Mbps definition of broadband and partially because the FCC broadband maps were rife with overstated speeds. The FCC has not issued a report since then, and I can only suppose there aren’t the votes in an evenly split FCC to approve a new report.

To give credit, Chairwoman Rosenworcel tried to get the FCC to increase the definition of broadband to 100/20 Mbps four years ago, but the idea went nowhere in the Ajit Pai FCC. At that time, 100/20 Mbps seemed like a reasonable increase in the definition of broadband. Most cable companies were delivering 100 Mbps download as the basic product, and a definition set at 100/20 Mbps would have made the federal statement that the speeds that most folks buy in cities is a reasonable definition of broadband for everybody else.

Chairwoman Rosenworcel is now ready to try again to raise the definition. Perhaps the possible addition of a fifth Commissioner means this has a chance of passing.

But this is now too little too late. 100/20 Mbps is no longer a reasonable definition of broadband. In the four years since Chairwoman Rosenworcel introduced that idea, the big cable companies have almost universally increased the starting speed for broadband to 300 Mbps download. According to OpenVault, almost 90% of all broadband customers now subscribe to broadband packages of 100 Mbps or faster. 75% of all broadband customers subscribe to speeds of at least 200 Mbps. 38% of households now subscribe to speeds of 500 Mbps or faster.

I have to think that the definition of broadband needs to reflect the broadband that most people in the country are really using. One of the secondary uses of the FCC broadband definition is that it establishes a goal for bringing rural areas into parity with urban broadband. If 75% of all broadband subscribers in the country have already moved to something faster than 200 Mbps, then 100 Mbps feels like a speed that is already in the rearview mirror and is rapidly receding.

When the 25/3 definition of broadband was adopted in 2015, I thought it was a reasonable definition at the time. Interestingly, when I first read that FCC order, I happened to be sitting in a restaurant that was lucky enough to be able to buy gigabit speeds and was sharing it with customers. I knew from that experience that the 25/3 Mbps definition was going to become quickly obsolete because it was obvious that we were on the verge of seeing technology increases that were going to bring much faster speed.

I think the FCC should issue two broadband definitions – one for measuring broadband adoption today and a second definition as a target speed for a decade from now. That future broadband target speed should be the minimum speed required for projects funded by federal grants. It seems incredibly shortsighted to be funding any technology that only meets today’s speed definition instead of the speeds that will be needed when the new network will be fully subscribed. Otherwise, we are building networks that are too slow before they are even finished construction.

Another idea for the FCC to consider could take politics out of the speed definition. Let’s index the definition of broadband using something like the OpenVault speed statistics, or perhaps the composite statistics of several firms that gather such data. Indexing speeds would mean automatic periodic increases to the definition of broadband. If we stick to the current way of defining broadband, we might see the increase in the federal definition of broadband to 100/20 at the end of this year and won’t see another increase for another eight years.

Unintended Consequences of Satellite Constellations

Astronomy & Astrophysics published a research paper recently that looked at “Unintended Electromagnetic Radiation from Starlink Satellites”. The study was done in conjunction with the Low Frequency Array (LOFAR) telescope in the Netherlands.

The LOFAR telescope is a network of over forty radio antennas spread across the Netherlands, Germany, and the rest of Europe. This array can detect extremely long radio waves from objects in space. The antennas are located purposefully in remote locations to reduce interference from other radio sources.

The study documents that about fifty of the 4,000 current Starlink satellites are emitting frequencies in the range between 150.05 and 153 MHz, which have been set aside worldwide for radio astronomy by the International Telecommunications Union. The emitted radiation from the satellites is not intentional, and the guess is that these are stray frequencies being generated by components of some of the electronics. This is a common phenomenon for electronics of all sorts, but in this case, the stray frequencies are interfering with the LOFAR network.

This interference adds to the larger ongoing concern about the unintended impact of large satellite constellations on various branches of science. We already can see that satellites mar photographs of deep space as they pass in front of cameras. The intended radiation from the satellite constellations can accumulate and interfere with other kinds of radio telescopes. There is a fear that this current radiation will interfere with the Square Kilometer Array Observatory that is being built in Australia and South Africa. This new project is being built in remote locations away from cellphones, terrestrial TV signals, and other radios. But satellite arrays will still pass within the range of these highly sensitive radio sites.

The fear of scientists is that interference will grow as the number of satellites increases. Starlink’s current plans are to grow from the current 4,000 satellites to over 12,000 satellites – and the company has approval from the FCC to launch up to 30,000 satellites. There are numerous other satellite companies around the world with plans for constellations – and space is going to get very busy over the next decade.

One of the issues that concern scientists is that there is nowhere to go for relief from these kinds of issues. There are agreements reached at the International Telecommunications Union for setting aside various bands of spectrum for scientific research. But there is no international policemen with the authority to force satellite companies into compliance.

In this case, Starlink is working with the scientists to identify and isolate the issue to hopefully eliminate the stray radiation from future satellites. If the problem gets too bad, the FCC could intercede with Starlink. But who would intercede with satellites launched by governments that don’t care about these issues?

I don’t know how many of you are stargazers. When I was a kid in the early 60s, it was a big deal to see a satellite crossing the sky. A few satellites, like Telstar, were large bright objects crossing the sky. Most of the new satellites are much smaller, but it still doesn’t take very long watching the sky to see a satellite crossing. The sky is going to be busy when there are tens of thousands of satellites passing overhead. It’s hard to think that won’t have unexpected consequences.

Outlook for FWA Cellular Wireless

Mike Dano at LightReading published a recent article looking at the future of FWA (cellular fixed wireless). For those not familiar with the technology, this is broadband delivered to homes and businesses by cellular companies using the new spectrum bands that have been labeled as 5G. This is a new product that has only been around for a little over the year and has already taken the broadband market by storm. At the end of the first quarter of this year, T-Mobile had almost 3.2 million customers and Verizon had almost 1.9 million. It’s likely that UScellular will be entering the market in a big way along with DISH. AT&T is still somewhat on the sidelines – it has an FWA product but is still making fiber a priority.

Dano talked to analysts at Wells Fargo who track the broadband industry. They are predicting that FWA will capture 10% of the residential market by 2025. To put that into perspective there are currently around 118 million homes with broadband, and FWA has quickly captured over 4% of households with FWA products. Wells Fargo analysts are predicting an additional 6.8 million FWA customers by 2025.

Interestingly, these same analysts predict that the cable company share of the residential market will drop from 67% today to 62% by 2025, a drop of 5.9 million customers. I’m not sure how the explosion of fiber construction plays into that math.

These analysts and others foresee the FWA wireless hitting a natural plateau as the technology starts hitting a saturation point in neighborhoods. The FWA technology is not able to serve all homes in an area due to several issues. First, while this product is nice for the bottom line of big cellular companies, their bread-and-butter product is serving cell phones. Since FWA shares the same spectrum, there is a natural limit on how many FWA customers they are willing to serve in any neighborhood. Additionally, both T-Mobile and Verizon tell FWA customers in the fine print that they will throttle the bandwidth anytime cellphone usage gets too busy. When that starts happening, I predict a lot of households will lose interest in the FWA product.

We got a deeper glimpse into the plans for FWA when CEO Mike Sievert of T-Mobile talked about the product at the J.P Morgan Global Technology, Media, and Communications Conference. He says that T-Mobile’s overall market penetration in small and rural markets is now at 16%, and the company’s target is to reach 20% by 2025. He says in prime small markets the company is targeting penetration rates in the mid-30s.

I have my own speculations about FWA. FWA is currently seeing big success because it is filling several market niches. In rural areas, the product delivers speeds from 50 Mbps to 200 Mbps depending on how far a customer lives from a tower. In markets where the alternatives are slower technologies like satellite, DSL, or WISP broadband, customers are happy to have relatively fast broadband for the first time. FWA is also the product for the price-conscious consumer, priced between $50 and $65 when most other broadband technologies cost more. In towns and cities, this product delivers a faster alternative to DSL.

But I have a hard time seeing FWA dominating any market in the long run. Many of the rural markets where it will have gained significant market shares will eventually get fiber from the many rural broadband grant programs. Will households stick with FWA when there is a much faster product?

I’ve already been reading online reviews that talk about the unpredictable bandwidth, which is inherent in a network that shares bandwidth with cellphone customers. Cellular bandwidth already varies throughout the day for a wide variety of reasons – something that anybody who watches the bars on their cell phone understands. FWA is not going to deliver the guaranteed speed performance as a wired technology – quality will vary according to local conditions.

Finally, within a decade, a 100 Mbps connection is going to feel as obsolete as 25/3 Mbps broadband feels today. At the end of the first quarter of this year, Openvault said that only 9.5% of all broadband households are still subscribed to a broadband product of 100 Mbps or less. The public has already abandoned 100 Mbps broadband, and the vast majority of households already have something faster. My prediction is that FWA will have a spectacular market share for the next five years, but a decade from now, the only households still using it will be the same ones that stick with DSL today – homes for whom price is far more important than performance.

Cable Companies and BEAD Grants

The BEAD grant establishes a clear definition of areas that are eligible for grants as any place where broadband speeds are less than 100/20 Mbps. During the lobbying of creating the IIJA legislation that created the BEAD grants, the cable companies and WISPs lobbied hard to get that speed definition when there were pro-broadband members of Congress wanting to set that definition at 100/100 Mbps.

I’ve been wondering lately how State Broadband offices are going to deal with cable companies that don’t meet that speed today. In working around the country, I keep encountering cable networks that don’t meet that definition.

Some of the areas served by cable companies are clearly grant eligible. While the big cable companies have all said that they’ve upgraded all of their networks to be fast, there are still tiny systems owned by the big cable companies that are using older technologies. We were recently working in a county in New Mexico where Comcast was still reporting 50 Mbps speeds for a cable network in the county seat. This system was still operating at DOCSIS 2.0, but to Comcast’s credit, it is finally upgrading the technology to the latest DOCSIS 3.1. If Comcast didn’t make this upgrade, this network would be eligible for BEAD funding.

But the issue that is more normal is cable networks that don’t meet the 20 Mbps upload speed test. I’m seeing a lot of networks, particularly in county seats in smaller counties where the upload speeds can’t deliver 20 Mbps. Should an area where the cable network is only delivering 10 Mbps or 15 Mbps upload be eligible for BEAD grants?

In most cases, the cable companies report the upload speed as exactly 20 Mbps in the FCC maps. But speed tests often show that few households meet that speed. I’m working with a few communities that are up in arms over this because many of the homes and businesses struggle with the restrictions that come with slow upload speeds.

Businesses have embraced functions that need upload speeds. They might want to allow multiple employees to conduct separate Zoom calls at the same time. Businesses have embraced software that operates in the cloud that requires constant upload connections. Businesses have converted to VoIP and need upload bandwidth to maintain telephone calls. Businesses use security systems that upload videos from security cameras to the cloud. I’ve interviewed dozens of businesses that say that they feel restricted by problems with upload speeds – particularly on older networks with a lot of jitter and latency.

Communities are confused about how to deal with this issue. I’ve not heard of any communities served by a cable company that got reclassified as underserved in the flurry of map challenges over the last six months – if readers know of any such reclassifications, I’d love to hear about it. Many communities are hoping that State Grant offices will be open to allowing for BEAD grants in these communities – but that feels like a big uphill battle.

The reality is that when an ISP declares in the FCC broadband maps that it is exactly meeting one of the speed thresholds of either 100 Mbps download or 20 Mbps upload, there should be an opportunity for a community to challenge the FCC maps.

Clearly, a local cable network delivering 10 or 12 Mbps upload speeds does not meet the definition of being served. But I have to wonder about the practical chances of somebody getting a BEAD grant to overbuild such a community.

Another Nuance of FCC Broadband Maps

There is one nuance of the FCC maps that doesn’t seem to be talked about. ISPS are only supposed to show coverage on the FCC maps for locations where they are able to serve within ten business days of a customer’s request for service. Any ISP that is claiming areas it won’t serve that quickly is exaggerating its coverage on the FCC maps. That can have real-life consequences.

This issue can apply to ISPs using many different technologies and situations:

  • I moved to a home in Florida and was told by Comcast that I couldn’t get broadband without a site survey because they didn’t know if the network was available, although all of my neighbors had Comcast, and there was a pedestal in my yard. It took me 45 days to convince Comcast to do a site survey and then several more weeks to schedule the drop.
  • Many fiber builders routinely take 30 days or more to schedule drop construction – following the mapping rules would imply only showing existing customers on the FCC map. Many fiber overbuilders won’t build a drop over some set length unless the customer pays the extra cost. Can an ISP claim an area as served when a customer will have to agree to pay $1,000 to get connected?
  • In looking at detailed mapping data, I find apartment building claimed within the coverage of large cable companies that don’t have cable broadband. It’s almost inconceivable that any ISP could bring service to a large apartment building within ten days of a request. There are dozens of issues involved with bringing broadband to MDUs.
  • WISPs routinely tell prospective customers that they don’t know if they can serve them until they conduct a site survey. It often turns out that they can’t serve some homes without taking extra steps like constructing a tall pole to receive the signal or cutting down trees – but often, some locations can’t served. There are some neighborhoods where a WISP can’t reach most homes. I’ve talked to several WISPs who are clueless about how to accurately report to the FCC.
  • Starlink shows widespread coverage in the FCC maps, although there are now many neighborhoods that are considered as saturated and where the company won’t take any new customers.
  • Many of my readers know Chris Mitchell. The FCC map for his home deep inside St. Paul, Minnesota showed the availability of gigabit wireless from a WISP. Upon inquiry, the WISP said it was willing to serve him but would have to build fiber first to be able to deploy the needed radios. It’s fairly clear that this particular WISP is using the FCC maps as advertising to let folks know it is in the area, and it had greatly exaggerated its coverage area by ignoring the 10-day rule.

There are real-life consequences for areas that are misclassified on the FCC maps. Consider the pockets of unserved areas inside cities. We worked with an urban area recently where we identified nearly 200 such unserved pockets. If those pockets were identified correctly on the FCC maps, then an ISP could ask for BEAD or other grant funding to extend network into these small areas. But if they are claimed as served, then it would be an uphill battle to get grant funding.

In rural areas, any ISP that offers speeds greater than 100/20 Mbps is effectively locking down areas that it claims to serve – and in doing so, is stopping grant money from funding unserved areas. I can’t imagine any easy way to estimate the overall impact of areas that are overclaimed because of ISPs ignoring the 10-day rule – but it’s not hard to imagine that this could represent an additional 5% – 10% of unserved places in rural areas that are incorrectly identified as served. It’s hard to even imagine the extent of the problem in urban areas.

Telehealth Successes

The American Medical Association fully supports telehealth. The group now has gathered enough evidence of the effectiveness of telehealth and touts the advantages of telehealth to its member physicians. There are still a few ongoing issues involving compensation, privacy, liability, and of course, rural patients without broadband access.

The AMA has published a few case studies to help member physicians understand the range of benefits of telehealth. Following are a few of the many case studies highlighted by the AMA.

Telepsychiatry at Virginia Commonwealth University Health. The University serves a wide range of patients over a large footprint, including cities, suburbs and rural areas. VCU Health conducts from 3,000 – 5,000 psychiatric sessions per week and had launched an online trial of psychiatric visits before the pandemic and was conducting up to 20% of psychiatry sessions online when the pandemic hit in March 2020. The numbers flipped with the pandemic, and by April 2020, University Health conducted 90% of sessions online, with the rest in person in hospitals.

Over time it was discovered that patients liked having these visits from the comfort of their home and 80% of visits are still conducted online. The biggest overall benefit of the change to telehealth was that no-shows dropped from 11% pre-pandemic to 6% with more telehealth visits. This was partially credited to an online system that reminded patients of upcoming sessions.

Hypertension Program, Ochsner Health Systems. Ochsner is a non-profit academic health system that operates in over 90 clinics and 20 hospitals in and around New Orleans. Ochsner launched a virtual hypertension program in 2015, long before the pandemic. The program is aimed at helping patients control high blood pressure. A lot of the patients are elderly and low-income. Each patient is assigned a dedicated team consisting of a clinician, pharmacist, and health coach. Patents are given brood pressure monitors that report results back to Ochsner. The average patient checks blood pressure 42 times per week. Patients who aren’t submitting pressure readings get reminders.

The program has been extremely successful. 79% of the virtual patients maintain the desired blood pressure compared to only 26% using traditional office visits. Medication adherence improved by 14% for the on-line patients. The best result is that the out-of-pocket cost for online patients went down an average of $77 per month due to not needing live doctor visits.

Teleneurology and Telestroke – Massachusetts General Hospital. Mass General is a large teaching hospital located in Boston. The hospital tried the first trial with telestroke assistance in 2000 with a small remote hospital located in Martha’s Vineyard. The program provides specialists at Mass General to quickly help doctors at smaller hospitals understand if a patient is undergoing an acute ischemic stroke. If so, it’s vital to administer tPA, a lifesaving drug, within 60 minutes. The small hospital transmits brain scans and other data to Mass General to help in the diagnosis.

The trial was successful, and Mass General has extended the service to 34 community hospitals. This is a cost-savings program since otherwise, each small hospital is required to have a physician with acute stroke experience available 24/7. The Mass General team is able to respond to requests from smaller hospitals in an average of 5 minutes.

 

Broadband Grants and Affordable Rates

One of the things that I don’t hear discussed enough is that some of the ISPs chasing rural broadband grants have high broadband rates. I’m curious how much emphasis State Broadband offices will put on the retail rates of grant applicants when evaluating grant winners.

The two most easily identified ISPs with high broadband rates are Charter and Comcast. Charter rates for standalone basic broadband are now over $90 in many markets, and Comcast is nearing $100 per month. Both ISPs don’t give any indication that they are going to slow down with annual rate increases. In fact, now that broadband customer growth has slowed, rate increases are the best path for these companies to satisfy Wall Street expectations.

But these two companies aren’t the only expensive ISPs that are winning grants. Any other big cable companies that will be pursuing grant funding have rates similar to Charter and Comcast. Many of the RDOF winners have high rates. Nextlink has affordable slower speeds but charges $100 for 300 Mbps – the starting speed for cable companies. Resound Networks currently charges $99 for 100 Mbps. I’ve seen several smaller RDOF winners with base rates starting around $100.

We already know that the high broadband rates of the cable companies in cities are a major factor in the growth of broadband deserts where many households can’t afford broadband. Numerous studies have shown a direct correlation between household income and broadband adoption – high rates make it harder to afford broadband.

To be fair to the big cable companies, most have special low rates for low-income residents, but that also comes with slower speeds. But I have to wonder if cable companies will be as willing to connect low-income homes on a newly built fiber network where it can easily cost over $1,000 to add a new subscriber. It’s relatively inexpensive in cities to add a customer to a coaxial cable network, but will the cable companies be willing to make a significant investment for homes that will have low rates that will take many years to break even?

A lot of ISPs participate in the FCC’s ACP plan that gives low-income subscribers a $30 monthly discount. But the funding for that program will be gone around the end of the first quarter of 2024, and it’s anybody’s guess if a divided Congress will approve the continuation of a low-income program.

What is not being discussed enough is that most of the ISPs that participate in ACP or have their own low-income plan don’t aggressively push saving to low-income households. It’s easy for public relations purposes to have these programs but not let customers know the discounts exist. The ISP that brags the loudest about serving low-income households is Comcast. The company’s website says that it has brought its low-income product to 10 million people since 2011. That’s impressive and probably equates to something like 4 million homes. It’s less impressive when you realize that Comcast passes over 61 million homes. ACP is eligible to homes with household incomes up to 200% of the level of poverty. My quickie estimate is that perhaps 13 million homes in the Comcast footprint are eligible for the ACP discount (I’d appreciate other estimates).

But back to high rates. There is a significant level of poverty in many rural areas. In the states I’ve been working in recently, the level of poverty in rural counties is generally higher than the statewide average. How much good are we doing for rural counties when we fund broadband networks where the rates will be over $90? Even with the ACP discount the cost of broadband will be over $60.

I contrast this to many cooperatives and even the larger telco overbuilders. Most of them have broadband rates in the $60 – $70 range. If I’m a rural customer and some giant ISP is going to bring fiber using grant money, I’d prefer the rates from AT&T or Frontier over the big cable companies. But the surveys I’ve done show that folks prefer local ISPs over big ISPs – they are hoping that grants will go cooperatives, small telcos, and other local ISPs.

I expect the BEAD grants will have the most complicated scoring of any broadband grant program ever. There are so many requirements for qualifying for a BEAD grant that it’s hard to think broadband rates can play a significant role in determining who wins the grants. That’s unfortunate because, in the long run, rates might be the most important factor for an ISP that comes to a rural area.

Using 42 GHz Spectrum for Broadband

The FCC circulated draft rules to govern the lower 42 GHz spectrum (between 42 – 42.5 GHz). This is within the range of spectrum referred to as millimeter wave spectrum.

This is one of the more unusual FCC spectrum deliberations because this spectrum is totally empty – there is nobody currently authorized by the FCC to use the spectrum band. The FCC is starting this deliberation with a clean slate.

FCC Chairperson Jessica Rosenworcel says that this gives the FCC an opportunity to come up with a spectrum-sharing model that will be easy for wireless carriers to use while maximizing the benefits for the public.

The early draft rules ask the industry to comment on three different approaches to the use of the spectrum.

  • A nationwide non-exclusive licensing approach, in which licensees would apply for a license and then register and coordinate specific deployment sites with a third-party database administrator.
  • A site-based licensing approach, in which licensees would directly apply to the FCC for each deployment site.
  • A technology-based sensing approach, in which operators would employ certain technologies to avoid harmful interference from other users of the spectrum – all coordinated locally without the use of a database administrator.

Since this spectrum is being made available for the first time, the FCC is also asking about the general rules that ought to apply to the spectrum, such as buildout requirements for carriers getting a license, the technical rules like power levels, and any synergies with the spectrum sharing approaches already being considered for the lower 37 GHz spectrum. The FCC specifically asks how using this spectrum can be done in such a way as not to interfere with radioastronomy sites that use spectrum from 42.5 GHz to 43.5 GHz.

The FCC is proposing to subdivide this 500 MHz of spectrum into five 100-MHz channels.

Millimeter-wave spectrum is interesting. Folks might recall that Verizon and T-Mobile used similar spectrum in the early days of the 5G marketing craze to deliver gigabit speeds to specially equipped cell phones. There were TV and print ads everywhere showing the gigabit+ speeds that were being delivered. Of course, those ads didn’t mention that this was not a serious attempt at a new technology. The public trials were done in the centers of large cities and needed small cell sites every 500 feet or so. Millimeter-wave spectrum has no ability to penetrate anything, and it was quickly discovered by the users of the technology that the signal could be blocked if the user positioned their body between the cell site and the phone. The penetration of the spectrum was so poor that it had trouble making it through panes of glass.

But the new proposal is for big channels that can carry an immense amount of bandwidth. It’s not hard to imagine a multi-gigabit point-to-point connections between nearby buildings – bring fiber to one building and use this spectrum to reach buildings in the vicinity. One of the most intriguing uses of millimeter wave spectrum is indoors in factories or offices using ceiling-mounted transmitters that can beam immense bandwidth within a confined space.

It will be interesting to see where this investigation leads. CTIA, the trade association for the largest cellular carriers, generally favors exclusive licenses, which is the opposite of what is being proposed by the FCC. As I wrote in a recent blog, the docket ought to attract the full range of wireless constituencies, each with a different idea about how to make this work.

Another Red Flag – the BEAD Labor Requirements

The BEAD grant rules established by the NTIA are going to be a difficult hurdle for many ISPs to cross. I think most ISPs reading the NTIA’s Notice for Funding Opportunities (NOFO) will find things on the list of requirements that will be hard or difficult to meet. If you are thinking of applying to BEAD, you should read these rules carefully after reading this blog. The rules start on page 56 of the NOFO.

Without trying to sound too critical, the labor requirements sound like something written by bureaucrats who are designing a hypothetically perfect labor system instead of written by folks who have ever built a broadband network and have dealt with broadband contractors. Let’s run through some of the requirements to make this point:

Those seeking grants must demonstrate that they intend to comply with federal labor and employment laws. I think every grant I’ve ever worked with has this same requirement, which is usually satisfied by having an officer of the applicant attest that they will follow the law. However, the NOFO goes much further than that. A grant office must obtain an applicant’s record of compliance with federal laws, as well as the records of any entities that will participate in the project, including contractors and subcontractors.

This will require an ISP to specifically identify contractors and subcontractors before filing for a grant. All of these entities must prove their past compliance with federal labor laws. This is not how the industry functions. The entire industry works on a system of primary construction contractors and a host of smaller subcontractor crews. Big ISPs like Charter and Frontier can easily identify their primary contractor because they will have them under contract to handle whatever future work comes along. Smaller ISPs typically find a primary contractor after they know they have a project – like after they win a grant.

I wrote a recent blog that talked about the problems that small ISPs are having in getting projects constructed. I gave an example of a financially stable ISP that couldn’t find a contractor in today’s market to build a few small projects funded by ARPA grants. This difficulty came after the ISP already had the projects and funding in hand. I can’t imagine rural contractors that will be willing to sign on to a grant project at the application stage when they don’t even know if the ISP will win the grant. This requirement shows a total lack of understanding of how small construction contractors function. Their number one goal is to always keep crews working. They choose projects based on the timing of the work, the level of payment, and the location.

It’s inconceivable to me that the typical contractor will agree to sign onto a grant project even before the grant application – that is forcing contractors to pick ISPs they think will win grants. This NTIA rules seems to want to make sure that all work is done by quality contractors by making applicants and contractors pair off even before winning a grant. I can think of a dozen ways how this can backfire on a contractor that agrees to work for a given grant project when it can’t possibly know if and when that grant will be awarded and when construction will start.

This requirement also shows a lack of understanding about the makeup of the construction companies that build broadband infrastructure. Underneath the prime contractors are normally a host of smaller subcontractors – even for projects built by the giant ISPs. Subcontractors are often single crews who hire on to projects. These small crews come and go. I’ve never heard of any sizable broadband project that could identify the small subcontractors that would eventually work on the project. Crews regularly leave and get replaced as needed during most projects. There is no way that these small 6-technician crews will sign on to theoretical grant projects two years before the start of construction. Only in a fantasy world can a contractor promise the make-up of the subcontractor workforce over the life of a multi-year construction project.

The NOFO suggests ways around this requirement, which it knows is hard, by suggesting that ISPs directly hire the labor force. I laughed out loud at that idea in an environment where ISPs are having trouble keeping existing staff or hiring new staff. Trying to build a grant project with employees might be the riskiest strategy of all. Most ISPs I know have an ethical problem hiring crews that will be let go in three or four years at the end of grant construction – and it’s hard to envision that an ISP can attract technicians who understand that work will be temporary.

Building networks with employees will also require buying expensive construction equipment that would have no use past the term of the grant. This idea is impractical since there is still a multi-year backlog in the supply chain for specialized fiber construction equipment. Plus, do we really want to require that an ISP must buy a million dollars of boring equipment just to win a grant? Can the NTIA please invent more ways to make it even more expensive to take the BEAD funding?

The NOFO also has a strong preference for using unionized contractors and getting a labor agreement specific to the grant project. The NOFO even suggests labor peace accords where workers agree to not strike or disrupt work during the life of the grant. It seems like a big stretch to get unions to make such agreements for theoretical grant projects that may not be built for many years into the future.

The NOFO also places a huge emphasis on having an “appropriately skilled and credentialed workforce (including by the subgrantee and each of its contractors and subcontractors)”. This means using a workforce where all members of the project workforce will have appropriate credentials, e.g., appropriate and relevant pre-existing occupational training, certification, and licensure.

For projects that don’t use union labor, NTIA wants to see that every employee, including contractors and subcontractors, has safety training, certification, and/or licensure requirements (e.g. OSHA 10, OSHA 30, confined space, traffic control, or other training as relevant depending on title and work), including whether there is a robust in-house training program with established requirements tied to certifications, titles; and information on the professional certifications and/or in-house training in place to ensure that deployment is done at a high standard.

Whoever wrote the NOFO has no understanding of the construction crews who build networks. There has been only a handful of certification programs around the industry for decades, and only a small percentage of technicians who build networks have any formal certification. I think every ISP will agree with me that they want a crew made up of construction veterans with a decade or two of experience rather than a crew that has technicians with newly minted certifications.

It’s hard to know if this is intentional, but like many of the BEAD requirements suggested by NTIA, these labor requirements greatly favor large ISPs over small ones. I think most smaller ISPs will be unable to identify contractors and subcontractors ahead of time and convince contractors to provide their history of adherence to federal law, have all certified employees, and jump through a mountain of paperwork. If I was a contractor, I wouldn’t touch a BEAD grant project with a 10-foot pole – there is plenty of other work available.

I hope that State Broadband Offices push back hard on these requirements to make them realistic. That won’t be easy because some of these rules seem mandatory – but not all.  I strongly urge State Broadband Offices to sit and talk with local ISPs and construction contractors about the hurdles created by these rules – because these requirements will stop quality ISPs from pursuing the BEAD grants.