My Thoughts on the BEAD Grants

I’ve had some time to think about the $42.5 BEADA grants that will infuse a huge amount of money into building broadband networks. I summarized the most important rules in an earlier blog, and today follows up with some observations and predictions about how these grants will probably work.

Not the Same Everywhere. These grants will be awarded through the states. The NTIA will set the overall guidelines, but it’s inevitable that states will have a huge say in who wins the grants. If a state is determined to give these grants to giant ISPs, that state will be able to maneuver within the rules to do so – as will states that don’t want to fund big ISPs. States will definitely put their own stamp on who gets the funding.

Mostly for Fiber. WISPA and other trade associations lobbied hard to set the speed requirement for new grant-funded technology to 100/20 Mbps. This makes fixed wireless and cable company HFC networks eligible for grant funding. This might have been a hollow victory, and I believe that most states are going to give a huge preference to building fiber and will be hesitant to award funding to any technology other than fiber. Undoubtedly, some states will fund other technologies, but my prediction is that most states will give most of the money to fiber projects.

Defining Served / Unserved Areas Will be a Mess. The grants attempt to improve broadband in areas with existing speeds under 25/3 Mbps. This insistence in sticking with measuring speeds will create a huge mess. Communities know that rural speeds are slower less than this, but if the broadband maps remain wrong, they will have to somehow prove it. It would have been so much simpler for the grants to be eligible to overbuild DSL with no speed test. I’m sure these requirements came from lobbying from big telcos, and we also don’t seem able to break away from the dreadful FCC broadband map databases.

A smart state might base grant awards upon state-generated broadband maps, but even that is going to be controversial since incumbent telcos will have a chance to challenge any grant request. Huge parts of the country have been wrongfully locked out of federal grants in the past due to the FCC database, and this is the one big chance to put that behind us. Unfortunately, there will still be communities that get behind by these grants.

Many States are Not Ready for This Funding. A lot of the states only recently started to form state broadband offices, and the size of these grants and the sheer volume of paperwork will overwhelm the people who award grants. There is also a disturbing trend right now of the existing employees of broadband offices bailing to take jobs in the industry. Handling these grants properly is going to require grant reviewers with a lot of expertise to wade through the many grant requests. In this over-busy industry, I don’t know where states will find the experienced people needed to do this right.

Overlapping Grant Requests. The dollar amount of the grant pool is so huge that the states are going to get multiple grant requests that ask to serve the same areas. I’m predicting states will face an almost unsolvable puzzle trying to figure out who to fund in these situations. Just to give an example, I live in North Carolina, and I won’t be surprised if Charter files a grant request to serve most of the state. In doing so, Charter will conflict with most other grant requests – many of which will also overlap with each other.

Big ISPs Want to Be Major Players. Many big ISPs have been recently signaling that they will be seeking huge funding from these grants. AT&T alone said it hopes to use these grants to pass five million new homes. Big ISPs have some major advantages in the grant process. They will have no problem guaranteeing matching funds. They will likely ask for grants that cover large areas, which is going to be tempting for grant offices trying to award the funds. The push by big ISPs creates a dilemma for states since citizens clearly prefer local ISPs run by local people over the corporate indifference of giant ISPs.

Mediacom and West Des Moines

In 2020, the City of West Des Moines, Iowa announced it was building a fiber conduit network to bring fiber to pass all 36,000 residents and businesses in the city. It was a unique business model that can best be described as open-access conduit. What is unique about this arrangement is that conduit will be built along streets and into yards and parking lots to reach every home and business. The City is spending the money up front to cross the last hundred feet.

The City’s announcement also said that the conduit network is open access and is available to all ISPs. Google Fiber was announced as the first ISP tenant and agreed to serve everybody in the city. This means that Google Fiber will have to pay to pull fiber through the conduit system to reach customers.

Mediacom, the incumbent cable company in the city, sued West Des Moines and argued that the City had issued municipal bonds for the benefit of Google Fiber. The suit also alleges that the City secretly negotiated a deal with Google Fiber to the detriment of other ISPs. The suit claims Google Fiber had an advantage since one of the City Commissioners was also the primary Google Fiber lobbyist in the state.

As is usual with such suits, outsiders have no idea of the facts, and I’m not taking sides with either of the parties. A recent article said the two sides are nearing a settlement, and if so, we might never understand the facts. I find the lawsuit to be interesting because it raises several interesting issues.

A lot of cities are considering open-access networks. Politicians and the public like the idea of having a choice between multiple ISPs. But this suit raises an interesting dilemma that cities face. If a city launches an open-access network with only one ISP, like in this case, that ISP gets a huge marketing advantage over any later ISPs. On an open-access network, no ISP has a technological advantage – every ISP that might come to West Des Moines will be providing fiber broadband.

If Google Fiber is first to market, it has an opportunity to sign everybody in the city who prefers fiber broadband over cable broadband. In the case of West Des Moines, each future ISP would also have to pay to pull fiber through the network, and a second ISP might have a hard time justifying this investment if Google Fiber already has a large market share.

From my understanding of the West Des Moines business model, the City needs additional ISPs to recover the cost of building the network – the City clearly intends to bring the benefits of open-access to its citizens. It’s hard to believe the City would intentionally gave an unfair advantage to Google Fiber. But did they inadvertently do so by giving Google Fiber the chance to gain a lock-down market share by being first?

Another interesting question this suit raises is if Mediacom considered moving onto the fiber network? When somebody overbuilds a market with fiber, the cable company must be prepared to compete against a fiber ISP. But in West Des Moines and a few other open-access networks like Springfield, Missouri, the cable company has a unique option – the cable company could also jump onto the fiber network.

It would be interesting to know if Mediacom ever considered moving to fiber. The company already has most of the customers in the market, and one would think it could maintain a decent market share if it went toe-to-toe with Google Fiber or another ISP by also competing using fiber. It would be a huge decision for a cable company to make this leap because it would be an admission that fiber is better than coaxial networks – and this switch probably wouldn’t play well in other Mediacom markets. I also think that cable companies share a characteristic with the big telcos – it’s probably challenging for a cable company to swap to a different technology in only a single market. Every backoffice and operational system of the cable company is geared towards coaxial networks, and it might be too hard for a cable company to make this kind of transition. I’m always reminded that when Verizon decided to launch its FiOS business on fiber, the company decided that the only way to do this was to start a whole new division that didn’t share resources with the copper business.

Finally, one issue this suit raises for me is to wonder what motivates ISPs to join an open-access network in today’s market. I understand why small ISPs might do this – they get access to many customers without making a huge capital investment. But there is a flip side to that and there can be a huge financial penalty for an ISP to pursue open access rather than building a network. In the last few years, we’ve seen a huge leap-up in the valuation multiple applied to facility-based fiber ISPs. When it comes time for an ISP to sell a market, or even to leverage an existing market for borrowing money, a customer on a fiber network that is owned by an ISP might easily be worth ten times more than that same customer on a network owned by somebody else.

That is such a stark difference in value that it makes me wonder why any big ISP would join an open-access network. Open-access is an interesting financial model for an ISP because it can start generating positive cashflow with only a few customers. But is the lure of easy cash flow a good enough enticement for an ISP to forego the future terminal value created by owning the network? This obviously works for some ISPs like Google Fiber, which seems to only want to operate on networks owned by others. But consider a small rural telco that might be located outside of West Des Moines. The telco could generate a much higher value by building to a few thousand customers in a market outside West Des Moines than by adding a few thousand customers on the open-access network.

The giant difference in terminal value might explain why open-access networks have such a hard time luring ISPs. It probably also answers the question of why a cable company like Mediacom is not jumping to join somebody else’s network. It’s an interesting financial debate that I’m sure many ISPs have had – it it better to go for the quick and easy cash flow from open-access or take more risks but hope for the much bigger valuation from building and owning the network and the customers?

The Fight Over 12 GHz Spectrum

For an agency that has tried to wash its hands from regulating broadband, the FCC finds itself again trying to decide an issue that is all about broadband. There is a heavyweight battle going on at the FCC over how to use 12 GHz spectrum, and while this may seem like a spectrum issue, it’s all about broadband.

12 GHz spectrum is key to several broadband technologies. First, this is the spectrum that is best suited for transmitting data between the earth and satellite constellations. The only way Starlink is going to be able to grow to serve millions of remote customers in the U.S. is by having enough backhaul to fuel the huge amounts of data that will be passed to serve that many customers. Lack of backhaul bandwidth will significantly limit the total number of customers that can be served and is an obvious major concern of the satellite companies.

It turns out that 12 GHz is also the best spectrum for transmitting large amounts of data with 5G. The carriers have been dabbling with the higher millimeter-wave spectrum, but it’s turning out that there are squirrelly aspects of millimeter-wave spectrum that make it less than ideal in real-world wireless deployments. The 12 GHz spectrum might be the best hope for carriers to be able to deliver gigabit+ wireless drops to homes. Verizon has been deploying fiber-to-the-curb technology using mid-range spectrum and seeing speeds in the range of 300 Mbps. Using the 12 GHz spectrum could provide a reliable path to multi-gigabit wireless drops.

The big question facing the FCC is if 12 GHz can somehow be used to satisfy both needs, pitting the 5G carriers against the satellite carriers. As an aside, before talking more about the issue, I must observe that the satellite companies bring a new tone into FCC proceedings. Their FCC filings do everything except call the other side a bunch of dirty scoundrels. Probably only those who read a lot of FCC documents would notice this, but it’s something new and refreshing.

The current argument before the FCC comes from filings between Starlink and RS Access, which is associated with Michael Dell, who owns a lot of the spectrum in question. But this is part of the larger ongoing battle, and there have been skirmishes that also involved Dish Networks, which is the largest owner of this spectrum.

The FCC will have to somehow untie the Gordian knot on a tough issue. As is to be expected with any use of spectrum, interference is always a major concern. The usefulness of any band of spectrum can be negated by interference, so carriers only want to deploy wireless technologies that have minimal and controllable interference issues. Both sides in the 12 GHz fight have trotted out wireless engineers who support their positions. RS Access says that spectrum can be shared between satellite and terrestrial usage, supporting the idea of not giving more spectrum solely to Starlink. Starlink says the RS Access engineers are lying and wants dedicated spectrum for satellite backhaul. I don’t know how the FCC can sort this out because the only way to really know if spectrum can be shared is to try it.

What I find most unusual about the fight is that the FCC is being dragged into a broadband issue. The last FCC Commission, Ajit Pai, did his best to wash broadband out of the vocabulary at the FCC. But in today’s world, almost everything the FCC does, other than perhaps chasing robocallers, is ultimately about broadband. While this current 12 GHz fight might look like a spectrum battle to an outsider, it’s all about broadband.

Broadband Labels

There is one quiet provision of the Infrastructure Investment and Jobs Act that slipped under the radar. Congress is requiring that the FCC revamp broadband labels that describe the broadband product to customers, similar to the labels for food.

The Act gives the FCC one year to create regulations to require the display of a broadband label similar to the ones created by the FCC in Docket DA 16-357 in 2016. A copy of the FCC’s suggested broadband label from 2016 is at the bottom of this blog. The original FCC docket included a similar label for cellular carriers.

ISPs are going to hate this. It requires full disclosure of prices, including any special or gimmick pricing that will expire. ISPs will have to disclose data caps and also any hidden charges.

As you can see by the label below, it includes other information that big ISPs are not going to want to put into writing, such as the typical download and upload speeds for a broadband product as well as the expected latency and jitter.

To show you how badly big ISPs don’t want to disclose this information, I invite you to search the web for the broadband products and prices for the biggest ISPs. What you are mostly going to find is advertising for special promotions and very little on actual prices and speeds. Even when it’s disclosed it’s in small print buried somewhere deep in an ISP website. And nobody talks about latency and jitter.

What is even harder for ISPs is that they often don’t know the speeds. How does a telco describe DSL speeds when the speed varies by distance from the hub and by the condition of the copper wire on each street. I’ve seen side-by-side houses with different DSL speeds. Cable companies can have a similar dilemma since there seem to be neighborhoods in every city where the network underperforms – most likely due to degradation or damage to the network over time.

The sample label asks for the typical speed. Are ISPs going to take the deceptive path and list marketing speeds, even if they can’t be achieved? If an ISP tells the truth on the labels, shouldn’t it be required to submit the same answers to the FCC on the Form 477 data-gathering process?

I’m sure that big ISPs are already scrambling trying to find some way out of this new requirement, but that’s going to be hard to do since the directive comes from Congress. It’s going to get interesting a year from now, and I can’t wait to see the labels published by the biggest ISPs.

Big Internet Outages

Last year I wrote about big disruptive outages on the T-Mobile and the CenturyLink networks. Those outages demonstrate how a single circuit failure on a transport route or a single software error in a data center can spread quickly and cause big outages. I join a lot of the industry in blaming the spread of these outages on the concentration and centralization of networks where the nationwide routing of big networks is now controlled by only a handful of technicians in a few locations.

In early October, we saw the granddaddy of all network outages when Facebook, WhatsApp, and Instagram all crashed for much of a day. This was a colossal crash because the Facebook apps have billions of users worldwide. It’s easy to think of Facebook as just a social media company, but the app of suites is far more than that. Much of the third world uses WhatsApp instead of text messaging to communicate. Small businesses all over the world communicate with customers through Facebook and WhatsApp. A Facebook crash also affected many other apps. Anybody who automatically logs into other apps using the Facebook login credentials was also locked out since Facebook couldn’t verify their credentials.

Facebook blamed the outage on what it called routine software maintenance. I had to laugh the second I saw that announcement and the word ‘routine’. Facebook would have been well advised to have hired a few grizzled telecom technicians when it set up its data centers. We learned in the telecom industry many decades ago that there is no such thing as a routine software upgrade.

The telecom industry has long been at the mercy of telecom vendors that rush hardware and software into the real world without fully testing it. An ISP comes to expect to have issues in glitches when it is taking part in a technology beta test. But during the heyday of the telecom industry throughout the 80s, and 90s, practically every system small telcos operated was in beta test mode. Technology was changing quickly, and vendors rushed new and approved features onto the market without first testing them in real-life networks. The telcos and their end-user customers were the guinea pigs for vendor testing.

I feel bad for the Facebook technician who introduced the software problem that crashed the network. But I can’t blame him for making a mistake – I blame Facebook for not having basic protocols in place that would have made it impossible for the technician to crash the network.

I bet that Facebook has world-class physical security in its data centers. I’m sure the company has redundant fiber transport, layers of physical security to keep out intruders, and fire suppression systems to limit the damage if something goes wrong. But Facebook didn’t learn the basic Telecom 101 lesson that any general manager of a small telco or cable company could have told them. The biggest danger to your network is not from physical damage – that happens only rarely. The biggest danger is from software upgrades.

We learned in the telecom industry to never trust vendor software upgrades. Instead, we implemented protocols where we created a test lab to test each software upgrade on a tiny piece of the network before inflicting a faulty upgrade on the whole customer base. (The even better lesson most of us learned was to let the telcos with the smartest technicians in the state tackle the upgrade first before the rest of us considered it).

Shame on Facebook for having a network where a technician can implement a software change directly without first testing it and verifying it a dozen times. It was inevitable that a process without a prudent upgrade and testing process would eventually result in the big crash we saw. It’s not too late for Facebook – there are still a few telco old-timers around who could teach them to do this right.

The Fixation on Speed

I was recently asked an interesting question. Is a 100/20 Mbps broadband connection really better than a 50/50 Mbps one? The question was referring to the new ReConnect grant rules that say that companies can seek grants to overbuild existing networks that are not performing at 100/20 Mbps. I have several different reactions to that question.

My first reaction is to ask an additional question. Do you think the RUS would provide a grant to overbuild a 50/50 Mbps WISP with 100/20 Mbps technology? In this example, the WISP doesn’t meet the RUS download speed threshold but is faster than the upload threshold. The answer is not clear to me. The definition of speed used for the grant has a download and upload component – what if an existing product only fails one of the two tests? I’ve never been a fan of using speeds as the definition of what is eligible for grants, and this kind of dilemma is one of many reasons what picking an arbitrary speed is likely to create controversies.

I next asked myself about how a household would feel about a 50/50 Mbps versus a 100/20 Mbps broadband product. I know some households who struggled with upload speeds during the pandemic even though they had 15 – 20 Mbps upload speed provided by a cable company. A home that struggled with a cable company upload connection likely had multiple students and adults trying to use upload at the same time. But the reason such a household struggled was more complex than just the speed. The upload path in cable company networks uses the worst spectrum inside the cable transmission path. The upload path often has a lot of noise and jitter – so even though a connection might show 20 Mbps on a speed test, the quality of the transmission can be highly compromised. If this theoretical home was fully informed it would choose a product with a faster upload link and should favor the 50/50 Mbps product. But there are plenty of homes with gamers and other heavy bandwidth users who care more about the download speed.

We also can’t forget about latency. A 50/50 Mbps connection on fiber might ‘feel’ faster to a customer than a 100/20 Mbps on a fixed wireless network with higher latency. The human eye is amazing at perceiving a slight difference in latency – fiber connections have an immediacy for the eye that is perceptible. I’ve always thought it would be interesting to set up a lab test where you could give people feeds with different speeds and latencies to measure the perceived differences from a customer perspective. I’ve always guessed that people care about latency a lot more than we think they do – but few customers understand this. I know that when customers first get fiber, they believe that it’s faster, even if they converted from a 100 Mbps download cable connection to a 100 Mbps fiber connection.

But back to the grant question – I don’t want to see federal grants used to build 100/20 Mbps broadband. If we are going to spend once-in-a-lifetime federal grant money we should be building networks that will be adequate a decade from now. If we don’t build for the future, then in ten years, we start the cycle all over again of talking about how to improve rural networks. There are plenty of arguments to be made why 100/20 Mbps is a good broadband connection today. But I defy anybody to say that it will be adequate a decade from now. A decade is a long time in the broadband world. A decade ago, the cable companies offered 30/3 Mbps speeds, and most people were happy with it. A decade from now a 100/20 Mbps connection will feel as inadequate as a 30/3 Mbps connection today.

Choosing between 100/20 Mbps and 50/50 Mbps today is an interesting thought exercise. There are some homes that should prefer each choice, so what matters most is the buyer of the broadband. I also think the answer is about more than just speed, and we must consider speed, jitter, and latency to fully compare two broadband options. But perhaps the most important thing to consider is that if a household buys a 100/20 Mbps connection today that they will likely be unhappy with that same product in a decade. It doesn’t take a crystal ball to understand this.

Being an ISP

Over time, this blog has talked about everything broadband, but I don’t think I’ve ever talked about being an ISP. In the simplest terms, an ISP is somebody that connects to a home or business and routes broadband traffic to and from the Internet. ISPs do a lot more these days. For example, they protect customers against hacking and bad behavior on the web.

We all know the big ISPs like Comcast, Charter, AT&T, and Verizon since these four ISPs serve over 75% of all broadband customers in the country. All of the other ISPs you hear about collectively serve the other one-fourth of the U.S. market.

The heyday of the ISP industry, in terms of the total number of ISPs, was probably in the late 1990s when anybody could be a dial-up ISP by buying a modem bank, some telephone lines, and a connection to the Internet. It seems like every small town and even many neighborhoods had one or more ISPs who competed with the few big nationwide players like AOL and CompuServe.

ISPs come in every shape and size. The ones we most think about as ISPs own networks that reach people’s homes, either through wires or wirelessly. Satellite companies like Viasat and Starlink are ISPs. But there are other kinds of ISPs. For example, some ISPs lease fiber connections from a city or somebody else that owns a network. There are still some ISPs delivering broadband over leased telco copper wires. A lot of people don’t think of cellular carriers as ISPs, but most people today have smartphones and connect to the Internet using apps. In many of the surveys we conduct, we see that as many as 10% of households only connect to the Internet over a cellular connection.

ISPs are somewhat regulated, but it gets complicated. The FCC under Ajit Pai largely deregulated broadband by wiping out the FCC’s Title II authority to regulate ISPs except for a handful of regulations specifically required by Congress. In doing so, Chairman Pai constantly referred to his deregulation as light-touch regulation, but the FCC eliminated 90% of the ways that the agency might theoretically be able to regulate ISPs. Consequently, the current FCC has very little regulatory authority over ISPs.

This doesn’t mean that ISPs are fully unregulated. ISPs are supposed to comply with a few regulations. For example, they are supposed to register with the FBI and describe the steps needed if the FBI wants to surveil a customer on an ISP network. An ISP has to officially register with the FCC if it wants to participate in receiving any funding from the Universal Service Fund. Many states expect ISPs to register as carriers – mostly, so the state knows who they are.

The FCC requires ISPs to use the Form 477 process to report the location of customers by Census Block, along with a description of the technology being used and the speeds delivered. But broadband regulation is taken so lightly that a lot of ISPs ignore this completely. For example, in almost every county I’ve ever worked in, there is a least one ISP that doesn’t report customers to the FCC. There doesn’t seem to be any penalty for not reporting or at least any that I’ve ever seen. Some of the ISPs that skirt regulation are sizable and sell fiber connections to large businesses in multiple markets.

ISPs are also theoretically regulated by the Federal Trade Commission. But that is truly light regulation because the FTC can’t easily establish rules or policies that affect all ISPs. Instead, the agency occasionally punishes a specific ISP for bad behavior, mostly centered on mistreating customers in some manner.

There are a lot of entities that don’t even realize they are ISPs. Governments often build fiber networks to connect various government buildings into a local network. But when cities then connect all of the government locations to the Internet, they have become an ISP. Cities also often branch out and provide a fiber connection to a few large businesses in a community – often without realizing this makes them an ISP like any other.

The big ISP industry believes that broadband regulation will be coming back when the FCC finally gets a fifth Commissioner. Companies with monopoly powers in all industries would love to be unregulated, and so far, the only two groups of companies that have largely been able to pull this off are ISPs and the giant web content companies. The need for some regulatory oversight is obvious. For example, the FCC is currently investigating the response efforts of big ISPs after a major storm. But without explicit regulatory authority, I’m not sure the agency has any authority to compel ISPs to do more to be ready for disaster recovery.

Being Serious about Local Resiliency

The FCC recently voted unanimously to adopt a Notice of Proposed Rulemaking to investigate the disaster resiliency plans of major telecom providers. The FCC noted that it has been taking too long after recent natural disasters to restore cellular and broadband services and wants to know if there are any steps the agency can take to improve recovery times after network outages.

As part of the investigation, the FCC wants to hear more about carriers’ existing recovery plans, and hopefully, the FCC will compare the actual behavior after recent disasters with the formal recovery plans. The FCC also wants to examine policies such as requiring more backup power.

One of the things I’ve noticed through the years in the industry is that small ISPs plan for disasters better than large ones. When I visit a small telco that sits in a known flood plain, I expect to find huts built on stilts to survive floods. I expect to find huts and buildings in hurricane zones built to withstand hurricane winds. I expect to find a robust system of backup generators, including many portable ones that can be deployed quickly when needed. If I go down the road a few miles to an area served by a big telco, I don’t expect to find any of these things.

Disaster recovery preparedness is not cheap. It costs more to build huts on stilts or to build a fortress for a hut to withstand storms – but in the long run, the prudent solution is also the lowest cost one. It costs far more to try to restore service after a big disaster. Big carriers will always say that resiliency is important, but you’ll have to dig hard to find examples of where they spent the extra money up-front to build things right.

Small companies are also more prepared in other ways. For example, small telcos, and electric coops both have formed pacts for aiding each other after disasters. When a bad storm hits a small company there is generally a swarm of technicians from around the country who converge to begin making repairs within hours after the end of the storm. Big companies bring in outside help as well, but not with the immediacy and vigor that I’ve witnessed many times with smaller companies.

Disaster recovery also means a lot of more subtle ways to be ready for trouble. Something as simple as having sufficient spare circuit cards, or a yard full of spare fiber and poles, or plenty of fuel for vehicles can make a huge difference in restoring service quickly.

Resiliency means a lot more than disaster recovery. Resiliency means planning ahead so that disasters don’t knock out service. Being prepared might mean a lot of little things like placing metal poles at key network intersections. It means designing fiber rings with automatic rollover that don’t lose service from a single fiber cut. It means burying fiber in places where damage can be expected, even if it costs more. Resiliency means adhering to a tree-trimming program.

If the FCC investigates the issue in the normal way, it will hear from a string of witnesses from big carriers that all swear their companies take resiliency seriously and who vow that it is a priority. Each big carrier will have a four-volume disaster recovery manual that looks impressive. But what the big carriers don’t have is a track record of recovering from disasters quickly – which is why the FCC created this docket.

I have a suggestion for the FCC if they want to do this right. Bring in frontline workers from smaller telcos, cellular companies, and electric cooperatives and have them examine the disaster and resiliency plans from the big carriers. Let these frontline workers visit the huts and pole yards of the big carriers. I’m sure they would find dozens of holes in the big company plans.

Unfortunately, I don’t think the FCC can force big companies to do the right thing. the FCC might be able to require a few things like requiring more backup generators. But the FCC can’t pass rules that are detailed enough to make sure that the big carriers do all of the little subtle things right. That only comes from companies that care for their customers as much as they care for the bottom line.

Broadband Adoption Grants

The recently enacted Infrastructure Investment and Jobs Act (IIJA) created two new grant programs to address digital equity and inclusion. This section of the IIJA recognizes that providing broadband access alone will not close the digital divide. There are millions of homes that lack computers and the digital skills needed to use broadband. The grant programs take two different approaches to try to close the digital divide.

The State Digital Equity Capacity Grant Program will give money to States to then distribute through grants. The stated goal of this grant program is to promote the achievement of digital equity, support digital inclusion activities, and build capacity for efforts by States relating to the adoption of broadband. I haven’t heard an acronym for this grant program – it’s likely that each state will come up with a name for the state program.

The Act allocates $1.5 billion to the States for this program – that’s $300 million per year from 2022 through 2026. Before getting any funding, each state must submit a plan to the NTIA on how it plans to use the funding. States will have to name the entity that will operate the program, and interestingly, it doesn’t have to be a branch of government. States could assign the role to a non-profit or other entity.

The amount of funding that will go to each state is formulaic. 50% will be awarded based upon the population of each state according to the 2020 Census. 25% will be awarded based upon the number of homes that have household incomes that are less than 150% of the poverty level, as defined by the U.S. Census. The final 25% will come from the comparative lack of broadband adoption as measured by the FCC 477 process, the American Community Survey conducted by the U.S. Census, and the NTIA Internet Use Survey.

The second new grant program is called the Digital Equity Competitive Grant Program. These are grants that will be administered by the NTIA and awarded directly to grant recipients. The budget for this grant program is $1.25 billion, with $250 million per year to be awarded in 2022 through 2026.

These grants can be awarded to a wide range of entities, including government entities, Indian Tribes, non-profit foundations and corporations, community anchor institutions, education agencies, entities that engage in workforce development, or a partnership between any of the above entities.

This will be a competitive grant program, with the rules to be developed by the NTIA. While the broadband infrastructure grants in the Act include a long list of proscribed rules, Congress is largely letting it up the NTIA to determine how to structure this grant program.

That’s going to make for some interesting choices for entities involved in digital inclusion. They can go after funding through the state or compete for nationwide grants. I doubt that anybody can make that decision until we see the specific grant rules coming out of each program.

I’ve been hearing about digital inclusion at every conference I’ve attended for the last fifteen years. For many years we talked about this as finding ways to solve the digital divide. We’ve known for all these years that there are homes that don’t have broadband because they can’t afford a broadband connection. We’ve known that homes can’t afford computers or other devices. And we’ve known for a long time that a lot of people don’t have the digital skills needed to use broadband.

There have been efforts over the years to address the issues, mostly done at the local level and mostly through non-profits. This is the first time that real funding is being aimed at solving these issues. It’s going to be interesting to see what comes out of this funding. I’m sure there will be some dazzlingly successful programs as well as some that will fizzle – but these grants will provide the grand experiment to find out what works the best. I like that these grants make new awards each year for five years – and I hope Congress pays attention because some of the best programs that get this funding will deserve to be funded when these grants are over.

We are only going to best thrive as a nation when everybody comes along for the ride, and this is the first set of grants that will take a serious shot at bringing broadband to those who are not benefitting from broadband technology.

New Middle-Mile Grants

One of the new programs established by the Infrastructure Investment and Jobs Act (IIJA) is $1 billion in grants to build middle-mile fiber. The grant program will be administered directly by the NTIA. The grant program defines middle-mile as any broadband infrastructure that does not directly connect to an end-user location. Projects that might be considered for the grants include building fiber, leasing of dark fiber, submarine cable, undersea cables, transport to data centers, carrier-neutral internet exchanges, and wireless microwave backhaul.

The amount of funding is disappointingly small. For comparison, the California legislature created a $3.25 billion middle-mile fund just within the state. Rural America is woefully underserved by middle-mile infrastructure. Rural communities can all attest to the pain of losing broadband, cellular coverage, and public safety systems anytime the backbone fiber into a rural area goes out of service.

The stated purpose of the grants is to reduce the cost of connecting unserved and underserved areas to the Internet and to promote resiliency – which is the phrase currently being used as a surrogate for redundancy. Resiliency means bringing a second transport route into an area so that a single fiber or microwave failure won’t strand a community without broadband.

The federal grants will provide up to 70% of the cost of constructing middle-mile connectivity. Priority will be given to projects that:

  • Leverage existing rights-of-way to minimalize regulatory and permitting challenges.
  • Enable the connection of unserved anchor institutions.
  • Facilitate the creation of carrier-neutral interconnection facilities (places where multiple carriers can meet and exchange traffic).
  • Improve the redundancy of existing middle-mile infrastructure.

Grant applicants must demonstrate financial, technical, and operational capability to be eligible. Grant applicants must also satisfy at least two of the following conditions (and more than two would be better):

  • Has a fiscally sustainable middle-mile strategy.
  • Will offer non-discriminatory access to other carriers and entities.
  • Projects which identify specific last-mile networks that will benefit.
  • Projects with identified investments or support that will accelerate the construction and completion of a project.
  • Projects that will benefit national security interests.

The NTIA is directed to prioritize grant applications that:

  • Connect middle-mile infrastructure to last-mile networks that plan to serve unserved areas.
  • Connect non-contiguous trust lands.
  • Offer wholesale broadband service to other carriers and entities.
  • Can complete the buildout in a timely manner.

It’s likely to be until late 2022 until the grant program is taking applications. While not gigantic compared to other parts of the infrastructure bill, this grant would still translate into 20,000 miles of middle-mile fiber at $50,000 per mile. These grants are going to most easily be awarded to solid financial recipients that have assembled a consortium of entities that will pledge to use the new middle-mile routes. I strongly suggest that regional groups start talking now to be ready when these grants when announced. At only $1 billion, it seems likely that there will only be one grant cycle.