Broadband Deserts

Perhaps it’s because the death of Queen Elizabeth is everywhere in the news, but somebody sent me an article from the BBC from 2008 where then Prince Charles warned that the lack of rural broadband in the UK was going to eventually result in broadband deserts.

The now King Charles III was quoted as saying that lack of broadband puts too much pressure on the people who live without broadband and that if a solution wasn’t found to bring rural broadband that the rural areas would turn into ghost towns as people abandoned rural areas. He was particularly worried about farmers, who, even in 2008, needed broadband to thrive. His fear was that the rural UK would turn into nothing more than a place for city residents to have a second home.

He was right, of course, and we’re already starting to see problems crop up in rural areas in the U.S. that don’t have broadband. Counties with poor broadband are seeing people move away to get better jobs or to get broadband for their kids. Farmers without broadband are at a serious disadvantage compared to peers using the latest broadband-enabled technology. Real estate agents are reporting that it’s extremely difficult to sell a home that has no broadband option. Several studies have shown that students that grow up without home broadband don’t perform nearly as well as students with broadband.

There are hundreds of rural counties working hard to get fiber broadband with the hope of stemming the population loss. Many are hoping to attract people who work from home as the best way to stem population loss and stimulate the local economy. They are banking on the notion that people will want to live in a beautiful and low-cost place while working from home.

There is a lot of hope that the various grant programs will solve a huge amount of the rural digital divide. There is grant money being used from ReConnect, ARPA, and the upcoming giant $42.5 billion BEAD grants that will bring broadband to a lot of rural counties. I’m already working with some counties that feel certain that they will be getting wall-to-wall fiber.

But I’m also working with counties that are not so sure they will get better broadband. They live in parts of the country where there are no small telcos or electric cooperatives interested in serving them. They live in places where the cost of building broadband is going to push them into the high-cost category, where a 75% BEAD grant is not going to be enough to entice an ISP.

As I wrote in a recent blog, there is also no guarantee that the current grants are going to reach everywhere. I think that there is a legitimate concern that communities that don’t get funding from current grants might have a long wait to ever see gigantic broadband grants again.

The world has changed a lot since King Charles warned about broadband deserts. In 2008, the Internet was already important to some folks, but over time it has become vital to a majority of households. In the community surveys I’ve been conducting this year, I am seeing where at least 30% of homes include somebody who works remotely from home – and in some counties, that’s a lot higher. These same surveys routinely show that many homes don’t have the broadband needed to support homework for students. I routinely hear from rural businesses that are struggling due to the lack of broadband.

The UK also has a program to build rural fiber. Project Gigabit is currently using £5 billion to bring broadband to 567,000 remote homes. Most of these projects start construction in 2023 and are expected to be done by the end of 2024.

To some degree, promoting rural broadband is a demographic experiment on a large scale. Congress is better than broadband infrastructure will revitalize many rural communities and give them the basis for a sustainable economy. I have no doubts that this isn’t going to happen everywhere because faster broadband by itself is not a cure-all for social woes. But communities that make a commitment to foster the best benefits of better broadband increase the chances of surviving and thriving.

The Data Divide

A report from the Center for Data Innovation warns of a new broadband gap they call the data divide, which is when some parts of society are not sharing in the big societal advantages that come from using and understanding the huge amounts of data that are being generated today.

The report includes examples of the data divide that make the concept easier to understand.

  • Patients who aren’t part of the system that is creating lifetime medical records won’t get the same quality of healthcare as people who have a fully documented medical history. People who don’t use medical monitoring devices won’t be informed early of health problems. People who aren’t included in genetic databases won’t be alerted to lifesaving techniques specific to their genome.
  • People that don’t participate heavily in the transactions that are used to calculate a credit score are often discriminated against even though they may be creditworthy.
  • Poor and rural school districts often don’t collect the same kind of data on students and student achievement, putting those students at a disadvantage for higher education opportunities.

These are just a few of the many examples that are going to become apparent as we move into a world that is data-centric. The report warns that the data gap doesn’t just affect individuals but also whole communities that are not as connected. The report points out the obvious – that access to the best data tends to be a luxury item available to wealthier individuals and communities.

We are now awash in data. Practically everything we do in public is recorded. The report warns that data can be put to good use to benefit society or used alternatively for the detriment of individuals and society.

Having access to mountains of data is a relatively new phenomenon that has been created by the combination of large data centers, the nascent AI industry, quantum computing, and companies that specialize in mining data. The report says that we are just entering a new world where data will play an important role in everybody’s life and that now is the time to start thinking, as a country and a society, about how we are going to control and use the mountains of data.

The report suggests that we develop national standards and goals for public data gathering so that decision-makers have the right data at their fingertips. Some of the ideas fostered by the report include:

  • Improve the quality of government-generated data.
  • Create standards to enhance the quality and usability of non-government data.
  • Support standard ways to incorporate crowd-sourced data and private-sourced data into government datasets.
  • Direct government agencies to develop strategies to make sure that data is available to all communities.

The report warns that the data gap will become severe unless we make sure that accurate data is available to everybody. Without a strategy and data policies, we’re on a path where most data will be used to market selectively to people or to lobby people with political ideas based upon a personal profile. Those uses of data seem inevitable, but we ought to be guaranteeing the upsides of using the data gathered about us rather than the downsides.

I’ll be the first to admit that I don’t understand enough about large datasets to know what is floating around in the world. But I know that both the government and private companies are gathering huge amounts of data about us, and it makes sense to create rules that make it hard to misuse the data and that make useful data available to everybody.

Overestimating Inflation

I’ve worked through a number of full economic cycles in the industry, including a number of recessions, that include:

  • The double-dip recession of 1980–1982 that was worsened by the Iranian revolution and the oil crisis.
  • A recession in 1990 was caused by accumulated consumer debt and another oil price spike.
  • The recession in 2001 was driven by the dot.com (and telecom) crash and worsened by September 11.
  • The great recession of 2007 – 2009 was the result of the subprime mortgage crisis and the meltdown of Wall Street.
  • A short recession in 2020 was caused by the COVID pandemic.

Most of these recessions were followed by periods of economic turmoil, which saw fluctuating interest rates and periods of inflation – like we are experiencing now as the delayed impact of the pandemic.

There are two interesting economic phenomena that recur in times of economic turmoil. The first is that people and businesses don’t believe that interest rates will rise. This current period of rising interest rates is a great example. Anybody that’s been paying attention has been hearing from the Federal Reserve for nearly a year that it would likely have to increase interest rates. With that much warning, people and businesses with adjustable-rate mortgages and loans had a lot of warning to refinance the debt to a fixed rate – and yet many did not do so. Mortgage rates recently hit 6% and are likely to increase further, and we’re soon going to be seeing stories of families and businesses defaulting because of the increased debt payments. For whatever reason, most people and businesses don’t refinance, even with persistent warnings that higher interest rates are on the way. This seems to be a recurring theme every time we enter a period of increasing interest rates.

The other phenomenon that I’ve seen repeated over and over is that businesses overact to inflation. Once they start seeing cost increases, they begin acting like inflation will be permanent and will continue to climb steadily. It has never done that in the U.S. economy. Costs spike, but the rate of inflation invariably slows down and goes back to normal. We’re already seeing inflation slowing due to falling fuel prices – which makes sense since escalating gasoline prices was one of the drivers of inflation.

What do I mean when I say that businesses overact to inflation? One way that I see it is in helping ISPs build business plans. I have clients that want me to build perpetual inflation into future forecasts. Building never-ending inflation into a forecast can quickly make a good idea look terrible. It is extremely unlikely that inflation will continue unabated – it never has in the U.S. economy. There are parts of the world where hyperinflation is normal, such as Nigeria that has seen inflation rates of 15% – 20% annually for many years. The country adjusts for this with currency manipulation, and businesses give big pay raises every year – and the real-life impact of inflation is barely noticed. It’s just how the economy works. But we have monetary policies in the U.S. and most of the top economies that are able to quash continuous hyperinflation.

The other overreaction to inflation also comes from vendors who raise prices in anticipation of inflation instead of reacting to actual cost increases. Economists will tell you that anticipation of inflation is one of the major drivers of inflation. It becomes a self-fulfilling prophecy when enough of the players in an industry raise prices in anticipation of future higher costs, which then becomes a driver of continued inflation. There was a lot of recent talk about price gouging and excess profits, and at least some of that came from businesses and industries that raised prices in anticipation that underlying costs would increase. There is a fine line between raising prices to be safe and price gouging. This is a phenomenon that happens in every recession, and the good news is that this also self-corrects over time when people flock to sellers that didn’t raise prices, and market prices eventually stabilize and stop climbing.

One of the most interesting things about the current period of inflation is that I’m not seeing ISPs increasing to keep up with inflation. I’m sure some ISPs have raised rates, but most broadband prices have not climbed in the past six months, as might be expected from looking at prices in other industries.

There are good reasons for this that are not related to inflation. The biggest cable companies have suddenly stopped growing, and they are probably afraid that raising rates will push folks to competitive alternatives – we’ll find out for sure at the end of the year. And most smaller ISPs take direction from the behavior of their large competitors – nobody wants to raise rates if they are competing against a bigger company that isn’t raising rates.

The good news for the industry is that everything will return to normal in a year or two – it always does. Interest rates will reach a peak and then slowly drop when the Federal Reserve no longer needs to be tweaking borrowing. Inflation will slow and will return to normal trends, and everybody will forget about it until the next crisis. This doesn’t mean that there won’t be casualties. There will be ISPs that will get into trouble if they hold a lot of variable rate debt – and some may fold. There will be some new projects that get derailed when costs climb higher than expected. But overall, the broadband sector will hunker down and wait out the current trends.

 

 

The Latest Stats on Broadband Usage

OpenVault released its Broadband Insights Report for 2Q22, which contains statistics about nationwide broadband usage at the end of the quarter. As usual, the OpenVault report is an invaluable peek into the national state of broadband.

The report shows that the average home used 491 gigabytes of data at the end of the second quarter. This is up by 13% of the 433 gigabytes used at the end of the second quarter of 2021. The second quarter usage, which represents June 30, is typically the lowest usage levels of the year due to schools being out of session and folks on vacation.

Upload bandwidth usage continues to grow and averaged 31.2 gigabytes per household, up from 28 gigabytes in 2021 and 13.6 gigabytes in 2018. Average upload speeds increased from 17 Mbps in 2021 to 23 Mbps in 2022.

The percentage of households that use more than 1-terabyte (1,000 gigabytes) of data each month continues to grow rapidly. OpenVault calls these power users. These are homes that will trigger data caps if they have an ISP that enforces them. At the end of the second quarter of 2022, 13.7% of homes are using more than a terabyte compared to 10.8% in 2021, an increase of over 26%. The percentage of homes using more than 2-terabytes increased from 1.5% in 2021 to 2.2% in 2022, an increase of 47%.

There has been a huge migration of folks subscribing to faster tiers of broadband. The most extraordinary statistic is that 14.2% of American homes now subscribe to gigabit service, up from 4.6% just two years ago.

Much of this shift, shown in the table below, is coming from cable companies that unilaterally increased speeds for customers, but many millions of customers have also upgraded to more expensive broadband tiers to get faster broadband. The following chart shows a remarkable trend since June 2020:

Subscriptions June 2020 June 2021 June 2022
Under 50 Mbps 18.4% 10.5% 5.7%
50 – 99 Mbps 20.4% 9.6% 8.5%
100 – 199 Mbps 37.8% 47.5% 10.1%
200 – 499 Mbps 13.5% 17.2% 55.4%
500 – 999 Mbps 5.0% 4.7% 6.0%
1 Gbps 4.9% 10.5% 14.2%

This chart, better than anything else I’ve seen, lays to rest the idea that the national definition of broadband ought to be 100 Mbps download. As of June of this year, 76% of U.S. households are subscribing to broadband plans of 200 Mbps or faster. The most popular tier is 200-400 Mbps, which makes sense since the big cable companies have migrated most broadband customers to speeds of 200 Mbps or 300 Mbps.

Interestingly, OpenVault says that ACP customers are using more broadband, at 654 gigabytes, than the average home. There are also 52% more power users among ACP subscribers using more than a terabyte of data than the overall population. OpenVault doesn’t speculate why this is so, but I would guess that part of the difference might be that homes getting broadband for the first time have a lot of streaming video from recent years to catch up on. I remember how much video my household watched when we first got Netflix – but over time, we went back to other activities.

The FCC Mapping Fabric

You’re going to hear a lot in the next few months about the FCC’s mapping fabric. Today’s blog is going to describe what that is and describe the challenges of getting a good mapping fabric.

The FCC hired CostQuest to create the new system for reporting broadband usage. The FCC took a lot of criticism about the old mapping system that assumed that an entire Census block was able to buy the fastest broadband speed available anywhere in the Census block. This means that even if only one home is connected to a cable company, the current FCC map shows that everybody in the Census block can buy broadband from the cable company.

To fix this issue, the FCC decided that the new broadband reporting system would eliminate this problem by having an ISP draw polygons around areas where it already serves or could provide service within ten days after a customer request. If done correctly, the new method will precisely define the edge of cable and fiber networks.

The creation of the polygons creates a new challenge for the FCC – how to count the passings inside of any polygon an ISP draws. A passing is any home or business that is a potential broadband customer. CostQuest tried to solve this problem by creating a mapping fabric. A simplistic explanation is that they placed a dot on the map for every known residential and business passing. CostQuest has written software that allows them to count the dots of the mapping fabric inside of any possible polygon.

That sounds straightforward, but the big challenge was creating the dots with the actual passings. My consulting firm has been helping communities try to count passings for years as part of developing a broadband business plan, and it is never easy. Communities differ in the raw data available to identify passings. Many counties have GIS mapping data that shows the location of every building in a community. But the accuracy and details in the GIS mapping data differ drastically by county. We have often tried to validate GIS data to other sources of data like utility records. We’ve also validated against 911 databases that show each registered address. Even for communities that have these detailed records, it can be a challenge to identify passings. We’ve heard that CostQuest used aerial maps to count rooftops as part of creating the FCC mapping fabric.

Why is creating a fabric so hard? Consider residential passings. The challenge becomes apparent as soon as you start thinking about the complexities of the different living arrangements in the world. Even if you have great GIS data and aerial rooftop data, it’s hard to account for some of the details that matter.

  • How do you account for abandoned homes? Permanently abandoned homes are not a candidate for broadband. How do you make the distinction between truly abandoned homes and homes where owners are looking for a tenant?
  • How do you account for extra buildings on a lot. I know somebody who has four buildings on a large lot that has only a single 911 address. The lot has a primary residence and a second residence built for a family member. There is a large garage and a large workshop building – both of which would look like homes from an aerial perspective. This lot has two potential broadband customers, and it’s likely that somebody using GIS data, 911 data, or aerial rooftops won’t get this one property right. Multiply that by a million other complicated properties, and you start to understand the challenge.
  • Farms are even harder to count. It wouldn’t be untypical for a farm to have a dozen or more buildings. I was told recently by somebody in a state broadband office that it looks like the CostQuest mapping fabric is counting every building on farms – at least in the sample that was examined. If this is true, then states with a lot of farms are going to get a higher percentage of the BEAD grants than states that don’t have a lot of compound properties with lots of buildings.
  • What’s the right way to account for vacation homes, cabins, hunting lodges, etc.? It’s really hard with any of the normal data sources to know which ones are occupied full time, which are occupied only a few times per year, which have electricity, and which haven’t been used in many years. In some counties, these kinds of buildings are a giant percentage of buildings.
  • Apartment buildings are really tough. I know from working with local governments that they often don’t have a good inventory of the number of apartment units in each building. How is the FCC mapping data going to get this right?
  • I have no idea how any mapping fabric can account for homes that include an extra living space like an in-law or basement apartment. Such homes might easily represent two passings unless the two tenants decide to share one broadband connection.
  • And then there is the unusual stuff. I remember being in Marin County, California and seeing that almost every moored boat has a full-time occupant who wants a standalone broadband connection. The real world is full of unique ways that people live.

Counting businesses is even harder, and I’m not going to make the list of the complexities of defining business passings – but I think you can imagine it’s not easy.

I’m hearing from folks who are digging into the FCC mapping fabric that there are a lot of problems. ISPs say they can’t locate existing customers. They tell me there are a lot of mystery passings shown that they don’t think exist.

We can’t blame CostQuest if they didn’t get this right the first time – Americans are hard to count. I’m not sure this is ever going to be done right. I’m sitting here scratching my head and wondering why the FCC took this approach. I think a call to the U.S. Census would have gotten that advice that this is an impossible goal. The Census spends a fortune every ten years trying to identify where people live. The FCC has given itself the task of creating a 100% census of residences and businesses and updating it every six months.

The first set of broadband map challenges will be about the fabric, and I’m not sure the FCC is ready for the deluge of complaints they are likely to get from every corner of the country. I also have no idea how the FCC will determine if a suggestion to change the fabric is correct because I also don’t think communities can count passings perfectly.

This is not the only challenge. There are going to be challenges of the coverage areas claimed by ISPs. The big challenge, if the FCC allows it, will be about the claimed broadband speeds. If the FCC doesn’t allow that they are going to get buried in complaints. I think the NTIA was right to let the dust settle on challenges before using the new maps.

The Big Ethernet Carrier Market

I haven’t talked about the big Ethernet carriers for a while. These are the giant companies that serve many of the largest businesses in the country and that also haul broadband between cities. The U.S. Carrier Ethernet Leaderboard tracks and ranks these carriers.

In the latest ranking from June 2022, the Leaderboard says that the largest six Ethernet carriers are Lumen, AT&T, Spectrum Enterprise, Verizon, Comcast Business, and Cox Business. These six carriers each have at least a 4% share of the U.S. Ethernet market. The next tier on the Leaderboard includes Altice USA, Cogent, Frontier, GTT, Windstream, and Zayo. These six carriers have a national market share between 1% and 4%.

The rankings are based on the number of billable Ethernet retail customer ports installed. In past years we used to track lit buildings, but billable ports reflect that some customers buy more than one major type of Ethernet connection. There are six categories of Ethernet service that are counted as ports, including:

  • Ethernet DIA. This is a relatively new service and connects customers directly to the Internet without passing through any intermediate carriers.
  • E-Access to IP/MPLS VPN. This is the most commonly sold big Ethernet product at 36% of the U.S. market and is more commonly called business-class virtual private network. MPLS VPNs are used to switch multiple kinds of broadband traffic across the same broadband connection.
  • Ethernet Private Lines. Private lines connect two locations with no switching in between. As an example, a bank might buy a private line between each bank branch in a city, and no carrier touches the traffic between branches.
  • Ethernet Virtual Private Lines. This is similar to dedicated private lines in that traffic is encrypted and not visible to carriers between the two end points.
  • Metro LAN. This uses Ethernet to connect multiple locations within a metropolitan network.
  • WAN VPLS. This extends Metro LAN service across the country or the world.

The next lower tier of large carriers includes companies that have less than a 1% share of the national Ethernet market. Some of the better-known names include ACD, AireSpring, Alaska Communications, Alta Fiber, American Telesis, Arelion, Armstrong Business Solutions, Astound Business, Breezeline, BT Global Services, Centracom, Consolidated Communications, Conterra, Crown Castle, Douglas Fast Net, DQE Communications, ExteNet Systems, Fatbeam, FiberLight, First Digital, FirstLight, Flo Networks, Fusion Connect, Global Cloud Xchange, Great Plains Communications, Hunter Communications, Intelsat, Logix Fiber Networks, LS Networks, MetTel, Midco, Momentum Telecom, NTT, Orange Business, Pilot Fiber, PS Lightwave, Ritter Communications, Segra, Shentel Business, Silver Star Telecom, Sparklight Business, Syringa, T-Mobile, Tata, TDS Telecom, TPx, Unite Private Networks, Uniti, US Signal, WOW!Business, Ziply Fiber and other companies selling retail Ethernet services in the U.S. market.

The names at the top of the Leaderboard are familiar since those are also most of the largest retail ISPs in the country.

What many people don’t realize is that most cities of any size include connections from some of these carriers. For example, most national chain stores, hotels, or other large national businesses have a single carrier that coordinates and connects all of its locations. This allows big businesses to efficiently and reliably connect locations to headquarters or the cloud. Anybody that has crawled through the FCC’s 477 data will see a number of these carriers listed as fiber providers in most cities.

In most cases, these carriers use somebody else’s fiber to connect to customers. Some large carriers like AT&T, Lumen, or Comcast will build fiber to business districts and then sell wholesale arrangements to carriers that need to reach specific businesses. It’s not unusual for a local outlet of a national business to not even know who the underlying carrier is – that’s something arranged by a corporate office and done behind the scenes. But any ISP salespeople knocking on the doors on chains is familiar with the story that the ISP connections are arranged by corporate.

I know a few fiber overbuilders who have cracked a hair into this market and have convinced some of the carriers on the list to buy from them and not one of the national carriers. It’s not easy to get onto the radar of these carriers, but it can be done with persistence.

The 12 GHz Battle

A big piece of what the FCC does is to weigh competing claims to use spectrum. It seems like there have been non-stop industry fights over the last decade on who gets to use various bands of spectrum. One of the latest fights, which is the continuation of a fight going on since 2018, is for the use of the 12 GHz spectrum.

The big wrestling match is between Starlink’s desire to use the spectrum to communicate with its low-orbit satellites and cellular carriers and WISPs who want to use the spectrum for rural broadband. Starlink uses this spectrum to connect its ground-based terminals to satellites. Wireless carriers argue that the spectrum should also be shared to enhance rural broadband networks.

The 12 GHz band is attractive to Starlink because it contains 500 MHz of contiguous spectrum with 100 MHz channels – a big data pipe for reaching between satellites and earth. The spectrum is attractive to wireless ISPs for these same reasons, along with other characteristics. The 12 GHz spectrum will carry twice as far as the other spectrum in point-to-multipoint broadband networks, meaning it can cover four times the area from a given tower. The spectrum is also clear of any federal or military encumbrance – something that restricts other spectrum like CBRS. The spectrum also is being used for cellular purposes internationally, which makes for an easy path to find the radios and receivers to use it.

In the current fight, Starlink wants exclusive use of the spectrum, while wireless carriers say that both sides can share the spectrum without much interference. These are always the hardest fights for the FCC to figure out because most of the facts presented by both sides are largely theoretical. The only true way to find out about interference is in real-world situations – something that is hard to simulate any other way,

A few wireless ISPs are already using the 12 GHz spectrum. One is Starry, which has recently joined the 12 GHz Coalition, the group lobbying for terrestrial use of the spectrum. This coalition also includes other members like Dish Networks, various WISPs, and the consumer group Public Knowledge. Starry is one of the few wireless ISPs currently using millimeter-wave spectrum for broadband. The company added almost 10,000 customers to its wireless networks in the second quarter and is poised to grow a lot faster. If the FCC opens the 12 GHz spectrum to all terrestrial uses, it seems likely that use of the spectrum would quickly be used in many rural areas.

As seems usual these days, both sides in the spectrum fight say that the other side is wrong about everything they are saying to the FCC. This must drive the engineers at the FCC crazy since they have to wade through the claims made by both sides to get to the truth. The 12 GHz Coalition has engineering studies that show that the spectrum could coexist with satellite usage with a 99.85% assurance of no interference. Starlink, of course, says that engineering study is flawed and that there will be significant interference. Starlink wants no terrestrial use of the spectrum.

On the flip side, the terrestrial ISPs say that the spectrum in dispute is only 3% of the spectrum portfolio available to Starlink, and the company has plenty of bandwidth and is being greedy.

I expect that the real story is somewhere in between the stories told by both sides. It’s these arguments that make me appreciate the FCC technical staff. It seems every spectrum fight has two totally different stories defending why each side should be the one to win use of spectrum.

Are BEAD Grants Large Enough?

One of the biggest questions associated with the $42.5 billion BEAD grant program is if that is enough money to solve the national rural digital divide. The funding will be allocated to states in a three-step process. First, States will get an automatic $100 million. Next, $4.2 billion will be directly allocated to States using the relative percentage of locations in each state defined as unserved and high-cost. This will rely on the new FCC maps, and the NTIA may still refine the definition of high-cost areas. The remaining $38.1 million will also be allocated to States using the new FCC maps, and will also use the relative number of unserved locations in each State.

The funding works out to be around $850 million per state, but the funding will vary significantly by state. Preliminary estimates have a number of states only getting $100 million – Connecticut, Delaware, District of Columbia, Hawaii, Maine, New Hampshire, North Dakota, Rhode Island, and Vermont. The largest estimated allocations are estimated to go to Texas at $4.2 billion and California at $2.8 billion.

States have been doing the math to see if they think the BEAD grant funding will be enough to reach every rural household with good broadband. I’ve only been able to find one article that cites an estimate of the effectiveness of the BEAD grants, but this one example raises some good questions.

The State of Minnesota is estimated to receive about $650 million in BEAD grant funding. In March of this year, the State Legislature approved $110 million for the existing Border-to-Border grant program, with most of the funding coming from federal ARPA funding given to the state. At that time, the State broadband office estimated that the state will need around $1.3 billion in total grant funding to reach everybody in the state. If that is a good estimate, then even after BEAD grants and the $110 million State grants, the state will be $540 million short.

This raises a lot of questions. First, inflation has hit the broadband industry hard, and I’ve seen a lot of estimates that the cost to build broadband networks is between 15% to 25% higher than just two years ago. That means that the $42.5 billion in BEAD funding is not going to stretch nearly as far as was estimated when Congress established the BEAD grants. This also raises the question of how much inflation will further increase costs over the years it’s going to take to build BEAD-funded networks. It’s not hard to imagine BEAD networks still being constructed in 2026 and beyond.

I’ve also seen estimates that the rules established by Congress and the NTIA for the BEAD grants could add as much as another 15% to the cost of building broadband networks compared to somebody not using grant funding. These extra costs come from a variety of factors, including the requirement to pay prevailing wages, expensive environmental studies that are not undertaken for non-grant projects, the requirement of getting a certified letter of credit, etc. The extra grant-related costs and the general inflation in the industry might mean that BEAD projects could cost 30% or more than building the same networks two years ago without grant funding.

This also raises an interesting question about how states allocated ARPA funding to broadband. Minnesota’s allocation of $110 million to broadband from ARPA is smaller than what many other states have done. As an example, my state of North Carolina allocated nearly $1 billion of the state’s ARPA money to broadband, and there are many states that have allocated $300 million or more to broadband. Part of the blame for a state like Minnesota not having enough money to reach everybody could be placed on the Legislature for not allocating much ARPA funding for broadband.

Another interesting question to be addressed is how State broadband offices will deal with areas where a 75% grant is not enough for an ISP to make a business case. From the feasibility work I’ve been doing this year, I think there are a lot more areas that fit the high-cost category than might be expected. The NTIA says that it might allow exceptions for grants up to 100% of the cost of assets – but asking for extra funding will probably open up the possibility for a State to instead fund less costly technologies. It might turn out that finding solutions for the many high-cost areas might be the unpredictable wild card in the BEAD grant process.

Finally, there are going to be areas where a State doesn’t make a BEAD grant award. It’s not hard to imagine a situation where only one ISP asks to serve an area, and a State broadband office decides that the ISP is unqualified to receive funding.

If the Minnesota estimate is even roughly accurate, it’s likely that Minnesota won’t be the only state that doesn’t receive enough BEAD money to get broadband to everybody. We’re not going to know this for sure until ISPs start applying for grants, but it won’t be a surprise if the BEAD grants are not large enough.

Congressional Push for a National Broadband Strategy

In August, a bill was passed through to the Committee for Commerce, Science, and Transportation to align the federal government’s efforts related to broadband. The bill was co-sponsored by Senators Roger Wicker, R-Mississippi, Ben Ray Luján, D-New Mexico, and Representatives Tim Walberg, R-Michigan., and Peter Welch, D-Vermont.

The Bill, S-4767 is titled the Proper Leadership to Align Networks (PLAN) for Broadband Act. The legislation is based upon a report earlier this year from the Government Accountability Office that determined that federal broadband efforts are fragmented and overlapping. The bill proposes that the President develop a national broadband strategy to better align the federal broadband effort.

There is no question that national broadband policy is fragmented. We have an FCC which is ostensibly in charge of broadband policy but which essentially washed its hand of broadband regulation under past Chairman Ajit Pai. The FCC has been in charge for years of tracking the state of broadband in the country and completely botched that task through an inadequate mapping process that allowed ISPs to report whatever they wanted about broadband coverage. For much of the last few decades, the feeling in DC is that the FCC has been in the pocket of the giant ISPs the agency is supposed to be regulating.

Congress gave responsibility for the giant BEADs grant program to the NTIA, largely due to the fact that Congress didn’t trust the FCC to administer the grant program. But the NTIA doesn’t have a lot of authority outside of the grant program. When the BEAD grants are behind us, the NTIA will fade into obscurity again in terms of national broadband policy.

The latecomer to the game is the FTC. The FCC handed some authority to the FTC when it abandoned broadband regulation. But the FTC mostly only prosecutes individual ISPs for bad behavior and has no authority to impose any regulation on all ISPs.

This bill is asking the Executive branch to take a shot at fixing federal broadband dysfunction through the creation of a broadband plan. I guess this plan would be aimed at discussing how to put broadband regulation back together again to have a cohesive federal policy. If you’ve read my blog for years, you know how I feel about broadband plans. They are only as good as any follow-through on the recommendations made. The decade-old national broadband plan was as near as you could get to a total bust – not because it didn’t include good recommendations, but because it was put on the shelf and quickly forgotten.

It’s hard to think that a new broadband plan, even one coming from this legislation, would fare any better than the last one. It will likely be a document with a few good ideas – but ideas that are softened to appease the many parties with input to the plan. It’s hard to imagine a new federal broadband plan going anywhere but on the shelf, as in the past.

I find it almost humorous that Congress would ask the White House to come up with the plan on how to fix the national broadband administration and regulation. The White House has almost zero power to implement any ideas the plan might suggest.

The one government entity that can create a coherent broadband plan is Congress. Congress writes the rules that direct how the FCC operates and they could change the direction of the FCC overnight. Congress is the one who gave the NTIA the strong current role in setting national broadband policy through the grant process and could expand that role if desired.

If Congress wants a coherent broadband policy, it needs to do nothing more than go into a room and write it. This Act is a way for Congress to pretend to be addressing broadband without actually doing so. If nothing happens after the creation of a newly written broadband plan, Congress can blame the White House.

The reality is that there are not enough votes in Congress to pass a new Telecommunications Act, which is what is needed to put national telecom policy back on track. There has obviously not been enough votes over the last decade to make any drastic changes to telecom policy. The large ISPs have bought enough influence in both parties to sidetrack any attempt by the federal government to try to regain the reins of broadband policy.

There is no telling if this particular legislation has enough legs to get to a floor vote – but it’s the kind of legislation that could garner enough votes from both parties to pass since the outcome threatens nobody.

Averting a Mapping Disaster?

Alan Davidson, the head of the National Telecommunications and Information Administration, recently announced that the agency is canceling plans to use the first iteration of the new FCC maps that the FCC says will be available by early November. Davidson says that he feels obligated to let the FCC’s challenge process play out before using the mapping data. I’m sure this wasn’t an easy decision, but it says that it’s better to hold out for a more accurate map rather than settling for the first iterations of the new FCC maps.

This decision will clearly add more time and delay to the $42.5 billion BEAD grant program. But the decision to wait recognizes that using incorrect maps would almost inevitably mean lawsuits that could delay the grant program even longer.

The timing of the new maps became unfortunate when Congress mandated that the FCC maps must be used to allocate over $38 billion in grant funding to states. The FCC has been stating all summer that it hopes that the new maps will be relatively accurate and will fix many of the obvious problems in the current broadband maps. If it wasn’t for the pressure of the BEAD grant program, the FCC would have had several cycles of the new maps to smooth out kinks and errors in the reporting before they had to bless the new maps as solid. The NTIA decision to delay relieves the pressure to have the first set of maps be error-free – which nobody believes will happen. I have a hard time recalling any cutover of a major government software system that was right the first time, and the FCC’s assurances all summer have felt more like bravado than anything else.

Over the last few weeks, I’ve been talking to the engineers and other folks who are helping ISPs with the new maps. I didn’t talk to anybody who thinks the new maps will be solid or accurate. Engineers are, by definition, somewhat cautious folks, but I expected to find at least a few folks who thought the new maps would be okay.

I’ve been saying for six months that the likelihood of the new maps being accurate is low, and I was thinking about not writing anything more about mapping until we see what the new maps produce. However, I was prompted to write about mapping again when I saw a headline in FierceTelecom that quoted Jonathan Chambers of Conexon saying that the new maps will be a train wreck. Conexon is working with electric cooperatives all across the country to build broadband networks, which gives the company an interesting perspective on rural issues.

Jonathan Chambers cites two reasons for pessimism. One is the reason I already mentioned, which is that it’s irrational to use the outputs of a new federal mapping system to allocate billions of dollars between states. He says that there are simpler alternatives that would take all of the pressure off the new mapping system. He’s right, but unfortunately, Congress specifically required In the IIJA legislation that the FCC maps be used. It would take an act of Congress to change that ruling.

Chambers is also pessimistic about the challenge process that is being allowed for the new maps. He expects the challenges to be major and ongoing. It seems unlikely that the FCC is prepared to investigate the huge number of protests that could come from every corner of the country claiming that the new maps got the facts wrong.

My discussions with engineers raised other questions not mentioned by Chambers. Some engineers told me that the underlying mapping fabric has a lot of mistakes. This is where CostQuest, the firm that created the new mapping system, laid out the location nationwide of every possible broadband customer. This was a nearly impossible task in the short time the company had to create the maps. I’ve been working for years with local governments that use GIS data to define potential broadband locations, and it’s always a challenge to identify only those buildings where somebody might buy broadband and exclude buildings used for some other purpose.

My biggest concern is that ISPs are still allowed to report marketing speeds instead of actual speeds, and I fear that ISPs will be motivated to overstate broadband speeds in the new maps (like many have done in the old ones). Any areas designated by the maps to already have broadband available at 100/20 Mbps will be declared ineligible for the BEAD grants, and any ISP that wants to protect against being overbuilt has a high motivation to claim that speed – and it seems likely that many of them will do so. I don’t know if this is true, but my interpretation of the FCC map challenge is that the FCC won’t entertain challenges based on speed, but only on the coverage area. If that is true there will be a huge uproar from states and communities that get disadvantaged from deceptive reporting by ISPs.

I’ve also heard from ISPs in the last week that were unable to navigate the new mapping system by the deadline. These are relatively small ISPs, but many of them have built fiber and it’s not good to have them excluded from the maps. I’ve heard from multiple sources that the new mapping system is not easy to use. I’ve heard from ISPs who didn’t have an engineer who was able to certify the maps and just gave up.

I guess we’ll find out in a few months how the first draft of the maps turns out. The FCC says it will release the results by early November. I expect there are a whole lot of folks who are poised to compare the new maps to their local knowledge of actual broadband usage – and then the challenges will begin.