The New Speed Battle

I’ve been thinking about the implications of having a new definition of broadband at 100/20 Mbps. That’s the threshold that has been set in several giant federal grants that allow grant funding to areas that have broadband slower than 100/20 Mbps. This is also the number that has been bandied about the industry as the likely new definition of broadband when the FCC seats a fifth Commissioner.

The best thing about a higher definition of broadband is that it finally puts the DSL controversy to bed. A definition of broadband of 100/20 Mbps clearly says that DSL is no longer considered to be broadband. A 100/20  Mbps definition of broadband means we can completely ignore whatever nonsense the big telcos report to the FCC mapping process.

Unfortunately, by killing the DSL controversy we start a whole new set of speed battles with cable companies and WISPs that will be similar to the controversy we’ve had for years with DSL. Telcos have claimed 25/3 Mbps broadband coverage over huge parts of rural America in an attempt to deflect broadband grants. In reality, there is almost no such thing as a rural customer who can get 25/3 Mbps DSL unless they sit next to a DSLAM. But the telcos have been taking advantage of the theoretical capacity of DSL, and the lax rules in the FCC mapping process allowed them to claim broadband speeds that don’t exist. I hate to admit it, but overstating DSL speeds has been a spectacularly successful strategy for the big telcos.

We’re going to see the same thing all over again, but the new players will be cable companies and WISPs. The controversy this time will be more interesting because both technologies theoretically can deliver speeds greater than 100/20 Mbps. But like with DSL, the market reality is that there are a whole lot of places where cable companies and WISPs are not delivering 100/20 Mbps speeds and would not be considered as broadband with a 100/20 Mbps yardstick. You can take it to the bank that cable companies and WISPs will claim 100/20 Mbps capability if it helps to block other competitors or if it helps them win grants.

The issue for cable companies is the upload speed. One only has to look at the mountains of speed tests gathered around the country to see that cable upload speeds are rarely even close to 20 Mbps. We’ve helped cities collect speed tests where maybe 5% of customers are reporting speeds over 20 Mbps, while the vast majority of cable upload speeds are measured at between 10 Mbps and 15 Mbps. Usually, the only cable customers with upload speeds over 20 Mbps are ones who have ponied up to buy an expensive 400 Mbps or faster download product – and even many of them don’t see upload speeds over 20 Mbps.

This begs the question of what a definition of broadband means. If 95% of the customers in a market can’t achieve the defined upload speeds, is a cable company delivering broadband under a 100/20 Mbps definition? We know how the telcos answered this question in the past with DSL, and it’s not hard to guess how the cable companies are going to answer it.

It’s not a coincidence that this new controversy has materialized. The first draft of several of the big grant programs included a definition of broadband of 100/100 Mbps – a speed that would have shut the door on cable companies. But cable company lobbying began immediately, and the final rules from Congress included the slimmed-down 100/20 Mbps broadband definition.

WISPs have a more interesting challenge because the vast majority of existing WISP connections are nowhere close to meeting either the upload or download speed of 100/20 Mbps. But fixed wireless technology is capable of meeting those speeds. A WISP deploying a new state-of-the-art system can achieve those speeds today for some reasonable number of miles from a tower in an area with good lines of sight. But most existing WISPs are deploying older technology that can’t come close to a 100/20 Mbps test. Even WISPs with new technology will often serve customers who are too far from a tower to get the full speeds. Just like with cable companies, the 100/20 Mbps definition of broadband will allow WISPs to stay in the game to pursue grants even when customers are not receiving the 100/20 Mbps speeds. So brace yourself, because the fights over speeds are far from over.

A Peek at ISP Stock Prices

Every few years, I take a look at the stock prices of the big ISPs. The big publicly traded companies seem to care more about stock prices than almost anything else. I was curious about how the stock prices of the big ISPs compared to their performance measured in terms of net new broadband customers added through the first three quarters of 2021. The following table shows the stock prices at the beginning of 2021 and 2022, along with the change in broadband customers.

      2022      2021 % Change    Adds % Change
Comcast $50.33 $50.51 -0.4% 1,114,000 3.6%
Charter $651.97 $647.03 0.8% 1,020,000 3.5%
Altice $16.18 $36.71 -55.9% 28,900 0.7%
Cable One $1,763.45 $2,152.22 -18.1% 173,000 20.2%
AT&T $24.60 $29.44 -16.4% 126,000 0.8%
Verizon $51.96 $58.85 -11.7% 208,000 2.9%
Lumen $12.55 $9.72 29.1% (178,000) -3.7%
Frontier $29.49 $26.95 9.4% (280,000) -9.1%
TDS $20.15 $18.84 7.0% 29,500 6.0%

I am as far away from being a stock analyst as can be imagined. I didn’t do any serious research into these company stocks, and so this blog is mostly idle speculation.

2021 was not a great year for the big cable company stocks. Both Comcast and Charter each grew by over one million broadband customers in three quarters, and yet their stock price stagnated for the year. In addition to growing customers, both companies had healthy rate increases last year that had to have boosted earnings. It makes me wonder what these companies would have to do for the stock prices to rise? It’s hard to think that these companies can continue to add a million new broadband customers each year.

Altice stock took a big bath, losing half its value during the year. It certainly can’t help that the company has stagnated while other cable companies have added customers – in the most recent quarter, the company lost net customers. But a drop that large must be due to other factors than just poor customer growth.

Cable One seemingly is being rewarded for the acquisition of properties from WOW!. I don’t know how the company is growing outside of that acquisition.

The two big telephone companies fared even worse than the cable companies, with substantial drops in stock prices. AT&T and Verizon saw net gains in broadband customers for the first time in several years. But these companies are both valued far more as cellular companies than as broadband companies – and the stock price drop likely has a lot more due to the increasing competitiveness of the cellular market.

Interestingly, Lumen and Frontier both had increases in stock prices despite a continued net drop in broadband customers. Lumen might have gotten a boost from the announced sale of mostly copper customers in twenty states. Frontier’s stock price boost is likely due mostly to coming out of bankruptcy with a healthy balance sheet. Both of these companies are now aggressively adding fiber customers and it will be interesting to see what that does to stock prices over the next year.

TDS is the only company on the list where stock prices match broadband customer growth. I’m sure that’s more a coincidence than anything else.

I worked for several publicly traded companies during my work career, and I know that stock prices are always at the forefront of internal discussions about the policies and goals of the companies. The 2021 stock price performance must be creating some interesting conversations in Board rooms around the industry.

Deploying 10-Gigabit PON

From a cost perspective, we’re not seeing any practical difference between the price of XGS-PON that offers a 10-gigabit data path and traditional GPON. I have a number of clients now installing XGS-PON, and we now recommend it for new fiber projects. I’ve been curious about how ISPs are going to deploy the technology in residential and small-business neighborhoods.

GPON has been the technology of choice for well over a decade. The GPON technology delivers a download path of 2.4-gigabits bandwidth to each neighborhood PON. Most of my clients have deployed GPON in groups of up to 32 customers in a neighborhood PON. In practical deployment, most of them pack a few less than 32 onto the typical GPON card.

I’m curious about how ISPs will deploy XGS-PON. From a pure math perspective, an XGS-PON network delivers four times as much bandwidth to each neighborhood than GPON. An ISP could maintain the same level of service to customers as GPON by packing 128 customers onto each GPON card. But network engineering is never that nicely linear, and there are a number of factors to consider when designing a new network.

All ISPs rely on oversubscription when deciding the amount of bandwidth needed for a given portion of a network. Oversubscription is shorthand for taking advantage of the phenomenon that customers in a given neighborhood rarely all use the bandwidth they’ve been assigned, and never all use it at the same time. Oversubscription allows an ISP to feel safe in selling gigabit broadband to 32 customers in a GPON network and knowing that collectively they will not ask to use more than 2.4 gigabits at the same time. For a more detailed description of oversubscription, see this earlier blog. There are ISPs today that put 64 customers or more on a GPON card – the current capacity is up to 128 customers. ISPs understand that putting too many customers on a PON card will start to emulate the poor behavior we see in cable company networks that sometimes bog down at busy times.

Most GPON networks today are not overstressed. Most of my clients tell me that they can comfortably fit 32 customers onto a GPON card and only rarely see a neighborhood maxed out in bandwidth. But ISPs do sometimes see a PON that gets overstretched if there are more than a few heavy users in the same PON. The easiest solution to that issue today is to reduce the number of customers in a busy PON – such as splitting into two 16-customer PONs. This isn’t an expensive issue because over-busy PONs are still a rarity.

ISPs understand that year after year that customers are using more bandwidth and engaging in more data-intensive tasks. Certainly, a PON with half a dozen people now working from home is a lot busier than it was before the pandemic. It might be years before a lot of neighborhood PONs get overstressed, but eventually, the growth in bandwidth demand will catch up to the GPON capacity. As a reminder, the PON engineering decision is based on the amount of demand at the busiest times of the day. That busy hour level of traffic is not growing as quickly as the overall level of bandwidth used by homes – which more than doubled in just the last three years.

There are other considerations in designing XGS-PON. Today, the worst that can happen with a PON failure is for 32 customers to lose bandwidth if a PON card fails. It feels riskier from a business perspective to have 128 customers sharing a PON card – that’s a much more significant network outage.

There is no magic metric for an ISP to use. You can’t fully trust vendors because they are going to sell more PON cards if an ISP were to be extremely conservative and put only 32 customers on a 10-gigabit PON. The ISP owners might not feel comfortable leaping to 128 or more customers on a PON. There are worse decisions to have to make because almost any configuration of PON oversubscription will work on a 10-gigabit network. The right solution will balance the need to make sure that customers get the bandwidth they request without being so conservative that the PON cards are massively underutilized. Over time, ISPs will develop internal metrics that work with their service philosophy and the demands of their customer base.

The Battle for IoT

There is an interesting battle going on to be the technology that monetizes the control of Internet of Things devices. Like a lot of tech hype, IoT has developed a lot slower than originally predicted – but it’s now finally becoming a big business. I think back to a decade ago when tech prognosticators said we’d soon be living in a virtual cloud of small monitors that would monitor everything in our life. According to those early predictions, our farm fields should already be fully automated, and we should all be living in the smart home envisioned by the Jetsons. Those predictions probably say more about the tech press that hypes new technologies than about IoT.

I’ve been noticing increasing press releases and articles talking about different approaches to monetizing IoT traffic. The one that we’ve all heard the most about is 5G. The cellular companies told Wall Street five years ago that the monitoring of IoT devices was going to fuel the 5G business plan. The wireless companies envisioned households all buying a second cellular subscription to monitor devices.

Except in a few minor examples, this business plan never materialized. I was reminded of it this week when I saw AT&T partnering with Smart Meter to provide patient monitoring for chronic conditions like diabetes and high blood pressure. The monitoring devices worn by patients include a SIM card, and patients can be monitored anywhere within range of a cellular signal. It’s a great way for AT&T to monetize IoT subscriptions – in this case, with monthly fees likely covered by health insurance. It sounds like an awesome product.

Another player in the IoT world is LEO satellites. In August of last year, Starlink made a rare acquisition by buying Swarm. This company envisions using satellites to be able to monitor outdoor IOT devices anywhere in the world. The Swarm satellites are less than a pound each, and the Swarm website says the goal is to have three of these small satellites in range of every point on earth by the end of 2022. That timeline slowed due to the purchase by Starlink, but this could be a huge additional revenue stream for the company. Swarm envisions putting small receivers in places like fields. Like with Starlink, customers must buy the receivers, and there is an IoT data monitoring plan that will allow the collection of 750 data packets per month for a price of $60 per year.

Also still active in pursuing the market are a number of companies promoting LoRaWAN technology. This technology uses tall towers or blimps and CBRS or some other low-power spectrum to communicate with IoT monitors over a large geographic area. The companies developing this technology can be found at the LoRa Alliance.

Of course, the current king of IoT is WiFi. Charter recently said it is connected to 5 billion devices on its WiFi network. WiFi has the advantage of a free IoT connection for the price of buying a broadband connection.

Each of these technologies has a natural market niche. The AT&T health monitoring system only makes sense on a cellular network since patients need to be monitored everywhere they go during the day. Cellular should be the go-to technology for mobile monitoring. The battle between LoRaWAN and satellites will be interesting and will likely eventually come down to price. Both technologies can be used to reach farm fields where cellular coverage is likely to never be ubiquitous. WiFi is likely to carry the signals from the devices in our homes – the AT&T vision of everybody buying an IoT cellular data plan sounds extremely unlikely since we all can have the same thing for the cost of a WiFi router.

Add Affordability to the Definition of Broadband

We spend an inordinate amount of time in the industry fixating on whether broadband connections meet certain speeds. There are rural communities all over the country asking residents to take speed tests right now hoping to prove that their broadband is inadequate. What I find sad is that every community that is doing this already knows the answer before they start gathering speed tests – they know their rural broadband is inadequate for the way that people want to use it.

I was glad recently to be reminded by a survey from the Leichtman Research Group survey that showed that 45% of homes have no idea what broadband speed they are supposed to be getting or are getting. I’ve seen the same thing in the numerous surveys we’ve done – people don’t fixate on Internet speeds and most just care if their home Internet works.

The surveys we coordinate for communities show one other issue that people care about, which is the price of broadband. In the hundreds of communities where we’ve done surveys, we almost always see that at least half of respondents think broadband is too expensive. In some communities, more than 90% of residents tell us that broadband is too expensive. When we ask people why they don’t have home broadband, the primary response in every survey is the cost of broadband.

I’ve had a half-baked idea floating around my brain for some time that might help address this problem. I’ve been wondering if prices shouldn’t be part of the definition of broadband.  There is a lot of discussion about setting a new definition of broadband speed at 100/20 Mbps. But I’ve learned in working with dozens of communities that there is a huge difference between a 100/20 Mbps connection that costs $55 and one that costs $85. As far as the public is concerned, these are not the same product – but we pretend that they are.

We know that as broadband prices increase, households are forced to drop broadband or downgrade to a slow-speed but cheaper alternative like DSL. One thing I’ve learned through surveys is that people in cities keep DSL because it costs less. People are not happy with the inferior performance of DSL, but they choose to spend $50 for a slow DSL connection because they can’t afford a 100/20 Mbps connection from Comcast or Charter.

Some of the new federal grants pay lip service to affordable rates, but none of these rules have any teeth. In fact, the giant BEAD grant legislation specifically prohibits the NTIA from suggesting broadband rates. (Any bet that this language came from the cable companies?) Some grants reward applicants for having a low-income broadband product, but they don’t insist that ISPs put any energy into marketing a low-income plan.

I don’t have specific metrics in mind, but there must be a way to weave affordability into the definition of broadband. Perhaps it’s something like the following. If 100/20 is the speed definition of broadband, then maybe connections under $60 would be considered as served, priced between $60 and $80 might be underserved, and prices over $80 would be considered as unserved. The price tiers I’ve chosen are arbitrary and open for debate, but the concept is that grants and subsidies ought to favor ISPs with affordable rates.

I know that real life is a lot more complex than my simple example. The big cable companies disguise broadband prices by hiding them in bundles. ISPs have jacked up the cost of the modem so that the basic broadband price sounds cheaper. In those markets lucky enough to have competition, the big ISPs offer special low rates for those willing to ask, while still billing the full rates to everybody else.

There is nothing that scares the big cable companies more than talking about regulating broadband prices. This was the main motivation for deregulating broadband at the FCC – cable companies don’t want regulators looking too closely at broadband prices. But prices matter at least as much as speeds, and for millions of homes, price is everything.

I know the big ISPs will say that the new Affordable Connectivity Program (ACP) will take care of this issue – but it doesn’t. Not all ISPs are going to take part in the program, and many of the ones that do will not market it to customers. Besides, how good a deal is it to get $30 off the basic Comcast broadband product that costs $90? The price after the ACP subsidy is still out of reach of many homes and is still more expensive than urban DSL – and many of the homes without broadband today can’t even afford DSL. Besides, there is also no guarantee that the ACP program won’t die in a few years when the funding runs dry.

I have no easy answer for this issue, but I hope this blog might plant a seed with somebody who can figure this out. I know that we have to stop ignoring the fact that prices matter. I’ve said for years that the big cable companies are on a path towards $100 broadband, and they are getting closer every year. Let’s stop pretending that $90 or $100 broadband is the same product as $50 or $60 broadband – even if the speeds are the same.

Mapping and Broadband Grants

Hopefully by now, most communities with poor broadband will have heard about the gigantic federal grants on the way to provide broadband solutions. The largest is the $42.5 BEAD (Broadband Equity, Access, and Deployment) grants that will be administered by states, with the funding and the rules established by the NTIA.

There is one provision in the enabling legislation that established these grants that makes me nervous and should concern everybody. The federal grants give priority to locations that are unserved (broadband speeds under 25/3 Mbps) and can also be used to fund underserved locations (speeds between 25/3 and 100./20 Mbps). The troubling provision is that Section 60102 of the legislation makes it clear the determination of eligible locations will rely upon the FCC maps.

Commerce Secretary Gina Raimondo, the agency that oversees the NTIA and the grant program, acknowledges that this is a problem. In an interview with CNBC, the Secretary admitted that grants might not be awarded until sometime in 2023 after the FCC maps have been updated.

I think it’s a huge problem if we need corrected FCC maps before we can decide which parts of the country are eligible for these grants. I fully expect the first version of the new FCC maps to be a disaster. ISPs will struggle with changing from reporting simple tallies by Census block to drawing complicated polygons around customers who are served or who can be served within ten days after a request. There will be a lot of honest mistakes made in the first few iterations of the new mapping as ISPs adapt to the new reporting methodology. It might take a few rounds of reporting until ISPs get the new maps right.

But the real problem will come from the big telcos who distort the current broadband maps by over-reporting broadband speed capabilities. Several of the big telcos have notoriously been reporting speeds of 25/3 Mbps or greater to shield monopoly areas from grants. ISPs today are largely free to claim any speeds they want. The current FCC rules say that ISPs can report marketing speeds – something the ISP can determine. There are huge parts of the country where speeds of 25/3 Mbps are claimed in the FCC maps when actual speeds might be a few Mbps.

The new FCC maps will not stop this practice. The big telcos can still claim fast speeds that don’t exist. In fact, if it stops the award of the BEAD grants, the telcos are likely to report even more areas as having 25/3 Mbps capability. Recall that just before the RDOF reverse auction, Frontier and CenturyLink tried to change tens of thousands of Census blocks to speeds of 25/3 Mbps, which would have kept them out of the RDOF auction. The FCC rejected most of these claims, but the attempt demonstrates the blatant deceptions that the big ISPs are willing to take to keep customers and revenues.

There is currently no penalty at the FCC for overreporting speeds, and some of the big telcos have found this to be a convenient tool to use to maintain monopoly service areas. The BEAD grant might be the last big grant program for a long time, so I can’t see any motivation for the big telcos will suddenly become honest and create honest maps.

We’ve seen in the challenges to the current NTIA grants that the big telcos have no shame. The telcos have been challenging grant eligibility in huge numbers of Census blocks where they know that speeds are poor  – these grant challenges are all about keeping out competition.

If there is any glimmer of hope, it’s that the BEAD grant funds will funnel through the states. Many of the states that already have grant programs have become tired of the games played by telcos and don’t pay them much heed. There are some states that have created their own broadband maps that they believe to be more accurate than the current FCC maps. We’ll have to see how much leeway the NTIA will allow for states to use the better mapping data. Unfortunately, the grant language in the BEAD legislation is fairly clear that the mapping that matters for this map is the FCC mapping. The grant legislation says that broadband data maps are the maps created under section 802(c)(1) of the Communications Act of 1934 (47 U.S.C. 642(c)(1)).

It’s already a shame that the mapping issue is immediately going to delay the BEAD grants from being awarded this year when millions of homes are waiting for better broadband. This mapping issue will easily add six months to a year until grants are awarded – and that means a longer time until there is a broadband solution deployed. The worse travesty will come if parts of the country continue to be denied grant funding due to dishonest maps. I think we’re only going to have one chance to get this right – and I’m not optimistic about this first basic step of defining who is eligible for the grants. I hope somebody proves me wrong.

Top Broadband Stories of 2021

Every year I write a blog talking about the trends that I think we’re likely to see in the coming year. But 2021 was such an unusual year for all of us that I thought it would also be useful to talk about what we accomplished in the industry over the last year while fending off a pandemic. All in all, it was quite a year.

Broadband Funding. This was the year when the federal government finally reacted to the poor broadband in many parts of the country and pulled the trigger on huge broadband grants. Before the year started, we saw funding through the CAREs funds in 2020. We saw money going to every community that can be used for broadband in the American Rescue Plan Act. This legislation also funded grant programs at the RUS and the NTIA. The big grant announcement was the $42.5 in broadband infrastructure funding from the Infrastructure Investment and Jobs Act. States across the country have chipped in with state broadband grants funded by legislatures. 2021 was the year when the funding spigots were opened wide.

Technology Getting Faster. 2021 was the year when XGS-PON became price competitive with GPON, and we’re starting to routinely talk about new FTTP networks as 10-gigabit capable. Fixed wireless technology has been improving, but the jury is still out on claims made in 2021 for being able to deliver rural gigabit wireless. Several cable companies did field trials of early DOCSIS 4.0 technology last year – the technology that will bring gigabit upload speeds to coaxial networks. Starlink showed us last year that satellite broadband doesn’t have to suck.

Supply Chain Becomes an Issue. I don’t recall ever hearing the term supply chain in the broadband industry before 2021 – and now it’s on everybody’s lips. Supply chain issues became real in 2021. ISPS ran into sudden long waits for basic electronics like switches and servers. During the year, the delivery times for fiber grew longer. And like always happens in times of shortages, by the end of the year, it became obvious that the biggest ISPs are still getting what they need while new ISPs at the bottom of the supply chain are told to wait a year to buy fiber.

BIG ISPs Interested in Rural America. I find it dismaying and somewhat ironic to see the big telcos and even a few of the big cable companies taking a sudden serious interest in serving rural America. The telcos started to ignore rural copper networks as far back as the 1980s, and their collective neglect led directly to the current dreadful state of rural broadband that we are now attempting to fix. The new interest in rural America is clearly due to the gigantic grant programs. Since the big grants are going to be funded through states, I guess we’ll find out if anybody wants to trust these companies yet another time.

Expansion of WiFi. When we look back twenty years from now, the expansion of WiFi spectrum might have been the most important development in 2021. The FCC originally voted to add 6 GHz spectrum to WiFi in April 2020, but the order was appealed by cellular carriers that wanted the spectrum for 5G. At the very end of 2021, the courts sided with the FCC and are allowing the use of 6 GHz WiFi to finally move ahead. WiFi is the wireless solution the world needs. You buy a box and can transmit wireless broadband around the home or office – the alternative is to buy subscriptions from cellular carriers. WiFi 6 and 6 GHz spectrum is going to take the technology to a new level.

Private Equity Finds Broadband. 2021 saw private equity money pouring into the broadband market. The big flashy announcement was when Apollo Global Management bought the copper assets of CenturyLink in twenty states. But more quietly, there is private equity money being used to buy smaller ISPs and to launch new fiber networks. It’s an interesting phenomenon when you consider that none of the fundamental aspects of the market has changed. Broadband networks are never likely to earn more than infrastructure returns, and the sudden interest in investing in low, slow return businesses is baffling.

Regulation Went Nowhere. There was big anticipation at the end of 2020 that a change of administration meant a new regulatory direction for the broadband industry. But inexplicably, almost all of 2021 went by without the White House naming a fifth FCC Commissioner or a new head of NTIA. 2021 was a year of regulatory status quo where the FCC concentrated on issues that have bipartisan support like awarding new spectrum and trying to fix robocalling.

ARPA is Not Just for Rural Broadband

FCC Commissioner Brandon Carr released an extraordinary statement the other day that is worth reading. Carr is taking exception to the final rules from the Treasury Department concerning how communities can use the $350 billion in funding from the American Rescue Plan Act (ARPA). Carr is asking states to somehow intervene in the way that cities, counties, and towns elect to use these funds.

As a reminder, the $350 billion he is talking about is funding that is being given directly to states, cities, counties, and townships. The money is not just for broadband and is intended to help local governments combat issues related to the pandemic.

Broadband is listed as an acceptable use of these funds since most communities had broadband-related problems during the pandemic as many millions were sent to work and school from home. But the money can also be used for many other purposes such as supporting the public health response to the pandemic, addressing negative economic impacts, replacing lost local government tax revenues that came as a result of the pandemic, covering premium pay for essential workers, and making investments in water and sewer infrastructure. The large majority of this funding is going to go to needs other than broadband.

Commissioner Carr starts with the statement that “the Administration’s rules green-light spending to overbuild existing, high-speed networks in communities that already have fast Internet service, rather than directing those dollars to the rural and other communities that lack access to any broadband service today.”

I take exception to this sentence for several reasons. First, I think the final Treasury rules are following the intent of Congress that wrote the enabling legislation. Congress included broadband as a possible use for the funds. If Congress had intended this funding to be used only for rural broadband, the legislation would have said so. But broadband is listed as an acceptable use for every community, including cities. I’m not sure how Commissioner Carr thinks that ARPA money given to Detroit, Baltimore, or New York City could be used to support rural broadband.

A lot of the funding is going to rural communities and I know many communities are aiming this funding to help areas with poor broadband. But I think cities contemplating using this funding also think they are helping to solve the digital divide. In every city, there are places where cable companies never built broadband, and there are many millions more homes that can’t afford broadband. Most of the urban initiatives I’ve seen for using ARPA funding are aimed at building infrastructure to serve public housing or for bringing broadband to students that don’t have home broadband. Commissioner Carr says those kinds of projects deviate from the intent of ARPA, and I have to disagree.

Commissioner Carr also doesn’t think this money should be used for overbuilding. I always get my hackles up when I hear that word, because the big ISPs have been using the word overbuilding as a pejorative for many years. Looking back to the days when there were federal grants that were earmarked to bring better broadband to areas with broadband speeds under 10/1 Mbps, the big ISPs fretted that the money would be used to overbuild existing rural ISPs. The big ISPs don’t think any federal funding should be used to ever overbuild any existing ISP – the big ISPs are in favor of maintaining monopolies. Whenever I see the word overbuild coming from a big ISP I just substitute the correct word – competition. When Congress added broadband as an acceptable use for the ARPA funding, it obviously intended that the money could be used to compete (overbuild) against ISPs that weren’t delivering the broadband households needed during the pandemic.

I must admit that I got a good laugh out of Commissioner Carr’s warning that “the Treasury rules allow these billions of dollars to be spent based on bad data.” The final Treasury rules allow the exact opposite by allowing communities to ignore the FCC’s notoriously bad broadband data when determining where to spend the money.

I opened the blog by calling this an extraordinary statement because I’m not sure why he wrote it. Commissioner Carr’s plea to the states doesn’t mean much since local communities are free to use the ARPA funds without any approval from the states. It’s just a guess, but perhaps Commission Carr is upset that the FCC has no role in this spending. This funding was created by Congress and given to the Treasury Department and to communities directly in what looks like a deliberate snub of the FCC. The FCC got snubbed again more recently when Congress decided to send the $42.5 billion in BEAD grants to the states to spend.

When Will We See Real 5G?

The non-stop wireless industry claims that we’ve moved from 4G to 5G finally slowed to the point that I stopped paying attention to it during the last year. There is an interesting article in PC Magazine that explains why 5G has dropped off the front burner.

The article cites interviews with Art Pouttu of Finland’s University of Oulu about the current state and the future of 5G. That university has been at the forefront of the development of 5G technology and is already looking at 6G technology.

Pouttu reminds us that there is a new ‘G” generation of wireless technology about every ten years but that it takes twenty years for the market to fully embrace all of the benefits of a new generation of wireless technology.

We are just now entering the heyday of 4G. The term 4G has been bantered around by wireless marketing folks for so long that it’s hard to believe that we didn’t see a fully-functional 4G cell site until late in 2018. Since then, the cellular companies have beefed up 4G in two ways. First, the technology is now spread through cell sites everywhere. But more importantly, 4G systems have been bolstered by the addition of new bands of cellular spectrum. The marketing folks have gleefully been labeling this new spectrum as 5G, but the new spectrum is doing nothing more than supporting the 4G network.

I venture to guess that almost nobody thinks their life has been drastically improved because 4G cellphone speeds have climbed in cities over the last few years from 30 Mbps to over 100 Mbps. I can see that faster speed on my cellphone if I take a speed test, but I haven’t really noticed much difference between the performance of my phone today compared to four years ago.

There are two major benefits from the beefed-up 4G. The first benefits everybody but has gone unnoticed. The traditional spectrum bands used for 4G were getting badly overloaded, particularly in metropolitan areas. The new bands of spectrum have relieved the pressure on cell sites and are supporting the continued growth in cellular data use. Without the new spectrum, our 4G experience would be deteriorating.

The new spectrum has also enabled the cellular carriers to all launch rural fixed cellular broadband products. Before the new spectrum, there was not enough bandwidth on rural cell sites to support both cellphones and fixed cellular customers. The many rural homes that can finally buy cellular broadband that is faster than rural DSL are the biggest winners.

But those improvements have nothing to do with 5G. The article points out what has always been the case. The promise of 5G has never been about better cellphone performance. It’s always been about applications like using wireless spectrum in complex settings like factories where feedback from huge numbers of sensors needs to be coordinated in real-time.

The cellular industry marketing machine did a real number on all of us – but perhaps most of all on the politicians. We’ve had the White House, Congress, and State politicians all talking about how the U.S. needed to win the 5G war with China – and there is still some of that talk going around today. This hype was pure rubbish. What the cellular carriers needed was more spectrum from the FCC to stave off the collapse of the cellular networks. But no cellular company wanted to crawl to Congress begging for more spectrum, because doing so would have meant the collapse of cellular company stock prices. Instead, we were fed a steady diet of false rhetoric about how 5G was going to transform the world.

The message from the University of Oulu is that most 5G features are probably still five or six years away. But even when they finally get here, 5G is not going to bring much benefit or change to our daily cellphone usage. It was never intended to do that. We already have 100 Mbps cellular data speeds with no idea how to use the extra speed on our cellphones.

Perhaps all we’ve learned from this experience is that the big cellular companies have a huge amount of political and social clout and were able to pull the wool over everybody’s eyes. They told us that the sky was falling and could only be fixed with 5G. I guess we’ll find out in a few years if we learned any lesson from this because we can’t be far off from hearing the hype about 6G. This time it will be 100% hype because 6G deals with the use of extremely short frequencies that will never be used in outdoor cellular networks. But I have a feeling that we’ll find ourselves in a 6G war with China before we know it.

Challenges to Broadband Grants

One of the most annoying aspects of the current federal broadband grants is the ability of incumbent ISPs to challenge the validity of grant requests. In the typical challenge, the incumbents claim that they are offering fast broadband and that an applicant should not be able to overbuild them.

This is another issue that can be laid squarely at the feet of the lousy FCC broadband maps. ISPs are largely free to claim any broadband speeds they want, and grant challenges give them a lever in situations like these grants. The challenges put a burden on anybody filing for a grant since they must somehow prove that incumbent broadband speeds are slower than 25/3 Mbps.

This is not new behavior by the incumbents. You might recall that before the RDOF auction in 2020 that Frontier and CenturyLink together tried to claim they were delivering speeds of at least 25/3 Mbps to tens of thousands of additional Census blocks. The goal was to eliminate these locations from the RDOF auction so that the telcos could preserve their broadband monopoly. The FCC largely rejected the last-minute changes by the telcos. There were already huge areas where telco speed capabilities were overstated before RDOF, with the result that a huge number of Census blocks were incorrectly kept out of the RDOF auction.

A recent article by Karl Bode for Community Networks highlights some specific examples of challenges bogging down the current round of NTIA broadband grants. He cites the example of a grant application made in Grafton County, New Hampshire, where the incumbents challenged the speeds for 3,000 Census blocks in a grant covering 4,000 blocks.

Grafton County had collected speed tests that showed that existing broadband speeds are mostly far below the 25/3 Mbps threshold. But this still puts the burden on the grant applicant to somehow document broadband speeds for each of the many Census blocks. The incumbents are using the challenges to weaponize the lousy data included in the FCC’s broadband maps.

This is often a ludicrous situation. Applicants like Grafton County are seeking to build fiber broadband because it has already heard repeatedly from residents about the poor broadband in the area.

There are easy and obvious fixes to this. One simple fix would be that grants that ask to build fiber over existing DSL should be free from challenges. There is no place in rural America where DSL is delivering adequate broadband.

Another easy fix would be to stop talking about 25/3 Mbps as a meaningful definition of broadband. If these grants only allowed challenges for claims of 100/20 Mbps, then all of the challenges from telcos would be neutered. But there would still be battles like seen by Grafton County, where the cable companies are delivering slow speeds and challenging the grants. Setting the definition of broadband to a faster speed, even if only for the purposes of these grants, would eliminate the wasted energy being taken in handing out grant funding. The folks taking the most of brunt of these challenges are the folks in the various broadband grant offices. The shame of the challenge process is that there probably are some legitimate challenges being made, but they get lost in the huge volume of harassment challenges.

Unfortunately, these challenges are in place for a reason that surprises nobody. When the legislation enabling grants comes through Congress, the incumbents get to sneak innocuous-sounding language into the grant rules that is then manifested in chaos during the grant process. Unfortunately, the upcoming BEAD grant rules include a challenge process, so we’re going to get to see this process repeated. If there were a huge number of challenges in the $288 million NTIA grant program, it’s hard to imagine what we’re going to see with the $42.5 billion BEAD grant program that’s granting 150 times more in funding.