The Growing Risk of Space Debris

Low orbit space is growing increasingly crowded. Starlink has over 7,100 satellites in orbit and has plans to grow to 30,000. Project Kuiper has plans for a constellation of 3,232 satellites. One Web’s first-generation constellation has 648 satellites, with plans to grow to over 6,300 satellites. The Thousand Sails (Qianfan) constellation is planning for up to 15,000 satellites. Numerous countries and businesses are launching or planning for smaller constellations for a wide variety of purposes.

The European Space Agency issued a recent report that looks at the debris issue. Low-orbit space is already full of debris. Space scientists worry that collisions between satellites will create a lot more debris. NASA scientist Don Kessler warned about this years ago, and the Kessler effect describes a scenario where a major satellite collision can lead to a chain reaction that will lead to more collisions until an altitude of space becomes unusable.

NASA released a comprehensive financial analysis on space debris in 2023.  The report says there may already be 170 million pieces of space debris, and we’re only able to track around 55,000 objects today that include live and dead satellites, dead rocket boosters, and relatively large pieces of debris. It’s already becoming routine for operators to have to move satellites to avoid collisions. In one scary incident, Starlink satellite 44 was on a collision course with the European Space Agency (ESA) Aeolus satellite. ESA was able to make a last-minute course correction, but ESA reported that it was unable to make contact with Starlink during the incident.

The only real solution is to have a comprehensive effort to remove debris. There are now worldwide conferences of space scientists who are concentrating on the debris issue.

There are a number of possible solutions for removing space debris. The most important step is to make sure that every object that is launched has the ability to be intentionally deorbited. Starlink has done a good job of this, but not every satellite has this ability. Deorbiting has it’s own risks to the planet, and scientists warn that aluminum oxide from deorbiting satellites can kill the ozone layer.

Another tactic being explored is to repurpose existing satellites by visiting them to refuel and to swap components to modernize as needed. This would save on expensive new launches and the problems from deorbiting materials. Scientists are envisioning robotic satellites that service other satellites.

Because of the huge volume of existing debris, there also needs to be some method of scrubbing the sky of debris. Active debris removal would involve satellites that snag debris using robotic arms, tethers, nets, or space tugs for large debris. Scientists think it’s possible to use space-based lasers to zap and evaporate small debris. There is also the possibility of using ground-based large lasers to nudge larger debris into deorbiting.

But like with most issues, the question is who is going to pay? The only worldwide agreement on the issue is the Outer Space Treaty signed by multiple parties in 1967. This treaty says the country responsible for making debris is responsible for removing it. This treaty is far out of date since the large majority of space objects are being launched by commercial companies and not governments.

Somebody will have to start the effort to pay for this before it’s too late and big swaths of orbits become unusable. Governments could chip in to do this. There could somehow be requirements for commercial operators to pay for this – increasing the cost of operating satellite constellations. Scientists believe that satellite companies have an incentive to see this done, but we’ve seen too many other industries that have pawned off their environmental costs to others.

What’s Going on With Boost Mobile?

Boost Mobile added 90,000 net new cellular customers in the first quarter of 2025, increasing customers to 7.1 million. That’s a big turnaround from recent quarters with customer losses. Reaching net additions might mean the company is finally turning the corner to become successful.

Boost Mobile was acquired by Dish Networks as a result of the merger of T-Mobile and Sprint. One of the FCC requirements was that Sprint would be replaced in the market with a new nationwide carrier, and the FCC took steps to enable Dish to be the new nationwide carrier.

When Dish acquired Boost Mobile for $1.4 billion in 2021, the company had over 9 million customers, but customers slowly leaked away since then. During those years, Dish has been deploying a new 5G SA (standalone) network and now claims to be able to cover 80% of the people in the country from it’s own cell sites. Boost is deploying with open RAN technology, meaning the company isn’t locked into the specific hardware and software from the big cellular vendors.

Boost met its first buildout requirements and was able to reach 70% of the U.S. population by June 2023. However, Boost asked for and received a delay for deploying four spectrum blocks (AWS-4, lower 700 MHz E, 600 MHz, and AWS H) until December 14, 2026 instead of June 14, 2025.

Boost Mobile still has a way to go to activate traffic on its newly built network. Most of its customers are still roaming on AT&T and T-Mobile. In the 4Q 2024 earnings call for parent Echostar, executives from Boost Mobile admitted that only 1 million customers were riding the Boost network.

A recent article from Ookla documents that the Boost Mobile networks are getting faster but are still not up on average to the speeds of T-Mobile, Verizon, and AT&T. However, speeds are improving, and Boost says it will have the fastest network in some major markets this year. Near the end of last year, Boost was named as the fastest cellular provider in New York City. I have to wonder how much of that speed is due to having a largely empty network?

Boost Mobile has a long way to go to be relevant. At the end of the first quarter of this year, Verizon claimed 146 million customers, T-Mobile 131 million, and AT&T 118 million. Boost is also behind the two big cable companies, with Charter having 10.4 million cellular customers and Comcast 8.2 million.

It’s interesting how customers have not moved to Boost Mobile. The company is offering competitive prices. One would have to think that its networks are relatively empty and nearly pristine. Dish made a public relations blunder when it opened cell sites a few years ago before the open RAN technology was working well. If Boost is now on solid technical footing, there is opportunity for growth. There has been a lot of press and speculation over the last year that T-Mobile and Verizon might be overstressing their networks due to the proliferation of FWA home cellular broadband.

To add to the drama. Echostar, the parent of Boost announced last week that it is electing to miss a $326 million interest payment on its 2029 maturity debt. If the company doesn’t make the payment by the end of June it will be forced into chapter 11 bankruptcy. Echostar may be playing chicken with the FCC and is blaming the default on the FCC not resolving some of the open spectrum issues for the company

One thing is for sure. Assuming it survives, Boost Mobile has a long way to go to be a serious nationwide carrier. The company may never reach the size of Sprint which it is supposedly replacing. It will be interesting to watch if the company can reach solvency and justify the big investment made in the new nationwide network.

Closing Copper Networks

In a precursor to the headlines we’ll be seeing in the U.S., Telefónica announced on May 27 that it shut off its copper telephone network in Spain. Telefónica today has $41 billion in revenues that include broadband, voice, and cellular business. The company was the legacy monopoly telephone company in Spain, founded in 1924, and still serves a major share of the telecom market.

The company began replacing copper with fiber in 2009 in reaction to a European Union order to unbundle copper networks to allow competition – an order that did not apply to fiber. Closing the copper networks also means the wholesale business has ended for companies that were using unbundled copper access.

The company was able to decommission the first two of its 8,500 exchanges in 2014, and the pace has accelerated since then, with 4,300 exchanges shut in 2024. Telefónica claims to be the first telephone company in Europe to have completed the transition from copper to fiber.

Unlike here, the European Union has strong rules that insist that every dropped copper customer has access to another source of broadband and voice – mostly fiber, but also wireless. The EU allows remote locations to be transitioned to satellite broadband. In the U.S., telcos are supposed to find an alternative for customers but often just wink at the requirement.

AT&T made a formal announcement in January of this year that it intends to get rid of copper everywhere except California by the end of 2029. The company has been quietly decommissioning copper long before that formal announcement. AT&T will likely have a prolonged battle with the California Public Service Commission before it can tear down copper.

AT&T said in its announcement that it will offer an alternate technology to customers – either fiber or wireless. AT&T announced plans at the end of last year to build 45 million additional fiber passings on the same 2029 deadline as killing copper. Cities are anxiously waiting to see if AT&T will really build fiber in the poorest neighborhoods and in places where fiber construction is very expensive. It seems more likely that the company will get 80% fiber coverage in some cities and call it good.

Rural areas are another matter. In rural areas, AT&T will offer FWA cellular broadband as an alternative to copper. But FWA technology has two major shortcomings in rural areas. First, there is zero cellular coverage in huge parts of rural America, and even less coverage when you account for FWA customers needing to be within a few miles of a cell tower. Even close to a cell tower, there is always the question if a given tower has the capacity to accept a lot of new FWA customers. There are already stories in the press of rural customers losing copper coverage with no wireless alternative.

The FCC recently changed rules to make it easier for legacy telcos to walk away from copper networks. These changes were adopted by the FCC’s Wireline Competition Bureau, meaning they didn’t come to the full Commission for a vote.

One rule change allows a telco to turn off copper wires without having to conduct a test to first see if a replacement technology can take over the functions that were being performed by copper. This rule clarification says a telco can justify tearing down the old network if the “totality of the circumstances” proves that the change is needed. That seems to provide justification for tearing down copper as long as some adequate number of homes in an area will have a replacement.

Another rule change puts a two-year moratorium on telcos for having to disclose and seek public comment about closing copper networks.

Of course, to speed things along even faster, there is a good chance that all rules pertaining to copper networks will be scrapped during the Delete, Delete, Delete process.

Shut Down the FCC?

Thomas Lenard of the Technology Policy Institute wrote an op-ed for the Wall Street Journal suggesting that it’s time to shut down the FCC. The Technology Policy Institute is a well-regarded think tank that concentrates on advancing knowledge to inform policymakers. The FCC recently initiated the Delete, Delete, Delete effort that asked the industry and the public if there are unneeded FCC regulations that should be taken off the books. The response was so overwhelming that if every suggestion was implemented there would be little left of the agency.

Mr. Lenard thinks shutting the FCC is in keeping with the Administration’s effort to streamline government by shutting down unneeded agencies. He suggests that the agency has met its original role, which was to regulate monopoly telecommunications services and to promote telecom competition.

He has a good point. The FCC was created by the 1934 Telecommunications Act specifically to oversee the Bell System monopoly along with radio and telegraph service. The agency watched the sunset of telegraph service and the diminishment of radio. It expanded its reach to regulate cable TV and cellular service. All of the industries overseen by the FCC are now highly competitive. Traditional cable TV is seemingly headed on the same death spiral as telegraph service and paging. At least in cities and suburbs, people have a wide array of competitive choices for telecom services. Satellite communications is rapidly growing to fill in the gaps in rural broadband, texting, and mobile voice. The FCC hasn’t overseen any major new policy since implementing the Telecommunications Act of 1996.

There are precedents of federal agencies that put themselves out of business. The Civil Aeronautics Board and the Interstate Commerce Commission were closed when their original regulatory mission was no longer needed, with any remaining functions moved to the Department of Transportation. The push towards government efficiency probably means a lot more agencies will disappear.

If the FCC disappeared tomorrow, it would create a void in a few areas. Mr. Leonard suggest the useful functions could be moved to other agencies. It would make sense to move spectrum management and the Universal Service Fund to the Commerce Department under the NTIA. The other big useful push from the FCC in recent years is the attempt to clamp down on robocalls and spam, and that function could be moved to the FTC.

Mr. Leonard argues that, for the most part, the FCC is now an agency looking for a reason to exist. The agency is suddenly concentrating its effort on asserting authority over content on the public airways, something that is far removed from the agencies stated purpose. Commissioner Simington recently suggested the FCC should regulate streaming video. When a regulatory agency begins looking for new things to regulate, it’s probably at the end of its original mission.

There are those, including me, who think the remaining FCC’s mission is to protect the public from telecom monopolies. But in all honesty, the agency hasn’t made much effort to help the public in decades, other than perhaps with robocalling. The FCC has been a textbook example of regulatory capture where the industries being regulated have all of the sway and influence.

The Administration proposes to shut a long list of agencies, including the Department of Education, the US Agency for Global Media, the Federal Mediation and Conciliation Service, the Woodrow Wilson International Center for Scholars, the Institute of Museum and Library Servies, the US Interagency Council on Homelessness, and the Community Development Financial Institutions Fund. Is it time to add the FCC to that list?

Indoor Cellular Coverage

Ookla wrote a recent article that highlights an increasing problem of poor indoor cellular coverage. The article notes that this is a growing problem since the public increasingly relies on cell phone apps.

Indoor cell coverage is growing poorer for several reasons. First, 5G carriers are migrating to higher mid-band frequencies, which don’t penetrate buildings as well as the lower frequencies used in the past. Years ago, cellular networks widely used 700 MHz and 900 MHz frequencies, which had the wonderful property of penetrating almost anything. I remember being amazed a decade ago when I didn’t lose a cell call in an interior elevator of a building. In recent years, I’ve noticed that my cell phone doesn’t work at the back of my neighborhood grocery store – but it did a decade ago. The problem is only going to get worse as cellular carriers migrate more 5G traffic to higher mid-band spectrum bands (3 GHz and higher).

There are also some changes in buildings that make it harder for wireless signals to penetrate. Ookla cites the increasing use of low-E glass, an energy-efficient glass with a microscopic coating that reflects heat and light – and also cellular signals. Ookla says that modern insulation materials, in general, are less friendly to cell signals.

Ookla also lays some of the blame on regulations that completely focus on outdoor cell coverage and has never acknowledged that 80% of cellular traffic originates from indoors (Ookla cites Ericsson for that statistic). The FCC adopted a minimum standard for outdoor cellular speeds of 25/3 Mbps in October 2020 as part of the 5G Fund for Rural America order. Interestingly, at that time, that was a higher speed than the definition of landline broadband that was still stuck at 25/3 Mbps. A few countries like Germany and Ireland require decent indoor cellular speeds for structures like hospitals, busy business districts, and tourist attractions.

There are some solutions to the problem. One would be for regulators to require better cellular speeds to match how people use it. That may sound like an easy fix, but it’s not.

The best way to improve cellular speeds is to use small cell sites that are closer to homes and businesses. A signal from a cell site in a neighborhood will penetrate nearby buildings a lot better than a signal from a tall tower a mile or more away. One of the limiting factors of cellular call strength that doesn’t get mentioned very often is that the power from cell site is restricted and limited. This is done to stop neighboring tall towers from interfering with each other. Stronger signals would penetrate buildings better but would wreak havoc with existing cellular networks.

Some businesses have tackled the problem on their own. Many hotels, hospitals, and business high-rises have invested in a rooftop cellular repeater (which is really a small cell site) that beams a signal down through the building. That strengthens the signal inside a building but nowhere else.

Ookla recommends an interesting solution, which is to embrace a neutral host model of telecommunications. This would have third party companies build, own, and operate cellular infrastructure, which would be leased to multiple service providers on a wholesale basis. Think of this as the open-access version of infrastructure. A neutral host company would build cell sites where they are most needed by the public and lease capacity to all cellular carriers. Unfortunately, that model has not ever been embraced in the U.S. In fact, Crown Castle, which was the predominant company chasing the neutral hosted model, announced in March that it is selling its small cell business and related fiber networks to Zayo and EQT.

The FCC’s 2024 Broadband Report

The FCC just released a graph and statistics-laden broadband report that summarizes the data collected by ISPs for June 2024. Of course, the data in this report is only as good as what ISPs report to the FCC, so a lot of this information, while interesting, is to be taken with a grain of salt. Let’s look at some specifics.

Figure 1 shows the trend from June 2020 until June 2024 of total fixed (broadband) and mobile (cellphone connections). Fixed connections grew from 118.5 million in June 2020 to 132.6 million in June 2024. That’s 2.4% growth from 2023 to 2024 with an average 3% annual growth over the four years. Mobile connections grew from 361.7 million in June 2020 to 416.1 million in June 2024. That’s 2.5% growth from 2023 to 2024 and an average of 3.8% annual growth over the four years. The problem with the overall numbers is that we don’t know for sure what ISPs are reporting. For example, if an ISP sells a gigabit connection to the landlord of an MDU, is that one broadband connection for the building, a broadband connection to every unit, or both? The FCC numbers are interesting, but the number of broadband connections is not the same as the number of broadband customers. To be fair, the FCC isn’t misrepresenting the numbers, but I promise that users of this report will.

Figure 2(a) gets off the rails because it describes the percentage of connections at various download speeds. According to this chart, there are 7.5 million connections with speeds under 25 Mbps and 16.7 million between 25 Mbps and 100 Mbps. This is a very different story than BEAD, which is currently targeting around 13.8 million locations – 6 million under 25 Mbps and 7.8 million between 25 Mbps and 100 Mbps. There are expected differences in the counts due to grants that have been awarded and not yet constructed. But the real issue with the FCC data is  that it represents the speeds that ISPs report to the FCC – not actual speeds. I, and others, have written a lot about ISPs that overreport speeds when you compare the FCC speeds in this graph with Ookla speed tests. There are a huge number of ISPs reporting exactly 100 Mbps download and delivering a lot less. Again, the FCC is not misrepresenting what these numbers are – but anybody who has dug into ISP reporting or paid any attention to State BEA map challenges knows these numbers are not close to accurate.

Figure 2(b) does the same for upload speeds. ISPs report that 27 million locations have download speeds under 20 Mbps, which should make them eligible for BEAD. I think the real number is a lot higher.

With all of that said, the FCC report is useful for seeing the trends of what ISPs report over time. For example, Figure 3 shows how reported gigabit connections (actually 940 Mbps ) has grown from 8.6 million in June 2020 to 34.4 million in June 2024.

After Figure 3 the data becomes muddled. Figure 4 shows the number of broadband connections in service by technology. It shows that 59% of customers are still served by cable technology, and that 24.5% served by fiber. The number that mystifies me is the fixed wireless. It says that 6.8% of broadband customers are served by this technology, which means if there are 132.6 million total connections, there are 9 million fixed wireless connections. As of June 30, 2024, AT&T, T-Mobile, and Verizon together claimed 9.7 million FWA customers, and there needs to be some millions of customers of WISPs added to the FWA number. That chart has me scratching my head about how the FCC compiled these numbers-.

After this, the report goes deeper into detail, but the more detailed reports all come from the same questionable data that builds the basic graphs and charts. Even knowing that there are questionable parts of the data, I found myself fascinated when looking through the report – only to occasionally pinch myself to remind me not to rely on the specific data from most of the many charts.

Increasing Broadband Price Competition

Competition has been creeping into broadband pricing for the last several years as cable companies have been using low introductory rates to try to win new customers and offering similarly low price to try to keep them. Anybody who competes against the big cable companies will tell you that cable companies have been competing for years by offering two-year promotional prices to keep customers.

However, competition might have gone into a new gear recently when Comcast began offering low rates with a five-year price guarantee. The 5-year guaranteed rates were introduced soon after Verizon offered a 3-year price guarantee for FWA wireless home broadband.

In a Comcast blog dated April 15, Comcast announced a 5-year guaranteed rate plan for new customers for 400 Mbps broadband for $55 per month. The product comes with the company’s WiFi Gateway and no contract is required. The plan also includes a free Comcast cell phone plan with a 30 GB data cap for one year. This is a substantial discount. The list price for 400 Mbps is $86, and the normal charge for the WiFi Gateway is $15. The cell phone normally costs $30 per month. The 5-year rate is available through June 23, but Comcast has already told some news outlets that the special rate offer will probably be extended.

On the announcement date, several news outlets like PC Magazine listed the 5-year deal packages as 400 Mbps ($55), 600 Mbps ($70), 1.1 Gbps ($85), and 2.1 Gbps ($105). The outlets also reported that these rates only come with an auto debit to a bank account. Comcast will charge $8 more to bill to a credit card and $10 more for a paper bill.

The low prices were likely also prompted by the recent announcement that Comcast lost 199,000 broadband customers in the first quarter. In this same quarter, the FWA products from AT&T, T-Mobile, and Verizon gained 913,000 customers.

Comcast’s competition isn’t sitting still. Verizon recently announced a 3-year lock for FWA broadband prices at $35 per month for customers who accept autopay and who also buy a Verizon cell plan. Verizon includes up to a $250 Amazon gift card. Not to be outdone, T-Mobile now offers a $35 price for FWA broadband with a 5-year guarantee for customers who have a T-Mobile cellular plan. The Verizon and T-Mobile plans seem to be more focused on reducing cellular churn than gaining new broadband customers.

Comcast is clearly trying to stop the loss of customers. I have to wonder about the overall impact of such widely advertised special rates. How will these low play with the millions of customers who are paying a lot more, including the many paying $15 per month for a WiFi gateway?

Will this lead to Comcast finally lowering its list prices? The company has raised rates annually for over a decade. Can the company maintain high rates in noncompetitive markets while widely advertising severely discounted prices elsewhere?

I’ve been saying for years that broadband will cost $100 per month. When considering the WiFi gateway, Comcast’s list prices were already there. Comcast isn’t even the most expensive cable company, and a handful of cable companies like Cox, Breezeline, and Mediacom have even higher list prices.

This announcement by Comcast, and the constant advertisements from the FWA providers, could prove to be a watershed moment for prices in the industry. Just imagine the glee that USTelecom will have next year if they can announce that prices for broadband are actually decreasing.

Growing Urban/Rural Broadband Gap

Ookla recently published a report that looks at statistics related to the digital divide. Ookla is in a unique position to understand U.S. broadband since the company is the most popular speed test company that gathers huge numbers of speed tests from all over the country.

Here are some of the key findings of the report:

  • 32 states saw an increase in the digital divide between urban and rural households in the second half of 2024. Ookla measured this by looking at the median broadband speeds for urban versus rural parts of each state, and in these states, the gulf between urban and rural increased.
  • Overall speeds are up, and 17 states saw an increase in the percentage of speed tests faster than 100/20 Mbps, with New Mexico, Colorado, and Minnesota having the most improvement.
  • The number of states where at least 60% of users realized speeds of 100/20 Mbps or faster increased from 9 in the first half of 2024 to 22 in the second half of the year.
  • The states with the highest percentage of users seeing fast speeds are New Jersey, Connecticut, Delaware, North Dakota, and Maryland. 19 states and the District of Columbia had at least 60% of speed tests faster than 100/20 Mbps.
  • Alaska and Montana had the worst broadband performance with less than 40% of users receiving speeds faster than 100/20 Mbps.
  • South Carolina is the only state that saw improvements in broadband performance in both urban and rural parts of the state.

Some states showed dramatic improvements in the percentage of speed tests above 100/20 Mbps. New Mexico climbed from 31.85% in the second half of 2023 to 52.37% in the second half of 2024, an improvement of 20.5%. Other big increases were Colorado at 19.1%, Pennsylvania at 18.5%, Minnesota at 17.4%, and Washington at 17.2%.

Ookla cited reports from the Fiber Broadband Association that shows that 56.5% of homes were passed by fiber at the end of 2024, an increase of 10.3 million new fiber passings during the year. Ookla credits the overall increase in speeds and the percentage of users seeing speeds faster than 100/20 Mbps on these investments.

Ookla came to the overall conclusion that urban broadband is improving at a faster pace than rural broadband. I was surprised by this finding. While it’s true that BEAD grants have continued to move slowly, there was a huge amount of broadband upgrades in rural markets last year. Many billions of dollars were invested in fiber from grant programs like the Capital Project Funds, Reconnect, and RDOF. A lot of rural communities have also started to see the much faster radios from WISPs and cellular carriers. But even with those improvements, speed improvements in urban areas are outpacing improvements in rural America.

Broadband Influencers

Since today is holiday, I thought I’d publish a more lighthearted blog. Today I ask the question – are there any broadband influencers? Influencers are an interesting recent phenomenon and are folks who make a living by getting enough followers on social media to earn money from the volume of followers or product endorsements.

A friend of mine is a tax preparer, and he pointed out a number of people trying to make a living as influencers in his field. These folks create short videos talking about tax issues that are posted on YouTube, Tic-Tok, Instagram, and other platforms trying to attract followers. While some of the tax influencers also do tax returns, some don’t, but all are trying to make a modest income as an influencer. What my friend finds troubling is that some of the advice from influencers is incorrect, and there are no standards or rules against giving poor advice online.

This made me wonder if there are any broadband influencers. I’ve never heard of any, but I don’t spend time on the social media platforms where influencers operate. I’m curious if there is anybody who is making a living by just talking about broadband issues online?

Folks inside the industry think broadband discussions are interesting, and a lot of people read my blog and articles published every day by various industry magazines and news sites. There are folks in our industry with podcasts and webcasts. But it’s hard to think that any of them have enough podcast viewers to create any real revenue from it.

Decades ago, there was a person in the industry who probably would be categorized as an influencer. Esme Vos created a portal called Muniwireless.com in 2003. Esme wasn’t an engineer or an ISP, but a technology lawyer. Muni wireless was a hot topic at that time, and many cities were considering ways to bring broadband to citizens. A few cities, like Philadelphia, made a big investment in wireless technology. Esme attracted a lot of attention and created well-attended conferences and a magazine to discuss and support the issue. At the time, most people in the industry knew about Esme – which is probably one of the definitions of an influencer. When it turned out that early mesh radios didn’t operate like promised, the idea lost appeal.

Since Esme, the tech industry has become flush with influencers that promote every new technology that comes along, like blockchain, cryptocurrency, or AI. Influencers who are not experts have been able to carve out a living by becoming the focal point for folks who want to learn more about the various new technologies – and that seems to be the definition of a technology influencer.

I think I know the answer to my question, but I wouldn’t mind being surprised by finding an influencer. I’d love to know about anybody who is making a living just by talking about broadband. That would be fascinating.

Battle for CBRS Spectrum

There is a huge battle brewing at the FCC over the use of CBRS spectrum. The pieces are starting to fall in place to possibly auction the spectrum for use by cellular carriers.

The idea of putting the spectrum up for auction has been discussed for several years, but the topic went into high gear last year when AT&T asked the FCC to open a formal docket to explore the idea. AT&T’s request was opposed by a diverse set of industry stakeholders, and the FCC didn’t take any action on the AT&T request.

One of the stumbling blocks to the AT&T request is that this spectrum is used by the military, primarily by the Navy. The Department of Defense just removed that hurdle and is circulating a spectrum plan where it would relocated its functions being handled by CBRS spectrum to the 3.1-3.4 GHz band.

The other new change is that Congress now seems gung-ho to reauthorize FCC spectrum auctions as a way to meet budget goals. During newly-passed House version of the new budget is an assumption that new spectrum auctions will be able to raise $88 billion. That claim seems high since the FCC has only raised a total of $233 billion from spectrum auctions since the process began in 1991.

For a new spectrum auction to raise a lot of money, a lot of spectrum has to be made available. The DoD proposal has the military freeing up 640 MHz of spectrum, including freeing up use in 1300-1350 MHz, 1780-1850 MHz, 5850-5925 MHz, and 7125-7250 MHz. DoD also proposes that the FCC clear 220 MHz in the upper C-band for auction. The FCC has already been considering auctioning off the AWS-3 spectrum bands that include 1695-1710 MHz, 1755-1780 MHz, and 2155-2180 MHz.

If the FCC goes along with AT&T’s and DoD’s proposed plans, it will be a huge windfall for cellular carriers over other spectrum users. CBRS spectrum is used today for a wide variety of functions, including rural broadband, manufacturing, industrial and enterprise private networks, transportation and logistics connectivity, and school and library access. I wrote a recent blog about how John Deere was using the spectrum to create a private network for its factories in Illinois and Iowa.

A spectrum auction would require the auction winner to fund existing users to relocate to another spectrum band, but doing so is disruptive, and in many cases would not result in a one-for-one functional swap.

The FCC would also be setting a new precedent by relocating CBRS spectrum users who won the use of the spectrum in the last few years. For the FCC to change its mind about the spectrum should make any spectrum winner nervous that the FCC won’t defend existing spectrum licenses.

Anybody who has been following the industry has noticed a big uptick in discussions about how the U.S. is again losing the 5G battle to the Chinese. I’ve never found anybody able to tell me what that means, and the last time this language was used was part of a ploy by carriers to put pressure on the FCC to hold more spectrum auctions. It looks like history is repeating itself.

The most interesting thing about these spectrum battles is that the cellular carriers mostly want more spectrum to be able to compete for home broadband service. I think we should be having the policy discussion if that is in the long-run national interest. It makes sense to deploy unused spectrum in rural areas to provide broadband in places where people don’t invest in fiber networks. But spectrum is a limited resource, and I think it’s a valid question to ask if we should be using valuable spectrum to bring a third competitor to urban markets that already have fiber and cable ISPs. I went back and reviewed the original goals for 5G, and competing for home broadband was never mentioned as a goal.