Arbitrage Rarely Lasts

I’ve always been fascinated with telecom business plans in tech built around arbitrage. I define arbitrage as any telecom or technology business plan that relies on the use of a platform that is not controlled by the entity that is selling retail products. What I find most intriguing about arbitrage is that the business opportunity inevitably comes to an end, and yet the people selling arbitrage services almost universally seem shocked when their business crumbles.

The earliest arbitrage business I remember is resold paging. Paging became the rage in the 1980s when a belt-clipped pager was a status symbol for millions of users. I recall meeting salespeople at big telecom shows in the 80s who were offering branded paging services to telcos and corporations. I could have the name wrong, but I recall the underlying network was PageNet. I recall joking with a coworker that it was easy to find the pager salespeople because they were the only ones at the conventions in sharkskin suits. Like other arbitrage opportunities, companies that resold paging folded when blackberries and other early cellular devices killed the paging industry.

The first really giant arbitrage business was reselling long-distance. After the AT&T divestiture in 1984, it became possible for companies to buy large volumes of long distance minutes from AT&T and other telcos. The arbitrage opportunity came when the bulk buyers of minutes would rebrand long-distance for anybody willing to peddle minutes. You may remember the standing joke from the late 1980s when half of the calls that came to your home was from somebody trying to get you to buy cheaper long-distance. Related to selling long-distance was an arbitrage opportunity to resell prepaid long-distance calling cards. Everybody making a call in an airport whipped out a card before making a call.

The next big opportunity came from the Telecommunications Act of 1996 when the FCC deregulated local telephone service. A lot of companies purchased unbundled network lines and collocated equipment in telco central offices. That wasn’t entirely arbitrage since the seller had to make a significant capital investment. The real arbitrage opportunity came from reselling local telephone lines from the big telcos. By 1997, State Commissions had mandating big discounts and low rates for reselling local service, and resellers sprung up instantly. I recall the biggest reseller was Talk America with over 1 million resold telephone lines. Like all arbitrage opportunities, local line resale died as quickly as it had started when telcos were allowed to raise wholesale rates.

The other big arbitrage opportunity has been reselling cellular minutes, which created the MVNO industry. Some really large MVNOs were created and T-Mobile and other cell companies bought back the largest resellers. But there were large numbers of small MVNOs created by companies and organizations who thought they could sell a lot of cellphones. A few MVNO arrangements have lasted many years, but the industry is littered with smaller MVNOs that didn’t make it because the underlying cell carrier either raised rates or changed the terms and made resale unprofitable.

The primary characteristic of arbitrage opportunities is that the business opportunity eventually evaporates, either from changes in the overall market, or more normally because the underlying network owner decides to charge more or end the business. The percentage of telco arbitrage businesses that eventually folded has to be in the 99% range, with only a handful that somehow made it.

There were smaller, yet still significant arbitrage opportunities. For example, most of the companies that sold cellular ACP plans were arbitrage companies that sprung up because of the federal subsidy and quickly died when ACP was cancelled.

I’ve been watching a new arbitrage situation. This isn’t in telecom, but in AI. It seems that there are 100,000 or more businesses that have sprung up by building wrappers – an app or software that relies on the systems of one of the handful of AI providers. If the history of arbitrage tells us anything, it’s that most of these businesses won’t make it. I already see the inklings of a big shakeout since AI firms are being pressed by shareholder to raise rates to curtail large monthly losses. If AI providers start charging the real cost of using their data centers, it’s likely that most of the wrapper companies will quickly die. I would hope the companies engaged in these business are realistic about it, because over the years I recall many resellers of arbitrage businesses seemed shocked when the opportunity died.

A Converged Carrier Market?

T-Mobile made financial news recently when a KeyBanc Capital Markets analyst downgraded the long-term outlook for T-Mobile stock and said the company is “underweight”. Press coverage quoted the analyst saying, “We think [T-Mobile] is fiber deficient in a converged/bundled world”.

We’ve been headed towards the industry that is dominated by a handful of converged telecom providers, and the comments from this analyst show that day is probably here. The analyst’s comments come from comparing T-Mobile with the other giant converged companies that offer broadband and wireless, specifically AT&T, Verizon, Comcast, and Charter/Cox.

It’s curious why the analyst dinged T-Mobile because the company is profitable and successful. In the latest financial report for the second quarter of 2025, the company reported $17.4 billion in customer revenues, up 6% year-over-year. Net income was $3.2 billion, the highest-ever for the company and up 10% year-over-year. Net cash from operations was $7 billion, up 27% year-over-year. Adjusted free cash flow was $4.6 billion, up 4% year-over-year.

T-Mobile was criticized because the analyst believes that the most successful big companies will be those that lock up customers with a bundle of broadband and wireless. That seems to mean that the companies with the most gigabit passings will be the ultimate winners in the market. T-Mobile is expected to have about 15 million fiber passings by 2030. That pales behind the 50 million passings expected by Verizon by 2020 or the 60 million planned by AT&T by 2023. Charter passes 57 million homes today and will be adding 7 million homes when it closes on the merger with Cox. Comcast says it will have 62.5 million passings by 2023. T-Mobile will clearly have the smallest fiber footprint.

How are the other big four converged companies doing with bundling? Comcast had 8.5 million cellular customers at the end of 2Q 2025 compared to 31.4 million broadband households. Charter had 10.9 million cellular customers compared to 29.9 million broadband households. AT&T reported for 2Q 2025 that 40% of its fiber customers are buying cellular. I can’t find where Verizon highlights the percentage of homes that buy cellular and broadband.

So this year, the stock market doesn’t seem to be valuing the converged carriers evenly. As I wrote this blog, T-Mobile stock was up 19% for the year. Comcast stock is down 11% for the year and Charter is down 22%. Verizon stock is up 6% and AT&T is up 20%. There is a story behind all of the stock price changes, and it mostly involves changes in customers and earnings, not in the percentage of convergence.

One thing is clear. These five companies dominate the telecommunications space. The five companies have most of the cellular customers in the country, and T-Mobile will be adding customers from the USCellular purchase. The five companies had over 98 million broadband customers at the end of the second quarter of 2025, and Charter will be adding 6-7 million more customers if the merger with Cox is approved. The five companies account for almost all of the national net growth of broadband customers.

The KeyBank analyst was looking at the long-term trajectory of T-Mobile compared to the other giant companies. The analysis statement seems to assume that FWA growth will eventually top out and decline in competition with the other big carriers. But for now, in the second quarter, T-Mobile had the biggest growth in both cellular and broadband customers. It’s obvious that T-Mobile has something today that customers value. My crystal ball is not clear enough to be able to predict that T-Mobile is going to stop growing any time soon, and it seems too early to predict that T-Mobile won’t be in the same category as the other four converged companies.

The Human Touch

Recently, Verizon Consumer CEO Sowmyanarayan Sampath wrote to customers saying that Verizon customer service has “taken a different path” and the company is raising the bar on the customer service experience. This sounds a lot like communications with customers I’ve seen over the years from all of the big ISPs – I can think of dozens of company messages telling customers that a big ISP cares about customer service.

What’s different about Mr. Sampath’s email is that he also included an email address where customers can contact him directly if they are having a problem that is not getting resolved. I have to assume this will use a different email address from the one he uses for normal emails, because it seems likely that his inbox will quickly fill with customer complaints.

This reminded me of an experience I had back in the early 1980s when I worked at Southwestern Bell. The company had an executive telephone hotline that was supposedly a direct line to the President for customers who knew the special number. Calls to this number were recorded and landed on the desk of somebody who happened to sit close to me. I would often overhear some of the complaints that came to the executive line, and they were the normal things you would expect – overbilling, botched installations, etc. Employees around the company responded quickly to every referral from the executive hotline.

I have to think that Mr. Sampath is doing something similar and has recreated the executive hotline using an email address. If Verizon customer service is indeed getting better, I assume anybody who makes a valid claim to that email will get some attention from elsewhere in Verizon. If that doesn’t happen, this will quickly be chalked up as another big company public relations ploy rather than an actual aid for frustrated customers.

I have to wonder how well this idea will work with such a gigantic company with coast-to-coast customers. I know at Southwestern Bell that no employee wanted to get the internal message from the executive suite that they had messed up. Will that work for a much bigger company?

People who run smaller ISPs, or other small businesses that deal with the public, will laugh at this article, because fielding customer issues is a daily part of every executive’s work day. It’s something that nobody loves doing, but it comes with operating a business. Every ISP hopes that employees can satisfy every customer so that the top guys never hear about problems. But the folks at Southwestern Bell many years ago figured out that there had to be a way for customers who aren’t satisfied with the routine solution to have an outlet to be heard.

This story has me thinking about how important the human touch is with customers – having a real person to talk to who can solve a problem. That question was prompted for me when I noticed that Verizon is touting that it has incorporated AI into the customer service process. I have to wonder if AI will be used to tackle problems sent to Mr Sampath’s email.

While big companies can pretend otherwise, we have not yet reached the time when an AI can provide the same quality response as a real person. My gut tells me that it will be a huge mistake for the big ISPs and carriers to take the human touch out of customer interactions. If so, that’s good news for the smaller companies that compete with big ISPs. I foresee that small ISP advertising will emphasize that customers can always talk to a real person.

 

The Impact of Broadband Slowdowns

Catchpoint recently issued The Internet Resilience Report 2025, its second report that looks at the impact of Internet outages and slow Internet performance on large corporations. Catchpoint sells software that looks in detail at Internet performance with the goal of identifying network problems early and fixing them before they become big problems.

The conclusions of the report will be familiar to anybody who works from home. Catchpoint highlights that broadband outages are costly and disruptive. But it also concludes that “slow is the new down”, meaning that broadband slowdowns are as damaging as outages. Web outages for specific platforms seem to be occurring with increasing frequency. I was recently working with a client who needed to interpret mapping data, and we found that ArcGIS was down nationwide. We had no alternative except to wait half a day until the application was up and running again.

The Catchpoint survey solicited feedback from 475 IT managers, directors, and executives of major companies. The key finding of the study is that 51% of the respondents said that an Internet performance problem led to a monthly loss of over $1 million in revenues in the last year, up from 43% in 2024. A third of those companies lost more than $10 million.

The most interesting finding is that 42% of respondents equated slow Internet performance to have the same negative impact as an outage. The report uses the phrase ”slow apps are dead apps” to describe the financial impact of applications that are not working as expected. The report also cites a recent study by Forrester that surveyed online retailers and reached the same conclusion that slow Internet might as well be an outage in terms of the bottom-line consequences.

The report concludes that large corporations should concentrate on application performance as much as they concentrate on Internet downtime. Big companies have largely accepted the need for broadband redundancy, and most buy a broadband connection from multiple ISPs. But that doesn’t protect them against regional and national outages of the companies and applications that control the Internet or the major applications used by businesses.

I expect that most readers of this blog are not from the giant corporations surveyed by Catchpoint. But I expect that everybody reading this has at least a few stories of how poor Internet performance impacted them in the last year. It seems like every month that something goes down for a while.

I’m lucky to live in a city, and when my broadband goes down, I am able to change quickly to using my cell phone for broadband. People living in rural areas are not so fortunate. At least around where I live, the cell coverage just a few miles outside the city isn’t good enough to support working from home.

The country is in the process of supposedly getting at least one broadband connection to everybody in rural America. But rural folks may still not feel secure to work from home if they don’t have a second broadband alternative. I hear from folks regularly who tell me about rural broadband outages that last for days, which contrasts with urban outages that rarely last more than a few hours. I’ve never thought about this before, but true rural parity between urban and rural broadband might mean having an alternative when primary broadband fails.

FCC Considers Changing Broadband Goals

FCC Chairman Brendan Carr has proposed changes to the way the FCC sets broadband goals and tracks broadband coverage. The proposed changes are included in the Nineteenth Section 706 Notice of Enquiry, which is scheduled to come for a vote at the FCC’s August meeting.

One of the areas being explored in the Notice is how the FCC determines the speed of broadband.

  • The FCC asks if 100/20 Mbps should be the benchmark for defining fixed broadband. This is a question that almost every annual Enquiry has asked, and the FCC will be asking for input. For example, should the FCC consider the speeds consumers buy when given an option of multiple speed tiers (spoiler alert, ISPs report that a significant percentage of consumers buy speeds of 500 Mbps or faster when they have the option).
  • The FCC is also asking the annual question of how to judge adequate cellular speeds. In recent years, the speed goal for 5G outdoor coverage has been 35/3 Mbps. The FCC is asking if that should remain the goal and if it should be extended to include speeds inside moving vehicles.
  • The FCC is recommending keeping the current benchmark for school broadband of 1 Gbps per 1,000 students. They are asking for comments on this benchmark. I’ve talked to numerous school officials who all say the current metric is obsolete and that they need 3-5 simultaneous Mbps per student, which would mean 3 – 5 Gbps are needed for a school with 1,000 students.
  • One proposal in the Enquiry that is going to be controversial is a recommendation to drop the future goal of eventually achieving 1000/500 Mbps broadband speeds. The FCC says that having a future goal is not necessary since it’s not required by statute. The FCC, under Chairperson Jessica Rosenworcel, adopted the future goal as a way to show the FCC’s support for building fiber. This preference for fiber was heavily baked into the original rules adopted by Congress for BEAD, but that preference has recently been greatly watered down.

Scrapping the gigabit goal for future broadband and sticking with 100/20 Mbps as the definition of broadband is out of synch with the market. OpenVault reported last year that 32% of U.S. homes are subscribed to gigabit broadband from fiber ISPs or cable companies.

The Enquiry also asks about how the FCC should track broadband deployment – who has broadband. The FCC wants to know if it makes sense to note homes in the FCC maps that are covered by a grant program that promises to bring faster speeds in the future.

I’m in favor of counting broadband coverage in two ways. One is a pure tabulation of the FCC mapping data that shows the number of homes not covered by broadband as of the latest FCC reporting date. It also makes sense to report how many of those homes have a coming broadband commitment. But homes with a commitment should not be counted as served until the new broadband is built. There have been plenty of defaults in the RDOF program, and there is no reason to think that ISPs won’t default on grants awarded by the many other state and federal grant programs.

The FCC also asks about the challenge of counting homes served by satellite broadband. It seems likely that a lot of States are going to award BEAD funds to Starlink and Kuiper. Is there any sensible way for the FCC to show areas covered by satellite for BEAD as having broadband if the same designation isn’t extended to neighboring areas not covered by BEAD. Recognizing all the places claimed by satellite probably means that almost the whole country would be counted as served, in which case you  might as well toss out the broadband map. I’m not sure how the FCC can open the door and count some locations as served by satellite but not others.

The Future of ReConnect

I have to wonder if there is any practical future for USDA’s ReConnect grants. I raise this question after noting that the Senate Appropriations Committee recently approved the fiscal year 2026 budget for the Department of Agriculture. Buried within that budget is $100 million dollars for new ReConnect loans or grants. It’s still early in the federal budget process, and the $100 million slated for next year is a preliminary number, but it’s already lower than previous annual allocations to the program.

ReConnect has been a popular program, particularly with cooperatives and small telcos. ReConnect was launched in December 2018 by Congress with an initial budget of $600 million. Additional funds continue to be allocated, including $550 million in 2020 and $1.15 billion in 2021. USDA is still sitting on $980 million of remaining appropriated funds, but is also sitting on $3 billion in funding requests.

ReConnect has always been an interesting program. USDA can use the funding for grants, loans, or a combination of the two. The program is intended to bring broadband to unserved rural locations, and the ReConnect process gives extra consideration to locations that are not close to any towns or cities.

I ask if ReConnect will still be relevant in upcoming years for several reasons. First, if you believe the hype about BEAD grants, every location in the country will soon be slated to get broadband of at least 100/20 Mbps. According to the NTIA, every location that has been excluded from BEAD is already served by at least one ISP claiming 100/20 Mbps. That can be for any technology, including fiber, cable, DSL, fixed wireless, FWA cellular wireless, or satellite.

But ignoring that promise from BEAD, there will still be remaining unserved locations around the country. For example, there have been some recent defaults of RDOF subsidies that were defaulted too late to be included in BEAD, and there will be more. There will likely be defaults on funding commitments from other state and federal grant programs, including some from the BEAD program. It’s also possible ISPs could go out of business and leave rural customers with no option at 100/20 Mbps. This is certainly possible for WISPs if the FCC meddles with the CBRS and 6 GHz spectrum.

I’m also positive that there are a lot of locations where ISPs claim 100/20 Mbps or faster in the FCC maps but are delivering something slower. Perhaps future ReConnect grants will allow ISPs to ask for funding in areas where they can prove the FCC map is wrong.

Another issue with ReConnect is that the grant rules in the past have insisted on contiguous grant areas of unserved locations. Because of the odd rules of many of the existing grant and subsidy programs, particularly RDOF, there will probably be no big contiguous unserved areas after BEAD grants have been awarded. Any future ReConnect grant is going to require cobbling together scattered locations into a single grant request, and that will require changes in the ReConnect rules.

But I think the fundamental challenge for BEAD is that the FCC is likely to declare soon that the rural broadband gap has been solved and every rural home in the country is able to buy adequate broadband. I’m not sure the USDA will be able to overcome that presumption.

A Peek at the New BEAD

The State of Tennessee released a side-by-side comparison of the new Benefit of the Bargain round of BEAD applications compared to its initial round of BEAD applications conducted before the revised BEAD rules.

The side-by-side comparison (file:///C:/A/Articles/Tennessee-BEAD-Comparison.pdf) is interesting and shows some big differences between the two grant rounds:

  • Tennessee received 541 applications in the new Benefit of the Bargain round compared to 298 applications in the original round of BEAD.
  • The low-orbit satellite companies Starlink and Kuiper bid throughout the state. Starlink didn’t submit any applications in the first round but bid almost everywhere in the new BEAD round. Kuiper bid for most of the state in both BEAD rounds. Satellite is clearly going to win a significant amount of grant funding since there were 68 of 173 serving areas that got proposals from one or both satellite providers and no other technology. The satellite companies surprisingly don’t seem to be fazed by bidding in Appalachia.
  • There were surprisingly few proposals for fixed wireless technology, with proposals only made in 12 of the 173 study areas included in the new round of BEAD. Part of the reason for this might be the mountainous and hilly nature of much of Tennessee, but there are plenty of areas in the central and western parts of the state where wireless will work well.
  • Comcast switched technology from the first to the second round. In the first round, the company proposed to build fiber, and in the new round it mostly changed to traditional hybrid fiber/coaxial networks – apparently to be able to bid at a lower cost. This makes me wonder if it’s really cheaper to build copper coaxial cables than fiber or if Comcast is just willing to take less funding.
  • There has always been a big question of whether big ISPs would show up for BEAD. There are three big companies in the new round of BEAD – AT&T, Comcast, and Windstream. The industry has always wondered if AT&T would join BEAD.
  • There are a number of smaller ISPs asking for funding to build fiber that includes cooperatives and municipalities.
  • There are four service areas that had no proposals. The state will have to talk an ISP into serving these areas before they can close out their BEAD grants.

It’s impossible to make any definitive cost comparisons between applicants because the new BEAD rules allow ISPs to request to serve areas smaller than the serving areas suggested by the state. There are also roughly 7,000 fewer passings on the newest BEAD map than were included in the initial BEAD grants. But in general, the comparison shows:

  • Most companies proposing to build fiber bid less the second time, but some of this could be due to fewer eligible passings and not just to a sharpening of the pencil.
  • Fiber ISPs across the country are wondering how much lower other technologies will bid in BEAD. There is only a single company asking to build wireless in the state, and their proposed grant awards are roughly one-third the cost of those asking for fiber in the same study areas. But without knowing more details, that ratio might not mean anything for other states.
  • However, satellite bids are incredibly low, most at 10% or less than proposals to build fiber. There is a map showing the eligible passings by study area, and I eyeball the satellite bids to be in the range of $400- $600 per passing. Kuiper is generally significantly lower than Starlink. These low bids are going to worry ISPs everywhere.

 

Continuing RDOF Defaults

CenturyLink told the FCC recently that it is defaulting on 41,000 RDOF locations spread across eight states and 153 Census block groups. That’s a big portion of the 77,000 locations that the company won in the RDOF reverse auction. CenturyLink originally was awarded $262.3 million in subsidies, spread over ten years.

There are a number of consequences of this default. First, this has now happened after states made BEAD maps and allocation. That makes it likely that nobody will be bringing improved broadband to the default areas. If the defaults had happened earlier, these areas could have been rolled into the BEAD process.

CenturyLink should expect a significant fine. In 2024, the FCC fined two companies that defaulted on RDOF. Etheric Communications was fined $732,000 for defaulting on 244 locations. GigFire (LTD Broadband) was fined $21.7 million for defaulting on 7,238 locations. Mercury Broadband was fined $14.2 million in a separate FCC decision and is also expected to return all RDOF funding for the defaulted areas.

If CenturyLink is fined at the same level or around $3,000 per location as the recent defaults, the fine will be $123 million. Additionally, roughly half of the RDOF funding has flowed to auction winners, meaning CenturyLink would have to return approximately $65 million of RDOF subsidy to the FCC.

The CenturyLink default defies the usual explanation of RDOF defaults. Many other defaults have been blamed on the FCC’s auction rules that didn’t pre-qualify companies before entering the auction. That resulted in companies winning RDOF that had weak balance sheets or insufficient financial backing.

But any pre-qualifying process would have easily allowed CenturyLink to enter the RDOF auction. CenturyLink is now obviously in financial distress and has decided that fines are less expensive than completing the required construction. The company has also already sold off much of it’s copper networks in twenty states and has been looking for a buyer for the remaining states. The company recently announced the sale of most of its fiber last-mile customers to AT&T, so it’s clear that CenturyLink is exiting the residential ISP business.

This is not likely the end of RDOF defaults. According to a telecompetitor article earlier this year, eight companies reported to the FCC at the end of 2024 that they were behind schedule in meeting their RDOF construction commitments. RDOF winners were required to have covered 40% of their locations in each State where they won an award by the end of 2024.

I said at the time it was first announced that RDOF is a badly flawed program. The reality has turned out to be far worse than any predictions. While RDOF was used successfully by a number of electric cooperatives and a few others to build future-looking networks, a huge amount of original awards fell on the floor through defaults or the FCC tossing out winners it didn’t like. Possibly the worst thing about RDOF was how the RDOF awards resulted in helter-skelter coverage areas that covered rural areas like Swiss cheese, making it hard today to do anything with the mess that RDOF left behind. I keep thinking we’ve heard the last bad news from RDOF, but the announcements keep coming.

The Space Cannon

As if low-orbit space isn’t already getting over-crowded, there is a startup that may send huge numbers of additional satellites into orbit. The California company is Spinlaunch.

Spinlaunch plans to shoot microsatellites into orbit using what they call a centrifugal cannon (pictured to the right). The cannon spins and accelerates a small rocket that will hold multiple satellites. The cannon accelerates the rocket using spinning arms inside a vacuum chamber that achieve a force of 10,000 G and a speed of 5,000 miles per hour – fast enough to achieve a suborbital height. From there, the rocket engines will fire to finish the trip to space. The company has done ten test launches that successfully reached suborbital heights.

The launches will be done from Adak Island, near the western end of the Aleutian Islands off Alaska. Spinlaunch’s partner is the Aleut Corporation, an Alaskan Native business. They chose the Aleutians since it provides a clear launch path over the Pacific Ocean with minimal disruption to commercial flights. The area also has steady winds, which allow for the use of cheap renewable wind power. The site also takes advantage of an abandoned U.S. Navy base on the island.

The company plans to start shooting satellites into orbit in 2026. The satellites, shaped like a disk, are 7.5 feet across and weigh about 154 pounds. This is significantly lighter than a Starlink satellite, with the current V2 satellites weighing in at 1,760 pounds. The plan is to place 250 satellites into orbit in a single launch, the most ever. Last year, Starlink launched 143 satellites in a single launch.

The satellite fleet will be owned by a sister company, Meridian Space. Meridian Space currently holds an FCC license to launch 1,190 satellites. Spinlaunch raised $150 million, including a recent infusion of $12 million from Kongsberg Defense and Aerospace, which will manufacture the satellites. Meridian Space plans to compete head-to-head with Starlink and Kuiper in selling broadband.

Spinlaunch thinks it has a number of advantages over other launch technologies. It requires 70% less fuel to put a satellite into orbit, meaning a much lower cost of deployment. The launch cannon should be fully usable for many years of future launches.

On the flip side, placing even more satellites into space increases the problems that have been identified with proliferation of low-orbit satellites. That includes an increased risk of space collisions and the resulting debris that could make low-orbit space into a dead zone. It means more interference with light pollution and interference with astronomy. It also means more satellites falling back to earth, which can cause degradation of the ozone layer.

But like it or not, the satellite age is upon us, and is going to accelerate as companies find clever ways to launch more satellites into low-orbit space.

The Growth of Backhaul Data

Zayo recently released a report that talks about the boom in backbone data usage in the country. For any readers who don’t know Zayo, it provides fiber connections for large data users and is one of the major companies that carry data between cities and across the country.

One component of backbone data is the accumulated usage of residential and small business broadband customers. We learned recently from OpenVault that residential and small business data usage grew 18% from 2023 to 2024. The average total usage per customer per month grew from 606 gigabytes at the end of the first quarter of 2023 to 663 gigabytes at the end of the first quarter of 2024. Zayo reports that data usage on fiber backbones grew far faster than the 18% that came from individual broadband users.

Zayo cited the following statistics:

  • Long-haul dark fiber sales were up 52% from 2023 to 2024. Carriers buy dark fiber when they want to send a large amount of data.
  • Wavelength capacity grew by 280% from 2020 to 2024. A wavelength represents the full bandwidth available from a band of light on a fiber. Zayo says that sales of 400 GB wavelengths accounted for more bandwidth than all sales from 10 GB and 100 GB connections in 2024.
  • Large buyers are buying the majority of new bandwidth. Zayo says that hyperscalers and carriers purchased 91% of dark fiber sales and 67% of all wavelength sales since 2020. Hyperscalers are large cloud users like Amazon AWS, Google Cloud, and Microsoft Azure.

Zayo says that AI traffic accounts for a lot of the new backbone data usage. The report discusses how AI data center usage in new markets is. For example, data sales in Memphis increased by 4,300% in 2024 and data sales in Salt Lake City grow 348%, both due to new AI data centers.

Zayo describes an interesting history of backhaul bandwidth. The report says that backbone bandwidth started growing faster than the bandwidth used by homes around 2007 due to the migration of data to the cloud. Large companies began storing data in data centers instead of locally as a way to protect data. Over time, a lot of the functions used by homes and businesses also migrated to the cloud as the common applications we all use moved to data centers instead of being stored only on people’s computers.

Zayo uses the term “distributed era” to describe the period that began in 2020 when the pandemic suddenly forced companies to expand networks to include people working from home. The decentralized workforce forced employers to find solutions to safely distribute access to their network to myriad locations.

Zayo coined the term “the intelligence era” for the period starting in 2024 and is characterized by modifying networks to handle the massive increases in data created by AI data centers. The changes are not just from larger bandwidth and include lowering latency and increasing real-time responsiveness to end-users.

The report digs a lot deeper than this blog and includes a  lot of interesting graphs of the detail of bandwidth growth since 2020.