Are Cable Companies “Permanently Impaired”?

KeyBanc Capital Markets analyst Brandon Nispel recently said in an industry report that “There are reasons to believe that cable is permanently impaired.” By that, he believes that cable companies are going to continue to lose broadband customers as they compete with fiber and FWA cellular wireless.

The problem that cable companies are experiencing stems largely from the time when they enjoyed a near-monopoly status in broadband markets across the country, when their only real competition was DSL provided over copper wires. For well over a decade, cable company broadband customers grew by huge numbers each quarter as people abandoned DSL. The reason for the cable company decline today is that the monopoly is now over and cable companies suddenly have to compete with alternatives like fiber and FWA cellular.

Using the term ‘permanently impaired’ makes it sound like cable companies have inferior broadband. From a technology perspective, fiber is clearly superior to cable broadband. Fiber has lower latency and less jitter for a more reliable signal, and fiber can provide very fast or symmetrical upload speeds for customers who care about upload. But a technology comparison would give the nod to cable over FWA wireless. Cable speeds are faster, and wireless networks generally have more variability of signal over time.

But most customers don’t buy broadband based on the performance specifications. Households that don’t need a lot of upload are perfectly happy with cable company download speeds, with tiers available from 300 Mbps to over a gigabit. Surveys show that a lot of cable company customers are happy with the broadband speed and performance.

The cable companies have been investing in increasing upload speeds, which will satisfy a lot of their broadband customers. Whether they goose upload speeds to 200 Mbps with a mid-split upgrade or invest in symmetrical speeds with a DOCSIS 4.0 upgrade, the increased upload speeds will be enough to satisfy the large majority of households.

I don’t think that most of the households leaving cable companies are doing so because of the technical differences in the technologies, other than perhaps heavy gamers and others who care about the difference in latency and jitter. The cable companies are seeing customers leave because of the way they treated customers over the last decade.

A lot of customers soured over the years on cable companies because of cavalier customer service, where customers had long wait times on the phone, and cable technicians routinely showed up late for customer appointments. It’s been a running joke about how dreadful it is to be stuck in a Comcast call queue. Cable companies didn’t create loyal customers when they had a big rate increase every year for more than a decade, and now have base rates approaching $100. Customers grew frustrated when new customers got low prices while long-term customers continued to pay the full list price. I think it’s the millions of customers who have a sour taste in their mouth for the cable companies who are bailing when they finally have a reasonable alternative that is not DSL.

I’m starting to get public feedback that the big fiber companies like AT&T are headed down the same path as the cable companies. I’ve been contacted in recent months by several AT&T fiber customers who are unhappy with their fiber service. One told me about an outage that lasted for nearly a week before AT&T finally fixed the problem – and then offered them a $3 discount off the bill for their inconvenience. Another customer told me about regular short outages on AT&T fiber – and this customer originally left the cable company for AT&T for this reason. AT&T fiber won a lot of customers when they entered markets because they were cheaper than the big cable companies, but the company has now raised rates for broadband by $5 per month two years in a row, at a time when the company is bragging about record profits.

Nispel is right that cable companies will continue to lose customers. That’s a natural consequence of the end of a near-monopoly. But urban markets will eventually reach an equilibrium, and cable will settle in at a lower penetration rate. We already know what that looks like after seeing how Verizon FiOS and cable companies reached an equilibrium in the Northeast.

The story is not that cable companies are losing customers and are doomed. The real story is that the ISPs displacing them are repeating the same mistakes made by the cable companies, and the public isn’t going to like them any more than the cable companies. A colleague recently observed that competition in urban areas is largely illusory and we’re largely seeing competition between equally inept ISPs. I’m starting to think he’s right.

 

Fiber and Public Safety

The Fiber Broadband Association released a paper Fiber for Public Safety: Fighting Fire with Fiber – How Broadband Infrastructure Protects Communities Before, During, and After Disasters. The paper provides some case studies that show how fiber infrastructure is supporting communities.

The report includes five case studies of real-world examples of how fiber broadband has proven make a difference for public safety and during disasters.

  • In California, Siskiyou Telephone restored communications to a fire camp within an hour after satellite broadband collapsed under the heavy demand from first responders.
  • In Hawaii, the hardened and buried fiber networks of Hawaiian Telecom kept working during the 2023 Maui wildfires while the rest of the island’s telecommunication grids went down.
  • In Oregon, Douglas Fast Net pre-positioned fiber at fire camp sites to make sure that broadband is available when emergencies hit.
  • In Tennessee, United Communications provides free fiber broadband to every fire and police station it serves and makes public safety a central mission of its fiber network.
  • In Georgia, PeachNet connects first responders and emergency agencies directly to fiber to strengthen readiness while also supporting community services.

Most folks in the industry can tell stories about the importance and resilience of fiber networks and disasters. Here are a few of my own:

  • In the aftermath of a bad summer thunderstorm near Laurel, Maryland, I saw where the storm had knocked down a tree that broke the electric, telephone copper, and cable company wires. Amazingly, the Verizon FiOS fiber did not break and was holding up to the full weight of the tree. The broadband was still working in the neighborhoods fed by that fiber. I wish I had a smartphone camera in those days.
  • The electric utility in Lafayette, Louisiana, built a fiber backbone around the City and connected fiber to electric substations, the University, and public anchor institutions. After Hurricane Katrina in 2005, the LUS Fiber was the only functioning communications network in the City.
  • When Hurricane Helene hit Asheville and surrounding counties, all communications went down as entire roads were wiped off the map, which crippled networks by destroying the fiber backbones that supported broadband, cellular, and telephone networks. However, the fiber network from Duke Power kept operating since the company had recently built a redundant fiber path into the area along a route that didn’t get destroyed. Having this operating fiber backbone sped up the restoration of the power grid by weeks since at allowed Duke repair crews to communicate with the outside world.

The whitepaper points out that fiber networks can play a huge role in public safety when built correctly. Burying fiber, like in Maui, made a huge difference. Building redundant routes can guarantee connectivity when other networks fail.

Grants Should Look Forward

State Broadband Offices had to go through a process this year of deciding if various technologies qualify for grant purposes as priority projects. A priority technology must meet the following requirement: Provide broadband service that meets speed, latency, reliability, consistency in quality of service, and related criteria as the Assistant Secretary shall determine; and ensure that the network built by the project can easily scale speeds over time to meet the evolving connectivity needs of households and businesses and support the deployment of 5G, successor wireless technologies, and other advanced services.

NTIA chose a speed of 100/20 Mbps as the metric for meeting the current test of a priority technology. This is convenient, since this was the declared speed that the legislation said a BEAD-funded technology must be able to deliver. Today’s blog asks if that definition is adequate.

One way to consider what the current speed of broadband should be is to look at historical trends. For many years, Cisco issued reports that regularly reported that the demand for speed was growing at roughly 21% per year for residential broadband, and a little faster for business broadband. Cisco and others noted that the demand for broadband speeds was on a relatively straight line back to the early 1980s.

It’s not hard to test the Cisco long-term growth rate. The following table applies a 21% growth rate to the 25/3 Mbps definition of broadband established by the FCC in 2015.This table is somewhat arbitrary since it assumes that broadband demand in 2015 was exactly 25 Mbps – but there was widespread praise of the new definition at that time, other than from ISPs who wanted to stick with the 4/1 Mbps definition. This simple table accurately predicted that we would be talking about the need to increase the definition of broadband to 100 Mbps download around 2022, which is exactly what happened. The FCC did not have a fifth Commissioner at the time and wasn’t able to make the change until March 2024 – but in 2022, the FCC wanted to change the definition of broadband to 100 Mbps download, which was at a 21% compounded annual growth rate from the definition of broadband the FCC had established in 2015.

I can’t think of any fundamental industry changes that would change the historical growth rate in the near future. We’ve certainly seen a big demand to buy faster broadband products. Consider the following chart that starts with the assumption that 100 Mbps was the right definition of broadband in 2022. Growing that number over time by the same 21% results in the following table. What does this table suggest for BEAD and other grant?. Consider the evaluation of Starlink, which is the technology that is closest to meeting or not meeting the needed speed. Ookla released a report in the first quarter of 2025 showing that the median speed on Starlink was 104.71 Mbps download and 14.84 Mbps upload, and that only 17% of Starlink customers in the first quarter fully met the 100/20 Mbps speed threshold.

The table above suggests that the current definition of broadband in 2025 should be something like 177/35 Mbps. It’s debatable if Starlink meets the 100/20 Mbps test today, but it clearly doesn’t meet a test based on the speed demand in 2025.

The BEAD future-looking test is challenging because nobody defined what future-looking means. I can think of two definitions of forward-looking that might make sense. One is to judge what speeds should be delivered when the grant project has been constructed, which for most BEAD projects will be at the end of 2029. The growth chart suggests that the speed for defining broadband in 2029 will be around 380/76 Mbps.

I think a better forward-looking test for a government-sponsored grant should be that a grant-funded network should still be relevant a decade after a grant is awarded. The chart suggests the desired speed should be 1191/238 Mbps in 2035.

Naysayers will argue that the 21% growth in speed demand can’t be sustained. Consider taking a more conservative approach that cuts the historical growth rate in half. That conservative approach would say that a target speed for a grant-funded project would be 195/30 Mbps in 2029 and 345/69 Mbps in 2035. I have nothing to go on except my gut, which tells me that 345/69 Mbps will feel inadequate in 2035.

Can the FCC Regulate Local Permitting?

ACA Connects, an industry group that represents midsize cable companies and fiber overbuilders, recently asked the FCC to issue regulations to streamline permitting and the acquisition of rights-of-way. For those who lose track of the various industry advocacy organizations, ACA Connects was previously known as the American Cable Association.

The ACA Connects comments were filed in response to the open Notice of Inquiry that asks for comments that can eliminate barriers to wireline deployments. The ACA Connects comments ask the FCC to investigate and regulate three issues. First is the timeline for local communities to respond to a request for rights-of-way or construction permits. ACA Connects members have related stories of communities that sit on requests for months with no response. ACA Connects advocates for a shot clock that requires communities to react within a specified time, similar to what has been required by the FCC for requests for placing a new wireless tower.

Second, the group asks that fees charged for access to rights-of-ways be cost-based and objectively reasonable, and that new ISPs are provided the same treatment that was provided to incumbent providers. There are communities that want large up-front fees to obtain rights-of-way and permits that go far beyond a reasonable value. I’ve often suspected that this is a result of cities losing franchise fees as the cable TV industry continues to lose customers.

ISPs also object to hidden fees and costs. The filing documents examples of unreasonable costs, such as having to bury conduit deeper than is required by industry standards. The filing cites an example where an ISP was asked to repave a full block after disturbing only a small portion of a sidewalk.

It’s worth noting that these are not universal problems, and many communities are welcoming fiber overbuilders with open arms and easing the process of bringing competition to their community. But any ISP understands how unexpected delays and costs for a routine function like obtaining rights-of-ways and permits can delay, and even kill plans to complete a new network project.

ACA Connects recognizes that this would be a big lift for the FCC. Communities are going to strongly resist any efforts to dictate rules for how cities manage and charge for rights-of-way. The FCC has made headway in managing the placement of towers and wireless facilities. But that’s partially because the process of siting a tower is unique. However, rights-of-way rules apply to a lot larger universe than just ISPs. Any changes to the rules will suddenly change the way that cities interact with electric utilities, cable companies, gas utilities, and even the general public who wants to make changes like cutting a new driveway into a busy road.

Local communities view control or rights-of-ways as one of the most important rights of a community and will resist any attempt by a federal agency to change the rules. I predict a huge legal battle if the FCC decides to tackle this. Not that it should matter, but that means that implementing what ACA Connects recommends could take many years and many lawsuits before implementation.

The FCC’s ability to tackle something like this has been weakened by recent Supreme Court rulings. For example, in Loper Bright Enterprises v. Raimondo, the Supreme Court largely ended the Chevron deference and ruled that federal agencies are on shaky ground when they make decisions that are not explicitly directed by Congress. In the 2025 ruling, McLaughlin Chiropractic Associates, Inc. v. McKesson Corp., the Supreme Court ruled that Courts can more easily disagree with rulings made by federal agencies, making it easier for courts to disagree with orders made by the FCC.

Like with many other regulatory issues, the reality of the court rulings means the right forum for fixing these issues is in Congress. But Congress has been conspicuously missing from regulation since the Loper Bright ruling. There is no question that the ISPs that prompted ACA Connects to file these comments are feeling pain in the market. But even if the FCC tackles what they are requesting, there won’t be any quick fixes.

‘Tis the Season (For Layoffs)

It’s going to be a rough holiday season for a lot of industry and tech workers, as communications and tech companies have announced layoffs. According to the hiring experts at Challenger, Gray, & Christmas, the layoffs announced in October were the largest in years. Employers have announced over 1 million job cuts through ten months of this year, already 44% higher than the job cuts for all of 2024.

Technology has the largest number of job cuts for the year, already at over 141,000. There are a number of different reasons for job cuts this year. In October, cost-cutting was the top reason for job cuts (50,437). AI was cited as the reason for 41,039 layoffs. Market and economic conditions were cited as the reason for 21,104 job cuts. The closing of stores and plants accounted for 16,739 cuts, and restructuring was the reason given for 7,588 job cuts.

Here are some of the cuts in the industry as reported by FierceNetworks:

AT&T didn’t announce any formal layoffs but it has still seen staff reduce by over 5,000 positions this year to reach 135,700. Many of the cuts are likely due to the new company policy of mandating that people return to the office five days per week.

Charter reduced staffing by 6,600 in 2024 to reach 94,500. The company recently announced it will be cutting 1,200 more jobs, plus it closed call centers in Ohio and Massachusetts.

Comcast seems poised to reduce staffing but hasn’t announced specific numbers. Rumors are that the company is getting ready to streamline operations.

T-Mobile originally said it was going to lay off the entire staff of 4,100 that came through the acquisition of UScellular. The company ultimately kept “more than half” of these employees.

Verizon actually increased staffing in 2025 and added 800 people this year. The company has been slashing staffing for many years. However, the company told investors when announcing third quarter earnings that it to intends reduce its costs. The Wall Street Journal reported, as I was publishing this blog, that the company plans to cut 15,000 people.  That’s before any impact from the upcoming acquisition of Frontier.

There are a lot of layoffs coming in other parts of the tech industry. Amazon laid off 14,000 people in October and says it will be cutting as many as 30,000 additional corporate jobs. Microsoft eliminated 9,000 positions recently, bringing it to 15,000 for the year. UPS has had the largest cuts with 48,000 jobs eliminated in 2025.

Broadband Usage 3Q 2025

OpenVault recently published its Broadband Insights Report for the end of the third quarter of 2025. OpenVault is documenting the continued growth in broadband usage by U.S. households.

One of the most useful statistics from OpenVault is the average monthly broadband usage per household in gigabytes. Below is the trend in average monthly U.S. download and upload volumes since the third quarter of 2021. These averages include broadband used by residential and small business customers.The average U.S. broadband customer used 43 more downloaded gigabytes and 7 more uploaded gigabits per month than a year earlier. This growth means continued pressure on broadband networks because if we assume roughly 120 million broadband subscribers nationwide, this growth means over 6 billion more gigabytes of data are used each month than a year earlier.

One of the most interesting things about the second quarter this year is that the overall average broadband usage was lower than the second quarter, something that hasn’t happened since 2019.

As can be seen in the table above, upload usage has been growing at a faster pace than download usage. In a recent quarterly report, OpenVault credited the growth of upload usage to the increasing usage of video calls, cloud backup, IoT uplinks, and similar uses. To put the 7-gigabyte increase in average upload into context, it’s the equivalent of every household uploading an additional 5 standard definition movie files or 2 high definition movie files every month compared to a year earlier. I think the average household would be surprised by the volume of data they are uploading each month.

Another interesting statistic is the percentage of U.S. subscribers at different speed tiers. For the last several years, there has been a steady migration of subscribers from slow speed tiers to the fastest tiers. The current report documents a big jump in subscribers in the 200-499 Mbps tier and a decrease in the 500-999 Mbps and the 1 Gbps+ tiers. I have to wonder is this is the impact of millions of homes migrating from cable broadband to FWA wireless. Customers seem to be making the change to get a substantially lower price and seem willing to sacrifice speed for price. We’ll have to see how this trend continues, but it’s the first break in the upward increase in the fastest speed tiers.

OpenVault always includes other interesting statistics in its quarterly reports:

  • 39% of Gen Alpha teens spend over three hours a day gaming, and only 30% of this age group watches TV, and only 28% listen to music or podcasts. Over time, this could mean a huge shift in the demand for traditional entertainment.
  • 89% of U.S. households now subscribe to at least one streaming service, which is very close to the percentage of homes that have broadband.
  • While average usage dropped a bit from the second to the third quarter, the median household usage increased from 431.4 GB to 438.9 GB.

North Carolina Provides Storm Relief

North Carolina announced a $50 million program to help ISPs that suffered damage a year ago with Hurricane Helene. The grants will be awarded through the North Carolina Department of Information Technology (NCDIT) Broadband Infrastructure Office. This is the same group that has been administering state broadband grants as well as BEAD.

The grants are available to ISPs who suffered damage in the 39 counties affected by the hurricane, plus the Eastern Band of Cherokee, who are in the far western part of the state. For anybody who has followed North Carolina and broadband, these awards are no surprise. North Carolina has been a national leader in broadband since the beginning. The state built the first state-funded middle-mile network that was designed to support broadband throughout the state and has been involved in promoting rural broadband for decades.

There is no question that networks in the region suffered big damage from the hurricane. In Buncombe County, where I live, 40% of all trees were damaged or destroyed. The hurricane devastated an estimated 822,000 acres of forests in the region. The flooding from the hurricane washed away entire towns, roads, and bridges. Mudslides wiped out a lot of homes and neighborhoods. Just within a few blocks from my home, dozens of wires were knocked off poles.

The area was aided after the hurricane by an amazing outpouring of help from across the country. Huge number of crews came in to replace poles and reattached wires. People who drove out of the area after the storm reported passing a non-stop caravan of utility trucks coming into town.

The quick fixes done by these crews were awesome and got the networks up and running. But anybody like me who always looks up at poles still notices a lot of work is needed to shore up the many quick fixes.

Network owners around the country might be wondering why this state grant funding is needed. In the past, FEMA has always stepped up to help utilities and telcos in disaster areas. Unfortunately, North Carolina has only received about 25% of the $59.6 billion estimated damage costs from FEMA. FEMA doesn’t ever pay all damage costs, but they have always paid a lot more than what Western North Carolina has received this time.

To be fair, Congress allocated a lot of funding for storm relief, but the money is not flowing. This seems to fall in the same category as other billions of dollars of federal money that aren’t making it to states for a huge range of issues.

To put the storm damage into perspective, the annual budget for the State of North Carolina was $29.7 billion in 2024 and $30.8 billion in the current fiscal year. There is no way the state could ever pick up the cost of a major storm.

Perhaps the money will eventually flow, but there doesn’t seem to be much movement on the federal side. The federal reaction to this storm should be a wake-up call to every network owner in the country. Everybody has gotten used to thinking of FEMA as the backstop for covering catastrophic damage. What happens if this is no longer true?

One of the best things about electric and communications companies is that everybody is willing to send crews to help after storm damage anywhere. But those utilities have always expected to eventually get reimbursed for a significant portion of that cost. Will utilities be so quick to send crews elsewhere if they have to fully foot the cost? I would hope so, but realistically, a lack of federal payment of disaster funds could put a crimp in the amazing system of mutual aid that benefits every community when they need help. And eventually, it’s going to be your community that needs the help.

Misaligned Priorities

We have several sets of broadband priorities at odds with each other in the country. The federal government is on a big push to move all transactions with the government to digital. The example that got a lot of press was when FEMA said it would only communicate with disaster victims through emails and its online portal. But government agencies across the board are pushing folks online to communicate.

The government is also clearly supporting an AI revolution where AI is supposed to revolutionize the way we work and live. According to federal government rhetoric, we are a little bit ahead of the Chinese in terms of AI development, and politicians seem to support the idea of doing whatever is needed to make sure that the U.S. wins the AI race.

At the same time that we are prioritizing AI and moving everything online, we seem to be deprioritizing broadband. NTIA cut the BEAD program funding in half to save money, at the expense of building new networks that would provide solid infrastructure for the next fifty years. The Administration outright killed the Digital Equity Act, which had the goal of getting computers into people’s hands and training them how to use them.

These goals are clearly at odds with each other. Consider the Digital Equity funding. There is a huge lost opportunity cost for not giving people the tools to enter the digital world that the government wants. What is the cost to society for people who aren’t given the tools to enter the digital world? Digital equity folks can rattle off tons of stories of folks who were given help with broadband who then went on to work in a tech field, start a business, become teachers, or otherwise thrive and contribute to society.

The disparity between these policies makes no sense to me. It looks to me like the Digital Equity Act was killed for the simple reason that it had the name ‘equity’ in its title. But digital equity never had any of the connotations that politicians classify as DEI. Digital Equity has always been an effort to help people learn more about and master computer technology and broadband. It makes no sense not to have digital equity as a goal if we want everybody to be able to use AI or communicate with the government online.

The BEAD grants were trimmed back for one reason only – to save money. The new Administration sent folks into every nook and cranny of the government to find ways to save money. On the surface, this isn’t a bad thing, and I have to think that many of the cuts to government expenses are good in the long run. But BEAD was never about spending money. BEAD is an infrastructure bill. There are reams of economic studies that show that spending money on infrastructure always returns more to the economy than the cost.

Just in my part of North Carolina, there are a bunch of counties where all of the BEAD awards went to satellite broadband. Set aside that Western North Carolina is mountainous and heavily wooded, and there will be homes that won’t be able to get adequate broadband from the satellites. Set aside that many of these counties have low overall incomes and many folks won’t be able to afford the satellite broadband.

The bigger issue is that building fiber is about a lot more than just bringing broadband to homes. When counties get a fiber network, they can start to get creative to find ways to leverage a new network to improve the local economy. Satellite broadband is finally starting to deliver the broadband that the average home needs to join the modern world. But satellite broadband isn’t going to support schools. It’s not going to enable a county to attract a new factory. Satellite is not going to enable a county to seek ways to improve cellular coverage. Fiber is the infrastructure needed to help the overall community, while satellite broadband just helps customers who can afford it.

I know this is probably coming across as another rant, but I know I’m right. BEAD and the Digital Equity Act were tools that could have made a big difference in rural communities. I’m pretty sure that by killing broadband programs that AI will not be coming to the rural counties in Western North Carolina. Folks here are going to fall through the cracks because they will be unable to communicate with FEMA and other government agencies. It feels like the government is making a conscious decision to exclude Western North Carolina. I don’t think this is deliberate, but unfortunately, by pursuing misaligned priorities, that’s exactly what is happening. The current government is making far too many decisions in a vacuum without considering the bigger picture.

Why BEAD to Kuiper?

There is no question that this has turned into one of the oddest years for broadband during my career. We’ve seen Digital Equity grants killed. We’re seeing the spending for BEAD being cut in half. And maybe oddest of all, we’re seeing States make sizeable BEAD grant awards to Kuiper, although the company isn’t close to having its first broadband customer.

You might think we should have learned a lesson from when Starlink was a big winner in the RDOF reverse auction. The FCC eventually killed those awards after it determined that Starlink was not ready to fulfill a major commitment to serve large numbers of locations in specific geographies.

As of the date of this blog, Kuiper has 153 working satellites in orbit. It has scheduled launches of an additional 72 satellites before the end of the year. It’s worth noting that previous planned launches have all been seriously delayed.

Kuiper was granted permission in July 2020 to deploy a constellation of 3,236 satellites. The satellites will be deployed at three altitudes of 370 miles, 380 miles, and 390 miles. The company says it will begin beta testing when it reaches 578 satellites deployed at 390 miles. To put this into perspective, Starlink launched commercial service when it reached 1,260 satellites. Even with that number, early Starlink customers complained about short service lapses between satellites. Kuiper is under pressure from the FCC to have 1,618 satellites in orbit by mid-2026 to maintain its spectrum licenses.

For the last month, I’ve been perplexed by the magnitude of the BEAD awards being made to Kuiper. As of the date of this blog, Kuiper has tentative BEAD awards for 324,000 locations, second only to Starlink at 427,000. The next biggest award winner is Comcast with 233,000 locations. These are tentative awards, and NTIA is still reviewing and may reject some of the tentative awards to fiber, which would likely increase the awards to satellite.

Kuiper is not nearly as ready as Starlink was with RDOF. The RDOF reverse auction closed at the end of 2020, and Starlink invited selected customers from its waiting list to try the service in January 2021. Starlink started taking pre-orders nationwide in April 2021. Starlink could finally reach every part of the lower 48 states in March 2022.

One of the oddest things about Kuiper is that nobody knows how fast the service will be until it is deployed. Early Starlink customers received speeds that were faster than advertised, but speeds went downhill quickly as customer additions outpaced satellite deployments.

Starlink is only now on the cusp of delivering consistent 100/20 Mbps broadband. According to a report from Ookla, Starlink speed tests in the second quarter of 2022 showed a median download speed of 53.95 Mbps, meaning half of customers had speeds faster than that speed, and half were slower. Median upload speeds in that quarter were 7.5 Mbps. In the first quarter of 2025, Ookla reported that Starlink had climbed to a median broadband speed of 104.71 Mbps download and 14.84 Mbps upload, nearly double the speeds in 2022. The Ookla report said that only 17.4 % of Starlink customers fully met the FCC definition of broadband of 100/20 Mbps per second, with the limiting factor for many customers being slow upload speeds.

I saw a recent quote from a State broadband manager, when asked why he made an award to Kuiper, said it was because they bid the lowest cost per passing. That seems like a cynical response, and it makes me wonder if State Broadband Managers have thrown up their hands and are just following NTIA’s rules without questions.

The chances are good that Kuiper will complete the constellation and will eventually deliver satellite broadband. But history has also shown that new technology companies are often late in meeting commitments and sometimes fail altogether. The BEAD grant process is taking a big chance that Kuiper will meet its obligations and that speeds will be reasonably fast – something that nobody can know until it happens.

Network Timing

One element that is key to all networks rarely gets discussed. Network timing (or network clocks) involves hardware or processes to make sure that all parts of a network are in synch.

Timing and synchronization are critical for network services that depend on precise, synchronized timing on network devices. Accurate and reliable synchronization of any network device helps manage the security, availability, and efficiency of the network devices. Timing is essential for the function of telephone, cellular, and broadband networks.

There are multiple kinds of timing in use.

Frequency Synchronization. This makes sure that all electronics inside a network operate using the same clock rate or frequency. Many kinds of network gear come with built-in clocks, and having different parts of a network using different clocks will result in data loss, corruption, or misinterpretation of bits. Frequency synchronization forces all of the clocks inside the network to operate in unison by matching the frequency of each clock to a source clock. There are different sources for frequency synchronization:

  • Synchronous Ethernet (SyncE) chooses one clock and forces the other clocks to match.
  • Networks can be synchronized to external clocks such as BITS or the GPS satellites. BITS can choose any reliable external clock.
  • Many networks use Precision Time Protocol (PTP), which eliminates the danger of losing the connection to an external clock.
  • A network can use a free-running internal oscillator chip that holds an accurate clock.

Many networks have used GPS for frequency synchronization. A GPS satellite carries a highly stable atomic clock that provides precise time signals, which can be converted into frequency references by a GPS receiver. While the atomic clock provides highly precise time and frequency information, GPS is not as reliable when there isn’t a clear view of the sky during weather events.

Phase Synchronization makes sure that the phase of network signal is consistent throughout the network. Phase refers to a specific point in time on a waveform cycle. Phase synchronization ensures that electronics agree on the timing of the start and end of each bit in a data stream. This is critical in applications where data from multiple sources have to be combined or compared, such as in a cellular network.

Time Synchronization, also called Time of Day (ToD) ensures that all electronics agree on the current time, which is critical in applications where timing is crucial. Networks differ in the need for precise time. Network Time Protocol (NTP) can be used to provide millisecond accuracy, while PTP can provide nanosecond accuracy along with phase synchronization.