Pushing the Speed Limit

Today’s blog is about several new fast broadband deployments. It seems that every year that vendors are developing new technologies that will speed up our networks and broadband connections.

The first was an announcement from AT&T that the company completed a live test of a 1.6 terabit fiber connection on a route between Newark and Philadelphia. The connection was tested over AT&T’s long-haul network that was also running 100 Gbps and 400 Gbps links.

The fast link was created by combining two 800 Gbps links created by white hardware operating with the Broadcom Jericho3 packet processor chip. The two links were combined using Ciena’s WaveLogic 6 Extreme cohere optical transponder. At the two ends of the link, the signal was processed by 800 G DR8 pluggable transceivers from Coherent which created the cross-connectivity to communicate with other packet and optical technologies.

This new link is four times faster than the 400 Gbps lasers that are being installed nationwide as the newest iteration of ling-haul and middle-mile networks – replacing the 100 Gbps lasers that were the standard for the last decade.

Lumen announced that the company successfully created a 1.2 terabyte connection connection on an 1,800 mile long link. The link was accomplished on what Lumen calls its ultra-low-loss (ULL) network. The test used Ciena’s WaveLogic 6 Extreme technology and Ciena’s Waveserver platform along with Juniper 800 Gbps routers.

The AT&T news release of the test quoted Mike Satterlee, VP of Network Infrastructure and Services as saying that these tests are vital for AT&T to keep up with future demand. He was quoted as saying that AT&T expected that overall long-haul network traffic will be doubling by 2028.

The other groundbreaking speed trial was conducted by T-Mobile. The company was able to achieve a 6.3 Gbps connection on a Samsung Galaxy S25 cell phone. The phone was using the Snapdragon X80 5G modem-RF system. The 6.3 Gbps test was achieved in the lab, and the speed achieved on a real-world 5G network was 4.3 Gbps. A second test achieved the same speeds using a non-commercial handset that used Qualcomm’s X85 5G Modem.

The network test was conducted using Nokia’s 5G radio access RAN. The sped was achieved by aggregating 2.5 GHz, AWS, and 600 MHz spectrum. The test was not as much about speed as it was the ability to combine multiple frequencies to create a high-bandwidth path.

These trials are proof that carriers are constantly pushing vendors to develop the next-generation of network gear that brings greater capacity. Middle-mile and long haul routes are under strain from unexpected traffic from AI data centers. But long-haul network operators are reporting a big uptick in requests for 100 gigabit data connections across markets and unrelated to AI.

Fixing Urban Cellular Coverage

Anybody who lives in an urban or suburban area know that cell coverage is not the same everywhere. There are neighborhoods with great cell coverage, neighborhoods with so-so  coverage, and neighborhood with little or no coverage. Nobody understands this better than first responders and city employees who work in all parts of a city.

This is all due to the physics of cell coverage. The FCC has purposefully restricted cell towers to low power levels in order to create discrete coverage areas or cells. This was done so that neighboring towers don’t interfere and cancel each other out. Coverage is also affected by the specific frequencies being used by cell carriers, with some of the higher frequencies used for 5G having shorter coverage distances. Another important factor that affects cell quality is the number of users in a neighborhood. Anybody who lives close to a busy road or a high school knows there are certain times of the day when coverage gets worse due to heavy cell usage.

The final factor that creates cellular deserts is the placement of cell sites. The big tall cell towers  were located years ago to largely take cover highways – not where people live. This was done due to a compensation system where carriers got wealthy from carrying vehicle roaming traffic for other carrier networks. Cell towers have also often been forced to locate on taller hills or away from residential neighborhoods who didn’t want a giant unsightly tower in backyards. Unfortunately, cities are now largely stuck with the original cell tower configuration.

A lot of the poor coverage can be solved with the placement of additional small cell sites to fill in neighborhoods with poor coverage. You might recall five years ago when the carrier industry promised to build a million small cell sites. For various reasons that never happened. The primary reason came when carriers realized they weren’t going to make any incremental new revenues from 5G, and they lost interest in investing in cell site infrastructure.

The good news is there is a way for cities to tackle the cellular coverage issue. My consulting firm recently helped a city in a major urban area that knew it had poor cell coverage. Using various tools, we were able to fully map all of the important factors that measure cell phone call quality.

We were able to create separate coverage maps for AT&T, T-Mobile, and Verizon, which is important because every carrier has distinctly different coverage areas based on the specific cell sites and frequencies they are using. Probably the best result of this study was a map that showed the unfortunate neighborhoods where all three carriers have poor coverage. A map overlaying poor cell coverage and household incomes was also eye-opening.

There are a lot of consequences of poor cellular coverage. The national statistics show that about 11% of homes have no home broadband and must rely on cell phones as the only source of broadband. Folks who live in neighborhoods with weak cell coverage can’t use their cell phone indoors. First responders struggle in these communities. Delivery companies struggle to find addresses when they lose cell and GPS coverage. Folks who can’t afford home broadband and who live in cellular deserts have the worst of all worlds for connectivity and are stuck having to seek out public WiFi for connectivity.

We think cities will find a cellular mapping study to be invaluable. For the first time, they’ll be able to visualize cellular coverage in ways they can understand. Armed with coverage maps, cities can have conversations with carriers about addressing some of the worst coverage. A next logical step might be forming public-private partnership or economic development initiatives to help fund improved cell coverage. But none of that can be contemplated until a city knows the facts.

Pew Report on Pole Attachments

Jake Varn of Pew authored a report based on a deep dive into the impact of pole attachments on broadband expansion. The report highlights how issues with pole attachments are adding extra costs to the many grant projects being funded like BEAD and the Capital Project Fund.

The report highlights what anybody who has been building networks has known for many years. The process of adding fiber to the pole can be horrendously complex and time consuming. The rules on how to get onto poles differ significantly by State, but also by pole owner. 23 states and Washington D.C. manage the pole attachment process while the rest of the states follow rules established by the FCC.

Every step in the pole attachment process can be expensive and time-consuming. Applications to attach fiber to a pole often must be done on a pole-by-pole basis since neighboring poles can be significantly different. Many pole owners have no good inventory of poles and must send somebody out to look at each pole to compare to what’s being requested in an application. All of the costs of this process are ultimately borne by the ISP making the application.

The real issues arise when there is not enough room to add a new fiber that will meet national standards for clearance with other wires. In many cases, there’s no room because parties that added wires in the past did not follow standards when placing their wires, and the pole owners never made sure it was done right. The new attacher must pay to fix these old errors.

The most expensive pole attachments come when a pole is too old or too full of wires to add a new fiber, in which case the pole must be replaced. The applicant is saddled with the entire cost of replacing the pole, although a federal order last year tried to soften that impact. But the real damage from having to replace poles is the time required, which differs by State and pole owner.

As the report highlights, the process to do the make-ready work just to get ready for fiber construction can take a long time, which isn’t compatible with the construction deadlines required by the many grant programs. There are already examples of ISPs who have returned RDOF and other grant awards once they realized that the poles were in worst case than they thought, or the learned the pole owner was not going to be cooperative.

The Pew report made some great recommendations, which are aimed at states and federal policymakers and pole owners:

  • The process would be speeded up if there was a requirement to create an electronic inventory of the basic data about every pole – how old, how tall, how many existing attachments? Some pole owners have done this, and it really eases the process.
  • Pew recommends that states standardize the application and permitting process to eliminate the maze of different paperwork and processes needed. A single broadband project might include poles on city, county, and state roads, with each jurisdiction requiring a different paperwork process and forms.
  • Pew recommends that governments form rapid response teams to resolve disputes or delays in the pole attachment process.
  • Pew recommends that States consider pole replacement funds to replace bad poles. They also recommend that states take advantage of any disaster relief funding to replace poles damaged by storms or other disasters.

There is still time for State and local governments to act on these recommendations – but they need to act now to ease the process of ISPs trying to bring better broadband to rural areas.

In full disclosure, I provided some input to this report.

An Industry on Hold

I keep seeing articles or podcasts every week speculating on what the new administration and Congress might change in the $42.5 billion BEAD grant program. This all seems like speculation to me since only a few people really know what might happen, and I don’t think they are talking. I don’t think any of the pundits know any more about what will happen to BEAD than what I included in a tongue-in-cheek blog last year that included a BEAD bingo card.

There is one thing that definitely has occurred. A large chunk of the industry that was expecting to participate in BEAD is largely on hold.

That obviously includes the many ISPs that have filed or plan to file BEAD applications. There is a huge amount of speculation that any significant changes to BEAD will mean repeating the BEAD application processes in the three states that have already announced awards and the twenty-plus with open BEAD grant windows.

While State Broadband Offices are marching forward with the BEAD process, they are all spending a lot of energy speculating on what they might have to rework – and worrying that they’ll not have enough money to do this all a second time.

The group feeling the most pain are the vendors expecting to sell to BEAD grant winners. This group already had a let-down when many of them guessed at the beginning of 2024 that there would be BEAD grants made last year. They now see the process entering April 2025 with no idea of when grants will be made and when ISPs might start ordering equipment. The one thing they are now seeing is that the money might be finally get released for a lot of states at the same time instead of BEAD awards dribbling out over a year.

Another group that is getting very concerned is elected officials in counties across the country. A lot of counties devoted significant resources participating in the BEAD process. Many states gave counties some power in choosing BEAD winners by giving a lot of grant points for local endorsement and local funding. A lot of counties have made broadband grants to ISPs that are contingent on them winning BEAD – and many of those grants are from ARPA funding that has a ticking time clock and expiration date. There are also a lot of rumors flying around that the federal government might claw back unspent APRA funds.

Everybody is on hold for the big decision of how much BEAD funding will go to satellite. Will it be 5%, 10%, 20%, 50%, or 80%? I’ve heard industry pundits making all of these guesses. That split is vitally important to both the ISPs and the vendors. The group most worried about this is local elected officials, who almost universally want fiber built in their counties.

The other big question that has everybody in knots is how much of the process can be changed by NTIA versus what needs to come from Congress. There is legislation still in the early stages in the House that addresses the issues, with other lawmakers drafting alternate ideas. While Congress could act quickly on this if they want to, they have a lot of other big issues on their plate right now.

Interestingly, the first Congressional bill on the issue is called the SPEED Act, but none of this is feeling very speedy. But who knows? Edicts could come down quickly and State Broadband Offices could issue grants quickly if there isn’t a lot of paperwork involved in reshuffling the rules. Meanwhile, my original BEAD bingo card is still intact.

Regulatory Shorts March 2024

Regulatory announcements seem to be hot and heavy these days, mostly related to a new administration.

USF Fee Increase. The FCC recently voted to increase the end-user fee used to fund the Universal Service Fund. The new fee for the second quarter of 2025 is now calculated at 36.6% of interstate telecommunications revenues. Everybody in the industry recognizes this fee as unsustainable, and there has been a lot of discussion, but no action taken to expand the base on which the fee is calculated. This may come to a head later this year since the Supreme Court is currently reviewing a case that challenges whether the Universal Service Fund structure is constitutional.

Net Neutrality Now a State Issue. Add Pennsylvania to the list of states that are considering a state version of a net neutrality law in a new bill introduced into the legislature. The current states that have adopted net neutrality rules are California, Colorado, Maine, Oregon, Vermont, and Washington. One of the interesting features of the Pennsylvania legislation is that it would classify ISPs as utilities, opening them to more state regulation.

Changing Regulators. FCC Commissioner Geoffrey Starks has announced he’ll be leaving the FCC some time this spring. Commissioner Starks is a Democrat, and the new administration is required by law to replace him with another democrat. The Senate has not yet confirmed Olivia Trusty, a Republican who is slated to fill the current open spot.

Adam Cassady has been named as the acting administrator of NTIA. Adam was formerly the chief of staff for FCC Commissioner Nathan Simington. The acting administrator slot is open until the Senate votes to confirm Arielle Roth.

Administration Support for the USF. The Trump administration filed a brief with the Supreme Court on March 13 that defends the Universal Service Fund and says that the claim the that the USF is unconstitutional is a strawman. There are legal scholars saying the Court might rule against the USF, even while the White House supports it, since this case has wider implications for other federal agencies.

Tower Dumps are Unconstitutional. A Mississippi federal judge has ruled that the law-enforcement technique known as a ‘tower’ dump’ is unconstitutional. Law enforcement agencies have often pulled large swaths of calling data in the area of crimes in the hope of identifying suspected criminals. In the case that triggered the order, the FBI had requested search warrants to pull cellular data from four carriers across nine cell towers in the hope of being able to sift through records to identify local gang members.

FCC Issues Shutdown Contingency Plan. Acknowledging this crazy year, the FCC issued a  Plan for Orderly Shutdown Due to Lapse of Congressional Appropriation. The document surprised me because the FCC is self-funded through regulatory fees, and I assumed that meant they would not be subject to a general government shutdown. And perhaps that is the case since the FCC shutdown would not happen “If prior year funds are made available.” I guess that raises the possibility that the FCC’s own funds could somehow be frozen in a larger shutdown.

 

 

How Low Can They Go?

AT&T and Verizon continue to aggressively eliminate staff. You have to wonder where the bottom will be in staffing levels.

In September 2024, Verizon announced that it would cut 5,000 positions. As of January 1 of this year, the company had 99,600 employees, down 5,000 from the beginning of 2024. As of January 1 of this year, AT&T had 140,990 employees, down 8,910 people during 2024. At the beginning of 2000, the two companies employed over 475,000 people, and since that time have shed a little over half of their employees.

The following graph shows the employees of the two companies since 2000.Verizon has steadily cut full-time employees during this century. The graph doesn’t show any disruption from Verizon’s purchase of AOL in 2015 and Yahoo in 2017. The graph also doesn’t tell the whole story since Verizon has also outsourced positions during this time. I recall a controversy at the end of 2018 when the company outsourced 2,500 IT jobs to India.

AT&T employee counts are a lot more complicated since AT&T acquired a lot of companies this century, including BellSouth in 2006, Leap Wireless in 2013, DirectTV in 2015, and Time Warner in 2018. AT&T subsequently shed both DirecTV and Tim Warner. Even with the turmoil caused by purchasing and ditching subsidiaries, AT&T has steadily been eliminating staff.

Both companies are currently actively striving to eliminate copper networks, with Verizon is much further along with this effort than AT&T. However, Verizon is slated to merge with Frontier sometime this year, which will bring new employees and a return of a lot of copper networks that Verizon had ditched to Frontier in the past.

Both companies also say they are considering how AI might streamline operations, which probably means even further cuts in staffing over the next few years.

This is all a far cry from the time when AT&T was the telephone monopoly and had over 1 million employees, making it the biggest employer outside the U.S. military. It’s anybody’s guess how much more these companies can slash staff and remain viable.

What’s the Future of Keyboards?

My consulting firm does surveys and I want to highlight the results from a recent survey. This was a random survey with statistically valid results surveyed a cross section of the community, so the results are reasonably believable.

We asked survey respondents the number of hours per day they use a cell phone and computer or tablet. The following chart shows the response by age:These results are not as accurate as studies that require people to keep a usage log, and the above numbers tabulate the number of hours people think they use devices. Note that these statistics come from just one survey.

These results reinforce a  few things I’ve been reading from various studies. Those over 65 are still using devices for fewer hours per day than younger age groups. 18 to 34 years old are using devices more than older folks, on average.

The response that I want to highlight is the big shift in usage of those under 34 to using cell phones instead of computers. Just a few years ago, our surveys showed an even split of device use for this age group. Going back a few more years and usage would have been weighted to using computers.

Unfortunately, our surveys don’t reach those under 18, but everything I’ve been reading says that teens and younger kids have migrated to cell phones to a greater degree than shown by this survey for 18-34 year olds. Kids are not just using cell phones – they are talking to them and rarely use a phone’s keyboard.

That’s the phenomenon that makes me ask if we are seeing the beginning of the end for typing into computers. I’ve been reading science fiction my whole life, and a constant prediction of the future has always been communicating with computers by voice.

The recent advent of AI is likely to increase this trend away from typing. I’ve been promised a good computer assistant since I used Ask Jeeves in 1996. No good software has ever come along that isn’t more work than doing things myself, but with AI that is likely to change soon.

There is no denying that younger folks are already making the transition away from typing and now prefer the smart phone. Friends of mine with younger kids say they complain loudly about having to use a keyboard. I’m clearly old school. I spend four hours or more a day writing and a lot of time working on complicated spreadsheets. My brain is completely trained on using a keyboard for those functions, and I’m not sure I’d want to try to transition to talking. But I love voice-to-text on my phone and I see the appeal to use voice for other functions.

It’s not hard to envision a reasonably near future where people will transition more from keyboards to talking. The future choice will not be between computer and cell phone, but a choice of the best screen to use for various functions. Unless we finally get functional glasses or holograms that can display anywhere we go. Give me the whole package and maybe I’m ready to talk.

Making it Easier to Kill Copper

The FCC recently enacted four rule changes to make it easier for legacy telcos to walk away from copper networks. These changes were adopted by the FCC’s Wireline Competition Bureau, meaning these changes did not come to the full Commission for a vote. While there has been regulatory changes in the past ordered by the various Bureaus within the FCC, it’s unusual for changes of any real importance to be enacted without a full Commissioner vote.

One order allows a telco to turn off copper wires without having to conduct a test to first see if a replacement technology can take over the functions that were being performed by copper. The requirement for having such tests is not eliminated, but the order gives telcos ways to justify not performing the tests.

In rural areas, AT&T is largely replacing copper with FWA wireless. But as anybody who lives in rural America knows, there are huge areas where there are no cell towers and no cellular coverage. The rule being clarified is one that came from the FCC’s 2016 Technology Transition order that requires a telco to prove that a replacement technology can match or exceed the performance of the copper network. The clarification of this new order is that the telco can justify tearing down the old network by saying that the ‘totality of the circumstances’ proves that the change is needed and not conduct testing. We’ll have to see how that works in practice, but it seems like a way to remove copper without having a replacement as long as some adequate number of homes in an area will have a replacement.

Another new order makes it easier for telcos to grandfather copper services. Grandfathering is a term used when a telco agrees to continue selling a product to existing customers while not offering it to new customers. The new rule eliminates the FCC paperwork required to grandfather a product.

Another order provides a two-year moratorium for telcos having to disclose and seek public opinions on changes made to copper network. This change was precipitated by having more than 1,100 such changes that were filed with the FCC since 2021, for which there were no objections or public feedback.

The final new order approved a petition filed by USTelecom on behalf of AT&T, Verizon, and Luman. The waiver asked that the FCC kill the rule to require telcos to offer standalone voice to replace voice lost when a copper network is torn down. The telcos want to be able to offer customers a bundle of services instead that probably includes FWA wireless bundled with voice. The FCC granted the waiver for two years, with a provision that the waiver can be extended.

Taken altogether, these changes eliminate a lot of paperwork involved with tearing down copper networks and remove the paperwork delays in the process. All of the big telcos are actively killing copper networks, with the latest big plans coming from AT&T to kill all copper by the end of 2029.

FCC Chairman Brendan Carr said that these changes are only the beginning and that many more regulatory rules will be relaxed or eliminated as part of the FCC’s Delete, Delete, Delete effort.

Speed Isn’t Everything

The marketing are of the broadband industry spends a lot of time convincing folks that the most important part of a broadband product is download speed. This makes sense if fiber of cable are competing in a market against slower technologies. But it seems like most advertising about speed is to convince existing customers to upgrade to faster speeds. While download speed is performance, the industry doesn’t spend much time talking about the other important attributes of broadband.

Upload Speed. Households that make multiple simultaneous upload connections like video calls, gaming, or connecting to a work or school server quickly come to understand the importance of upload speeds if they don’t have enough. This was the primary problem that millions of households subscribed to cable companies encountered during the pandemic when they suddenly were using a lot of upload. Many homes still struggle with this today, and too many people upgrade to faster download speeds, hoping to solve the problem. ISPs using technologies other than fiber rarely mention upload speed.

Oversubscription. Home broadband connections are served by technologies that share bandwidth across multiple customers. Your ISP is very unlikely to tell you the number of people sharing your node or the amount of bandwidth feeding your node. The FCC’s broadband labels require ISPs to disclose their network practices, but nobody tells you statistics like this that would help you compare the ISPs competing for your business. The cable industry ran afoul of this issue fifteen years ago when large numbers of homes began streaming video, and many ran into it again during the pandemic. It still happens today any time a neighborhood has more demand than the bandwidth being supplied.

Latency. The simple description of latency is the delay in getting the packets to your home for something sent over the Internet. Latency increases any time that packets have to be resent and pile up. If enough packets get backlogged, latency can make it difficult or impossible to maintain a real-time connection. Latency issues are behind a lot of the problems that people have with Zoom or Teams calls – yet most folks assume the problem is not having fast enough speed.

Prioritization. A new problem for some broadband customers is prioritization. Customers buying FWA cellular wireless are told upfront that their usage might be slowed if there is too much cellular demand at a tower. Cellular carriers clearly (and rightfully) give priority to cell phones users over home broadband. Starlink customers who buy mobile broadband are given the same warning. Starlink will prioritize normal customers in an area over campers and hikers. Most ISPs say they don’t prioritize, but as AI is introduced into networks it will be a lot easier for them to do so. Over the last few months I’ve seen that several big ISPs are considering selling a priority (and more expensive) connection to gamers at the expense of everybody else.

Your Home Network. Everybody wants to blame the ISP when they have problems. However, a large percentage of broadband problems come from WiFi inside the home. People keep outdated and obsolete WiFi routers that are undersized for their bandwidth. Customers try to reach an entire home from a single WiFi device. Even when customers use WiFi extenders and mesh networks to reach more of the home, they often deploy the devices poorly. If you are having any broadband problems, give yourself a present and buy a new WiFi router.

Reliability. If operated properly, fiber networks tend to be the most reliable. But there are exceptions, and it all boils down to the quality of your local ISP as it does to the technology. It’s hard to say that any factor is more important than reliability if your ISP regularly has network outages when you want to use broadband.

Technology Shorts March 2025

This blog takes a look at some of the newest technologies coming out of the lab that might eventually make a difference in broadband.

Terahertz Chips. One of the biggest hurdles to faster computing is the speed at which we can get data into and out of a chip. Scientists at Notre Dame, the Universite de Lille in France, and Nanyang Technology University in Singapore have collaborated to design a chip that uses multiple terahertz waves to vastly increase the I/O function in computers. Their findings were reported in Nature.

Terahertz waves are located between optical light and microwaves, ranging in frequencies between 0.1 and 10 terahertz. The challenge with using terahertz waves in electronics is finding a way to beam a signal to where it’s needed rather than broadcasting widely. The team is using topological photonics and a beamformer that can focus the beam in any direction within a chip. The remaining challenge is to find efficient power amplifiers and electronic oscillators that will work at terahertz speeds. The chips would be a huge breakthrough that could enable super-highspeed applications like real-life 3D holograms or self-driving cars capable of processing the huge amounts of information from multiple sensors.

New Material for Better Chips. Scientists at the EPFL’s Power and Wide-band-gap Electronics Research Laboratory (POWERlab) in Lausanne, Switzerland, found an interesting property of vanadium dioxide  – it naturally changes from an insulator to a conductor at 155 degrees Fahrenheit. Further, after the material is cooled to become an insulator it remembers what happened to it while it was a conductor. This makes it a great material for building chips because the use of a circuit through the material naturally heats it to the needed temperature to become a conductor. But turn off the power, even temporarily, and when the material cools it remembers the circuit path and data that was stored during use. These properties hold huge promise for using vanadium dioxide for long-term data storage. The switch between the two states also mimics the way that brain neurons operate, in that a circuit could be triggered only when needed, making this an interesting material for making advanced chips that mimic brain functioning.

Cooling Data Centers. Julia Carpenter, the cofounder of the new company Apheros stumbled across a metal foam during research for her Ph.D.  It runs out the foam is as much as 1,000X better as a heat sink than standard metal used in cooling plates used to cool down electronic components, like used in data centers. The term metal foam comes from looking at the material under a microscope that shows a sponge-like appearance. The foam can be inserted into existing cooling places and increase cooling capacity by 90%. The real promise of the metal foam is to use it in the primary design as a cooling element. Data centers create a huge amount of heat, and getting rid of that heat is one of the challenges of building a new data center.

End to Exploding Batteries. Researchers at Cornell have made a breakthrough that could end the problem of exploding lithium-ion batteries. The solution is to replace the normal liquid based lithium with porous lithium crystals. The crystal structure has been tried before, but solid crystals encouraged the growth of dendrites, or crystal growth, that eventually slowed the flow of ions. The porous crystals are structured around a molecular cage, with and three macrocycles, that allow ions to pass.