Millimeter Wave Broadband

For those who follow everything about broadband speeds, Ookla published a recent article talking about the deployment of millimeter wave spectrum in U.S. cellular networks. You might remember the big burst of marketing in 2000 when Verizon commercials bragged about gigabit speeds on cellphones. These fast speeds were enabled by millimeter wave spectrum that had been deployed at the time in a handful of urban business districts. At the time, Verizon told investors that millimeter wave was going to be the future of cellular, and that cellular broadband was going to be able to compete head-on with cable and fiber networks. They had plans on the drawing board to deploy the technology deep in neighborhoods.

As a reminder, millimeter wave spectrum uses much higher frequencies than are normally used for cellular service. Before the Verizon marketing blitz, the company had purchased a lot of 28 GHz spectrum from Straight Path and XO Communications. That’s a significantly higher frequency than the mid-range spectrum (1 – 4 GHz) used for cellular service. It’s called millimeter wave spectrum because the radio waves for 28 GHz are extremely short. AT&T also dipped its toe into millimeter wave spectrum with the acquisition of 29 GHz spectrum.

The Ookla article points out that many of Verizon’s millimeter wave spectrum deployments are still in use, and the use of millimeter wave spectrum is growing. Ookla cites statistics compiled by its RootMetrics effort, where the company sends people to take random cellular speed tests in markets around the country. When those in-person tests are combined with the normal Ookla speed tests conducted by the public, the Ookla article shows that in the second half of 2025, that 2.2% of Verizon cellular speed tests used millimeter wave technology, while 0.2% of AT&T used the higher spectrum. T-Mobile had virtually no millimeter wave usage.

The report demonstrates the issues with using millimeter wave spectrum. The technology can deliver gigabit speed, but the effective distance from a transmitter is very short. RootMetrics found millimeter wave speed test connections mostly within 500 feet of a transmitter, even though the spectrum can theoretically carry for a half mile. That short distance limits the use of the spectrum to high traffic areas where the extra spectrum can help relieve pressure on the other cellular spectrum bands. In case you’re wondering, most high-end cellphones manufactured since about 2001 include the ability to receive the millimeter wave spectrum. Most of the rest of the world, other than South Korea, never activated millimeter wave spectrum in networks or cellphones.

Interestingly, this report also tells a similar story about C-Band spectrum (3.7 – 4.2 GHz). Most RootMetrics speed tests for connection using C-Band were found within a half mile from a tower, although the spectrum can theoretically carry for two miles. This is good proof that, while cellular speeds are improving, the fastest speeds are found relatively close to towers. The older spectrum bands used for cellular, like 700 MHz and 900 MHz, carry for many miles, but carry far less bandwidth.

The Ookla report goes into detail about the coverage found in a few markets. For example, the  report includes a map of millimeter wave just south of downtown Denver that shows small pockets of good coverage next to areas with poor coverage, again demonstrating the distance limitations on the technology. The report is well worth reading.

We’re Drowning in Data

The analytic company IDC says the U.S. economy will be generating 394 trillion zettabytes of data annually by 2028 (a zettabyte is a trillion gigabytes). The majority of the energy used in data centers today is for storing some of this data in an accessible format. We don’t try to make all data available, and about 20% of the data we generate today is considered to be “hot data” that AI systems might want to draw on quickly. The remaining 80% of data is “cold data”, which we don’t put in data center storage, but which we also don’t discard, since it might still be of use in the future.

Today, hot data is largely stored on hard drives in data centers. This storage for quick retrievable uses a lot of electricity to operate the hard drives, and additional energy is used to cool the data center to offset the heat generated by the electronics. There is a growing trend of storing cold data on magnetic tapes, which also require energy for heating and cooling, since tapes are best stored at temperatures between 61 and 77 degrees. Tapes must also be replaced every 15-20 years by transferring the data – an intensive work effort.

The need to keep so much data at our fingertips to support AI means that we are literally drowning in data, and the problem is growing quickly every year. The solution to this is to find other ways to store massive amounts of data that don’t require a lot of electricity. There are several potential data storage methods on the horizon, and we’re going to need more.

One interesting possibility comes from Peter Kazansky, working at the University of Southampton in the UK. Back in 1999, while working with scientists at Kyoto University, Kazansky encountered a physical phenomenon that might provide the future for long-term data storage. The team at Kyoto found that when writing on glass with ultrafast femtocell lasers (a light pulse every quadrillionth of a second), the light traveling through the glass scattered in a way they could not explain.

It turns out that the researchers had discovered hidden nanostructures within silica glass created by micro-explosions from the lasers. The lasers had created tiny holes 1,000 times smaller than a human hair throughout the glass. The eureka moment came when researchers realized they could take advantage of this phenomenon by using lasers to print complex patterns inside the glass. After many years of research, Kazansky found that he could etch patterns in the glass that could store data in 5-dimensions – the normal x,y, and z coordinates, plus two additional coordinates related to voxels, or the scattering pattern of light.

This allows for storing massive amounts of data on a piece of etched silica glass. A 5-inch glass platter (slightly larger than a music CD) can store up to 360 terabytes of data. Unlike tape or hard disk storage, it looked like this technology creates forever memory that can be stored for the future. While energy is needed to etch the glass and encode the stored data, the process of reading the data uses light and is not energy-intensive. Kazanky founded SPhotonics in 2024 to commercialize the new storage method. Currently, the data can be retrieved at a speed of 30 MB per second, but he sees a path to reach 500 MB per second, which is faster than retrieving data from tapes.

Of course, storing data on etched glass is not without peril. A disc can break, and a fire or other disaster at a storage facility could destroy massive amounts of data, so most data will have to be stored at multiple sites. But at least the raw materials for silica glass are cheap and readily available. Probably the bigger issue facing the world is deciding how and when to ditch data that is no longer useful. Data scientists are already tackling this question today, but they are generally cautious and side with storing rather than destroying data if there is even a slight chance that it might be useful later.

Top-to-Bottom Review of USF

FCC Chairman Brendan Carr has been promising a top-to-bottom review of the Universal Service Fund (USF), and on April 29, the FCC released a Notice of Proposed Rulemaking (NPRM) that looks specifically at the High-Cost fund mechanisms that provide ongoing subsidies to ISPs operating in very rural markets. The High-Cost fund is the USF program that most people in the country (and even the industry) don’t understand.

The High-Cost program was initially created to support rural telephone companies in rural markets with the highest network costs per customer. Over time, the subsidy transitioned to provide support for rural broadband networks. Subsidies are paid today through several mechanisms: Connect America Fund Broadband Loop Support (CAF BLS), High Cost Loop Support (HCLS), and the sunsetting Alternative Connect America Cost Model (A-CAM) I, Revised A-CAM I, and A-CAM II mechanisms. Each of these plans is available to a different set of telcos or carriers, and the rules for participating in these plans are written in legalese that would probably confound most readers.

This new FCC NPRM asks a lot of questions about High-Cost support, with an eye towards possibly radically changing the program. The specific areas of questions asked by the FCC include:

Is Change Needed? The NPRM asks if these programs should be modified. It offers three options for respondents: 1) update the mechanisms to reflect the current rural broadband landscape; 2) create a new fixed-support mechanism to replace A-CAM and the other subsidies, or 3) do nothing and let the current A-CAM plan sunset over time and disappear.

Types of Support Needed. The NPRM asks about the type of support that might be appropriate in different circumstances, such as when a carrier is already providing service in an area, when a competitor appears in a rural market, or when new broadband infrastructure is being brought to a subsidy area by BEAD or other funding programs.

Impact of Satellite. The FCC asks if there should be any recognition or consideration in the subsidy plans to recognize low-orbit satellites.

Deployment Obligations. What should a high-cost subsidy recipient have to do in return for accepting the subsidies in terms of constructing infrastructure or maintaining networks?

Extension of the Current A-CAM? The payments for the current A-CAM programs will sunset between 2026 and 2028. The FCC is asking if all of the various remaining plans should be put on the same timeline.

IP Transition. The FCC wants to know if there is a role for the Universal Service Fund to be used to encourage telcos and carriers to complete the IP transition away from TDM technology.

The FCC is also looking at the other parts of the Universal Service Fund. At the end of April, the FCC voted to implement an online competitive portal where ISPs can bid to provide broadband service for schools and libraries that qualify to participate in the E-Rate program. The stated purpose for going to an online portal is to reduce waste, fraud, and abuse.

In April, the FCC issued an NPRM that asks for feedback about the Lifeline Fund. Among the FCC’s proposals in the NPRM is to end the Lifeline subsidy for telephone-only service. The agency also wants to strengthen the use of the National Verifier database that verifies eligibility. The agency was prompted into action when it was alerted that $5 million in Lifeline funds had been distributed to carriers to support service people who had died.

Finally, in April, the FCC sought comments on the structure and operations of USAC, the non-profit agency that administers the USF.

A Quiet Success

I’m writing this blog in recognition of the 60th anniversary of Merit, a non-profit network operating in Michigan. There aren’t many network entities that can trace their beginning back to 1966 when Michigan State University, University of Michigan, and Wayne State University formed the Michigan Educational Research Information Triad (MERIT) with the goal of networking the universities together. Through a grant from the National Science Foundation that was matched by the Michigan Legislature, Merit successfully networked the mainframe computers at the three universities in 1971.

Merit played a role in the early development of the commercial Internet. In 1987, Merit partnered with IBM, MCI, and the Michigan Strategic Fund to create and manage NSFNet, a venture funded by the National Science Foundation to connect universities to a series of five supercomputers. That network was founded in 1986 and quickly bogged down due to the network speed of 56 kilobits per second. By 1988, Merit had connected 13 nodes on the network to operate at the blazing speed of 1.5 Mbps. From 1987 to 1994, Merit organized a series of Regional-Tech meetings around the country to add 170 other universities and networks to NSFNet. In 1996, Merit joined numerous other universities to create the Internet2 fiber network.

Merit established an early goal to connect all educational entities in Michigan to an interconnected network. This work got a huge boost in 2010 with two federal grants that helped to build 2,287 miles of middle-mile fiber across Michigan. The Merit network today reaches throughout the state, including 70 new miles of fiber funded by a federal grant in 2022 to reach the Upper Peninsula.

This blog refers to Merit as a quiet success, which I think is an apt description. While well-known in the education and government sectors in Michigan, Merit is like other middle-mile networks and is not well-known to the public. This is partially because Merit, like other middle-mile networks, doesn’t seek publicity, but also because it operates quietly behind the scenes. People recognize that their local school, library, or community college has fast gigabit broadband, but don’t know that Merit brought that broadband and manages the underlying network.

Currently, Merit is connected to 3,041 member locations in Michigan, which includes 863 higher ed locations, 633 K-12 locations, 526 government locations, 325 non-profits, 235 healthcare locations, and 212 libraries. Merit is also connected to 247 meetpoints with for-profit ISPs and carriers that use the middle-mile network to spread last-mile broadband throughout the state.

Merit’s reach is huge. The network now brings faster broadband to 1 million K-12 students and 250,000 college and university students. There is still room for growth and expansion. While Merit reaches 100% of public 4-year colleges and universities, it reaches 45% of private colleges and universities, 93% of community colleges, 99% of K-12 schools, and 8% of libraries.

Merit is not just a middle-mile fiber provider. A few years ago, Merit launched the Michigan Moonshot initiative, which aims to benefit communities through data and mapping analysis, infrastructure planning, and digital inclusion research and programs. Merit participates in Eduroam, the worldwide initiative to provide WiFi access by allowing visiting students and faculty to log on with their home-campus credentials from any other participating institution. Merit participates in the Tribal Broadband Bootcamp sponsored by the Sault Ste. Marie Chippewa Tribe, which brings hands-on experience working with fiber technology.

There are other non-profit networks around the country that connect schools, universities, and libraries. But none have been around as long as Merit, and few have have the reach and impact in their state that Merit has achieved in Michigan.

Satellite Update April 2026

There is so much news and activity in the satellite sector that I find myself gathering a pile of news items each month. Here are some of the highlights from April.

Amazon Entering Direct-to-Device Market. Amazon announced it has signed an agreement to buy Globalstar for $10.8 billion. Globalstar is one of the early leaders in developing technology for providing direct-to-device services to smartphones and other devices. Globalstar currently has about two dozen satellites in orbit.

Jeff Bezos Enters the Space Data Center Race. Jeff Bezos’s rocket company Blue Origin has applied to the FCC to launch a data center in space. The application asks for approval to launch 51,600 satellites that would constitute a huge AI data center. The company argues that a data center in space will complement terrestrial data centers and will give the U.S. the edge in machine learning, autonomous systems, and predictive analytics. The satellites would be placed between 300 and 1,100 miles above Earth, with most of them higher than broadband satellites. This announcement follows a proposal from SpaceX and Elon Musk to put a million data center satellites in space.

Growing Feud Between SpaceX and Amazon Leo. We’re seeing a budding regulatory rivalry between the two American broadband satellite companies. It seems that both SpaceX and Amazon Leo file comments about anything filed by its rival at the FCC. Earlier this month, SpaceX filed comments at the FCC complaining that Amazon Leo is violating the FCC’s orbital space debris mitigation plans. SpaceX claims that Amazon Leo placed several satellites 90 kilometers higher than authorized by the FCC. In a similar complaint, Amazon LEO accused SpaceX of placing satellites too low into its authorized space. Both companies have made negative comments on the other’s plans to create a satellite-based AI data center in space.

Will Starlink Honor BEAD? A group of House Democrats sent a letter to the NTIA Administrator Arielle Roth that raises concerns that SpaceX might not meet its BEAD obligations. The letter was prompted by letters sent by SpaceX to various state broadband offices that said the company doesn’t want to comply with various BEAD reporting requirements. The legislators fear that Starlink will walk away from BEAD, leaving locations with no broadband alternative (although these customers can buy satellite broadband regardless of the BEAD grants).

Failed Satellite Launch. A Blue Origin rocket failed to place a satellite for AST Space Mobile into the proper orbit, and the satellite had to be de-orbited. It was expected that insurance would be used to recover the cost of the lost satellite.

Amazon Leo to Launch Service in Mid-2026? The company said earlier this month that it is still planning to begin offering broadband service by mid-2026. That seems like an extraordinary claim since the company still had around 240 satellites in orbit as of the date of this blog. By comparison, Starlink had almost 900 satellites in service when it began beta tests with customers. At the time, the beta test customers described noticeable gaps in coverage between satellites. What’s most interesting about the announcement is that Amazon has asked the FCC for a two-year delay in meeting the full deployment obligation for its first constellation of over 3,200 satellites.

Environmental Protesters. Residents who live close to SpaceX’s Starbase launch site recently protested during a meeting centered on SpaceX’s planned IPO. The residents of the area complained about the repeated vibrations and pollution caused by regular rocket launches, along with concern about possible fires set in the arid South Texas landscape.

Denied Spectrum Sharing. The FCC recently denied requests from multiple satellite companies that wanted to share in spectrum bands already being used by other entities. As an example of the rejection, SpaceX had asked to share in the 1.5 GHz, 1.6/2.4 GHz, and 2 GHz bands. Other satellite companies had asked to share other spectrum bands. The FCC rejection said these requests were premature and that the agency needs to revise the way it allocates spectrum to accommodate direct-to-device service.

EchoStar versus Tower Owners

One of the more interesting conflicts in the telecom industry right now is EchoStar’s fight with tower owners. The fight comes from EchoStar walking away from billions of dollars of long-term leases of cell towers to support its facility-based cellular business.

This story requires some background. This started when Dish purchased a significant amount of cellular spectrum and also the customers of Sprint’s prepaid brands, which included Boost Mobile. The sale to Dish was a requirement of the FCC agreeing to allow T-Mobile to buy the Sprint cellular business. The FCC wanted Dish to become the next facility-based cell carrier as a replacement for Sprint.

Dish borrowed billions of dollars to add to its existing debt to build a nationwide cellular network. As Dish’s satellite television business faded, the company agreed to merge with EchoStar. However, as the roll-out of the new cellular business bogged down, Dish was faced with looming problems over its $26 billion in debt and was in peril of default. In 2025, the FCC also put a lot of pressure on EchoStar and accused it of spectrum squatting, since much of its spectrum was sitting unused. EchoStar decided to walk away from the cellular business, and that involved defaulting on the many leases for cell tower space and backhaul. EchoStar realized a cash windfall when it was able to sell spectrum to AT&T and SpaceX for more than $42 billion.

A coalition of more than forty tower owners, led by the Wireless Infrastructure Association (WIA) and NATE: The Communications Infrastructure Contractors Association, has asked the FCC to force EchoStar to pay for the abandoned cell tower leases. These losses are estimated to be worth as much as $9 billion. The biggest companies in this group include American Tower, Crown Castle, SBA Communications, and Vertical Bridge.

WIA and the tower owners are asking the FCC to withhold the transfer of EchoStar spectrum to AT&T and SpaceX until the tower obligations have been satisfied.

EchoStar counters that the leases are a contractual dispute and that private contracts are outside of the FCC’s jurisdiction. It further argues that its cancellation of the leases was out of its control, making it a force majeure event, since the company was pressured by the FCC to abandon the cellular business.

This is an interesting dilemma for the FCC. There have regularly been defaults on leases of fiber, towers, and other infrastructure, and the FCC has never gotten involved in these kinds of disputes before. I have to think that if it wasn’t for the associated transfer of the spectrum, the FCC would never consider this issue.

It’s an interesting chess game. If the FCC sides with the tower owners, then the payments to EchoStar for the spectrum will be held up. Even though EchoStar would certainly take a negative FCC decision to court, the big delays in getting paid for the spectrum would likely drive the company to reach a compromise with the tower owners. If the FCC doesn’t rule for the tower owners, they will almost certainly take this to court, and may get an eventual settlement.

There was an interesting report written by the Brattle Group for WIA related to the case. The report says that the damages to the tower industry would be severe and would likely result in an increase of 5.7% to 10.7% to others who lease towers. It also warns that this would ripple through the cellular industry and would put a crimp on future capital spending for towers and related infrastructure. The tower leases are not the only dispute related to EchoStar’s decision to abandon the cellular business, and other vendors are also seeking relief by going straight to court.

 

Technology Shorts April 2026

The following topics discuss some interesting technologies that might someday influence the broadband industry.

Chip-level Photonics. Researchers at the CUNY Graduate Center have developed a thin, flat chip that can convert infrared light into precise frequencies of usable light that can be focused into a narrow, precise beam. The surface of the chip is patterned with tiny structures smaller than the wavelength of light. When hit with an infrared laser, tiny patterns convert the incoming light into a higher color as a narrow beam that can be steered by changing how the incoming light is polarized.

Scientists now envision a stack of different metasurfaces that could each be used to develop a different wavelength of light to use inside a chip to carry data. Effectively, this could create multiple laser light beams that were generated inside the chip without the expensive apparatus needed to inject external laser signals into each chip. Having a range of locally generated light signals could solve the problem of trying to move massive amounts of data into and out of the chip core – which is currently the biggest bottleneck to fast computing.

Dirt-Powered Fuel Cells. A team at Northwestern University has developed a device that can generate electricity by harvesting the power created by microbes that naturally reside in the soil and naturally break down organic matter. The fuel cell is about the size of a paperback book. The device has a disc-shaped anode that is buried in the soil with a second anode poking out near the surface. The device is large enough to tap the natural moisture in the soil at the bottom of the device. In testing, the device works across various soil conditions. The devices tested so far are creating 68 times more power than needed to operate the fuel cell, meaning there is a lot of power available to power other devices like agricultural sensors. The beauty of the technology is that it should work for many years without any need to replace batteries or other components like is needed for other power sources that could be used for similar applications. A fuel cell should work as long as there is enough carbon and moisture to fuel the natural microbes there.

New Laser Technology. Researchers at Tianjin University have created a new kind of optical device that can generate a light phenomena called skyrmions. The research shows that two skyrmions can be created, which are donut-shaped light patters that hold their shape – one that can be controlled by electric energy and the other by magnetic energy. The skyrmions are highly stable and resist interference, making them a good candidate for storing and transmitting data.

These new light sources could open up the use of terahertz wavelength lasers that could actively switch between light and magnetic mode, enabling a huge increase in the amount of data included in a laser transmission. This could provide the control of data flow needed to take full advantage of using a terahertz light source for data transmission. The key to making this work will be perfecting the constant flux between the light and magnetic pulses.

Forever Batteries? Scientists at the CSIRO Royal Melbourne Institute of Technology, and University of Melbourne, Australia, have developed a technology they are calling a quantum battery that can be recharged in a quadrillionths of a second, and that can work with six orders of magnitude of stored energy, making it practical for real-life applications. Unlike traditional batteries that use a chemical reaction to store and release energy, a quantum battery transfers energy using quantum coherence and collective interactions, rather than chemistry.

Normal batteries take a long time to recharge, and eventually the chemicals must be replaced, usually meaning replacing the battery. With traditional batteries, the larger the battery, the longer the time needed to recharge. Quantum batteries are the opposite, and the larger the battery, the faster it can be recharged. This is due to a phenomenon of collective effects between particles, which causes all of the storage units in a quantum battery to behave collectively.

The downside of quantum batteries is that the battery can also discharge all of its power quickly, and the challenge has been to find a way to control the output. The Australian team has demonstrated a full battery cycle from light absorption to storage to of electrical power output at room temperature and steady-state operation, showing that the quantum batteries have potential for real-life applications. The batteries can also be charged wirelessly, which opens up the possibility for remote charging.

The Push for Permitting Reform

There is currently a bill being considered in Congress that would mandate a new set of permitting requirements for wireless and wired infrastructure. The bill is H.R. 2289 – the American Broadband Deployment Act of 2025.

The bill first started with the goal of making it easier to get permits for BEAD and other federally grant-funded projects, but the bill has grown to encompass all local and state permitting for telecommunications infrastructure.

The heart of the changes that would come from the passage of the bill are as follows:

  • The legislation would create a shot clock from 60 to 150 days during which a state or local government must approve or deny a request for a permit to construct a wireless or wireline project. If the locality doesn’t respond in that time frame, the request is automatically assumed to be “deemed granted”.
  • The legislation would create similar shot clock rules for any carrier seeking a cable franchise.
  • The bill would eliminate the need for environmental impact studies or historic preservation reviews for any project not considered to be a major federal action under NEPA rules.
  • The law would make it easier to get permits on tribal lands.
  • The law would restrict permitting fees to recover only actual costs, instead of the traditional standard of reasonable costs.

Interestingly, the bill doesn’t address the biggest permitting issue in the rural areas where BEAD grants will be built, which is getting permits on federal lands. I’ve worked with a few ISPs where the process of crossing federal lands took almost two years. However, the law applies to state highways and state parklands, so it will be easier to build across a state park, but not a federal one.

The legislation doesn’t address the other issues with getting permits in rural areas. While I am sure there are exceptions, most of the folks I know who are building rural fiber projects tell me that most County governments invite them in with open arms. The rural permitting problems that cause the most delays and headaches continue to be crossing railroads and bridges, which are not addressed by the legislation.

As you might imagine, this legislation is being vigorously opposed by local governments, who say the new permitting preempts local authority over public right-of-way, zoning, and permitting. They argue that the restriction of using actual costs means they can’t charge enough to pay for the longer-term monitoring and management of granted rights-of-way.

The biggest local objection to the law is that this would severely limit the ability of local officials to regulate the placement, construction, and modification of cell towers. It would finally give cellular companies the ability to place towers in residential areas or near historic sites.

The legislation clearly reads like a wish list for the giant carriers and gives them the freedom to build what they want, where they want. The authors of the bill took a bill intended to make it easier to build grant-funded rural networks and applied it to the whole country. That feels like solving a relatively small rural problem related to speeding up grant construction as a pretext to apply a sledgehammer solution for all construction. The majority of broadband construction happens in cities and suburbs, not in rural America. Some of those places have complicated situations that should not be lumped together with rules aimed at speeding up rural grants.

I’m sure there will be ISPs and carriers that read this who can tell horror stories of why this is needed. But I also know folks who have built a huge number of rural projects where permitting from local governments was not difficult or expensive.

It will be interesting to see if this passes in Congress. There were several broadband-related bills that passed the House last week, and this bill didn’t make it yet. Every member of the House who votes for this is telling the local governments in their district that Congress knows more about permitting than the local folks who have been doing this forever.

Customer Reactions to Outages

On January 14th of this year, Verizon had a major cellular network outage that lasted up to ten hours and that impacted more than 1.5 million wireless customers. Not all Verizon customers lost service, but the impact was felt across the country. Recon Analytics conducted a survey of 1,702 small, medium, and large Verizon business customers to understand their reaction to the outage. I can’t recall having ever seen a public survey of this type related to a single event. The results tell a sobering story for all ISPs and carriers.

The biggest impact was felt by the large businesses, and 44% of them said they were impacted by the outage. That makes sense they tend to operate in multiple locations, across multiple states, with a lot of employees that can be impacted by a day-long outage. 33% of medium-sized businesses were impacted, and 21% of small businesses. On the flip side, only 3% of large businesses were not aware of the outage, while 7% of midsize and 12% of small businesses hadn’t heard about the outage.

For all sizes of businesses, about two-thirds said the outage didn’t change their opinion of Verizon. That’s a scary statistic for anybody who sells to businesses, because it means one-third changed their opinion. About 5-6 % of all these customers said they were much more negative about Verizon, with 32-35% said they were somewhat more negative.

When asked if they were more likely after the outage to shop around for an alternative to Verizon, small businesses were the most forgiving, with 28% saying they were more likely to shop around, while 50% of midsize companies, and 59% of large businesses said they were likely to look for alternatives. Anybody who has worked is sales and marketing knows that there is a big difference between somebody who will consider changing service and somebody who will definitely change service.

But still, the survey results have to be worrisome for Verizon. The Verizon business sales team had been pushing the story for years that the reason to Verizon is because the service is reliable. An outage that lasted throughout a workday is the opposite of reliable.

While this survey concerned a long cellular outage, it’s something ISPs should also be aware of. Not only business customers, but also residential customers now believe that cellphone and broadband coverage should always be on, and never off. My friend Travis, who owned US Broadband, tells the story of how he had a ten minute outage in the middle of the night in Minneapolis, after years with no outages, and his negative Google reviews went through the roof.

It’s virtually impossible to never have a network outage, at least in some portion of a network. The Verizon outage of ten hours was particularly negative for the public, especially after the company eventually said the cause of the outage was some unspecified software issue. We don’t expect big companies to have software problems they can’t resolve quickly.

This outage raises the issue of what ISPs and carriers should do after an outage. First is probably having an internet definition of what constitutes an important outage – is it an outage that lasts ten minutes, an hour, or half a day? Should an ISP pretend short outages didn’t happen, or should they fully explain outages to every customer who was impacted?

Indoor Cellular Coverage

In an experience that is probably familiar to everybody, in the last few months I’ve found myself unable to get a cell signal in places I routinely visit. At the pharmacy, the only cell coverage I could find was directly next to the front windows. I went to my doctor and found I couldn’t get any reception while biding my time in a waiting room. There was no signal in the back half of the grocery store.

I know my experience is not unique, and I regularly see other people grumbling in these locations about the lack of cell signal. It seems extraordinary in today’s world, where people want nearly ubiquitous cellular coverage, to find so many places with poor or no cell coverage indoors.

My first reaction to this was surprise, since my cellphone data speeds are easily ten times faster than they were a decade ago. The fact that I can’t receive indoor cellular coverage is a reminder that cell reception is due a lot more to the power of the signal rather than the speeds being delivered.

There are several reasons why indoor cell coverage is getting worse. One big reason is that cell carriers have been migrating to higher frequencies. Years ago, cellular networks widely used frequencies like 700 MHz and 900 MHz, which were able to easily penetrate buildings. The higher frequencies used today do a much worse job of penetrating buildings. A second reason is that the building materials used in newer or upgraded buildings deflect a lot of the cell signal. Modern insulation materials are generally less friendly to cell signals. Ookla recently documented that low-E glass, which is used to reflect heat in many new buildings, reflects cell signals along with reflecting heat. The bottom line is that you aren’t imagining it if you notice that indoor cell coverage isn’t as good as it was in the past.

There are several possible fixes for this, but they aren’t cheap and aren’t widely deployed. One is for businesses to invest in a cellular repeater to put on the roof to aim downward to provide better cell coverage inside the building. Cell carriers have been pushing this technology for years, and many hotels, convention centers, and office buildings are willing to pay for the capital costs and recurring fees to provide better cell indoor coverage. But groceries, hardware stores, doctors’ offices, and pharmacies aren’t willing to make this kind of investment.

Another alternative is to provide free public WiFi inside large buildings. Many businesses where customers spend significant time do this today. A large percentage of the restaurants I visit have WiFi for customers, but most require a customer to find the password and log in, something I’m rarely willing to do during a quick trip to the grocery or pharmacy. Very few stores offer WiFi that doesn’t require a password.

There has been talk for years of implementing Hotspot 2.0, a technology that allows a subscriber to automatically connect to any WiFi router that is part of a larger Hotspot 2.0 network. Every year we hear of a few smaller cities or ISPs that put together this kind of network. However, the idea has never gotten enough traction to bring it to larger markets. I’m sure the issue is figuring out a way to monetize the effort to cover the costs of implementing it.

Another new concept for improving indoor coverage is to allow access to a neutral 5G host. This would involve a third-party infrastructure provider to build, own, and operate shared cellular infrastructure for buildings that can be used by any cellular carrier. The neutral 5G host would likely want some up-front money from building owners, but would expect to also charge the cell carriers for the extra reach provided to their networks.

One interesting technology solution is the use of small cellular repeaters that would work in conjunction with a neutral host. Ericsson markets a repeater called the Radio Dot, shown at the top of this blog. Repeaters can be distributed throughout a building to make sure the cell signal reaches all needed spaces, much like is done with WiFi extenders.