Categories
The Industry

Rural Cellular Coverage

2021 is going to into history as the year when the whole country is finally talking about rural broadband. The pandemic made it clear that many millions of households and even entire communities don’t have adequate broadband. Congress and the White House responded by funding billions of dollars for improved broadband in the ARPA legislation. We are perhaps edging closer to an infrastructure bill that will allocate tens of billions of additional dollars to fix a lot of the rural broadband divide.

An issue that seems to have fallen off the radar is rural cellular coverage. In 2020, the FCC halted its plans to fund a reverse auction for a $9 billion 5G Fund that was intended to improve rural cellular coverage. That program was halted after it became apparent that Verizon and others were reporting inaccurate cellular coverage data to the FCC – much like the problem we have with FCC broadband maps. In July of this year, the FCC started another round of data collection, so the plans for the 5G fund must still be in play.

The topic has come to my attention recently as my consulting firm was doing several surveys in rural counties. In these surveys, between 30% and 50% of survey respondents said that they had little or no cellular coverage at their homes. In all of these cases, the FCC data from cellular carriers showed ubiquitous coverage throughout each county.

To be fair, the FCC reporting for cellular coverage is outdated. In its annual report to Congress, the FCC reports the percentage of homes and businesses in every county that can receive 5/1 Mbps LTE coverage. Ookla recently reported that the millions of speed tests show that the average national cellular download speed in early 2021 was 76.6 Mbps for AT&T, 82.4 Mbps for T-Mobile, and 67.2 Mbps for Verizon. The FCC is still reporting on a cellular speed that is far slower than the speed that a majority of cellular customers can receive – much in the manner that it keeps focusing on 25/3 Mbps as the definition of broadband.

But it’s really troublesome when the FCC reports to Congress that whole counties can get 5/1 Mbps cellular broadband when residents tell us they have almost no cellular overage. It’s hard to understand why members of Congress who live in some of these rural counties don’t push back on the FCC.

I’ve heard the same kind of rural stories about rural cellular coverage that I’ve heard about rural broadband. People say that coverage peters out during their commute home, and they have no coverage in their neighborhood. I heard the other day from somebody who told me they only get weak cellular coverage in one tiny part of their backyard – anybody of my age remembers running around the corner of airports trying to find that magic spot.

I would venture to say that almost everybody reading this blog knows of cellular dead spots. I live in the center of a city, and there is no Verizon coverage at my house – yet there is good Verizon coverage at the other end of my block. I have relatives that live in a nearby older suburb who have weak and intermittent cellular coverage for the entire neighborhood from both AT&T and Verizon. Every Uber driver in this area can paint a pretty accurate picture of the location of cellular dead zones – and there are a lot of them.

Cellular dead zones are largely a matter of geography and topography. I don’t know if it’s realistic to think that Verizon should be required to show the dead zone at my house on its cellular coverage maps. But I wonder how many people buy homes in hilly cities like mine, only to find that cellphones don’t work. I decided to not make an offer to buy a home here when my cellphone didn’t work at the house.

The problems in cellular rural areas are much more of a concern than my half-block of bad coverage. In rural counties, poor coverage comes from the combination of two issues – distance from a tower and terrain. In cities, there are cell towers within a mile or two of everybody, and with the introduction of small cells, even closer in many cases. In rural counties, there might only be a few cell sites, and many residents live more than two miles from the nearest tower.

Categories
The Industry

Local is Better

I am lucky enough to live in a place that is ideal for growing vegetables. I’m up in the mountains, where temperatures are moderate, and we have an average of three inches of rain every month all year. For the past several years, I have bought a big box of fresh produce every week from two local farms. I have never eaten so well, and I am comforted by knowing exactly where my vegetables were raised. I consider myself lucky to even have this option.

Most people in the country buy broadband from a handful of giant ISPs – The four largest ISPs serve more than three-fourths of all broadband customers in the country. But some people are lucky enough to benefit from great local ISPs in the same way that I benefit from being near to local vegetable farms.

There is a wide variety of local ISPs that includes independent telephone companies, telephone cooperatives, municipal ISPs, electric cooperatives, independent fiber overbuilders, and WISPs. Not all local ISPs are great – but from the stories I hear around the country, most people served by local ISPs feel lucky to have the local option.

There was a time 75 years ago when the federal government acknowledged that local is better. When it became clear that the big commercial electric companies were never going to build and serve in rural America, somebody in the federal government came up with the brilliant idea of letting communities build and operate their own electric companies through cooperatives. This didn’t cost the government much since the federal electrification plan provided long-term loans through the Rural Utility Service, and the cooperatives repaid the government for funding rural electric networks.

We’re in the process of handing out billion in federal grants to build rural broadband networks, and there isn’t one shred of localism in the new federal programs. Instead, the government is handing out huge grants that are often lucrative enough to instead attract the biggest ISPs and speculative investors to build rural broadband.

Does anybody really think that AT&T or Charter really wants to provide broadband in rural America? AT&T spent the last forty years milking the last vestiges of profits out of rural networks while making zero investments. Charter and other big ISPs are not going to like the much higher operating costs in rural areas that come from long truck rolls and increased maintenance costs to keep up with huge miles of rural miles.

The big federal grants are providing an incentive for big ISPs or speculative investors to build fiber networks because the grants make it possible to earn high returns. It’s impossible to believe that a big ISP like AT&T is going to provide the same level of customer service, repairs, and network maintenance as would be done by a local ISP. I don’t need a crystal ball to know that there will be a huge difference in twenty years between a fiber network built today by a giant ISP and one built by a rural cooperative. Without future grants, AT&T won’t invest in future electronics upgrades, and the company won’t do the routine maintenance needed to keep the network in good working order. Cooperative fiber networks will be humming along like brand new while the big ISP networks will already be showing signs of age.

The electrification program didn’t include grants, and the newly formed cooperatives eventually repaid the federal government for lending it the money to build power grids. I can’t imagine that the federal government has ever made a better investment or gotten a better return than it did from the electrification loans – the only things that come even close are Interstate highways and the G.I. Bill that sent millions to college after WW II.

In a real slap-down against localism, current federal broadband grant programs are actually stacked against small local ISPs. Federal grants expect a grant recipient to prove upfront that it has the matching funds to pay for the portion of the project not funded from a grant. Small ISPs typically don’t have the kind of balance sheets that traditional banks are looking for, and I know small ISPs that are unable to get grants due to the inability to raise the matching funds. And forget about starting a new cooperative – the grants and banks are not interested in helping start-ups.

It’s a shame that we forgot about the most successful program imaginable for building rural networks. Forty years from now, we are going to see that many of the areas where big ISPs get grant money today will have gone to hell and will need federal funding relief again. But rural markets operated by good local ISPs will be humming nicely along in forty or a hundred years from now. Broadband is one of the areas where local really is better. We all know this, but little guys don’t get a say in writing grant rules.

Categories
Regulation - What is it Good For?

Viasat and the CAF II Reverse Auction

The FCC has done some very odd things over the years to try to solve the rural broadband problem. One of the oddest was the CAF II reverse auction that awarded $122.5 million to Viasat. Things move so fast in this industry that this award at times feels like ancient history, but the reverse auction just ended in August 2018.

There is a lot of grant money currently raining down on the rural broadband industry, but when this award was made, that wasn’t happening. It’s hard to fathom that only three short years ago that the FCC deemed that giving $122.5 to a geosynchronous satellite company was good policy.

This blog is not intended as a criticism of Viasat. If you live in a place where DSL is not available, then satellite broadband is your likely only alternative. Viasat satellite broadband has gotten better over time. The broadband on the ViaSat-1 satellite launched in 2011 was dreadfully slow. The company markets broadband as fast as 100 Mbps download on the ViaSat-2 satellite launched in 2017. The company plans three new ViaSat-3 satellites with even higher capacity, with the first to launch sometime in 2022.

My consulting firm does detailed market research in rural counties, and we’ve heard from satellite customers across the country. We’ve never met anybody that fully loves the product. The most common complaints are high prices, small data caps, and big latency. Prices are high compared to other forms of broadband, with the latest pricing from Viasat as follows:

Price Speed Data Cap
Unlimited Bronze $84.99 12 Mbps 40 GB
Unlimited Silver $119.99 25 Mbps 60 GB
Unlimited Gold $169.99 100 Mbps 100 GB
Unlimited Platinum $249.99 100 Mbps 150 GB

There is a $12.99 per month additional fee for equipment on top of these prices. A customer must sign a 2-year contract to get these prices, with a fee of $15 per remaining month if a customer breaks a contract.

We’ve been told by customers that download speed slows to a crawl after a customer exceeds the monthly data allotment. To put these data caps into perspective, OpenVault says that at the end of the first quarter of this year that the average U.S. home used 462 gigabytes of data. It’s not easy for a modern home to curtail usage down to 60 or 100 GB.

The biggest performance problem is probably the latency, which can be 10 to 15 times higher than with terrestrial broadband. The latency lag is due to the time required for the signals to go to and from the satellites parked at over 22,000 miles above the earth which adds time to every round-trip connection to the web. Most real-time web connections, such as using voice-over-IP or connecting to a school or corporate WAN, work best with a latency of less than 100 ms (milliseconds). We see speed tests on satellites with a reported latency between 600 ms and 900 ms.

I wonder about the long-term repercussions of this reverse auction grant award. Most federal programs prohibit providing a government subsidy to an area that is already receiving a federal broadband subsidy. Viasat is going to be collecting $12.25 million per year from the FCC through 2027. Will this mean that people unlucky enough to live where Viasat won the reverse auction can’t get faster broadband out of the wave of new grant funding? If so, these homes might be doomed to not get a landline broadband solution for decades.

The FCC should never have allowed geosynchronous satellite broadband into the reverse auction. Perhaps we can’t fully blame the FCC for not foreseeing the pandemic where rural people are screaming for better broadband. But it didn’t take much of a crystal ball in 2018 to understand that something better would come along sooner than the 10-year window they are providing in the Viasat award areas.

The FCC is likely to repeat this same mistake if it awards nearly $1 billion to Starlink in the RDOF reverse auction. If the federal infrastructure funding becomes available, there will be ISPs willing to build fiber to areas where Starlink will be getting a 10-year RDOF subsidy. It’s not too late for the FCC to change course and not make the RDOF award. Otherwise, the agency might be dooming a lot more people to not getting a permanent broadband solution.

Categories
The Industry

5G for Cars – An Idea that Won’t Die

An industry group calling itself 5G Americas has published a whitepaper that touts the advantages of a smart auto grid powered by 5G and the C-V2X technology. This technology is the car connectivity standard that much of the industry has gelled around, replacing the older DSRC standard.

Over a decade ago, the FCC became so enamored over the idea of self-driving cars that the agency dedicated the 5.9 GHz spectrum band for the sole use of smart cars. The picture painted to the FCC at the time was the creation of a 5G network along roadways that would communicate with self-driving cars. As engineers experimented with smart cars, they quickly came to understand that the time lag involved in making real-time driving decisions in the 5G cloud was never going to be fast enough for the split-second decisions we constantly make while driving. Last year, the FCC halved the amount of bandwidth available for smart cars but didn’t totally kill the spectrum.

This whitepaper still envisions that the concept of a ubiquitous wireless network supporting smart cars. It’s not entirely surprising when looking at the companies that make up 5G Americas – AT&T, Ciena, Cisco, Crown Castle, Ericsson, Intel, Liberty Latin America, Mavenir, Nokia, Qualcomm, Samsung, Shaw Communications, T-Mobile, Telefónica, VMware and WOM. These companies would stand to make a lot of money on the idea if they could talk the government into funding the needed wireless network along roads.

There are still some interesting ideas suggested by the whitepaper. There are a lot of benefits to car-to-car communications. A car can be alerted when a neighboring car wants to change lanes or wants to pass. Drivers could peek into a camera of the car in front of them before trying to pass. Drivers can be alerted about a host of hazards, such as a car running a red light or patches of ice on the road ahead.

Most cars today already include a lot of safety features that weren’t imagined twenty years ago and the benefits envisioned by C-V2X technology sound like the next generation of safety features that car manufacturers are likely to embrace.

But this whitepaper doesn’t give up on a wireless network positioned along roads to communicate with vehicles. It refers to this as an intelligent transportation system (ITS), which would consist of a system of sensors and communications devices along roads specifically designed to communicate with vehicles. The paper touts additional benefits from a wireless network such as communications between cars and traffic lights and smart parking systems in cities.

Much of this whitepaper could have been written over a decade ago and probably was. The benefits are the same ones that have been discussed for years, although there has been some progress in developing the chips and the technology that could enable smart vehicles.

But the one thing that is largely skipped over in the paper is who pays for the infrastructure to support this. The paper suggests a collaboration between roadbuilders (federal, state, and local governments) and the cellular carriers. There is also an allusion about offering such amazing new features that car owners will pony up for a subscription to use the technology. My guess is that the real purpose of this whitepaper is to lobby Congress for grant funding for roadside networks. The paper largely suggests that government should pay for the 5G infrastructure along roads while the cellular carriers collect any subscription revenues.

The benefits touted by the paper all sound worthwhile. It would be nice to feel safe when passing another vehicle. It would be nice if your car could automatically be directed to the nearest parking place to your planned destination. But it’s hard to think those benefits are enough to entice governments to pony up for the needed infrastructure. Most of the roads in America are funded by local and county governments, and most of the roads outside of major cities are lightly traveled. I imagine most counties would laugh at the idea of funding this when many of these same counties don’t yet have broadband to homes.

If enough cars are equipped with the chips to enable this technology, there might be a few major metropolitan areas that might consider the idea. But therein lies the chicken and the egg question – will a city consider an investment in the technology before most cars have the chips, and will carmakers spend the money to install the chips before there are real-world places where this will work?

I hope that the car industry is pursuing the car-to-car communications ideas. That technology could enable most of the safety aspects touted by this whitepaper without investing in the external cellular network. The chipmakers can still make a lot of money by improving car safety. But this idea of having a ubiquitous 5G network along roads is never going to be practical, but it’s an idea that will seemingly never go away.

Categories
Technology

Aging Coaxial Copper Networks

We’re not talking enough about the aging coaxial copper networks that provide broadband to the majority of the broadband customers in the country. Comcast and Charter alone serve more than half of all broadband customers in the country.

These copper networks are getting old. Most coaxial networks were constructed in the 1970s and had an expected life of perhaps forty years. We seem to be quietly ignoring that these networks will soon be fifty years old.

There is an industry-wide consensus that telephone copper is past the end of its economic life, and most telephone networks that are still working are barely limping along. It was no surprise last October when AT&T announced that it would no longer connect new DSL customers – if the company had its way, it would completely walk away from all copper, other than as a convenient place to overlash fiber.

To some degree, coaxial networks are more susceptible to aging than telephone copper networks. The copper wire inside of a coax cable is much thicker than telephone copper wires, and that is what keeps the networks chugging along. However, coaxial networks are highly susceptible to outside interference. A coaxial network uses a technology that creates a captive RF radio network inside of the wires. This technology used the full range of radio spectrum between 5 MHz and 1,000 MHz inside the wires, with the signals arranged in channels, just like is done with wireless networks. A coaxial copper network is susceptible to outside interference anywhere along the wide range of frequencies being carried.

Decades of sun, cold, water, and ice accumulate to create slow deterioration of the coaxial copper and the sheathe around the wire. It’s vital for a coaxial sheathe to remain intact since it acts as the barrier to interference. As the sheath gets older, it develops breaks and cracks and is not as effective in shielding the network. The sheathing also accumulates breaks due to repairs over the decades from storms and other damage. Over forty or fifty years, the small dings and dents to the network add up. The long coaxial copper wires hanging on poles act as a giant antenna, and any break in the cable sheathing is a source for interference to enter the network.

Just like telcos never talk publicly about underperforming DSL, you never hear a cable that admits to poor performance in networks. But I’ve found that the performance of coaxial networks varies within almost every community of any size. I’m worked in several cities in recent years where we gathered speed tests by address, and there are invariably a few neighborhoods that have broadband speeds far slower than the rest of the network. The primary explanation for poorly performing neighborhoods is likely deterioration of the physical coaxial wires.

Cable companies could revitalize neighborhoods by replacing the coaxial cable – but they rarely do so. Anybody who has tried to buy house wiring knows that copper wiring has gotten extremely expensive. I haven’t done the math recently, but I wouldn’t be surprised if it costs much more to hang coaxial copper than fiber. You can see by the following chart how copper prices have peaked in recent years.

Big cable companies deliver decent bandwidth to a lot of people, but there are subsets of customers in most markets who have lousy service due to local network issues. I talk to cities all of the time who are tired of fielding complaints from the parts of town where networks underperform. City governments want to know when the cable companies will finally bite the bullet and upgrade to fiber. A lot of industry analysts seem to think the cable companies will put off upgrades for as long as possible, and that can’t be comforting to folks living in pockets of cable networks that already have degraded service. And as the networks continue to age, the problems experienced with coaxial networks will get worse every year.

Categories
The Industry

Explaining Growth in Broadband Demand

I haven’t talked about the growth of broadband usage for a while. I was explaining the exponential growth of broadband usage to somebody recently, and I suddenly realized an easy way for putting broadband growth into context.

The amount of data used by the average broadband user has been doubling roughly every three years since the advent of the Internet. This exponential growth has been chugging along since the earliest dial-up days, and we’re still seeing it today. Consider the following numbers from OpenVault showing the average monthly U.S. household broadband usage:

1st Quarter 2018          215 Gigabytes

1st Quarter 2019          274 Gigabytes

1st Quarter 2020          403 Gigabytes

1st Quarter 2021          462 Gigabytes

Average household usage more than doubled in the three years from 2018 to 2021. The growth happened at a compounded growth rate of 29% annually. That’s a little faster than the more recent past, probably due to the pandemic, but in the decade before the pandemic, the compounded annual growth rate was around 26%.

What does this kind of growth mean? One way to think about broadband growth is to contemplate what growth might mean in your own neighborhood. Supposes you are served by DSL or by a cable company using HFC technology. If your ISP has the same number of customers in your neighborhood now as in 2018, the local network is now twice as busy, carrying twice as much traffic as just three years earlier. If your ISP hasn’t made any upgrades in that time, the chances are that you can already see some signs of a stressed network. Perhaps you notice a slowdown during the evening prime time house when most of the neighborhood is using broadband. You’ve probably run into times when it was a challenge making or maintaining a Zoom call.

To the average person, this kind of broadband traffic growth might not seem like a big deal, because they probably assume that ISPs are doing magic in data centers to keep things working. But any network engineer will tell that a doubling of traffic is a big deal. That kind of growth exposes the bottlenecks in a network where things get bogged down at the busiest times.

The most interesting way to put broadband growth into perspective is to look into the future. Let’s say that the historical 26% growth rate continues into the future. There is no reason to think it won’t because we are finding more ways every year to use broadband. If broadband keeps growing at the historical rate, then in ten years your neighborhood network will be carrying ten times more traffic than today. In twenty years it will be carrying one hundred times more traffic than today.

When you think of growth in this manner, it’s a whole lot easier to understand why we shouldn’t be funding any technologies with grant money today that won’t be able to keep up with the normal expected growth in broadband traffic. Looking at growth from this perspective explains why AT&T made the decision last year to stop selling DSL. Understanding the normal growth rate makes it clear that it was idiotic to give CAF II funding to Viasat. Expected growth might be the best reason to not give RDOF subsidies to Starlink.

I have nothing against Starlink. If I still lived in a rural area, I would have been one of the first people on the list for the beta test. But there are already engineers I respect who believe that the Starlink network will struggle if the company sells to too many customers. If that’s even just a little bit true today, then how will Starlink perform in ten short years when the traffic will be ten times higher? And forget twenty years – Starlink is at heart a wireless network, and there are no tweaks to a wireless network that will ever handle a hundred-fold increase in traffic. If Starlink is still viable in twenty years, it will be because it took the same path as Viasat and imposed severe data caps to restrict usage or else raised rates to restrict the number of customers on the network.

I take flak every time I say this, but if I was in charge of grant programs, I wouldn’t fund anything other than fiber. I can’t think of any reason why we would fund any technology that doesn’t have a reasonable chance to still be viable in ten short years when broadband usage will likely be ten times higher than today. I would hope that a government-funded network will still be viable in twenty years when traffic volumes are likely to be one hundred times greater than today. If we don’t get this right, then we’re going to be talking about ways to build rural fiber a decade from now when other technologies crash and burn.

Categories
The Industry

A Fiber Land Grab?

I was surprised to see AT&T announce a public-private partnership with Vanderburgh County, Indiana to build fiber to 20,000 rural locations. The public announcement of the partnership says that the County will provide a $9.9 million grant, and AT&T will pick up the remaining $29.7 million investment.

The primary reason this surprised me is that it is a major reversal in direction for AT&T. The company spent the last thirty years working its way out of rural America, capped by an announcement in October 2020 that the company will no longer connect DSL customers. AT&T has publicly complained for years about the high cost of serving rural locations and has steadily cut its costs in rural America by slashing business offices and rural technicians. It’s almost shocking to see the company dive back in as a last-mile ISP in a situation that means long truck rolls and higher operating costs.

I’m sure it was the County grant that made AT&T consider this, but even that is surprising since the County is only contributing 25% of the funding. I’ve created hundreds of rural business plans, and most rural builds need grants of 40% or even much more to make financial sense. I assume that there is something unique about this county that makes that math work. AT&T and other telcos have one major advantage for building fiber that might have come into play – they can overlash fiber onto existing copper wires at a fraction of the cost of any other fiber builder, so perhaps AT&T’s real costs won’t be $29.7 million. Obviously, the math works for AT&T, and another county will be getting a rural fiber solution.

AT&T is not alone in chasing rural funding. We saw Charter make a major rural play in last year’s RDOF reverse auction. The RDOF reverse auction also attracted Frontier and Windstream, and both of these companies have made it clear that pursuing fiber expansion opportunities and pursuing grants are a key part of their future strategic plan.

My instincts are telling me that we are about to see a fiber land grab. The big ISPs other than Verizon had shunned building fiber for decades. When Verizon built its FiOS network, every other big ISP said they thought fiber was a bad strategic mistake by Verizon. But we’ve finally reached the time when the whole country wants fiber.

This AT&T announcement foreshadows that grant funding might be a big component of a big ISP land grab. The big ISPs have never been shy about taking huge federal funding. I wouldn’t be surprised if the big ISPs are collectively planning in board rooms on grabbing a majority of any big federal broadband grant funding program.

I think there is another factor that has awoken the big ISPs, which is also related to a land grab. Consider Charter. If they look out a decade or two into the future, they can see that rural fiber will surround their current footprint if they do nothing. All big ISPs are under tremendous pressure from Wall Street to keep growing. Charter has thrived for the last decade with a simple business plan of taking DSL customers from the telcos. It doesn’t require an Ouiji board to foresee the time in a few years when there won’t be any more DSL customers to capture.

I’m betting that part of Charter’s thinking for getting into the RDOF auction was the need to grab more geographic markets before somebody else does. Federal grant money makes this a lot easier to do, but without geographic expansion, Charter will eventually be landlocked and will eventually stop growing at a rate that will satisfy Wall Street.

Charter must also be worried about the growing momentum to build fiber in cities. I think Charter is grabbing rural markets where it can have a guaranteed monopoly for the coming decades to hedge against losing urban customers to competition from fiber and from wireless ISPs like Starry.

My guess is that the AT&T announcement is just the tip of the iceberg. If Congress releases $42 billion in broadband grants, the big companies are all going to have their hands out to get a big piece of the money. And that is going to transform the rural landscape in a way that I would never have imagined. I would have taken a bet from anybody, even a few years ago, that AT&T would never build rural fiber – and it looks like I was wrong.

Categories
Technology

Zayo Installs 800 Gbps Fiber

Zayo announced the installation of an 800 Gbps fiber route between New York and New Jersey. This is a big deal for a number of reasons. In my blog, I regularly talk about how home and business bandwidth has continued to grow and is doubling roughly every three years. It’s easy to forget that the traffic on the Internet backbone is experiencing the same growth. The routes between major cities like Washington DC and New York City are carrying 8-10 times more traffic than a decade ago.

Ten years ago, we were already facing a backhaul crisis on some of the busiest fiber routes in the country. The fact that some routes continue to function is a testament to smart network engineers and technology upgrades like the one announced by Zayo.

There is not a lot of new fiber construction along major routes in places like the northeast since such construction is expensive. Over the last few years, a major new fiber route was installed along the Pennsylvania Turnpike as that road was rebuilt – but such major fiber construction efforts are somewhat rare. That means that we must somehow handle the growth of intercity traffic with existing fiber routes that are already fully subscribed.

You might think that we could increase fiber capacity along major fiber routes by upgrading the bandwidth capacity, as Zayo is doing on this one route. But that is not a realistic option in most cases. Backhaul fiber routes can best be described as a hodge-podge. Let’s suppose as an example that Verizon owns a fiber route between New York City and Washington DC. The company would use some of the fibers on that route for its own cellular and FiOS traffic. But over the years, Verizon will have leased lit or dark fibers to other carriers. It wouldn’t be surprising on a major intercity route to find dozens of such leased arrangements. Each one of those long-term arrangements comes with different contractual requirements. Lit routes might be at specific bandwidths. Verizon would have no way of knowing what those leasing dark fiber are carrying.

Trying to somehow upgrade a major fiber route is a huge puzzle, largely confounded by the existing contractual arrangements. Many of the customers using lit fiber will have a five 9’s guarantee of uptime (99.999%), so it’s incredibly challenging to take such a customer out of service, even for a short time, as part of migrating to a different fiber or a different set of electronics.

Some of the carriers on the major transport routes sell transport to smaller entities. This would be carriers like Zayo, Level 3, and XO (which is owned by Verizon). These wholesale carriers are where smaller carriers go to find transport on these existing busy routes. That’s why it’s a big deal when Zayo and similar carriers increase capacity.

I wrote about the first 400 Gbps fiber path in March 2020, implemented by AT&T between Dallas and Atlanta. Numerous carriers have started the upgrade to 400 Gbps transport, including Zayo, which has plans to have that capacity on 21 major routes by the end of 2022. The 800 Gbps route is unique in that Zayo is able to combine two 400-Gbps fiber signals into one fiber path using electronics from Ciena. Verizon had a trial of 800 Gbps last year using equipment from Infinera.

In most cases, the upgrades to 400 Gbps or 800 Gbps will replace routes lit at the older standard 100 Gbps transport. While that sounds like a big increase in capacity, in a world where network capacity is doubling every three years, these upgrades are not a whole lot more than band-aids.

At some point, we’re going to need a major upgrade to intercity transport routes. Interestingly, all of the federal grant funding floating around is aimed at rural last-mile fiber – an obviously important need. Many federal funding sources can’t be used to build or upgrade middle-mile. But at some point, somebody is going to have to make the needed investments. It does no good to upgrade last-mile capacity if the routes between towns and the Internet can’t handle the broadband demand. This is probably not a role for the federal government because the big carriers make a lot of money on long-haul transport. At some point, the biggest carriers need to get into a room and agree to open up the purses – for the benefit of them all.

Categories
The Industry

Forecasting Interest Rates and Inflation

This is a topic that I haven’t written about since I started my blog seven years ago because there hasn’t been a reason. We have just gone through a decade that benefitted from both low interest rates and low inflation – a rarity in historical economic terms.

Anybody building a broadband network can tell you they are seeing significant inflation in the prices of components needed to build a fiber network. There are some who shrug off current inflation as a temporary result of supply chain issues. To a large degree, they are right, but the inflation is real nonetheless. As someone who worked in the industry in past times of inflation, my experience is that prices never go back down to former levels. Even if all of the factors leading to current inflation are eventually solved, it’s unlikely that the companies that make conduits and handholes will ever go completely back to the old prices.

To some degree, the lack of inflation has spoiled us. As recently as a year ago, I knew that I could pull a business plan off the shelf from ten years ago, and it probably still made sense. All of the industry fundamentals from a decade ago were all roughly the same, and a business plan that worked then would still have worked.

I hate to say it, but those days of surety might be over for a while. The chart below is all-too-familiar to those of us who have been in the industry a long time. In the not-too-distant past, we saw periods of both high interest rates and high inflation. 1980 is not ancient history, and those of us who were in the industry at the time recall the jarring effect of both high interest rates and high inflation on telephone companies. This chart doesn’t go back to even worse times, like in 1970 when President Nixon ordered a nationwide freeze on wages and prices to try to stop hyperinflation. I remember seeing a talking head economist on a business show a few years ago who said that we now know how to beat inflation and that high inflation and high interest rates were never coming back to the U.S. economy. I had a good laugh because I knew this guy was a total idiot.

We now live in a global economy, and the U.S. doesn’t have any magic pill that somehow keeps us out of worldwide economic upheaval. As one example, West Africa is currently suffering from hyperinflation. The current inflation rate in Nigeria is 16%, down from over 20%. Nearby Congo is one of the primary sources for metals like cobalt and tantalum that are essential for making things like computer chips and cellphones. When the price of raw material from Congo skyrockets, the industries that use those resources have no choice but to raise prices to compensate.

We don’t have to go back to ancient history to remember when we worried about interest rates. I worked with cities that were floating municipal bonds in the 2000s, and I recall times when they delayed selling bonds hoping that rates would be more favorable in the weeks or months to follow. One fiber project I was working with never was launched because the cost of interest on bonds grew larger than the project could support.

Everybody who builds financial forecasts for broadband businesses is in a quandary. How do we reflect the rising costs for materials and labor? How can anybody forecast the cost to build fiber two years from now or three or five years? We look out over the next ten years and see an industry that wants to grow faster than the support structure for the industry is ready to handle. Companies like Corning have difficult decisions to make. The company could likely sell twice as much fiber as in recent years if it had more factories. But does it dare build those factories? A factory is a fifty-year investment, and does the company want to have huge idle capacity a decade from now when the fiber craze naturally slows down? Every manufacturer in the industry is having a similar conversation, but nobody knows the calculus for figuring out the right answer. And that calculus will get much harder if we see the return of both inflation and higher interest rates.

Interest rates are going to have to increase at some point. The rates have been held below the natural market as a monetary strategy to fuel the economy. But the Federal Reserve signaled a few weeks ago that it foresees six to seven interest rate increases over the next two years.

I don’t mean for this blog to be gloom and doom. For most of my career, I’ve dealt with both inflation and interest rates when making financial forecasts. The last decade spoiled me like it spoiled many of us, and we need to readjust the way we think about the future and figure out how to deal with an economic world that is returning to normal.

Categories
Regulation - What is it Good For?

Is Defining Broadband by Speed a Good Policy?

I’ve lately been looking at broadband policies that have shaped broadband, and I don’t think there has been any more disastrous FCC policy than the one that defines broadband by speed. This one policy has led to a misallocation of funding and getting broadband to communities that need it.

The FCC established the definition of broadband as 25/3 Mbps in 2015, and before then, the definition of broadband was 4/1 Mbps, set a decade earlier. The FCC defines broadband to meet a legal requirement established by Congress and codified in Section 706 of the FCC governing rules. The FCC must annually evaluate broadband availability in the country – and the agency must act if adequate broadband is not being deployed in a timely manner. The FCC chose broadband speed as the way to measure its success, and that decision has become embedded in policies both inside the FCC and elsewhere.

There are so many reasons why setting an arbitrary speed as the definition of broadband is a poor policy. One major reason is that if a regulatory agency is going to use a measurement index to define a key industry parameter, that numerical value should regularly be examined on a neutral basis and updated as needed. It’s ludicrous not to have updated the speed definition since 2015.

Cisco has reported for years that the demand for faster speeds has been growing at a rate of about 21% per year. Let’s assume that the 25/3 definition of broadband was adequate in 2015 – I remember at the time that I thought it was a fair definition. How could the FCC not have updated such a key metric since then? If you accept 25 Mbps download as an adequate definition of broadband in 2015, then applying the expected growth in demand for speed by 21% annually produces the following results.

Download Speeds in Megabits / Second

2015 2016 2017 2018 2019 2020 2021
25 30 37 44 54 65 79

This is obviously a simplified way to look at broadband speeds, but a definition of the minimum speed to define broadband at 79 Mbps feels a lot more realistic today than 25 Mbps. Before arguing about whether than is a good number, consider the impact of extending this chart a few more years. This would put the definition of broadband in 2022 at 96 Mbps and at 116 Mbps in 2023. Those higher speeds not only feel adequate – but they feel just. 80% of the homes in the country already have access to cable company broadband where a speed of at least 100 Mbps is available. Shouldn’t the definition of broadband reflect the reality of the marketplace?

We know why the FCC stuck with the old definition – no FCC wanted to redefine broadband in a way that would define millions of homes as not having broadband. But in a country where 80% of households can buy 100 Mbps or faster, it’s impossible for me to think this one fact doesn’t mean that 100 Mbps must be the bare minimum definition of broadband.

There have been negative consequences of this definition-based policy. One of the big problems is that the 25/3 Mbps speed is slow enough that DSL and fixed wireless providers can claim to be delivering broadband even if they are delivering something less. Most of the FCC mapping woes come from sticking with the definition of 25/3 Mbps. If the definition of broadband today was 100 Mbps, then DSL providers would not be able to stretch the truth, and we would not have misallocated grant funding in recent years. Stubbornly sticking with the 25/3 definition is what saw us giving federal broadband grants to companies like Viasat.

As long as we are going to define broadband using speeds, then we’ll continue to have political fights over the definition of broadband. Congress recently ran headlong into this same issue. The original draft of the Senate bill had proposed a definition of broadband as 100/100 Mbps. An upload speed set at that level would have prohibited broadband grants for cable companies, WISPs, and Starlink. Sure enough, by the time that lobbyists made their calls, the definition of upload speed was lowered to 20 Mbps in the final legislation. Congress clearly gave in to political pressure – but that’s the line of business they are in. But we’ve had an FCC unwilling to be honest about broadband speeds for political reasons – and that is totally unacceptable.