Bill Gates on the Pandemic

Bill Gates has spent much of his time since Microsoft working to eliminate diseases in the world. In 2015 Gates presented a TED talk that discusses how the world was unready for a major disease outbreak. That talk was prophetic and everything he predicted came to pass.

This blog looks at the broadband industry, but in 2021 we can’t have any real discussions about the direction of broadband in 2021 without acknowledging the continuing impact of the pandemic. Gates has started a new podcast and his first episode looks at how the pandemic is going to change the way we live. Some of his predictions have direct relevance to broadband, and ISPs ought to heed some of these predictions when thinking about the future of their businesses.

Here are a few of his predictions and my thoughts on how the pandemic will affect broadband.

The pandemic is going to last longer than we all hope. Gates says that even after a vaccine tamps down the numbers in the US that we’ll continue to have flare-ups until the whole world gets the pandemic under control. He said this even before there was news that the virus seems to be mutating and creating more contagious forms. This means that while things will certainly get better than what we are experiencing in January 2021, ISPs need to expect to continue practicing pandemic protocols for a long time. Companies that get sloppy and careless are likely to pay the price by having to isolate employees. It’s unlikely that the world will control the pandemic completely during 2021, so brace for this lasting even longer.

Working at home is here to stay. Gates believes that many jobs will never return to the office. People will continue to work at home, many permanently. This has a bunch of implications for ISPs.

First, ISPs that have concentrated on downtown businesses are going to see a downturn. Gates thinks its likely that many downtown businesses will sit empty, which means that ISPs that made such building the focus of their business will need to expand elsewhere. It’s a really interesting twist to see ISP  business plans turned upside down. A good case in point is CenturyLink. Before the merger with Level 3 the company was aggressively pursuing building fiber in residential neighborhoods. The company pulled a 180 and has concentrated on adding buildings to fiber rather than neighborhoods. In retrospect, the original direction might have been the right one.

This also has big implications for the cable companies. It’s clear that many households are unhappy with the upload capabilities of cable ISPs. In surveys that CCG has been doing this year, we’re seeing 30% to 50% of homes telling us that home broadband connections in cities are inadequate for working from home. I think the cable companies have been hoping this problem will blow over, but if a lot of people stay home to work, unhappiness with cable broadband is going to grow. Cable companies are going to have to invest in expensive upgrades to get faster broadband or be more vulnerable to fiber overbuilders.

Virtual meetings are here to stay. Gates predicts that the platforms used for online meetings will improve significantly over the next few years and video meetings will be a permanent alternative to travel. A huge number of people in the broadband business are road warriors, and live meetings are no longer going to be expected, or for many people even acceptable. I know that I’ve been giving some serious thoughts about largely eliminating work travel – a drastic change for a consultant. But I’ve just spent a year proving that live meetings are not needed. I’ve actually gotten to know clients better through a string of video meetings instead of a few live visits.

People are choosing where to live. Millions of people are pouring out of big cities where real estate and rents are too expensive and moving to suburbs, small towns, and rural areas instead. ISPs need to join the rest of the work world and consider remote employees. Obviously, employees that visit customers or who take care of networks must be local, but every ISP has a few functions that don’t require a person to be in the office to be effective. It’s more important to find the most talented people than it is to find people within commuting distance from your office.

A more mobile workforce has a lot of implications for employers. Employees who work from home will have options and are going to look for employers who treat them well and who offer interesting work. That means a lot of turnover for companies that don’t value employees – in the new economy many of your current employees can find other jobs without relocating. A bigger challenge for companies with remote staff is going to be creating a sense of company identity and fostering teamwork.

The Legacy of the Pai FCC

As is normal with a change of administration, there are articles in the press discussing the likely legacy of the outgoing administration. Leading the pack in singing his own praises is former FCC Chairman Ajit Pai, who recently published this document listing a huge list of accomplishments of the FCC under his Chairmanship. Maybe it’s just me, but it feels unseemly for a public servant to publish an official self-praise document. The list of accomplishments is so long, I honestly read it twice to make sure Chairman Pai wasn’t taking credit for inventing 5G!

More disturbing to me are industry articles like this one that lists the primary achievements of the Pai FCC to include “the repeal of Title II regulation of the internet, rural broadband development, increased spectrum for 5G, decreasing waste in universal service funding, and better controlling robocalls.” I see some of those as failures and not accomplishments.

I find it unconscionable that the regulatory agency that is in charge of arguably the most important industry in the country would deregulate that industry. The ISP industry is largely controlled by a handful of near-monopolies. It’s easy to understand why the big ISPs don’t want to be regulated – every monopoly in every industry would love to escape regulation. It’s the government’s and the FCC’s role to protect the public against the worst abuses of monopolies. Lack of regulation means that carriers in the industry can no longer ask the FCC to settle disputes. It means that consumers have no place to seek redress from monopoly abuses. We’re within sight of $100 basic broadband, while the FCC has washed its hands of any oversight of the industry. Killing Title II regulation comes pretty close in my mind to fiddling while Rome burns.

We saw the results of broadband deregulation at the start of the pandemic. Had the FCC not deregulated broadband, then chairman Pai could have directed ISPs on how they must treat the public during the pandemic. Instead, the FCC had to beg ISPs to voluntarily sign on to the ‘Keep America Connected Pledge’, which only lasted for a few months and which some of the big ISPs seemingly violated before the ink dried. During this broadband crisis, the FCC stood by powerless due to its own decision to deregulate broadband. This is downright shameful and not praiseworthy.

Everywhere I look this FCC is getting praise for tackling the digital divide, and admittedly the FCC did some good things. There were some good winners of the CAF II reverse auction that will help rural households – but that was offset by awarding some of that grant to Viasat. The FCC did some good by increasing the Lifeline subsidy for tribal areas. But on the downside, the FCC decided to award a seventh year of CAF II subsidy of $2.4 billion to the big telcos – with zero obligations to use the money to expand broadband. The FCC knows full well that the original CAF II was mostly a sham and yet took no action in the last four years to investigate the failed program. The Pai FCC closed out its term by largely botching the RDOF grants.

The area where the FCC did the most good for rural broadband was making more wireless spectrum available for rural broadband. This FCC missed a few chances early, but in the last few years the FCC nailed the issue. The FCC might have made the best long term impact everywhere with the rulings on 6 GHz spectrum. Spectrum decisions might be the biggest lasting legacy of this FCC.

But we’re never really going to know how this FCC did in narrowing the rural broadband gap, because this FCC has no idea how many homes don’t have broadband. The lousy FCC mapping was already a big issue when Chairman Pai took over the FCC. There was a lot of gnashing of teeth about the issue under Chairman Pai, but in four years nothing was done to fix the problem, and if anything, the maps have gotten worse. It might not be so disturbing if the bad mapping was nothing more than lousy data – but the bad data has been used to justify bad policy and even worse, has been used to determine where federal grants should be awarded.

To add salt to the wound, the FCC issues a mandated report to Congress every year that reports on the state of broadband. The reports from the Pai FCC are so full of imaginary numbers that they are closer to fiction than fact. About the most the FCC under Chairman Pai can say is that the imaginary number of people without broadband grew smaller under his watch. On the last day as Chairman, the FCC released the latest report to Congress that concludes incorrectly that broadband is being deployed to Americans “on a reasonable and timely basis”. This recent report also concludes yet again that 25/3 Mbps is still a reasonable definition of broadband – when homes with that speed were unable to function during the pandemic.

In looking back, it’s clear that this FCC tilted as far as possible in favor of the big ISPs. There is nothing wrong about regulators who work to strengthen the industry they regulate. But regulators also have a mandate to protect the public from monopolies abuses. The FCC seems to have forgotten that half of its mandate. If there is any one event that captures the essence of this FCC, it was when they voted to allow Frontier to bill customers for an extra year for equipment that customers own. I didn’t see that accomplishment on Chairman Pai’s list.

Looking Back at 2020

I periodically take a look at broadband trends into the future. But as I was thinking about how unique 2020 was for everybody, I realized that there were some events during the year that we’re going to look back on a decade from now as important to the broadband industry. Interestingly, most of these events were not on anybody’s radar at the beginning of the year.

Upload Broadband Entered the Picture

For the first time, we all started caring about upload speeds due to the pandemic. Millions of homes that thought they had good broadband suddenly found that the home broadband connection wasn’t good enough for working or schooling. Millions of people reacted to this by upgrading to faster download broadband speeds, only to find in many cases that the upgrade still didn’t fix the upload speed problems.

It also appears that a lot of people will continue to work from home after the end of the pandemic which means that the demand for upload speeds is not going to go away. This is going to put a lot of pressure on cable companies in markets where there is a fiber competitor. Fiber ISPs only need to advertise as the work-from-home solution to snatch customers.

Charter Pursues Rural Broadband

Charter looks to be the only ISP out of the largest four that is adopting a strategy to expand to rural areas surrounding existing markets. Charter has been the fastest growing ISP over the last few years, and it looks like the company wants to continue that growth.

I think the rural telcos are going to look back in a decade and realize they made a big mistake. The telcos have had repeated opportunities to upgrade broadband and dominate the rural markets, where they could have been a permanent monopoly. Instead, Charter is going to sweep through many markets and take most of the customers. Charter is going to be aided in this expansion by the $1.22 billion they snagged out of the recent RDOF grant.

Windstream Decides to Chase Fiber

If you go by what they’re saying, Windstream is coming out of bankruptcy as a new company. The company has said recently that it intends to build fiber to cover at least half of its historic telephone serving areas. This will catch Windstream up to the smaller telcos that have largely migrated to fiber as the only chance for long term survival. Of course, this also means that half of Windstream’s markets are largely going to be abandoned. Windstream customers have to be wondering which half they live in.

Satellite Broadband Goes into Beta

After years of being somewhat theoretical, Starlink has customers in beta test that are loving the download broadband speeds between 50 Mbps and 150 Mbps. All of the satellite companies still have a long way to go in terms of launching sufficient satellites to become a viable competitor – but we now have proof on concept.

Rough Year for the Supply Chain

The telecom industry, like so many others, has largely taken the supply chain for granted without a lot of thought of where network components are manufactured. 2020 started with price pressure on electronics due to tariffs and went into a tailspin when the pandemic hit Wuhan Province in China, where the majority of laser technology is made.

Electronics vendors have spent much of 2020 developing new sources of manufacturing. This means a downside for the Chinese economy, but an upside for many other places in the world. The new administration says it will fund an effort to move much of US chip manufacturing back to the US, and hopefully other electronic components will follow. The big advantage that the far east has had over US manufacturing has been cheap labor, but that might be largely overcome by modern and largely robotized factories. Hopefully, telecom vendors will take the needed steps to make sure we aren’t caught flat-footed again.

And Now . . . Really Fast Internet

It was inevitable that ISPs would eventually start offering residential broadband speeds faster than 1 gigabit. It’s a little hard to believe it was so long ago, but it was back at the end of 2012 when Google Fiber announced it was bringing gigabit fiber to Kansas City. I know of a few small ISPs, like the municipal ISPs in Lafayette, LA and Chattanooga, TN that offered gigabit service before then – but Google Fiber was the first to make gigabit the only product.

Gigabit data speeds were revolutionary in 2012. At that time, the speed of the basic product on most cable company networks was ‘up to 30 Mbps’. Google leaped the market speeds with over a 30X increase in speed – the biggest jump since we leaped from dial-up to 1 Mbps DSL.

At the time, the cable companies all scoffed at Google Fiber as a gimmick – but in markets where they competed against gigabit fiber, the cable companies scrambled to roll out gigabit products using DOCSIS 3.1. And the biggest cable companies like Comcast and Charter stepped up the broadband game and will have moved the speed of basic broadband from 30 Mbps to 200 Mbps in 2021.

It took a long time for users to buy into the gigabit speed tier. This was often due to price since gigabit products on cable company networks are priced over $100 per month before adding the modem fee. But Google Fiber has stayed with the same $70 price it announced in 2012 and numerous other fiber ISPs now have gigabit products under $100. That price no longer looks high when the price of the standalone Comcast broadband and modem is at $90.

OpenVault reports on broadband usage and subscriptions and reports that in the last year that residential gigabit subscriptions have climbed to 5.6% of all broadband subscriptions – a 124% increase from just a year earlier. Families sent home during the pandemic are deciding in mass that faster speeds are needed to support their new broadband needs.

Google Fiber has announced the introduction of a 2-gigabit product. For now, the product is only being offered in Huntsville, AL and Nashville, TN.  One has to think that the company will eventually offer 2-gigabit service in its other 30 markets. Google Fiber has priced the 2-gigabit tier at $100. This comes with a new modem that is capable of handling the 2-gigabit speeds as well as WiFi 6 to efficiently transport large bandwidth applications around the home without interference.

An even faster product is now being offered by MLCG of Enderlin, North Dakota. The company announced a 5-gigabit bandwidth product and a 2.5-gigabit product. The 5-gigabit product is priced at $199 per month and the 2.5-gigabit product at $150 per month. MLCG advertises the 5-gigabit product for Multiple users simultaneously – Stream or edit 4K videos, upload/download LARGE files, gaming with NO worry about lag, multiple smart home devices, video chat, play online games, download files, streaming HD shows and movies, social media and web browsing“.

Skeptics will say that these new products are a gimmick and that nobody needs Internet access faster than gigabit speeds. That would have been a valid observation in 2012 when there was nothing that could be done over a residential internet connection that needed a gigabit  of speed. But I know users who tell me they are stretching a gigabit product. I have a friend who has several heavy gamers in the household and who also also backs up his office servers at home daily who tells me there are times when a gigabit feels a little slow. I have several clients who have told me that doctors are asking for something faster than a gigabit in order to be able to view 3D medical scans in real-time at home. Already today, one out of 18 homes in the country has upgraded to a gigabit product, and it’s not hard to imagine that some of those homes want more than a gigabit.

I’m sure that the initial penetration rates on the products faster than a gigabit will be minuscule at first. But let’s look back at this in five years when a lot of ISPs offer multi-gigabit products. ISPs that can’t offer gigabit speeds never miss an opportunity to pooh-pooh fast broadband – but over time the penetration rate for these new faster products will climb, just like it has for gigabit broadband.

The Politicization of the FCC

Before the citizens of Georgia elected two new Democratic Senators, it looked like a Republican Senate was on a path to lock up the FCC by not approving any new Commissioners. This was threatened by Mitch McConnell and other Senators who didn’t want the FCC to pursue the reintroduction of net neutrality and broadband regulation.

The current FCC was already politicized when late last year the President didn’t reappoint Mike O’Reilly as a Commissioner after O’Reilly voiced his opinion that the FCC didn’t have the authority to overturn Section 230 of the FCC’s rules that provide a shield for web companies to not be sued over content posted by the public. O’Reilly thought that only Congress has that authority, and from I can tell, he is right. The politicization continued when the President appointed Nathan Simington as the new FCC Commissioner – somebody with virtually no telecom experience, but who is a vocal supporter of eliminating Section 230 rules.

The FCC has always been a little political, in that new administrations have been able to appoint a new Chairman who supposedly follows the political inclination of the new administration. But the FCC is an independent agency and sometimes FCC Commissioners surprise the White House. But for the most part, FCCs tend to follow the basic philosophy of the party in power. This is something that is part of what I call the regulatory pendulum, where the FCC and other regulatory agencies tilt due to politics towards the corporations they regulate or towards the public they are supposed to protect.

But in the past, the shifts that came with changes of administration have been subtle, because the vast majority of what the FCC does is not political or controversial. Probably 90% or more of the topics that make it onto the FCC’s dockets are not political but have to do with overseeing the telecom industry. There is nothing political about FCC actions like approving new cellular handsets or trying to stop robocalling.

To some extent, the current politicization of the FCC can be attributed to Congress, which has been too divided and partisan to pass a new telecom act. The current primary telecom rules were passed in 1996 when broadband access meant AOL and CompuServe, and the rules governing broadband are badly out of date. It’s like we’re regulating self-driving cars with horse and buggy rules.

Without updated directions from Congress, the FCC is forced to somehow fit desired policy changes inside of existing rules. That was the primary reason for the convoluted process the current FCC undertook to eliminate broadband regulation. The problem with these ad hoc workarounds is that a subsequent FCC can undo every workaround, and the new FCC in 2021 is likely to reimpose broadband regulation and net neutrality.

None of this regulatory back and forth is healthy for the FCC or healthy for the country. When the FCC gets tainted by charges of political bias, then the public and the industry come to have no faith in the FCC or anything they order.

The courts ruled that the current FCC was within its regulatory powers to undo broadband regulation – and that same court ruling will mean that a new FCC has the power to undo anything the last FCC did. If you ask the executives of the largest ISPs what they most want out of regulation, they will tell you its consistency. The big ISPs were perfectly fine living with the net neutrality rules, and the CEO of every big ISP went on the record saying so. What they are not fine with is the FCC changing rules on net neutrality, privacy, and other important issues every time there is a change of administration.

Unfortunately, the power to stop this policy yo-yo is in the hands of Congress and I don’t hold out any big hope that Congress can agree on important telecom issues to the extent needed to issue an updated Telecom Act. New telecom legislation would provide a clear set of policies that would apply to an FCC appointed by Democrats or Republicans. But maybe Congress will surprise us all and dig in on a bipartisan basis and figure this out. Broadband and related topics are too important to allow a big policy shift every time there is a change in the White House.

Powering the Future

For years there have been predictions that the world would be filled with small sensors that would revolutionize the way we live. Five years ago, there were numerous predictions that we’d be living in a cloud of sensors. The limitation on realizing that vision has been figuring out how to power sensors and the other electronics. Traditional batteries are too expensive and have a limited life. As you might expect, scientists from around the world have been working on better power technologies.

Self-Charging Batteries. The California company NDB has developed a self-charging battery that could remain viable for up to 28,000 years. Each battery contains a small piece of recycled radioactive carbon-14 that comes from recycled nuclear fuel rods. As the isotope decays, the battery uses a heat sink of lab-created carbon-12 diamond which captures the energetic particles of decay while acting as a tough physical barrier to contain the radiation.

The battery consists of multiple layers of radioactive material and diamond and can be fashioned into any standard batter size like a AAA. The overall radiation level of the battery is low – at less than the natural radiation emitted by the human body. Each battery is effectively a small power generator in the shape of a traditional battery that never needs to be recharged. One of the most promising aspects of the technology is that nuclear power plants pay NDB to take the radioactive material.

Printed Flexible Batteries. Scientists at the University of California San Diego have been researching batteries that use silver-oxide zinc chemistry. They’ve been able to create a flexible device that offers 10-times the energy density of lithium-ion batteries. The flexible material means that batteries can be shaped to fit devices instead of devices designed to fit batteries.

Silver–zinc batteries have been around for many years, and the breakthrough is that the scientists found a way to screen print the battery material, meaning a battery can be placed onto almost any surface. The printing process paints in a vacuum and layers on the current collectors, zinc anode, the cathode, and separator layers to create a polymer film that is stable up to almost 400 degrees Fahrenheit. The net result is a battery with ten times the power output of a lithium-ion battery of the same size.

Anti-Lasers. Science teams from around the world have been working to create anti-lasers. A laser operates by beaming protons while an anti-laser sucks up photons from the environment. An anti-laser can be used in a laptop or cellphone to collect photons and use them to power the battery in the device.

The scientific name for the method being used is coherent perfect absorption (CPA). In practice, this requires one device that beams out a photon light beam and devices with CPA technology to absorb the beams. In the laboratory, scientists have been able to capture as much as 99.996% of the transmitted power, making this more energy-efficient than plugging a device into electric power. There are numerous possible uses for the technology, starting with the obvious ability to charge devices that aren’t plugged into electricity. But the CPA devices have other possible uses. For example, the devices are extremely sensitive to changes in photons in a room and could act as highly accurate motion sensors.

Battery-Free Sensors. In the most creative solution I’ve read about, MIT scientists started a new firm, Everactive, and have developed sensors that don’t require a battery or external power source. The key to the Everactive technology is the use of ultra-low power integrated circuits which are able to harvest energy from sources like low-light sources, background vibrations, or small temperature differentials.

Everactive is already deploying sensors in applications where it’s hard to change sensors, such as inside steam-generating equipment. The company also makes sensors that monitor rotating machinery and that are powered by the vibrations coming from the machinery. Everactive says its technology has a much lower lifetime cost than traditionally powered sensors when considering the equipment downtime and cost required to periodically replace batteries.

The Fiber Backlog

One of the issues facing new fiber projects in 2021 is the backlog and slow ordering time for fiber cable. I’ve heard recently from clients that have been told it will take them from four to nine months to get new fiber.

Part of this delay can be blamed on the pandemic as factories and shippers everywhere have gotten turned sideways during the pandemic. We saw a big slowdown after the first quarter of 2020 for electronics that was due to the pandemic. Some of this was due to the fact that the Wuhan province in China is where a lot of optical electronics are manufactured – and which also was ground zero for the pandemic. Electronics production in Wuhan ground to a quick stop when the local government responded to the pandemic with a total and prolonged shutdown.

The local backlog in Wuhan eventually cleared, but the industry started looking for workarounds. Many of the vendors that were relying on factories in Wuhan moved part of the manufacturing to other countries as a hedge against having all manufacturing located in one concentrated area. This wasn’t easy to do during the pandemic. That’s a shift that has been due for years because eventually, something would have happened locally in Wuhan to pinch the supply chain – be it this pandemic, major weather events, or politics. I think many vendors learned a lesson and are going to diversify their supply chain in the future. This is going to cost a lot of business for Wuhan but will be better for the rest of the world. I’m hoping that at least some of this manufacturing finds its way back to the US – the fact that electronics are all made overseas feels like a national security issue to me.

But the backlog in fiber preceded the pandemic – there was already a backlog before the beginning of 2020. The pandemic added to the backlog, but it’s something that was already building. The backlog in fiber seems like more of a traditional supply and demand issue.

The world has been building fiber at an astonishing and accelerating pace. Just in this country, there are fiber projects everywhere. There are a few big companies like Verizon that have been buying huge quantities of fiber. For example, Verizon announced in 2017 that it was going to buy over $1 billion in fiber from Corning over a few years – up to 12.4 million miles of cable. But seemingly everybody else is also building fiber. Until the pandemic curtailed my travel, it seems like I saw fiber construction crews almost everywhere I went. Just a few years earlier, spotting a fiber crew was a rarity.

There is definitely a backlog in fiber, but the backlog is far more pronounced for smaller fiber buyers and at the maximum for new fiber buyers. This is where normal supply and demand kicks in. A company like Corning is always going to put Verizon at the front of the delivery queue. The largest buyers like Verizon worry about not having enough fiber and so they place large orders that eat up the capacity at factories. When there is word of supply chain problems and shortages, the big companies like Verizon order even more fiber to be safe.

This creates a shortage at the manufacturer which can’t pledge the extra fiber to anybody else. Over time, as the big companies don’t take delivery of all of the fiber, the excess enters the supply chain for everybody else. This creates a fluctuation in the supply that the manufacturer can’t predict. To some degree, much of the perceived shortage is artificial and is a result of fiber being allocated to the biggest buyers. The shortages start to look really long when these market fluctuations get layered on top of real shortages and slowdowns like happened during the early days of the pandemic.

The current shortage is probably not as bad as what buyers are being told by suppliers. Somebody being promised fiber in nine months will likely get it in six, and those being told six months will probably get it in four months. But those are still historically long waits for fiber.

There is not a whole lot that a new fiber buyer can do about the situation. Big carriers buy directly from the manufacturers and it’s not likely that Verizon and other big buyers are waiting long for fiber. Everybody else in the industry buys fiber through wholesale supply houses, and these are the ones seeing the biggest impact from the yoyoing supply. Just like the manufacturers take care of the huge buyers, a supply house takes care of its long-time buyers first, so small and new fiber buyers are at the end of the supply chain. In a true shortage, like the one that happened years ago when one of the major fiber factories burned down, the smallest buyers might not even be able to get fiber.

This current shortage will eventually clear and the market will return to normal – it always does. But for 2021 and even beyond, a new fiber buyer needs to order early or face sitting around waiting on fiber.

Building Rural Coaxial Networks

Charter won $1.22 billion in the RDOF grant auction and promised on the short-form to build gigabit broadband. Charter won grant areas in 24 states, including being the largest winner in my state of North Carolina. I’ve had several people ask me if it’s possible to build rural coaxial networks, and the answer is yes, but with some caveats.

Charter and other cable companies use hybrid fiber-coaxial (HFC) technology to deliver service to customers. This technology builds fiber to neighborhood nodes and then delivers services from the nodes using coaxial copper cables. HFC networks follows a standard called DOCSIS (Data Over Cable Interface Specification) that was created by CableLabs. Charter currently uses the latest standard of DOCSIS 3.1 that easily allows for the delivery of gigabit download speeds, but something far slower for upload.

There are several distance limitations of an HFC network that come into play when deploying the technology in rural areas. First, there is a limitation of roughly 30 miles between the network core and a neighborhood node. The network core in an HFC system is called a CMTS (cable modem terminating system). In urban markets, a cable company will usually have only one core, and there are not many urban markets where 30 miles is a limiting factor. But 30 miles becomes a limitation if Charter wants to serve the new rural areas from an existing CMTS hub that would normally be located in larger towns or county seats. In glancing through the rural locations that Charter won, I see places that are likely going to force Charter to establish a new rural hub and CMTS. There is new technology available that allows a small CMTS to be migrated to the field, and so perhaps Charter is looking at this technology. It’s not a technology that I’ve seen used in the US, and the leading manufacturers of small CMTs technology are the Chinese electronics companies that are banned from selling in the US. If Charter is going to reach rural neighborhoods, in many cases they’ll have to deploy a rural CMTS in some manner.

The more important distance limitation is in the last mile of the coaxial network. Transmissions over an HFC network can travel about 2.5 miles without needed an amplifier. 2.5 miles isn’t very far, and amplifiers are routinely deployed to boost the signals in urban HFC networks. Engineers tell me that the maximum number of amplifiers that can be deployed is 5, and beyond that number, the broadband signal strength quickly dies. This limitation means that the longest run of coaxial cable to reach homes is about 12.5 miles. That’s 12.5 miles of cable, not 12.5 miles as the crow flies.

To stay within the 12.5-mile limit, Charter will have to deploy a lot of fiber and create rural nodes that might serve only a few homes. This was the same dilemma faces by the big telcos when they were supposed to upgrade DSL with CAF II money – the telcos needed to build fiber deep into rural areas to make it work. The telcos punted on the idea, and we now know that a lot of the CAF II upgrades were never made.

Charter faces another interesting dilemma in building a HFC network. The price of copper has steady grown over the last few decades and copper now costs four times more than in 2000. This means that the cost of buying coaxial cable in relatively expensive (a phenomenon that anybody building a new house knows when they hear the price of new electrical wires). It might make sense in a rural area to build more fiber to reduce the miles of coaxial cable.

Building rural HFC makes for an interesting design. There were a number of rural cable systems built sixty years ago at the start of the cable industry, because these were the areas in places like Appalachia that had no over-the-air TV reception. But these early networks carried only a few channels of TV, meaning that the distance limitations were a lot less critical. But there have been few rural cable networks built in more recent times. Most cable companies have a metric where they won’t build coaxial cable plant anywhere with fewer than 20 homes per road mile. The RDOF grant areas are far below that metric, and one has to suppose that Charter thinks that the grants make the math work.

To answer the original question – it is possible to build rural coaxial networks that can deliver gigabit download speeds. But it’s also possible to take some shortcuts and overextend the amplifier budget and curtail the amount of bandwidth that can be delivered. I guess we’ll have to wait a few years to see what Charter and others will do with the RDOF funding.


$100 Broadband

Advocates for digital inclusion have shown that the primary reason that many homes don’t buy broadband is price – homes can’t afford the broadband from the big cable companies. The title of this blog is ‘$100 Broadband’ because we’re on a trajectory for that to become the normal price of broadband in just a few years.

Broadband is already expensive, and the cable companies are now in the mode of raising rates every year. Consider the prices already charged today by Comcast and Charter.

The Comcast basic ‘Performance’ broadband product is priced at $76 starting on January 1, an increase of $3. To go along with this, Comcast is now charging $14 per month for a modem, an increase of $1 per month. This means somebody who is not receiving special pricing or who is not in a bundle is now paying $90 per month for basic broadband. That rate doesn’t include the extra fees being levied on households that exceed the monthly 1.2 terabyte data cap. If Comcast continues to increase rates by $4 per year, then they’ll be at a $98 rate in 2023 and have a base rate of $102 in 2024.

Not all Comcast customers pay this full rate today, but many eventually will. New customers who have switched from DSL probably have special low introductory rates that revert to the list price after a one or two-year contract. A large percentage of Comcast customers pay less than the list price through bundling. Nobody with a bundle knows what they pay for broadband, but they quickly find out that they are expected to pay the list price if they dare to cut the cord and break the bundle.

Charter is not as expensive as Comcast. The company just raised the rates on December 1 for its basic broadband product by $5 to reach a rate of $74.99. In addition, Charter charges $5 for a modem, bringing the standalone price for broadband to $79.99. At a $5 annual rate increase, the company will achieve $100 rates in four years. In addition, Charter has petitioned the FCC to allow it to bill for data caps – something that will substantially increase the rates for homes that are likely working or that have students at home.

I am certain that most consumers don’t know the full price of broadband. My consulting firm does residential surveys and in a few recent surveys in Comcast markets, the average Comcast customer thinks broadband costs around $70. This speaks to the power of hidden fees where the average customer doesn’t associate the ridiculously high $14 modem rate as being a broadband charge. Comcast and the other big ISPs have mastered the art of confusing customers by billing practices that make it hard for customers to see the price of a given product.

Comcast also hides its rates from the general public. If you don’t believe me, search for Comcast rates on the web – all you’ll easily find are the rates being charged to customers that switch from DSL. You won’t find the company talking about its actual rates outside of small-print footnotes – and even the small print won’t mention the modem charge.

I predict that the cable companies are going to start quietly cutting back on special pricing and bundling discounts. Those discounts no longer make competitive sense in markets where the only other competitor is telco DSL. AT&T recently announced it will not be connecting new DSL customers, meaning that a cable company likely has no competition in markets where AT&T is the telco. But the cable companies have largely obliterated DSL in almost every market. It has to be dawning on cable companies that they have won the broadband war and they no longer have to give away deep discounts to get and keep customers. The cable companies are now de facto monopolies in most markets, and they will start acting like monopolies. And that means charging full price for services among other things.

Right now, the FCC has no authority over broadband prices since the agency wrote itself out of the broadband regulation business. But if the FCC never discourages cable companies from continually raising rates, we’re going to be looking at rates of $150 per household in a decade. Monopolies are going to keep raising rates until a regulator steps in and tells them to knock off the nonsense.

The Cost of Using Poles

The Georgia Public Service Commission (GPSC) passed a rule recently that reduces the cost of pole attachments to $1 per year per pole for anybody that builds broadband in areas of the state that the state considers to be unserved. They titled this the One Buck Deal. The state has created its own broadband map that undoes many of the errors in the FCC’s broadband maps and shows that over 500,000 rural homes don’t have broadband.

I really don’t mean to detract from any effort to make it easier to build rural fiber – but pole attachment fees are not what is stopping companies from building rural fiber. It easy to understand how regulators got this idea because the big ISPs have been screaming about pole attachment fees for years. And at the national level, the biggest fiber builders have claimed that pole attachment fees are an impediment.

From an operating perspective, annual pole attachment fees are a relatively minor cost for most network owners. The biggest expenses for operating a new fiber project are labor and interest on debt. Other big expenses include the cost of the Internet backbone, billing, and marketing. Pole attachments fall far down the list, and for most projects I’ve worked with, the cost of pole attachments is rarely more than a percent or two of total operating expenses. While the GPSC gesture of reducing these fees would be welcome to a fiber overbuilder, avoiding 1% of operating costs isn’t going to move the needle on any business plan.

The biggest cost of deploying fiber is the construction cost of building the fiber network along each road in a service area. Poles play a major role in the cost equation, but it’s not the fees to rent the poles that are the problem. The biggest cost culprit in putting fiber on poles in something the industry calls make-ready. This is the cost of getting poles ready before fiber can be hung. There are national electrical standards that define the spacing between wires of different utilities – rules that are designed to provide safety to technicians that must work on poles, particularly when trying to fix storm damage.

Make-ready costs fall into three general categories. Some make-ready involves fixing existing problems with wires. The original utilities on the poles may not have followed the safety rules and there are often many cases where wires are already out of compliance with the safety rules. Cables may be installed too close to neighboring wires. Wires might have too much sag, making it hard for an additional attacher.  Unfortunately, the make-ready rules say that the new fiber attacher must pay the full cost fixing existing problems.

The second category of make-ready involves situations where there is not enough room for a new attacher. In these cases, the pole must be replaced with a taller pole and each existing attacher must move wires to the new pole. Unfortunately, the new attacher must also pay for all of these costs.  The final category of cost in areas with a lot of trees is tree trimming. Electric utilities are supposed to keep trees trimmed out of the way of the wires on a pole – but if they are lax in this effort, then the new fiber attacher must also pick up these fees.

It’s not untypical for make-ready costs to range from $10,000 to $20,000 per mile, with some cases we know of as high as $50,000 per mile. The areas with the highest costs are with pole owners (generally electric companies) that have neglected pole maintenance for many years. A new fiber builder is often saddled with replacing poles that are rotted or leaning – something that the utilities should have been routinely fixing over the years. I know of cases where practically every pole needs to be replaced – and this can generally be pinned on the absence of maintenance by the pole owner.

If the GPSC really want to make it easier to build rural fiber, they would have tackled the make-ready issue aggressively. It’s crazy that a new pole attacher must pay to fix existing safety violations of the current utilities using the pole. It’s massively unfair that a new fiber attacher should pay the full cost to replace poles that are old, rotted, and already unsafe.

But fixing the make-ready issue means taking on the powerful lobbies of existing utilities. The telcos, cable companies, and electric utilities collectively have a huge presence in most state legislatures. They are perfectly happy with the status quo where the new guy pays to fix all past sins.

I hope the Georgia idea doesn’t catch on. Regulators and state politicians look for easy ways to say that they are doing something to fix the rural broadband problem. They will point to things like the One Buck Deal to prove they are taking action – when in fact, actions like this one don’t make it any easier to build rural fiber. If regulators want to fix rural pole issues, then they should be fixing the 99% cost problem of pole make-ready instead of the 1% cost issue of pole attachment fees.