Continued Bans Against Municipal Competition

There are still states around that don’t let municipalities participate in finding broadband solutions. In a world where it’s now clear that broadband is vital to homes, it’s hard to understand how such bans make any sense. I’m going to write today about my state of North Carolina as an example of how continued bans are harming the citizens of the state.

This is a state where the telecom lobby has been historically generous with state lawmakers and has been able to pass desired legislation for decades. The municipal ban in North Carolina was passed a decade ago when lawmakers reacted to a citywide fiber network constructed by the City of Wilson. The big ISPs at the time – Time Warner Cable (now part of Charter), AT&T, and CenturyLink argued that municipal competition was unfair and that the private sector should be allowed to take care of the broadband needs of communities.

The pandemic has highlighted the fallacy of the big ISP argument. In the ten years since the law passed, the state has seen some improvements in broadband. When Google Fiber popped up in the Research Triangle, both AT&T and Time Warner reacted quickly and a few lucky households can buy gigabit broadband today from three providers. There are a few telephone cooperatives in the state that have built fiber in rural areas, including one that is extending into some neighboring counties. A few WISPs have built wireless broadband to pockets of rural customers. Charter upgraded a few years ago to DOCSIS 3.1 and cable broadband in the cities is decent – although upload speeds now look to be inadequate to handle the pandemic.

But the households that had no broadband, or poor broadband a decade ago when the municipal ban was passed still had no broadband when the pandemic sent students and adults home to work. AT&T has gone so far as to announce on October 1 that it will no longer install new DSL customers – so it’s ceding the towns and cities in the states to become monopoly markets controlled by cable companies. CenturyLink got billions of federal dollars to improve rural broadband to at least 10/1 Mbps, but nobody in North Carolina can find where these upgrades have been made.

The big ISPs are lobbying against a bogeyman that doesn’t exist. Municipal competition has been allowed in a lot of the country for twenty years and there are still only around 200 communities that have built fiber – most of them small and most of them already having a municipal electric utility.

Communities only look for broadband solutions when prompted to do so by their citizens. A big percentage of rural communities are now exploring better broadband because their citizens are screaming loudly about the inability to participate in normal daily life without good broadband. At the local level, broadband is a non-partisan issue and you won’t find many local politicians in rural areas who are not strong proponents of solving the broadband gaps.

But unfortunately, at the state level, politics as usual still controls the municipal ban on broadband. Almost annually in North Carolina, proponents for better broadband take a shot at overturning the municipal broadband ban. But year after year the lobbyists kill such efforts. In the last legislative session the story I heard is that AT&T lobbyists were able to change the wording of the proposed new law to the point of neutering it. This is the same AT&T that publicly announced a year ago that it was finished building residential fiber and that this year decided to bow out of the business of providing broadband using DSL on copper wires. This is a big ISP that is not spending any money to help North Carolina households but that still is still spending on lobbyists to kill any law that even hints at broadband competition.

The AT&T tactic used to kill a pro-broadband law earlier this year demonstrates the newest big ISP tactic. Lobbyists get language inserted into proposed bills that kills them – but the language is always subtle, and to a layperson never sounds bad. This gives cover to state politicians that can tell their rural constituents that they are pro-broadband while still voting against pro-broadband laws.

What’s ironic about the municipal broadband ban is that there are only a few communities that are willing to become an ISP. The vast majority of communities that spend money looking for a broadband solution are trying to lure a new ISP to serve their community. But of course, the big ISPs don’t want that any more than they want municipal competition.

The pandemic may have changed the calculus of the legislative process enough to overturn the bans against municipal broadband. Rural residents are now up-in-arms about lack of broadband and are letting politicians know their unhappiness. But since the legislative process is done behind closed doors, it’s sadly likely that the lobbyists of the big ISPs will continue to hold enough sway to keep killing competition.


Why I am Thankful for 2020

Every year I write a blog at Thanksgiving talking about the things in our industry for which I am thankful. Most years it’s not hard to do this because there are always a lot of great things happening in the broadband industry. But 2020 has been hard on the broadband industry just like it’s been hard on all of us. I had to reach a little deeper this year to make a list. Please feel free to comment on this blog with things you are thankful for this year.

Response to the Pandemic. To me, the big story of the year is the way that local officials and local ISPs quickly responded to the pandemic. It was a shock sending kids home to do schoolwork who didn’t have computers or home broadband connections. I’ve talked to dozens of school districts that scrambled and found hot spots and computers so that within a short time kids had some options.

Unfortunately, this wasn’t always easy. For instance, there are a lot of rural places with poor cellular coverage where sending home a wireless hotspot wasn’t a viable solution. Communities and ISPs found ways to install public hot spots at schools, parked school buses, restaurants, fire stations – any place where people could park cars and where ISPs could get a broadband signal. I’m thankful for the thousands of people who mobilized quickly to make this happen.

Rural Broadband Problems Got Noticed. Politicians at every level heard from angry constituents who will no longer tolerate the sad state of rural broadband. All of a sudden, almost every politician is talking about solving the rural broadband problem. We’ll have to see how this translates into action when the pandemic is over, but there is no mistake that rural residents were finally heard loud and clear.

Rural Spectrum. Probably the brightest broadband news this year is that the FCC released a ton of new spectrum that can be used for rural broadband. Broadband purists want everybody in America to have fiber, but until we figure out how to pay for that, today’s wireless technology can deliver 50 Mbps to 100 Mbps broadband in rural areas and is a badly-needed solution. The new spectrum gives WISPs a chance to step up their game.

Better WiFi on the Way. The industry released the WiFi 6 standard and the FCC approved 6 GHz spectrum for WiFi use. These two innovations are going to revolutionize WiFi. A lot of the problems that homes cite with broadband performance can be blamed on our currently overloaded WiFi spectrum bands. Within a few years, most of these problems should melt away with new WiFi gear.

A New FCC Coming. While this FCC did some positive things, they have gone too far in the direction of catering to the big ISPs at the expense of the public good. The ideal FCC balances the needs of the industry and the needs of the public. I expect a new FCC is going to swing the regulatory pendulum away from a carrier emphasis back closer to where the FCC ought to be.

Cybersecurity Getting Better. Early news reports say there was no apparent tampering of voting machines in the recent elections. That’s great news and is a reminder that cybersecurity has quietly gotten a lot better at protecting computer networks. There hasn’t been a big hack of corporate or government networks announced for a while. The biggest threats to computer networks continue to come from disgruntled employees or employees that inadvertently let bad actors into networks.

Growth of Video Conferencing. I don’t know how others feel, but I like video conferencing. I find it refreshing to see who I’m talking to. As a lifetime road warrior, I really like not getting on an airplane to make a presentation. We’ve learned this year that people can communicate well from a distance. I don’t know about the rest of the world, but I won’t be flying across the country without a very good reason when the pandemic is finally over – and for that I’m thankful.

It’s Almost 2021. Perhaps the best thing about 2020 is that it’s almost over and we’ll soon get a new year, and hopefully a reset. May 2021 be better for you all.

Traditional Cable Losses Slow in 2Q 2020

The largest traditional cable providers collectively lost almost 1.1 million customers in the third quarter of 2020 – an overall loss of 1.5% of customers. This is smaller than the loss in the second quarter of 1.5 million net customers. To put the quarter’s loss into perspective, the big cable providers lost 12,641 cable customers per day throughout the quarter.

The numbers below come from Leichtman Research Group which compiles these numbers from reports made to investors, except for Cox which is estimated. The numbers reported are for the largest cable providers, and Leichtman estimates that these companies represent 95% of all cable customers in the country.

Following is a comparison of the third quarter subscriber numbers compared to the end of the second quarter of 2020:

3Q 2020 2Q 2019 Change % Change
Comcast 20,094,000 20,367,000 (273,000) -1.3%
Charter 16,235,000 16,168,000 67,000 0.4%
DirecTV 13,600,000 14,290,000 (690,000) -4.8%
Dish TV 8,965,000 9,052,000 (87,000) -1.0%
Verizon 4,000,000 4,062,000 (62,000) -1.5%
Cox 3,710,000 3,770,000 (60,000) -1.6%
AT&T TV 3,500,000 3,400,000 100,000 2.9%
Altice 3,035,100 3,121,500 (86,400) -2.8%
Mediacom 663,000 676,000 (13,000) -1.9%
Frontier 518,000 560,000 (42,000) -7.5%
Atlantic Broadband 317,787 311,845 5,942 -1.9%
Cable One 277,000 290,000 (13,000) -4.5%
Total 74,914,887 76,068,345 (1,153,458) -1.5%
Total Cable 44,331,887 44,704,345 (372,458) -0.8%
Total Satellite 22,565,000 23,342,000 (777,000) -3.3%
Total Telco 7,696,000 8,082,000 (4,000) 0.0%

Some observations about the numbers:

  • The big loser is AT&T, which lost a net of 590,000 traditional video customers between DirecTV and AT&T TV (relabeled from AT&T U-verse). It’s worth noting that AT&T added 100,000 telco cable customers for the quarter.
  • The big percentage loser continues to be Frontier which lost 7.5% of its cable customers in the quarter.
  • Charter has gained cable customers for two quarters in a row. The company credits the gains to offering a lower-price package and also to a marketing campaign that is giving two months free of broadband to new customers during the pandemic. Charter has been beating the industry as a whole every quarter since Q3 2018.

The loss of traditional cable customers continues to mount at dizzying levels for the industry. This is the seventh consecutive quarter where the industry lost over one million cable subscribers. It’s especially worth noting that these losses happened during a quarter when the biggest ISPs gained over 1.5 million customers for the quarter.

One interesting thing to note is that people cutting the cord seem to be switching to online video sources that carry many of the same channels as traditional cable TV, In the third quarter the combination of Hulu + Live TV, Sling TV, and AT&T TV Now, and fuboTV collectively added over one million customers for the quarter. This count doesn’t include YouTube TV or Philo which don’t report customers quarterly. The online industry pins the increases on the return of live sports. It’s worth noting that Hulu + Live TV would now rank as the fifth largest cable provider, ahead of Verizon.

Broadband Usage Stays Strong in 3Q 2020

OpenVault recently released its Broadband Insights Report for the third quarter of 2020. OpenVault supplies software used by the companies that run the Internet and is able to provide some interesting insights into the state of broadband.

Probably the biggest news in the report is that increased household usage due to the pandemic has not abated. The average US home in September used 384 gigabytes of data, up slightly from 380 gigabytes in June, but up 40% from September 2019. Perhaps the most interesting thing about that number is that schools returned to live classes in many parts of the country in September, and yet average Internet usage did not decline.

The 384 gigabytes represent total household bandwidth usage, both upload and download combined. OpenVault reported average upload and download usage separately for the first time and reports that the average home downloaded 359 gigabytes and uploaded 25 gigabytes of data. That second number is shocking and just a year ago the average upload usage would have been a lot smaller.

Power users of the Internet remain high with 8.8% of all US households now using more than 1 terabyte of data per month, including 1% of households now using over 2 terabytes per month. This is more than double the 4.2% of households that used a terabyte of monthly data in the third quarter of 2019. This has to be good news to ISPs with data caps – most are not billing data caps during the pandemic, but they will realize significant new revenue when they go back to billing for high broadband usage.

Subscriptions to faster broadband continue to climb as households upgrade to faster broadband tiers. Since the second quarter, nationwide subscribers to gigabit broadband increased from 4.9% to 5.6% (an increase of over 875,000 new gigabit subscribers). Subscribers to speeds between 500 Mbps and gigabit grew from 5% to 5.24%, and subscribers to speeds between 200 Mbps and 500 Mbps grow from 13.5% to 14.1%.

OpenVault reports two numbers that rural America will find disheartening. They report that the average nationwide download speeds in September was 170 Mbps and the average upload speed was 13 Mbps. That average highlights better than any other statistic the sad state of rural broadband where the FCC defines broadband as 25/3 Mbps but where most rural homes fall far short of even that modest goal. It’s worth noting that the average speeds are now being influenced heavily by the households subscribing to gigabit speeds.

Remembering that OpenVault works for the largest ISPs, the report closes with a conclusion that the increased broadband usage means increased revenue opportunities for ISPs as customers migrate to faster broadband speeds and spend between $20 and $30 more per month for broadband.

The OpenVault statistics should be a reminder that broadband usage has been growing at a torrid rate for years, with residential broadband usage increasing annually by 21% for the last decade. The pandemic has accelerated that growth a bit, but to the extent that millions of workers might remain working at home after the pandemic – this one-time burst in increased usage likely represents a restart of the curve. Broadband usage has remained at 40% to 50% above 2019 levels this year, but there is no reason to think it will ever recede to past usage levels. People are going to work from home more in the future. We have all incorporated video meetings into our routines. Millions of households during the pandemic upgraded to 4K TVs and are not going back to watching lower resolution video. Higher broadband usage volumes are here to stay.

Updating FCC Device Rules

The general public probably doesn’t realize that all telecom devices must be approved by the FCC before the devices can be announced, marketed, or sold to the public. These requirements were put in place many years ago at the FCC to make certain that devices were safe, that radio devices don’t exceed legal power limits, and that radio devices use only the radio frequencies that are authorized.

Once a manufacturer has devices that are of a finished quality ready to sell to customers, the devices are sent to the FCC testing labs for approval. It’s rare for devices to fail the FCC approval process, but it does happen – and one has to suppose a manufacturer of a failed devices was hoping for a miracle by sending devices for testing.

This testing is still a vitally needed step, particularly in the wireless world. Devices that go inside central offices, huts, and cellular sites must also pass inspection by Underwriters Laboratories to makes sure the devices are safe. But wireless devices have two big concerns that must be addressed. The first is that devices stay within the spectrum bands they are supposed to use. Now that devices have software-defined antennas it would not be hard for cheap devices to stray out of the authorized band, which would cause havoc in the real world as devices interfered with licensed uses of spectrum – including uses by the military, satellites, and other key users of spectrum. Without testing it’s not hard to imagine cheap foreign cellphones that would blast out signals out of the authorized band.

The other big issue with wireless devices is the power level. We care about the power level for a bunch of reasons. First is user safety, and one of the reasons that cellphones have been declared safe over time is that they transmit at relatively low power levels. Power also defines how far a signal can propagate. If wireless devices were allowed to transmit at high power levels the signal might carry to the horizon and interfere with other uses of the frequency. Limiting the power of devices is one of the key ways that allows the FCC to define license areas for selling or awarding spectrum. The ability to limit power is probably the main reason that the FCC has been allowing rural WISPs to use some of the frequency in rural areas that sits idle. If WISPs used too much power they could be interfering with urban use of the spectrum.

The FCC rules are rigid in the process that a device manufacturer must follow. One key aspect of the FCC rules is that manufacturers are prohibited from doing pre-sales or conditional sales of wireless devices – except at the wholesale level. Apple can pre-sale a new iPhone to Verizon, but neither Apple nor Verizon can take preorders from the public. That means that the marketing effort for a new device can’t start until the device passes the FCC tests, and the devices can’t be sent for FCC testing until the devices are retail-ready.

Manufacturers are also prohibited from sending display versions of their devices to retail outlets. People want to see and touch a new cellphone before they order it, but the devices can’t be displayed in a Verizon store until they are approved for retail sales.

Manufacturers have been asking for the FCC to relax these rules so that they can market in the way that we market most things today. The testing delays may have made sense decades ago, but today it adds significant time in bringing new cellphones and other devices to market.

Cellphones are huge business today and it’s a major marketing event when Samsung or Apple announces the next generation cellphones. I have a hard time thinking why Verizon and other wireless carriers couldn’t take pre-orders for the latest phone months before the phones are ready. We now do that with a lot of big-ticket items like the Tesla Cybertruck – people are willing to get on waiting lists long before they can ever buy a new truck. We also now live in the world of Kickstarter where cool new ideas of all kinds are pre-marketed to see if there is enough marketing demand to go to manufacturing.

The big manufacturers like Samsung and Apple are never going to send a phone for FCC testing that doesn’t pass the tests – and they aren’t going to deliver phones to customers until they pass the FCC tests. It’s hard to think of any reason why the cellular carriers can’t take preorders for the latest phone. It’s hard to see what harm would come through taking orders early when customers are fully aware that they have to wait until the new phone is released.

It no longer makes sense to treat FCC-approved devices differently than other electronics. Manufacturers have asked the FCC to allow for waivers from the rules. It’s probably not a good idea to let cheap foreign cellphones be marketed until they have passed FCC muster. But it’s hard to think of any reason why the FCC should delay commerce by not allowing presales of iPhones. It’s time for the FCC rules to catch up to the realities of the marketplace.

State versus Federal Regulation

One of the oldest tugs-of-war in the regulatory world is a battle between state and federal authority to impose regulations. This has been a constant battle in telecom regulation, but also extends across the regulation of many other industries.

The latest telecom clash between state and federal regulators comes from the attempt of states to implement net neutrality rules. The first state to pass a net neutrality law was California, but this was quickly followed by net neutrality rules in Vermont, Washington, Rhode Island, New York, Montana, and Hawaii.

The California net neutrality rules closely match those that were killed by the FCC in 2017. The California laws were quickly challenged by the US Department of Justice along with some large ISPs. The federal courts upheld the FCC’s authority to kill federal net neutrality but ruled that the FCC didn’t have the jurisdiction to override state net neutrality rules.

This year several industry lobbying groups have banded together and have sued in the U.S District Court in the Eastern District of California to stop the implementation of the California law. This includes the American Cable Association, CTIA — The Wireless Association, NCTA — The Internet & Television Association, and USTelecom — The Broadband Association. Other states have put implementation of state net neutrality rules on hold waiting for the outcome of this latest case.

The line between state and federal regulatory authority has always been a fuzzy one. The plaintiffs in this case argue that the California rules are unlawful because the state is trying to regulate interstate commerce – meaning communication between California residents and Internet servers sitting in data centers in other states. They argue that only the FCC can regulate this kind of traffic.

But the line between state and interstate traffic got blurred a long time ago as telcos and ISPs have implemented centralized technologies. For example, regulators always assumed that states have the authority to regulate telephone features like Caller ID or voice mail since these are sold to accompany landlines, which has always been considered to be under state jurisdiction. However, a close look at the technology used by telcos would show that some functions supporting telephone features are handled in other states. A company like AT&T or Verizon might house the master servers for identifying the identity of calling numbers in a single data center that serves a huge swath of the country. If a computer dips into a data center in Chicago to get the name of a calling party, does that make caller ID an interstate function?

Unfortunately, technology has badly blurred the lines between telecom products that are interstate in nature versus products that are delivered locally or within a state. My guess is that there are very few telecom products left that are purely local.

This kind of jurisdictional argument also raises the specter of carriers manipulating telecom service delivery to avoid regulation. To use my caller ID example, there is nothing to stop a telco from changing the way that caller ID works if doing so can avoid regulation – put the server in another state, and voila – caller ID is an interstate service.

This raises a larger issue to consider. The blurring of state versus interstate products raises the issue of whether states even have a role in regulating telecom services. For example, there are likely almost no web products or connections that completely begin and end within a single state. If the state / interstate issue is defined in the way the plaintiffs are asking in this case, then states likely have no role left in regulating broadband products.

That may be where the courts end up on the question – but that somehow doesn’t feel right. One of the regulatory roles of states is to protect their citizens against abuses by monopolies and bad actors. In cases where the federal government fails to act, states are the last line in protecting consumer rights. I have no idea how the courts will rule in this case. But I have a hard time thinking that states can’t act to make sure that there is no discrimination in the routing of Internet traffic affecting their citizens.

Quantum Encryption

Verizon recently conducted a trial of quantum key distribution technology, which is the first generation of quantum encryption. Quantum cryptography is being developed as the next-generation encryption technique that should protect against hacking from quantum computers. Carriers like Verizon care about encryption because almost every transmission inside of our communications paths are encrypted.

The majority of encryption today uses asymmetric encryption. That means encryption techniques rely on the use of secure keys. To use an example, if you want to send encrypted instructions to your bank (such as to pay your broadband bill), your computer uses the publicly available key issued by the bank to encode the message. The bank then uses a different private key that only it has to decipher the message.

Key-based encryption is safe because it takes immense amounts of computing power to guess the details of the private key. Encryption methods today mostly fight off hacking by using long encryption keys – the latest standard is a key consisting of at least 2048 bits.

Unfortunately, the current decryption methods won’t stay safe for much longer. It seems likely that quantum computers will soon have the capability of cracking today’s encryption keys. This is possible since quantum computers can perform thousands of simultaneous calculations and could cut down the time needed to crack an encryption key from months or years down to hours. Once a quantum computer can do that, then no current encryption scheme is safe. The first targets for hackers with quantum computers will probably be big corporations and government agencies, but it probably won’t take long to turn the technology to hack into bank accounts.

Today’s quantum computers are not yet capable of cracking today’s encryption keys, but computing experts say that it’s just a matter of time. This is what is prompting Verizon and other large ISPs to look for a form of encryption that can withstand hacks from quantum computers.

Quantum key distribution (QKD) uses a method of encryption that might be unhackable. Photons are sent one at a time through a fiber optic transmission to accompany an encrypted message. If anybody attempts to intercept or listen to the encrypted stream the polarization of the photons is impacted and the recipient of the encrypted message instantly knows the transmission is no longer safe. The theory is that this will stop hackers before they know enough to crack into and analyze a data stream.

The Verizon trial added a second layer of security using a quantum random number generator. This technique generates random numbers and constantly updates the decryption keys in a way that can’t be predicted.

Verizon and others have shown that these encryption techniques can be performed over existing fiber optics lines without modifying the fiber technology. There was a worry in early trials of the technology that new types of fiber transmission gear would be needed for the process.

For now, the technology required for quantum encryption is expensive, but as the price of quantum computer chips drops, this encryption technique ought to become affordable and be available to anybody that wants to encrypt a transmission.

What Does an Administration Change Mean for the FCC?

Just as the last change in administration changed the course of the FCC, so will the swing back to a Democratic administration. If you’ve been reading me for a few years you know I am a big believer in the regulatory pendulum. Inevitably, when a regulatory agency like the FCC swings too far in any direction, it’s inevitable that it will eventually swing back the other way.

If I had to characterize the current FCC, the biggest theme of the last four years has been their stances that were exceedingly in favor of the big carriers. In ruling after ruling they helped to fulfill the wish list of the big telcos and cable companies – with nothing bigger than the nearly complete deregulation of broadband. The term deregulation isn’t even the right word because the current FCC took themselves out of the game as regulators. Chairman Ajit Pai has characterized their treatment of broadband as light-touch regulation, but it went way beyond that and the FCC eliminated its own ability to regulate broadband.

A new FCC is almost surely going to try to re-regulate broadband, and they are likely to do so by pushing for the introduction of net neutrality. This is not going to be an easy task. Obviously a new FCC can undo things done by a former FCC, but by completely writing the agency out of the regulatory game, the new FCC will have to start over from scratch. They are going to have to go through the full cycle of steps required to reintroduce any semblance of broadband re-regulation. That means adopting a policy, seeking several rounds of public comments, and then finally voting to reintroduce net neutrality and other broadband regulation. Then will come the inevitable lawsuits that will tack more time onto the process. I’m bettering we’re three years into a new FCC before we see broadband regulation back on the books.

As part of re-regulation process, a new FCC will likely put back the FCC complaint process. Most of America doesn’t realize that the current FCC does nothing with customer complaints about ISPs – complaints are simply forwarded to the Federal Trade Commission.

Hopefully, a new FCC will continue with the process of setting aside unused spectrum for rural broadband. This is one of the few areas where the current FCC stood up to big carriers – but those carriers weren’t really bothered since they don’t use most licensed spectrum in rural markets.

I’m hoping the new FCC takes a hard look at the disaster of broadband reporting and mapping. The current FCC has delayed implementation of new mapping for several years – I’ve always believed that they don’t want to see an honest count of rural homes without broadband because the numbers will be double of what the FCC claims today.

I think a new FCC will update the national definition of broadband. It’s a travesty to continue to define broadband at 25/3 Mbps when eighty percent of America buys broadband from cable or fiber companies. The main outcome of an update of the definition of broadband will hopefully be an honestly count of homes that have inferior broadband. An added bonus will be that slow broadband technologies should stop being eligible for federal grant funding. A new definition of broadband needs to recognize the new crisis of slow upload speeds that have made it so miserable for workers and students send home during the pandemic.

I hope the new FCC gets off the 5G bandwagon. The current FCC blindly followed the administration in pushing the story that America is losing the mythical 5G war. Outside of 5G the current FCC has been in favor of letting the markets solve technology issues. 5G will be whatever it’s going to be, and our national regulators should be not be pushing 5G or hindering it – they just need to stay out of the way of market progress.

Will Cable Companies Tackle Faster Upload Speeds?

The number one complaint I’ve been hearing about broadband during the pandemic is that people found that they were unable to conduct multiple online sessions for working or doing schoolwork from home. I’ve talked to a lot of people who have been taking turns using broadband, which is an unsatisfactory solution for everybody involved. This phenomenon became instantly apparent for people with slow rural broadband connections, but a lot of people in towns using cable company broadband hit the same roadblock.

Cable companies have always been stingy with upload speeds because it hasn’t really mattered to the public. Only a handful of customers who wanted to upload large data files ever cared much about upload speeds. But connecting to a school or work server or talking on Zoom requires dedicated upload connections – and when those functions suddenly became a part of daily life, people suddenly cared a lot about upload broadband speeds.

By now, most cable companies have upgraded their networks to DOCSIS 3.1. This allowed upgrades of download speeds from a maximum of perhaps 200 Mbps up to a gigabit. Unfortunately, as part of this upgrade, many cable providers did nothing to improve upload speed.

People may not realize that the signals inside of a cable company network use radio frequencies to transmit data, meaning a cable network is essentially a captive radio network kept inside of the copper coaxial wires. As such, the signals inside a coaxial system share the same characteristics as any wireless network. Higher frequencies carry more data bits than lower frequencies. All of the signals are subject to interference if external frequencies leak into the cable transmission path.

The DOCSIS specification for cable broadband sets aside the lowest frequencies in the system for upload bandwidth – the spectrum between 5 MHz and 42 MHz. This happens to be the noisiest part of cable TV frequency – it’s where outside sources like appliances or motors can cause interference with the signal inside the cable network.

The DOCSIS 3.0 specification, released in 2006 allowed for other parts of the spectrum to be used for upload data speeds, but very few cable companies took advantage of the expanded upload capability, so it’s laid dormant. This DOCSIS 3.0 standard allowed a mid-split option to increase the frequency for upload to 85 MHz. or a more-aggressive high-split option to assign all of the bandwidth up to 204 MHz to data upload. DOCSIS 4.0 is going to offer even a wider range of upload speeds, as high as 684 MHz of spectrum.

It’s been widely reported during the pandemic that millions of households have upgraded to faster broadband packages in hopes of solving the upload problem. But upgrading download speed from 100 Mbps to 200 Mbps won’t help a household if the upload path is the same with both products.

Cable companies are faced with a big cost dilemma. It’s costly to upgrade a cable network from today’s stingy upload speeds to the mid-spit or hi-split option. Rearranging how the bandwidth is used inside of a cable network means replacing many of the key components of the network including neighborhood nodes, amplifiers, and power taps. It could mean replacing all cable modems.

It’s hard to know what cable companies will do. They might be hoping that the issue blows over when people and students move back out of the home. And to some extent that could happen. We saw the average monthly download bandwidth used by homes drop this year from 404 gigabytes in March to 380 gigabytes in June after home-schooling ended for the spring school year. There is likely going to be some relief for upload bandwidth demand when the pandemic is finally over.

But there is a lot of evidence that the increased demand for upload bandwidth will never drop to pre-pandemic levels. It seems likely that millions of jobs are going to permanently migrate to the home. It seems likely that schools will more freely allow students with illnesses to keep up with schoolwork remotely.  High school students are likely to see more options for advanced placement classes online. It looks like video conferencing is here to stay.

Will cable companies make a big investment just to improve upload speeds? Most of don’t plan to upgrade to DOCSIS 4.0 until near to the end of this decade and might decide to spend no other capital until then – since that future upgrade will mean replacing all components of the network again. The cable companies have the ability to greatly increase upload speeds today – but my bet is that almost none of them will do so.

The Speed of Thought

Verizon has created a 1-hour documentary on the potential for 5G called the Speed of Thought. It’s available on Amazon Prime, on Comcast’s Peacock, as well as on Verizon FiOS on demand. Here is the trailer for the film.

It’s an interesting video that looks a decade into the future from the eyes of 5G developers. The main thrust of the video is that the future of 5G is going to offer a lot more than just faster data speeds for cellphones. The documentary looks at some specific examples of how 5G might interface with other technologies in the future to provide solutions that are not needed today.

The documentary looks at the potential for marrying 5G and augmented reality for firefighters to better let them navigate inside buildings during fire to find and save people. This will require having building plans on file for the fire department that could then be used by firefighters to navigate during the near zero visibility during a fire. I have to admit that this is a pretty cool application that would save lives if it can ever be made to work. The application requires fast wireless broadband in order to communicate a 3D image of the inside of a building in real-time.

The documentary also explores using 5G to assist in emergency medicine in remote places. In Western North Carolina where I live this is a real issue in that residents of many western counties live hours away from a hospital that could save lives for heart attacks, strokes, and accidents. The example used in the film is the use of a robot that assists with a heart procedure in San Francisco, but controlled from Boston. I have a hard time thinking that’ll we’ll ever trust broadband-enabled surgery in major hospitals since an unexpected broadband outage – something that happens far too often – means a loss of life. But the idea of being able to administer to remote heart attack and stroke victims has major potential as a lifesaver.

There is also a segment where students are taught about the civil rights movement in an interactive environment using augmented reality. I have to think this technology will be introduced first in schools which largely have been connected to gigabit fiber in most of the country. However, the idea of tying augmented reality to places like a battlefield or an art museum sounds appealing. It’s hard like immersive learning – actually seeing and participating in events – would be a much more effective way to learn than reading books.

Finally, there is a segment on a test program in Sacramento that uses 5G to provide instant feedback on traffic conditions to drivers, pedestrians, and bicycle riders. This is obviously the first generation of using 5G to create smarter or self-driving vehicles while also benefitting pedestrians and others who enter traffic lanes. Verizon has been talking about using 5G for smart cars since the earliest days of talking about 5G. There is still a long way to go, and even when this gets here it’s likely to appear in places like Sacramento and not in rural America.

The documentary is well done and ought to be interesting to anybody in the industry. But it is still an advertising piece intended to convince people that 5G is a great thing. What I don’t see in all of these applications is a giant new revenue stream for Verizon. Using augmented reality for education is likely to evolve and use landline broadband long before it’s made mobile. Applications like the one that makes life easier for firefighters are intriguing, but it’s hard to envision that as a mover and shaker of Verizon’s bottom line. I think the one that Verizon is hoping for is smart vehicles and traffic control. The company hopes that every car of the future comes with a 5G subscription. Verizon also hopes that people in the future will wear augmented reality glasses in daily life. I really like the imagery and stories told in the documentary, but I remain leery about the predictions.