Traditional Cable Losses Slow in 2Q 2020

The largest traditional cable providers collectively lost almost 1.1 million customers in the third quarter of 2020 – an overall loss of 1.5% of customers. This is smaller than the loss in the second quarter of 1.5 million net customers. To put the quarter’s loss into perspective, the big cable providers lost 12,641 cable customers per day throughout the quarter.

The numbers below come from Leichtman Research Group which compiles these numbers from reports made to investors, except for Cox which is estimated. The numbers reported are for the largest cable providers, and Leichtman estimates that these companies represent 95% of all cable customers in the country.

Following is a comparison of the third quarter subscriber numbers compared to the end of the second quarter of 2020:

3Q 2020 2Q 2019 Change % Change
Comcast 20,094,000 20,367,000 (273,000) -1.3%
Charter 16,235,000 16,168,000 67,000 0.4%
DirecTV 13,600,000 14,290,000 (690,000) -4.8%
Dish TV 8,965,000 9,052,000 (87,000) -1.0%
Verizon 4,000,000 4,062,000 (62,000) -1.5%
Cox 3,710,000 3,770,000 (60,000) -1.6%
AT&T TV 3,500,000 3,400,000 100,000 2.9%
Altice 3,035,100 3,121,500 (86,400) -2.8%
Mediacom 663,000 676,000 (13,000) -1.9%
Frontier 518,000 560,000 (42,000) -7.5%
Atlantic Broadband 317,787 311,845 5,942 -1.9%
Cable One 277,000 290,000 (13,000) -4.5%
Total 74,914,887 76,068,345 (1,153,458) -1.5%
Total Cable 44,331,887 44,704,345 (372,458) -0.8%
Total Satellite 22,565,000 23,342,000 (777,000) -3.3%
Total Telco 7,696,000 8,082,000 (4,000) 0.0%

Some observations about the numbers:

  • The big loser is AT&T, which lost a net of 590,000 traditional video customers between DirecTV and AT&T TV (relabeled from AT&T U-verse). It’s worth noting that AT&T added 100,000 telco cable customers for the quarter.
  • The big percentage loser continues to be Frontier which lost 7.5% of its cable customers in the quarter.
  • Charter has gained cable customers for two quarters in a row. The company credits the gains to offering a lower-price package and also to a marketing campaign that is giving two months free of broadband to new customers during the pandemic. Charter has been beating the industry as a whole every quarter since Q3 2018.

The loss of traditional cable customers continues to mount at dizzying levels for the industry. This is the seventh consecutive quarter where the industry lost over one million cable subscribers. It’s especially worth noting that these losses happened during a quarter when the biggest ISPs gained over 1.5 million customers for the quarter.

One interesting thing to note is that people cutting the cord seem to be switching to online video sources that carry many of the same channels as traditional cable TV, In the third quarter the combination of Hulu + Live TV, Sling TV, and AT&T TV Now, and fuboTV collectively added over one million customers for the quarter. This count doesn’t include YouTube TV or Philo which don’t report customers quarterly. The online industry pins the increases on the return of live sports. It’s worth noting that Hulu + Live TV would now rank as the fifth largest cable provider, ahead of Verizon.

Broadband Usage Stays Strong in 3Q 2020

OpenVault recently released its Broadband Insights Report for the third quarter of 2020. OpenVault supplies software used by the companies that run the Internet and is able to provide some interesting insights into the state of broadband.

Probably the biggest news in the report is that increased household usage due to the pandemic has not abated. The average US home in September used 384 gigabytes of data, up slightly from 380 gigabytes in June, but up 40% from September 2019. Perhaps the most interesting thing about that number is that schools returned to live classes in many parts of the country in September, and yet average Internet usage did not decline.

The 384 gigabytes represent total household bandwidth usage, both upload and download combined. OpenVault reported average upload and download usage separately for the first time and reports that the average home downloaded 359 gigabytes and uploaded 25 gigabytes of data. That second number is shocking and just a year ago the average upload usage would have been a lot smaller.

Power users of the Internet remain high with 8.8% of all US households now using more than 1 terabyte of data per month, including 1% of households now using over 2 terabytes per month. This is more than double the 4.2% of households that used a terabyte of monthly data in the third quarter of 2019. This has to be good news to ISPs with data caps – most are not billing data caps during the pandemic, but they will realize significant new revenue when they go back to billing for high broadband usage.

Subscriptions to faster broadband continue to climb as households upgrade to faster broadband tiers. Since the second quarter, nationwide subscribers to gigabit broadband increased from 4.9% to 5.6% (an increase of over 875,000 new gigabit subscribers). Subscribers to speeds between 500 Mbps and gigabit grew from 5% to 5.24%, and subscribers to speeds between 200 Mbps and 500 Mbps grow from 13.5% to 14.1%.

OpenVault reports two numbers that rural America will find disheartening. They report that the average nationwide download speeds in September was 170 Mbps and the average upload speed was 13 Mbps. That average highlights better than any other statistic the sad state of rural broadband where the FCC defines broadband as 25/3 Mbps but where most rural homes fall far short of even that modest goal. It’s worth noting that the average speeds are now being influenced heavily by the households subscribing to gigabit speeds.

Remembering that OpenVault works for the largest ISPs, the report closes with a conclusion that the increased broadband usage means increased revenue opportunities for ISPs as customers migrate to faster broadband speeds and spend between $20 and $30 more per month for broadband.

The OpenVault statistics should be a reminder that broadband usage has been growing at a torrid rate for years, with residential broadband usage increasing annually by 21% for the last decade. The pandemic has accelerated that growth a bit, but to the extent that millions of workers might remain working at home after the pandemic – this one-time burst in increased usage likely represents a restart of the curve. Broadband usage has remained at 40% to 50% above 2019 levels this year, but there is no reason to think it will ever recede to past usage levels. People are going to work from home more in the future. We have all incorporated video meetings into our routines. Millions of households during the pandemic upgraded to 4K TVs and are not going back to watching lower resolution video. Higher broadband usage volumes are here to stay.

Updating FCC Device Rules

The general public probably doesn’t realize that all telecom devices must be approved by the FCC before the devices can be announced, marketed, or sold to the public. These requirements were put in place many years ago at the FCC to make certain that devices were safe, that radio devices don’t exceed legal power limits, and that radio devices use only the radio frequencies that are authorized.

Once a manufacturer has devices that are of a finished quality ready to sell to customers, the devices are sent to the FCC testing labs for approval. It’s rare for devices to fail the FCC approval process, but it does happen – and one has to suppose a manufacturer of a failed devices was hoping for a miracle by sending devices for testing.

This testing is still a vitally needed step, particularly in the wireless world. Devices that go inside central offices, huts, and cellular sites must also pass inspection by Underwriters Laboratories to makes sure the devices are safe. But wireless devices have two big concerns that must be addressed. The first is that devices stay within the spectrum bands they are supposed to use. Now that devices have software-defined antennas it would not be hard for cheap devices to stray out of the authorized band, which would cause havoc in the real world as devices interfered with licensed uses of spectrum – including uses by the military, satellites, and other key users of spectrum. Without testing it’s not hard to imagine cheap foreign cellphones that would blast out signals out of the authorized band.

The other big issue with wireless devices is the power level. We care about the power level for a bunch of reasons. First is user safety, and one of the reasons that cellphones have been declared safe over time is that they transmit at relatively low power levels. Power also defines how far a signal can propagate. If wireless devices were allowed to transmit at high power levels the signal might carry to the horizon and interfere with other uses of the frequency. Limiting the power of devices is one of the key ways that allows the FCC to define license areas for selling or awarding spectrum. The ability to limit power is probably the main reason that the FCC has been allowing rural WISPs to use some of the frequency in rural areas that sits idle. If WISPs used too much power they could be interfering with urban use of the spectrum.

The FCC rules are rigid in the process that a device manufacturer must follow. One key aspect of the FCC rules is that manufacturers are prohibited from doing pre-sales or conditional sales of wireless devices – except at the wholesale level. Apple can pre-sale a new iPhone to Verizon, but neither Apple nor Verizon can take preorders from the public. That means that the marketing effort for a new device can’t start until the device passes the FCC tests, and the devices can’t be sent for FCC testing until the devices are retail-ready.

Manufacturers are also prohibited from sending display versions of their devices to retail outlets. People want to see and touch a new cellphone before they order it, but the devices can’t be displayed in a Verizon store until they are approved for retail sales.

Manufacturers have been asking for the FCC to relax these rules so that they can market in the way that we market most things today. The testing delays may have made sense decades ago, but today it adds significant time in bringing new cellphones and other devices to market.

Cellphones are huge business today and it’s a major marketing event when Samsung or Apple announces the next generation cellphones. I have a hard time thinking why Verizon and other wireless carriers couldn’t take pre-orders for the latest phone months before the phones are ready. We now do that with a lot of big-ticket items like the Tesla Cybertruck – people are willing to get on waiting lists long before they can ever buy a new truck. We also now live in the world of Kickstarter where cool new ideas of all kinds are pre-marketed to see if there is enough marketing demand to go to manufacturing.

The big manufacturers like Samsung and Apple are never going to send a phone for FCC testing that doesn’t pass the tests – and they aren’t going to deliver phones to customers until they pass the FCC tests. It’s hard to think of any reason why the cellular carriers can’t take preorders for the latest phone. It’s hard to see what harm would come through taking orders early when customers are fully aware that they have to wait until the new phone is released.

It no longer makes sense to treat FCC-approved devices differently than other electronics. Manufacturers have asked the FCC to allow for waivers from the rules. It’s probably not a good idea to let cheap foreign cellphones be marketed until they have passed FCC muster. But it’s hard to think of any reason why the FCC should delay commerce by not allowing presales of iPhones. It’s time for the FCC rules to catch up to the realities of the marketplace.

State versus Federal Regulation

One of the oldest tugs-of-war in the regulatory world is a battle between state and federal authority to impose regulations. This has been a constant battle in telecom regulation, but also extends across the regulation of many other industries.

The latest telecom clash between state and federal regulators comes from the attempt of states to implement net neutrality rules. The first state to pass a net neutrality law was California, but this was quickly followed by net neutrality rules in Vermont, Washington, Rhode Island, New York, Montana, and Hawaii.

The California net neutrality rules closely match those that were killed by the FCC in 2017. The California laws were quickly challenged by the US Department of Justice along with some large ISPs. The federal courts upheld the FCC’s authority to kill federal net neutrality but ruled that the FCC didn’t have the jurisdiction to override state net neutrality rules.

This year several industry lobbying groups have banded together and have sued in the U.S District Court in the Eastern District of California to stop the implementation of the California law. This includes the American Cable Association, CTIA — The Wireless Association, NCTA — The Internet & Television Association, and USTelecom — The Broadband Association. Other states have put implementation of state net neutrality rules on hold waiting for the outcome of this latest case.

The line between state and federal regulatory authority has always been a fuzzy one. The plaintiffs in this case argue that the California rules are unlawful because the state is trying to regulate interstate commerce – meaning communication between California residents and Internet servers sitting in data centers in other states. They argue that only the FCC can regulate this kind of traffic.

But the line between state and interstate traffic got blurred a long time ago as telcos and ISPs have implemented centralized technologies. For example, regulators always assumed that states have the authority to regulate telephone features like Caller ID or voice mail since these are sold to accompany landlines, which has always been considered to be under state jurisdiction. However, a close look at the technology used by telcos would show that some functions supporting telephone features are handled in other states. A company like AT&T or Verizon might house the master servers for identifying the identity of calling numbers in a single data center that serves a huge swath of the country. If a computer dips into a data center in Chicago to get the name of a calling party, does that make caller ID an interstate function?

Unfortunately, technology has badly blurred the lines between telecom products that are interstate in nature versus products that are delivered locally or within a state. My guess is that there are very few telecom products left that are purely local.

This kind of jurisdictional argument also raises the specter of carriers manipulating telecom service delivery to avoid regulation. To use my caller ID example, there is nothing to stop a telco from changing the way that caller ID works if doing so can avoid regulation – put the server in another state, and voila – caller ID is an interstate service.

This raises a larger issue to consider. The blurring of state versus interstate products raises the issue of whether states even have a role in regulating telecom services. For example, there are likely almost no web products or connections that completely begin and end within a single state. If the state / interstate issue is defined in the way the plaintiffs are asking in this case, then states likely have no role left in regulating broadband products.

That may be where the courts end up on the question – but that somehow doesn’t feel right. One of the regulatory roles of states is to protect their citizens against abuses by monopolies and bad actors. In cases where the federal government fails to act, states are the last line in protecting consumer rights. I have no idea how the courts will rule in this case. But I have a hard time thinking that states can’t act to make sure that there is no discrimination in the routing of Internet traffic affecting their citizens.

Quantum Encryption

Verizon recently conducted a trial of quantum key distribution technology, which is the first generation of quantum encryption. Quantum cryptography is being developed as the next-generation encryption technique that should protect against hacking from quantum computers. Carriers like Verizon care about encryption because almost every transmission inside of our communications paths are encrypted.

The majority of encryption today uses asymmetric encryption. That means encryption techniques rely on the use of secure keys. To use an example, if you want to send encrypted instructions to your bank (such as to pay your broadband bill), your computer uses the publicly available key issued by the bank to encode the message. The bank then uses a different private key that only it has to decipher the message.

Key-based encryption is safe because it takes immense amounts of computing power to guess the details of the private key. Encryption methods today mostly fight off hacking by using long encryption keys – the latest standard is a key consisting of at least 2048 bits.

Unfortunately, the current decryption methods won’t stay safe for much longer. It seems likely that quantum computers will soon have the capability of cracking today’s encryption keys. This is possible since quantum computers can perform thousands of simultaneous calculations and could cut down the time needed to crack an encryption key from months or years down to hours. Once a quantum computer can do that, then no current encryption scheme is safe. The first targets for hackers with quantum computers will probably be big corporations and government agencies, but it probably won’t take long to turn the technology to hack into bank accounts.

Today’s quantum computers are not yet capable of cracking today’s encryption keys, but computing experts say that it’s just a matter of time. This is what is prompting Verizon and other large ISPs to look for a form of encryption that can withstand hacks from quantum computers.

Quantum key distribution (QKD) uses a method of encryption that might be unhackable. Photons are sent one at a time through a fiber optic transmission to accompany an encrypted message. If anybody attempts to intercept or listen to the encrypted stream the polarization of the photons is impacted and the recipient of the encrypted message instantly knows the transmission is no longer safe. The theory is that this will stop hackers before they know enough to crack into and analyze a data stream.

The Verizon trial added a second layer of security using a quantum random number generator. This technique generates random numbers and constantly updates the decryption keys in a way that can’t be predicted.

Verizon and others have shown that these encryption techniques can be performed over existing fiber optics lines without modifying the fiber technology. There was a worry in early trials of the technology that new types of fiber transmission gear would be needed for the process.

For now, the technology required for quantum encryption is expensive, but as the price of quantum computer chips drops, this encryption technique ought to become affordable and be available to anybody that wants to encrypt a transmission.

What Does an Administration Change Mean for the FCC?

Just as the last change in administration changed the course of the FCC, so will the swing back to a Democratic administration. If you’ve been reading me for a few years you know I am a big believer in the regulatory pendulum. Inevitably, when a regulatory agency like the FCC swings too far in any direction, it’s inevitable that it will eventually swing back the other way.

If I had to characterize the current FCC, the biggest theme of the last four years has been their stances that were exceedingly in favor of the big carriers. In ruling after ruling they helped to fulfill the wish list of the big telcos and cable companies – with nothing bigger than the nearly complete deregulation of broadband. The term deregulation isn’t even the right word because the current FCC took themselves out of the game as regulators. Chairman Ajit Pai has characterized their treatment of broadband as light-touch regulation, but it went way beyond that and the FCC eliminated its own ability to regulate broadband.

A new FCC is almost surely going to try to re-regulate broadband, and they are likely to do so by pushing for the introduction of net neutrality. This is not going to be an easy task. Obviously a new FCC can undo things done by a former FCC, but by completely writing the agency out of the regulatory game, the new FCC will have to start over from scratch. They are going to have to go through the full cycle of steps required to reintroduce any semblance of broadband re-regulation. That means adopting a policy, seeking several rounds of public comments, and then finally voting to reintroduce net neutrality and other broadband regulation. Then will come the inevitable lawsuits that will tack more time onto the process. I’m bettering we’re three years into a new FCC before we see broadband regulation back on the books.

As part of re-regulation process, a new FCC will likely put back the FCC complaint process. Most of America doesn’t realize that the current FCC does nothing with customer complaints about ISPs – complaints are simply forwarded to the Federal Trade Commission.

Hopefully, a new FCC will continue with the process of setting aside unused spectrum for rural broadband. This is one of the few areas where the current FCC stood up to big carriers – but those carriers weren’t really bothered since they don’t use most licensed spectrum in rural markets.

I’m hoping the new FCC takes a hard look at the disaster of broadband reporting and mapping. The current FCC has delayed implementation of new mapping for several years – I’ve always believed that they don’t want to see an honest count of rural homes without broadband because the numbers will be double of what the FCC claims today.

I think a new FCC will update the national definition of broadband. It’s a travesty to continue to define broadband at 25/3 Mbps when eighty percent of America buys broadband from cable or fiber companies. The main outcome of an update of the definition of broadband will hopefully be an honestly count of homes that have inferior broadband. An added bonus will be that slow broadband technologies should stop being eligible for federal grant funding. A new definition of broadband needs to recognize the new crisis of slow upload speeds that have made it so miserable for workers and students send home during the pandemic.

I hope the new FCC gets off the 5G bandwagon. The current FCC blindly followed the administration in pushing the story that America is losing the mythical 5G war. Outside of 5G the current FCC has been in favor of letting the markets solve technology issues. 5G will be whatever it’s going to be, and our national regulators should be not be pushing 5G or hindering it – they just need to stay out of the way of market progress.

Will Cable Companies Tackle Faster Upload Speeds?

The number one complaint I’ve been hearing about broadband during the pandemic is that people found that they were unable to conduct multiple online sessions for working or doing schoolwork from home. I’ve talked to a lot of people who have been taking turns using broadband, which is an unsatisfactory solution for everybody involved. This phenomenon became instantly apparent for people with slow rural broadband connections, but a lot of people in towns using cable company broadband hit the same roadblock.

Cable companies have always been stingy with upload speeds because it hasn’t really mattered to the public. Only a handful of customers who wanted to upload large data files ever cared much about upload speeds. But connecting to a school or work server or talking on Zoom requires dedicated upload connections – and when those functions suddenly became a part of daily life, people suddenly cared a lot about upload broadband speeds.

By now, most cable companies have upgraded their networks to DOCSIS 3.1. This allowed upgrades of download speeds from a maximum of perhaps 200 Mbps up to a gigabit. Unfortunately, as part of this upgrade, many cable providers did nothing to improve upload speed.

People may not realize that the signals inside of a cable company network use radio frequencies to transmit data, meaning a cable network is essentially a captive radio network kept inside of the copper coaxial wires. As such, the signals inside a coaxial system share the same characteristics as any wireless network. Higher frequencies carry more data bits than lower frequencies. All of the signals are subject to interference if external frequencies leak into the cable transmission path.

The DOCSIS specification for cable broadband sets aside the lowest frequencies in the system for upload bandwidth – the spectrum between 5 MHz and 42 MHz. This happens to be the noisiest part of cable TV frequency – it’s where outside sources like appliances or motors can cause interference with the signal inside the cable network.

The DOCSIS 3.0 specification, released in 2006 allowed for other parts of the spectrum to be used for upload data speeds, but very few cable companies took advantage of the expanded upload capability, so it’s laid dormant. This DOCSIS 3.0 standard allowed a mid-split option to increase the frequency for upload to 85 MHz. or a more-aggressive high-split option to assign all of the bandwidth up to 204 MHz to data upload. DOCSIS 4.0 is going to offer even a wider range of upload speeds, as high as 684 MHz of spectrum.

It’s been widely reported during the pandemic that millions of households have upgraded to faster broadband packages in hopes of solving the upload problem. But upgrading download speed from 100 Mbps to 200 Mbps won’t help a household if the upload path is the same with both products.

Cable companies are faced with a big cost dilemma. It’s costly to upgrade a cable network from today’s stingy upload speeds to the mid-spit or hi-split option. Rearranging how the bandwidth is used inside of a cable network means replacing many of the key components of the network including neighborhood nodes, amplifiers, and power taps. It could mean replacing all cable modems.

It’s hard to know what cable companies will do. They might be hoping that the issue blows over when people and students move back out of the home. And to some extent that could happen. We saw the average monthly download bandwidth used by homes drop this year from 404 gigabytes in March to 380 gigabytes in June after home-schooling ended for the spring school year. There is likely going to be some relief for upload bandwidth demand when the pandemic is finally over.

But there is a lot of evidence that the increased demand for upload bandwidth will never drop to pre-pandemic levels. It seems likely that millions of jobs are going to permanently migrate to the home. It seems likely that schools will more freely allow students with illnesses to keep up with schoolwork remotely.  High school students are likely to see more options for advanced placement classes online. It looks like video conferencing is here to stay.

Will cable companies make a big investment just to improve upload speeds? Most of don’t plan to upgrade to DOCSIS 4.0 until near to the end of this decade and might decide to spend no other capital until then – since that future upgrade will mean replacing all components of the network again. The cable companies have the ability to greatly increase upload speeds today – but my bet is that almost none of them will do so.

The Speed of Thought

Verizon has created a 1-hour documentary on the potential for 5G called the Speed of Thought. It’s available on Amazon Prime, on Comcast’s Peacock, as well as on Verizon FiOS on demand. Here is the trailer for the film.

It’s an interesting video that looks a decade into the future from the eyes of 5G developers. The main thrust of the video is that the future of 5G is going to offer a lot more than just faster data speeds for cellphones. The documentary looks at some specific examples of how 5G might interface with other technologies in the future to provide solutions that are not needed today.

The documentary looks at the potential for marrying 5G and augmented reality for firefighters to better let them navigate inside buildings during fire to find and save people. This will require having building plans on file for the fire department that could then be used by firefighters to navigate during the near zero visibility during a fire. I have to admit that this is a pretty cool application that would save lives if it can ever be made to work. The application requires fast wireless broadband in order to communicate a 3D image of the inside of a building in real-time.

The documentary also explores using 5G to assist in emergency medicine in remote places. In Western North Carolina where I live this is a real issue in that residents of many western counties live hours away from a hospital that could save lives for heart attacks, strokes, and accidents. The example used in the film is the use of a robot that assists with a heart procedure in San Francisco, but controlled from Boston. I have a hard time thinking that’ll we’ll ever trust broadband-enabled surgery in major hospitals since an unexpected broadband outage – something that happens far too often – means a loss of life. But the idea of being able to administer to remote heart attack and stroke victims has major potential as a lifesaver.

There is also a segment where students are taught about the civil rights movement in an interactive environment using augmented reality. I have to think this technology will be introduced first in schools which largely have been connected to gigabit fiber in most of the country. However, the idea of tying augmented reality to places like a battlefield or an art museum sounds appealing. It’s hard like immersive learning – actually seeing and participating in events – would be a much more effective way to learn than reading books.

Finally, there is a segment on a test program in Sacramento that uses 5G to provide instant feedback on traffic conditions to drivers, pedestrians, and bicycle riders. This is obviously the first generation of using 5G to create smarter or self-driving vehicles while also benefitting pedestrians and others who enter traffic lanes. Verizon has been talking about using 5G for smart cars since the earliest days of talking about 5G. There is still a long way to go, and even when this gets here it’s likely to appear in places like Sacramento and not in rural America.

The documentary is well done and ought to be interesting to anybody in the industry. But it is still an advertising piece intended to convince people that 5G is a great thing. What I don’t see in all of these applications is a giant new revenue stream for Verizon. Using augmented reality for education is likely to evolve and use landline broadband long before it’s made mobile. Applications like the one that makes life easier for firefighters are intriguing, but it’s hard to envision that as a mover and shaker of Verizon’s bottom line. I think the one that Verizon is hoping for is smart vehicles and traffic control. The company hopes that every car of the future comes with a 5G subscription. Verizon also hopes that people in the future will wear augmented reality glasses in daily life. I really like the imagery and stories told in the documentary, but I remain leery about the predictions.

States Fight Back Against CAF II

Jon Brodkin of ArsTechnica wrote a recent article about the Mississippi Public Service Commission (PSC) notifying the FCC that AT&T had failed to meet its CAF II requirements in the state. AT&T had taken over $49 million per year for six years ending this December and was supposed to use that money to upgrade broadband to almost 144,000 residents in the state to at least 10/1 Mbps broadband.

The PSC notification informs that FCC that they don’t believe the upgrades have been done or that many of those homes were able to get faster broadband. AT&T has certified to the FCC that the CAF II work has been completed on schedule. AT&T has stonewalled the PSC on data requests to find out how many homes have successfully been able to access faster broadband.

The FCC is supposed to begin testing CAF II homes in 2021 and is supposed to fine the big telcos like AT&T if homes in the CAF II area aren’t getting the faster speeds. However, that testing program is badly flawed in that the telcos are going to have some say about which homes get tested, and they’ll certainly funnel the testing into places that meet the speed test.

AT&T elected to use the CAF II funding to upgrade speeds by offering fixed cellular service to customers that formerly had slow DSL service. From what I can see, AT&T has not widely advertised the new wireless product and it’s unlikely that they have added many people to the cellular technology in Mississippi or anywhere else. The company is refusing to tell the state how many homes are on the new product.

Unfortunately, what AT&T is doing in Mississippi is not unusual. AT&T took $2.57 billion nationwide for CAF II and it’s likely It hasn’t made many upgrades in other states as well. I’ve seen a lot of evidence that Frontier ($1.7 billion) and CenturyLink ($3.03 billion) have also failed to upgrade rural customers. Those two companies elected to mostly upgrade rural DSL to the faster speeds. We’ve recently had engineers in counties where Frontier and CenturyLink were supposed to make CAF II upgrades and we could find no evidence of upgraded DSL anywhere in the rural parts of these counties. We’ve also helped counties to solicit speed test from citizens and we’ve studied a number of counties where no rural DSL service tested even close to the 10/1 Mbps goal of CAF II.

To make matters even worse, the FCC recently decided to award these big telcos a seventh year of subsidy. That means AT&T will get $428 million in 2021, Frontier will get $283 million, and CenturyLink will get $506 million. The companies have no obligation for this addition funding and don’t have to use it to improve rural broadband.

While 10/1 Mbps broadband isn’t great, it’s a lot better than the DSL that was in these rural areas in 2015 when the CAF II payments began. The CAF II areas are remote and most customers who could even get DSL saw speeds under 1 or 2 Mbps download.

The impact of AT&T’s failure to make the upgrades became apparent this year when millions of students were sent home during the pandemic. A student might be able to squeak out a school connection on a 10/1 Mbps broadband connection, but students cannot function on the slower DSL that is still in place due to lack of upgrades. The actions of the FCC and the greed of the big telcos robbed millions of rural homes from getting better broadband.

The failure of CAF II rests entirely on the FCC. The last FCC under Chairman Wheeler awarded the funding to upgrade to 10/1 speeds, even though the definition of broadband at the time was 25/3 Mbps. The current FCC under Chairman Pai has turned a blind eye to the non-performance of the big telcos and absurdly is awarding them with an additional year of CAF II funding. The overall CAF II program handed out over $10 billion in funding for improving rural broadband that might as well have been flushed down the drain. The FCC could have awarded this money instead to broadband grants that could have brought better broadband in the CAF II rural areas.

I hope the Mississippi PSC does more than just write a letter. I’d like to see them ask for AT&T to refund the CAF II money to the state to use for broadband grants. And I’d love to see other states do the same and take back the billions of CAF II broadband funding that was wasted.

Broadband Interference

Jon Brodkin of ArsTechnica published an amusing story about how the DSL went out of service in a 400-resident village in Wales each morning at 7:00 am. It turns out that one of the residents turned on an ancient television that interfered with the DSL signal to the extent that the network collapsed. The ISP finally figured this out by looking around the village in the morning with a spectrum analyzer until they found the source of the interference.

It’s easy to think that the story points out another weakness of old DSL technology, but interference can be a problem for a lot of other technologies.

This same problem is common on cable company hybrid-fiber coaxial networks. The easiest way to understand this is to think back to the old days when we all watched analog TV. Anybody who watched programming on channels 2 through 5 remembers times when the channels got fuzzy or even became unwatchable. It turns out that there are a lot of different devices that interfere with the frequencies used for these channels including things like microwave ovens, certain motors like power tools and lawnmowers, and other devices like blenders. It was a common household occurrence for one of these channels to go fuzzy when somebody in the house, or even in a neighboring home used one of these devices.

This same interference carries forward into cable TV networks. Cable companies originally used the same frequencies for TV channels inside the coaxial wires that were used over the air and the low TV channels sat between the 5 MHz and 42 MHz frequency. It turns out that long stretches of coaxial wires on poles act as a great antenna, so cable systems pick up the same kinds of interference that happens in homes. It was pretty routine for channels 2 and 3, in particular, to be fuzzy in an analog cable network.

You’d think that this interference might have gone away when cable companies converted TV signals to digital. The TV transmissions for channels 2 through 5 got crystal clear because cable companies relocated the digital version of these channels to better frequency. When broadband was added to cable systems the cable companies continue to use the low frequencies. CableLabs elected to use these frequencies for the upload portion of broadband. There is still plenty of interference in cable networks today – probably even more than years ago as coaxial networks have aged and have more points for interference to seep into the wires. Until the pandemic, we didn’t care much about upload bandwidth, but it turns out that one of the major reasons that cable companies struggle to deliver reliable upload speeds is that they are using the noisiest spectrum for the upload function.

The DSL in the village suffered from the same issue since the telephone copper wires also act as a big outdoor antenna. In this village, the frequency emanating from the old TV exactly matched the frequencies used for DSL.

Another common kind of interference is seen in fixed wireless networks in a situation where there are multiple ISPs using the same frequencies in a given rural footprint. I know of counties where there are as many as five or six different wireless ISPs, and most use the same frequencies since most WISPs rely on a handful of channels in the traditional WiFi bandwidth at 2.4 MHz and 5 MHz. I’ve heard of situations where WiFi is so crowded that the performance of all WISPs suffer.

WiFi also suffers from local interference in the home. The WiFi standard says that all devices have an equal chance of using the frequencies. This means that a home WiFi router will cycle through all the signals from all devices trying to make a WiFi connection. When a WiFi router connects with an authorized device inside the home it allows for a burst of data, but then the router disconnects that signal and tries the next signal – cycling through all of the possible sources of WiFi.

This is the same issue that is seen by people using WiFi in a high-rise apartment building or a hotel where many users are trying to connect to WiFi at the same time. Luckily this problem ought to improve. The FCC has authorized the use of 6 GHz spectrum for home broadband which opens up numerous new channels. Interference will only occur between devices trying to share a channel, but that will be far fewer cases of interference than today.

The technology that has no such interference is fiber. Nothing interferes with the light signal between a fiber hub and a customer. However, once customers connect the broadband signal to their home WiFi network, the same interference issues arise. I looked recently and can see over twenty other home WiFi networks from my office – a setup ripe for interference. Before making too much fun of the folks in the Welsh village, there is a good chance that you are subject to significant interference in your home broadband today.