Beware the Taxman

As we’ve learned in the telecom industry there is nothing that local tax authorities won’t tax if they can get away with it.

The latest attempt to create a new tax comes from Dallas, Texas. The City Council there authorized a lawsuit against online video providers like Netflix and Hulu to collect franchise fees. The city not only wants to collect a 5% franchise tax moving forward but wants to also collect back taxes from these entities as far back as 2007.

The rationale behind the tax is that these providers deliver services through wires that are located at least partially in the public right-of-way. Dallas joins a few other cities like Kenner, Louisiana, and four cities in Indiana that have filed similar lawsuits.

It’s clear that the city is trying to replace lost tax revenues that come from traditional franchise taxes. These taxes were first started in the early days of the cable TV industry as a way to charge cable companies a fee to hang wires in city rights-of-ways. The taxes were pretty small in the 1970s because cable revenues were low. But communities benefitted tremendously when the cable industry exploded and began offering hundreds of channels of programming. Franchise taxes grew even faster when cable companies started raising cable rates an average of 9% annually for more than a decade as the companies passed on increases in programming fees.

Cities obviously have gotten used to this steady and growing revenue stream, but suddenly see franchise fees dropping like a rock as households cut the cord or downsize cable packages. The big cable providers have collectively shed 11.8 million customers over the last two years. Most communities have seen at least a 30% decrease in franchise fees and they know the tax revenue is going to continue to sink.

The argument that online video providers owe franchise fees is flimsy because online providers do not own any facilities in the community and do not directly use any right-of-way. But that doesn’t mean that courts won’t decide that fees paid to online providers can’t be taxed.

I’m writing about this tax because the cities winning the right to tax online video would open a huge can of worms because every Internet service that derives revenues rides through wires in public rights-of-way in exactly the same manner as online video. The city is counting on the quacks-like-a-duck argument that online video looks enough like cable service that it ought to be taxed the same – but at the end of the day, online video is delivered as a big stream of 1s and 0s like every other bit of data delivered over the Internet. If this concept passes legal muster, then everything ISPs deliver to customers could face similar taxes.

A similar taxation scheme has been hatched in Maryland to tax all Internet advertising. In February, a wide coalition of parties ranging from Amazon and Google but also including the U.S. Chamber of Commerce has sued the state to kill this new tax.

The Maryland tax only applies to online advertising and doesn’t try to tax ads in newspapers, TV, or the radio. Opponents say this is an open attempt to tax out-of-state companies. They also argue that the tax violates the ban on taxing the Internet that has been passed and renewed by Congress.

It seems likely that states will continue to pursue more esoteric taxes over the next few years. The pandemic has hit states erratically and some states are flush with revenues this year while others are hurting badly. Expect the states with diminished tax revenues to overturn every rock and try every idea that will bring in new tax revenues.

Broadband and Water Systems

We always hear about smart grid technologies for electricity but rarely hear any mention of using smart technology to improve our water systems. Our water systems need help. The American Society of Civil Engineers has given the country’s water systems a grade of D- for the last decade. It’s been estimated that as much as 50% of water is lost to leaks in some systems, with even good systems losing as much as 20% of water. Major cities in the US like San Diego, Las Vegas, and San Antonio have had water scares in the recent past. In the telecom industry, we talk about outdated copper networks built in the 1970s but there are major water arteries in many communities that go back to the Civil War. My city of Asheville, NC recently had a major water project to replace wooden water pipes used in the original water routes from the local reservoir.

The good news is that there are tools that can help to pinpoint the biggest problems in water systems. Just like an electric grid, the first step in developing a smart water system requires placing numerous sensors throughout a city to gather information on water flow and pressure. At the simplest level, once engineers understand the normal water flow in a system, then any deviations and drops in water pressure are an immediate signal that there is a new leak in the system.

A second step is to upgrade to more accurate water meters. Engineers have estimated that as many as 40% of the meters used to serve high-volume commercial customers under-report the amount of water being used, and consequently underbill for water usage. Old and inaccurate meters can also disguise leaks.

Once more accurate water meters are installed, it becomes possible to start identifying long-standing water leak problems. It’s possible to build computer models for small sections of a water system to be able to compare the amount of water entering a neighborhood compared to what is reaching meters. This allows a city to rank and compare neighborhoods and identify the parts of a city with the highest amount of wasted water. After that comes the effort needed to identify the location of leaks. Sensors and computer modeling can be applied to smaller sections of a neighborhood to better pinpoint leaks.

There are also new techniques being developed. Daniel Tartakowsky, a professor of energy resources at Stanford recently published a paper suggesting new techniques to specifically identify the location of leaks. Even after computer modeling has identified a street with a major leak, the cost of pinpointing a leak located under city streets can be costly. Nobody wants to dig up whole city blocks looking for wet soil.

The proposed technique uses an older concept known as a water hammer. This has been used for years and is done by quickly turning off water and using sensors to gather data about how the shock wave and vibrations from turning off water propagate through pipes. Such vibrations tend to stop or be altered when hitting damaged pipes. The traditional water hammer technique involved time-consuming and costly calculations. The new technique proposed by Dr. Tartakowsky uses a computer model that can be handled with a laptop that can interpret the results from a water hammer test quickly to pinpoint the location of a leak, often within ten meters.

All of this new technology benefits from a city that has ubiquitous broadband capabilities to closely and accurately monitor the the numerous sensors needed to establish a smart water system. This is something that is sorely needed. Cities everywhere worry about having high water rates, which can be driven by processing and providing a lot of clean water that is then wasted in leaks.

 

The WiFi 6 Revolution

We’re edging closer every day to seeing WiFi 6 in our homes. WiFi 6 will be bolstered by the newly approved 6 GHz frequency, and the combination of WiFi 6 and 6 GHz spectrum is going to revolutionize home broadband.

I don’t think many people understand how many of our home broadband woes are caused by current WiFi technology. WiFi has been an awesome technology that freed our homes from long category 5 wires everywhere, but WiFi has a basic flaw that became apparent when homeowners started to buy hordes of WiFi-enabled devices. WiFi routers are lousy at handling multiple requests for simultaneous service. It’s not unusual for 25% or more of the bandwidth in a home to get eaten by WiFi interference issues.

The WiFi standard was designed to give equal opportunity to any device to use a broadband network. What that means in practical use is that a WiFi router is designed to stop and start to give every broadband device in range a chance to use the available spectrum. Most of us have numerous WiFi devices in our home including computers, tablets, TVs, cellphones, and a wide range of smart home devices, toys, etc. Behind the scenes, your WiFi router pauses when you’re downloading a big file to see if your smart thermostat or smartphone wants to communicate. This pause might seem imperceptible to you and happens quickly, but during the time that the router is trying to connect to your thermostat, it’s not processing your file download.

To make matters worse, your current WiFi router also pauses for all of your neighbor’s WiFi networks and devices. Assuming your network is password-protected, these nearby devices won’t use your broadband – but they still cause your WiFi router to pause to see if there is a demand for communications.

The major flaw in WiFi is not the specification that allows all devices to use the network, but the fact that we currently try to conduct all of our WiFi communications through only a few channels. The combination of WiFi 6 and 6 GHz is going to fix a lot of the problems. The FCC approved 6 GHz frequency for WiFi use in April 2020. This quadruples the amount of bandwidth available for WiFi. More importantly, the new spectrum opens multiple new channels (adds fourteen 80 MHz channels and seven 160 MHz channels). This means homes can dedicate specific uses to a given channel – direct computers to one channel, smart TVs to another, cellphones to yet another channel. You could load all small bandwidth devices like thermostats and washing machines to a single channel – it won’t matter if it’s crowded for devices that use tiny bandwidth. Separating devices by channel will drastically reduce the interference and delays that come from multiple devices trying to use the same channel.

The introduction of WiFi 6 is going to require the introduction of devices that can use the WiFi 6 standard and that enable the 6 GHz spectrum. We’re just starting to see devices that take advantage of WiFi 6 and 6 GHz in stores.

It looks like the first readily available use of the new technology is being marketed as WiFi 6E. This application is being aimed at wireless devices. Samsung has released WiFi 6E in the Galaxy S21 Ultra phone. It’s been rumored that WiFi 6E will be in Apple’s 1Phone 13 handsets. Any phone using Qualcomm’s FastConnect 6700 or 6900 chips will be able to use the 6 GHz spectrum. That’s likely to include laptop computers in addition to cellphones.

It’s going to take a while to break the new technology into practical use. You can buy routers today that will handle WiFi 6E from Netgear and a few other vendors, meaning that you could use the new spectrum at home for smartphones and devices with a 6E chip. The advantage of doing so would be to move cellphones off of the spectrum being used for applications like gaming, where WiFi interference is a material issue. The new WiFi 6E chips will also handle bandwidth speeds greater than 1 Gbps, which might benefit a laptop but is largely lost on a smartphone. It’s going to be a while until WiFi 6 is available at work or in the public – but over a few years, it will be coming.

The home WiFi network of the future is going to look drastically different than today’s network. One of the downsides of the 6 GHz spectrum is that it doesn’t travel as well through walls as current WiFi, and most homes are going to have to migrate to meshed networks of routers. Smart homeowners will assign various devices to specific channels and I assume that router software will make this easy to do. Separating WiFi devices to different channels is going to eliminate almost all of the WiFi interference we see today. Big channels of 6 GHz spectrum will mean that devices can grab the bandwidth needed for full performance (assuming the home has good broadband from an ISP).

Changing the Definition of Broadband

A group of US Senators recently sent a letter to the FCC asking to raise the definition of broadband to 100/100 Mbps. This speed has been discussed for several years as the logical step forward from the current 25/3 Mbps speed set by the FCC in 2015. It’s clear to everyone in the industry that homes are using a lot more broadband than they did in 2015 – with the biggest change being simultaneous uses of multiple broadband streams in the typical home.

I thought I’d discuss just what it would mean to change the definition of broadband. I think the change in broadband definition would trigger the following:

  • This would make it clear that DSL is an obsolete technology. We’d no longer have to worry about the big telcos that stretch the truth by claiming 25/3 Mbps speeds on DSL to stave off federal grants to overbuild them. No DSL will meet the 100/100 Mbps test and DSL will automatically be considered as an obsolete technology not capable of delivering modern broadband. Any area served by rural DSL should automatically be eligible for federal grants.
  • A higher definition of speed also declares other technologies to be inadequate. This eliminates high-orbit satellites from consideration for grant funding – something that should never have been allowed as was done in the CAF II reverse auction. This new definition would declare that older versions of fixed wireless technology are obsolete and would require WISPs to upgrade to new technology if they want to be considered for grant funding. This also kills the idea that WISP networks that have multiple wireless backhaul hops are adequate – only fiber-fed radios can meet the needed speeds.
  • This would put cable companies on the hot seat because many cable systems are not capable of 100 Mbps upload speeds. Most big cable companies did not bother to upgrade the upload portion of the network when they upgraded to DOCSIS 3.1. Cable companies that stick to the older DOCSIS 3.0 technology will fail this new FCC speed for uploading. Expect cable companies to fight fiercely against increasing the definition of broadband. If this new speed is adopted, expect to see cable companies quietly completing the mid-split upgrades to improve upload speeds – something they all should have automatically done when it was clear that poor upload speeds were the primary culprit in homes struggling during the pandemic. A change in the definition of broadband could goal cable companies into doing what they should have done as good corporate citizens.
  • This might also be a problem for the low orbit satellite companies. There are already some early beta tests results from Starlink that could pass the speed test – but many early speed tests do not. But the technology is still in beta testing and if Elon Musk is being truthful that the download speeds will soon be 300 Mbps, then 100/100 Mbps might be a passable hurdle. However, critics of Starlink say that speeds are going to bog down tremendously when more customers are added to the satellite networks. We will probably find out more about Starlink’s likely speeds if Starlink pushes back against a faster definition of broadband.
  • This definition would mean that most of rural America would rightfully be declared to not have broadband. Homes served by fiber pass the test. Homes served with WISPs with the latest technology, from towers fed by fiber, and within 2-3 miles of good line-of-sight with the transmitter can pass the test. Everything else used to provide broadband in rural America would no longer be considered as broadband.
  • This drastically changes the picture for federal grants. Today, huge swaths of rural America were denied RDOF grants because telcos lied about the speed capability of rural DSL. A higher definition of broadband speeds will paint a whole new picture where the vast majority of rural America should be eligible to get broadband grant assistance.
  • This also drastically changes the reporting to Congress on the state of US broadband. Recall that this reporting was the original reason that the FCC established a definition of broadband. Overnight we’d go from 10s of millions of homes without good broadband to potentially hundreds of millions – if the cable companies truthfully report on upload speed capability. That would paint the picture that every broadband consultant, engineer, and policy person already knows – that much of America is unhappy with their current broadband. The 100/100 Mbps definition of broadband would align the FCC with the public perception of what is acceptable broadband.

Broadband Shorts for March 2021

Following are a few broadband topics that I found of interest but that are too short for individual blogs.

The End of Project Loon. The Google parent firm Alphabet has killed project Loon. This was the attempt to use a fleet of balloons to bring broadband to remote places. The project was started 9 years ago and spun off as a separate company two-and-a-half years ago.

It’s a little surprising because Loon had some successes. Loon had raised $219 million of equity from SoftBank in 2019. Loon was able to bring some broadband and cellular coverage to Puerto Rico after the devastating hurricanes. Loon was recently approved by the government of Kenya to bring broadband to remote areas. The company’s stated goal was to bring Internet access to a billion people.

There are likely a few contributing factors to the decision. One is the pending ascension of satellite broadband. Google also faced fierce pushback in places like India that didn’t want broadband fostered by a big US company. It was also likely coming clear that it’s hard to base a company on providing subsidized broadband – that means lining up a lot of governments to pay the subsidies.

Surprising Success of Telehealth. Parks Associates conducted a study that shows that 41% of all US households took part in a telehealth visit in 2020. Further, 29% of homes say they are likely to engage in telehealth in 2021. About half of all kids under 18 have a high degree of interest in permanently adopting telehealth.

The survey also showed overall high satisfaction with the technical performance of telehealth. This is somewhat surprising since the vast majority of medical professionals scrambled to institute telehealth last spring. A majority of medical practitioners also expressed satisfaction with telehealth – 65% of healthcare organizations rate the 2020 telehealth delivery as a success and 94% plan to continue offering telehealth services.

While this is only one survey when added to everything else being published about telehealth it looks like this is something that’s going to stick.

Popularity of Working from Home. Masergy, a supplier of managed SD-Wan software, undertook a survey that showed that about two-thirds of knowledge workers report being happier working from home. While employees were forced to work at home due to the pandemic, a lot of those working from home expressed a strong preference to never return to the office environment. Reasons given included increased productivity and the avoidance of the commute. Many of those working at home are comfortable with the idea of a hybrid schedule that puts them in the office occasionally, but mostly working from home.

From the employer’s perspective, the biggest challenge of 2020 has been security. But ISPs and software firms have developed solutions that seem to be working for most companies.

This has a lot of implications for both broadband and for corporations that employ knowledge workers. For ISPs, this means continued demand for upload bandwidth – something that I think cable companies were hoping would fade away with the end of the pandemic. This also puts pressure on employers, because workers that prefer working from home are going to migrate to corporations that embrace the idea – while ones that don’t might have trouble finding the best talent.

Really? There is a seller on eBay apparently successfully marketing 5G repellent cream. Without even mentioning the size of the jar, they are selling lotion for $36 under the brand name ‘5 Guard’.  I find this to be funny in that the 5G radiation that most scares some people is millimeter-wave spectrum. This spectrum can’t penetrate human skin more than perhaps a cell or two deep. I’m sure this isn’t the first G repellant – just the first one I ran across.

No Longer News – Big Telcos Ignore Copper

In news that falls under the category of ‘why did they even bother’, the California Public Utilities System released a report that details how the big telcos in the state have allowed the copper networks to deteriorate. The report was originally written in 2019, but as tends to happen in California, the telcos were able to quash the report for a few years by claiming it contains proprietary data.

The report is a great, but sad read for anybody that wants to hear the story of the slow death of copper networks. My first reaction on reading the report was to ask why the CPUC is even pretending that it has any regulatory authority over copper networks because AT&T and Frontier largely ignore the Commission on anything related to copper.

This is a report that could have been written in almost any state because copper is dying regardless of whether the telco is AT&T, Frontier, CenturyLink, Windstream, Consolidated, or others. The report is a good reminder that the neglect that is now blamed on Frontier in California was mostly due to Verizon, which owned the copper networks until 2016.

The CPUC documents situations that are familiar to customers on copper networks everywhere. AT&T and Frontier failed to meet the CPUC’s goal of fixing 90% of reported troubles within 24-hours. Amazingly, AT&T reports hitting that goal in 2 out of the 96 months covered by the CPUC report. But even that hides the fairly common practice of telling customers that troubles can’t be fixed and cutting them off the network.

The companies have raised rates on telephone service. AT&T raised rates for landline telephone between 2006 and 2018 from $10.69 to $27, allowing the company to maintain revenue on copper even as demand dropped. Measured rate service, which is supposed to be a lot cheaper for homes that don’t make many calls increased from $5.70 in 2006 to $24.25 in 2018.

The CPUC report also hints that AT&T is likely redlining poorer neighborhoods in areas where it has upgraded to fiber. Areas with fiber have 2010 median household incomes of $72,024 while areas still on copper have household incomes of $60,795. These comparisons are a bit hard to make because AT&T is not converting whole neighborhoods to fiber – just selected small pockets.

Since the report was written the AT&T situation has gotten worse when the company decided in October of 2020 to step selling DSL to new customers. Customers with DSL are allowed to keep service for now, but this means that AT&T is 100% ceding the markets where there is a cable company and is leaving rural households with few broadband options beyond satellite. This has to be a move by AT&T to start the process of walking completely away from copper. Frontier has promised to expand its fiber footprint as a condition for coming out of bankruptcy, but this is likely going to happen in cities, towns, and suburbs and not in rural areas.

It’s time for state regulators to stop the gnashing of teeth over copper networks. The service sucks and is going to get worse and worse until the networks die. If states had any real regulatory authority over copper networks, they’d confiscate them from the telcos and sell them to somebody else for a dollar. Any buyer would do a better job of keeping on the lights than the big telcos – and a buyer would likely use the remaining copper revenues to fund a conversion to fiber.

Is Fiber a Hundred Year Investment?

I think every client who is considering building a fiber network asks me how long the fiber is going to last. Their fear is having to spend the money at some future point to rebuild the network. Recently, my response has been that fiber is a hundred-year investment – and let me explain why I say that.

We’re now seeing fiber built in the 1980s becoming opaque or developing enough microscopic cracks that impede the flow of light. A fiber built just after 1980 is now forty years old, and the fact that some fiber routes are now showing signs of aging has people worried. But fiber cable is much improved over the last forty years and fiber purchased today is going to avoid many of the aging problems experienced by 1980s fiber. Newer glass is clearer and not likely to grow opaque. Newer glass is also a lot less susceptible to forming microcracks. The sheathing surrounding the fiber is vastly improved and helps to keep light transmissions on path.

We’ve also learned a lot about fiber construction since 1980. It turns out that a lot of the problems with older fiber are due to the stress imposed on the fiber during the construction process. Fiber used to be tugged and pulled too hard and the stress from construction created the places that are now cracking. Fiber construction methods have improved, and fiber enters service today with fewer stress points.

Unfortunately, the engineers at fiber manufacturers won’t cite a life for fiber. I imagine their lawyers are worried about future lawsuits. Manufacturers also understand that factors like poor construction methods or suffering constant fiber cuts can reduce the life of a given fiber. But off the record, I’ve had lab scientists at these companies conjecture that today’s fiber cable, if well handled, ought to be good for 75 years or more.

That still doesn’t necessarily get us to one hundred years. It’s important to understand that the cost of updating fiber is far less than the cost of building the initial fiber. The biggest cost of building fiber is labor. For buried fiber, the biggest cost is getting the conduit into the ground. There is no reason to think that conduit won’t last for far more than one hundred years. If a stretch of buried fiber goes bad, a network owner can pull a second fiber through the tube as a replacement – without having to pay again for the conduit.

For aerial fiber, the biggest cost is often the make-ready effort to prepare a route for construction, along with the cost of installing a messenger wire. To replace aerial fiber usually means using the existing messenger wire and no additional make-ready, so replacing aerial fiber is also far less expensive than building new fiber.

Economists define the economic life of any asset to be the number of years before an asset must be replaced, either due to obsolescence or due to loss of functionality. It’s easy to understand the economic life of a truck or a computer – there comes a time when it’s obvious that the asset must be replaced, and replacement means buying a new truck or computer.

But fiber is a bit of an unusual asset where the asset is not ripped out and replaced when it finally starts showing end-of-life symptoms. As described above, it’s much cheaper than the original construction costs to bring a replacement fiber to an aerial or buried fiber route. Upgrading fiber is more akin to upgrading a properly constructed building – with proper care buildings can last for a long time.

Many similar utility assets are not like this. My city is in the process today of upgrading a few major water mains that unbelievably used wooden water pipes a century ago. Upgrading the water system means laying down an entirely new water pipe to replace the old one.

It may sound a bit like a mathematical trick, but the fact that replacement of fiber doesn’t mean a 100% replacement cost means that the economic life is longer than with other assets. To use a simplified example, if fiber needs replacement every sixty years, and the cost of the replacement requires only half of the original cost, then the economic life of the fiber in this example is 120 years – it takes that long to have to spend as much as the original cost to replace the asset.

I know that people who build fiber want to know how long it’s going to last, and we just don’t know. We know if fiber in constructed properly that it’s going to last a lot longer than the 40-years we saw from 1980 fiber. We also know that in most cases that replacement doesn’t mean starting from scratch. Hopefully, those facts will give comfort that the average economic life of fiber is something greater than 100 years – we just don’t know how much longer.

2,000 Blogs and Counting

Doug Dawson, 2017

I’m taking a short pause from broadband issues because today is blog number 2,000. I look at that number and I have no idea how it happened. I’ve published almost every business day since March 2013 – only missing a few days when I was sick and a few times due to technical snafus. If you told me in 2013 that I’d still be doing this every day in 2021 I would have laughed at such a crazy idea.

I started writing the blog at the urging of my wife Julie. Back in 2013, I told her that I was having trouble keeping up with everything that was going on in the industry. That’s something most people can sympathize with – there are multiple headlines in our industry every day. Julie  knew I was an okay writer and suggested the blog as a way to force myself to keep up with industry events.

In the beginning, I was writing only to myself, and the blog was a resource for me to store my interpretation of industry news as a resource for later use. But somehow, I started to pick up readers. I’ve never advertised the blog other than to refer to it at the end of my emails, but month after month and year after year the daily readers have grown.

Once in a while, a blog goes viral. The most readers I got was over 40,000 in a day when I wrote an article wondering how well Starlink would perform. The blog ended up on Reddit and I got a huge number of comments from Elon Musk fans who made rude references about my lineage. The silly thing about the reader reactions was that most comments were based strictly on the headline and it was clear that few commenters had actually read the blog – I agreed with most of the comments except the parts about my lineage. I pulled that blog down to cut off the nonsense, but it gave me an appreciation of what it must be like to be an actual journalist. My second most popular blog talked about how squirrels and gophers chew through fiber – the comments were a lot nicer!

I still write the blog as a way to force myself to keep up. It’s a busy time to be a broadband consultant and I could easily get lost in work and ignore the industry around me. To some extent I’m still writing to myself, which is why you’ll sometimes see blogs full of statistics – these blogs help me store facts I know I’m going to want to use later.

I think my favorite aspect of the blog is that it has led to meeting some of the most interesting people in the industry. I seem to meet somebody new through the blog almost weekly.

Readers often make my life easier because they often send me links to interesting things that I would never see otherwise. I get links to small-town newspaper articles or whitepapers published overseas that would never otherwise come to my attention.

Another interesting aspect of the blog is that I have readers worldwide. Just yesterday I had readers in India, Canada, the UAE, the UK, South Africa, Oman, Australia, Kenya, the Philippines, South Korea, Germany, Ukraine, Portugal, Turkey, and Mauritius. My blogs on the nuances of US regulations or FCC actions are probably baffling to these readers, but a lot of my blogs talk about the problems suffered by lack of good broadband – a problem all around the globe.

I’m also read by few college students who want to know more about the broadband industry. They often send me great questions, which I try to answer when I can.

Probably the biggest change in the blog is that over time I’ve found a voice. As interesting as the industry is, we have a lot of problems. The pandemic made it clear that there are still far too many people without good broadband. The biggest ISPs could do a lot better and often do more harm than good. Regulators often do puzzling things. I no longer shy away from giving my opinion on such topics. I don’t think for a second that I am moving the meter on any topic, but I hope that it’s valuable for readers to hear a perspective that’s not published in many other places. The things I say in the blog mirror the conversations I have with clients and peers. Mostly I hope that I am helping to inform people who live in places that need better broadband and encouraging them that there are solutions if they keep plugging away.

I don’t know how much longer I will keep writing every day. But I still enjoy the daily break to write a blog, and starting tomorrow I’m aiming at blog number 3,000!

Broadband and Bad Actors

I wrote a sentence in a blog the other day that stuck with me: “Fiber is not automatically a great technology – it can be a great technology when operated by a great ISP.” It’s becoming clearer to me over time that much of our broadband grief in this country comes from what I call bad actors. By this, I mean ISPs that put quarterly earnings above customers. Almost all of the troubles we see with rural broadband comes directly from the behavior of bad actors.

The original bad actors are the big telcos. These companies decided decades ago that they were going to stop maintaining rural networks – and they did so with a vengeance. They cut back drastically on the rural technician workforce. They closed businesses offices everywhere so that customers had to talk with people who had no idea where they live. Year after year, the telcos reduced the amount of maintenance capital that could be used to fix customer problems until the budgets are nearly zero.

Why do I call the big telcos bad actors? They purposefully let copper networks deteriorate. Each big telco made a decision in the boardroom to milk revenues out of remaining copper customers while spending the bare minimum amount of money needed to keep the lights on.

We know big telcos could have done much better. Many of the small telcos across the country took a different path. Many of them worked to keep copper in good operating condition and upgraded DSL technology several times to bring rural DSL download speeds between 10 Mbps and 30 Mbps. These companies also didn’t stop doing maintenance or cut back on customer service. We saw the same thing in places like Germany where the big telcos have gotten the most possible out of old copper and DSL.

What really made the big telcos into bad actors was when they took huge amounts of funding to make DSL better and largely pocketed the money instead of providing upgrades. I’ve written about the disaster of the CAF II program many times. The FCC should never have given this money to the telcos, but when they did, the big telcos purposefully misused the funds – that’s being a bad actor.

The big telcos aren’t the only bad actors. The big cable companies are slipping into bad actor mode. The cable companies had to compete against DSL for the first decade or so after introducing broadband, but they’ve won that battle now and are becoming monopolies – in many markets where AT&T walked away from DSL recently, the cable company is now a monopoly.

There are a number of reasons why I now consider big cable companies to be bad actors. Their billing practices are deceptive and a significant portion of every bill is now buried in hidden fees. The cable companies use data caps that punish homes for using the bandwidth they’ve purchased. The cable companies have notoriously dreadful customer service, the worst of any other corporations in the US – the public likes the IRS and funeral homes more than they like Comcast.

The big cable companies are also headed down the path of neglecting networks. In every city I’ve studied, there are some neighborhoods that have sluggish broadband – and the cable companies don’t bother spending the money to fix problems. The cable companies elected to not implement the upgrade for upload speeds that were part of DOCSIS 3.1 and are quietly ignoring the upload crisis right now. If the cable companies cared about customers more than the bottom line, they would profusely apologize about the upload performance that has crippled millions of homes during the pandemic and would be doing everything in their power to boost upload speeds – instead, I hear only crickets.

This all matters because the federal government continues to give money to the bad actors. The FCC is flowing an additional $1.5 billion to the big telcos this year from the CAF II program. Big telcos and cable companies have won billions in the recent RDOF auction. There is some chance that those companies won’t build everything they should or will cut corners and build the absolute minimum needed to fulfill the grant requirements. But even if they build what they’ve promised, they are likely going to continue taking advantage of customers. Is there any reason to think that a big telco is going to take any better long-term care of a rural fiber network than it has done with copper? It’s not hard to predict that bad actors still won’t fund sufficient technicians or enough maintenance capital. Unfortunately, but bad actors are unlikely to get better. If I had a magic wand the bad actors would never see another penny of federal subsidy – they’ve shown repeatedly that they don’t deserve it.

California Net Neutrality Rules Go Into Effect

A federal judge in the Eastern District of California has allowed the California net neutrality law to go into effect. The law was passed in 2017, soon after the FCC killed federal net neutrality along with broadband regulation. Here is a copy of the California statute from 2018.

The California net neutrality law was met immediately by a suit by the US Justice Department that said that California didn’t have the authority to pass a law that impacts interstate commerce. The California law was also challenged in court by America’s Communications Association, CTIA, the NCTA, and US Telecom on behalf of the biggest ISPs in the country. That lawsuit claimed that the California law was a “classic example of unconstitutional state regulation”.

The court placed a stay on the implementation of net neutrality until the lawsuit was resolved. The appeal of the case has bogged down in court and not much progress was made in resolving the issue. Recently, the US Department of Justice withdrew its objections to the California law, leaving only the big ISP suit. Judge John A. Mendez of the federal courts decided to lift the injunction on the law after the Department of Justice withdrew its objection to the law.

Theoretically, the law will go into effect immediately. However, the four telecom associations that brought the original suit may still try to get another injunction against the law going into effect.

Judge Mendez scolded Congress for not dealing with the issue, “When you have to deal with legislation drafted in 1934 in 2021, I don’t think anyone is well served …That is Congress’ job. They have to keep up with what is going on in the real world.” This is a sentiment that almost anybody following broadband regulation will mirror. The judge also said that lifting the injunction was done strictly on legal grounds and that nothing political should be read into his decision.

It’s going to be interesting to see how the big ISPs deal with this ruling. They can’t ignore it since California would be the fifth-largest economy in the world if it was a standalone country. It’s going to be challenging for ISPs to act one way in California and a different way everywhere else. Even if they somehow try to do that, there are several other states that have also passed new net neutrality rules that have been put on hold waiting for the resolution of the California case. It’s not hard to envision dozens of slightly different sets of net neutrality rules.

The California law largely mimicked the original FCC net neutrality rules. It included things like an injunction against paid prioritization. It would mean the end of zero-rating where a company can impose data caps while excluding it’s own content from the practice. One of my favorite aspects of the law is that it forces ISPs to tell customers the actual data speeds they are likely to receive – which is probably the part of the net neutrality order that carriers dislike the most.

This is a nightmare scenario for the big ISPs. The thought of having different net neutrality rules in each state would be a regulatory nightmare, and likely unworkable if the various states adopt different versions of the rules. But ultimately, the fault for this can be laid at the feet of the big ISPs who put tremendous effort into killing the federal rules. Ironically, the CEOs of all of the big ISPs have been on record saying that they could live with net neutrality. But federal net neutrality was killed as part of the effort to kill FCC regulation of broadband. It would be ironic if the big ISPs at some point have to lobby for federal net neutrality rules rather than face multiple state rules.