Why No Outcry about AT&T DSL?

I’ve been a little surprised that there hasn’t been any regulatory reaction to AT&T pulling out of the DSL market last October. The company stopped taking orders for DSL connections. I’ve heard instances where the company won’t even connect a voice customer on copper. I’ve heard stories that once DSL is disconnected for any reason that the company won’t reconnect it. If somebody buys a house served only by AT&T, the new owner can’t get DSL, even if the old owner had it. If somebody gets disconnected for late payment, they aren’t being allowed to reconnect. I heard a few stories lately of customers who had technical trouble with DSL and were told that the company can’t and won’t fix it.

This is a huge change for customers. In towns where there is a cable competitor, AT&T has suddenly made the cable company into a de facto monopoly as the only broadband option for a new customer. In rural areas where there is no cable competitor, the change strands homes with no landline alternative. AT&T says publicly that rural DSL is being replaced by fixed cellular broadband – but it seems like the wireless product is far from universally available. Many homes are left with no alternative other than satellite broadband, assuming they have a decent view of the sky.

I really thought that at least a few states would react to the change. Just a year ago, the Public Service Commission in New Mexico took CenturyLink to task for doing a poor job for copper customers. I expected at least a few states to be up in arms about AT&T. Perhaps state regulators have finally given up on telephone copper and are getting realistic about the coming end of the copper networks. Or perhaps AT&T has a strong enough lobbying effort in states to stave off a public enquiry.

AT&T took a different approach than Verizon, which has been following the formal rules for turning down copper networks. Verizon goes through the process of notifying all customers that they are withdrawing copper services, and months later cuts the copper dead. AT&T is not going out of service yet. Customers with DSL are allowed to keep the service. But give the company the slightest reason to disconnect, and AT&T is gone. The company is withdrawing from copper through attrition instead of a formal withdrawal. It’s an interesting tactic because it doesn’t trigger any of the regulatory rules associated with fully walking away from copper. It’s pretty clear that this is AT&T’s first shot and that the day will come when they’ll disconnect the remaining customers and walk away from the copper business entirely.

I’ve been hearing similar stories for several years about CenturyLink and Frontier. Customers are put on a years-long wait list to get new service. Customers are often told that problems can’t be fixed, and the telco walks away instead of repairing problems. But those two companies have not formally stopped providing DSL to new customers in the same manner as AT&T.

The reaction that I was expecting was for states to make AT&T prove that it has a fixed cellular alternative in an area before refusing customers on copper. That would be hard for AT&T to do since many rural areas have poor or no cellular service.

The biggest impact of this change is not in rural areas. AT&T’s rural DSL in most places has been so slow that it barely works. Many rural homes walked away from DSL years ago since the monthly rate couldn’t be justified with the tiny bandwidth being delivered. The bigger impact from this change comes in cities and towns where AT&T is the telephone incumbent. DSL has retained a decent market share in many towns because it’s cheaper than cable broadband. In the many towns and cities that AT&T serves, the company has taken away the low-price option from the market. I’m sure that potential customers are still being surprised when they find out that the DSL option is off the table. I haven’t seen any noticeable reaction from the cable companies in these markets, but by now, they realize they are a monopoly. We know over time this will mean a slow return to monopoly practices – higher prices, less attention to maintenance, slower repair times, and less incentive to upgrade.

The Senate Broadband Grants

Now that it looks like the House might sign an infrastructure bill, I figured it was time to take a harder look at the $42.45 billion Senate broadband grant program. There is still more to be done before this becomes law – the House needs to pass infrastructure legislation, and then any differences in the two bills must be reconciled. That still gives lobbyists a lot of time to try to change the rules. The following is not a complete summary of the Senate bill, just the highlights that I think are the most important.

The Senate created a $42.45 billion grant program to be administered by the NTIA. Administered in this case means the NTIA will interpret the final grant rules within the confines set by this legislation. The grant money will go to states, and states will pick grant winners using the rules dictated by the NTIA. This won’t be the free-for-all as we’ve seen with CARES and ARPA funding, and I expect fairly explicit rules from the NTIA based upon whatever is demanded in the final legislation. 10% of the funding will be distributed nationwide to reach the highest cost areas to serve. That determination probably comes from the FCC’s cost models. Every state will get at least $100 million.

The grants define unserved to mean areas that lack access to 25/3 Mbps broadband. Underserved means areas that have broadband greater than 25/3 but less than 100/20 Mbps. Since speed is the determining factor, this means using the FCC mapping data – unless the NTIA allows an alternate method. States must offer grant funding to cooperatives, nonprofit organizations, public-private partnerships, private companies, public or private utilities, public utility districts, or local governments.

The funding can be used for several purposes, including building last-mile infrastructure to unserved and underserved locations, connecting to anchor institutions, data collection and mapping, providing WiFi or reduced-cost broadband to eligible MDUs, broadband adoption, or any other use allowed by the NTIA. This list is worth noting because all of the money doesn’t go to last-mile infrastructure.

A state must certify that it will bring broadband to all unserved areas and anchor institutions before money can be used for underserved areas. States must prioritize the following in awarding grants: deployment to persistent poverty counties and areas, the speeds of the proposed technology, the length of time required to build a network, and compliance with federal labor laws.

There is a challenge process, and local governments, nonprofit organizations, or ISPs can challenge the eligibility of proposed grant areas, including if an area is unserved or underserved. The NTIA can intervene in these challenges.

Grant winners must bring at least 25% matching funds. Matching funds can be in cash or in-kind contributions. Matches can be made from CARES and ARPA funds.

Grant winners must build broadband networks that provide speeds of at least 100/20 Mbps. Broadband must be made available to every home and business in a grant area that wants service. Grant recipients must offer at least one low-cost broadband service option for eligible subscribers – the NTIA will determine the definition of an acceptable low-cost option.

It won’t be easy to understand the winners and losers in this grant until after the NTIA crafts the specific grant rules. The cable companies, WISPs, and maybe even the satellite companies already won a huge battle by setting the eligible technology requirement to 100/20 Mbps. The challenge process allows the incumbents to delay the grant process and create havoc.

But the public is also a big winner. There have been shenanigans for years from telcos that have lied about rural areas that can receive 25/3 Mbps broadband. Allowing funding to areas with speeds up to 100/20 Mbps washes away most of the past nonsense.

The big ISPs likely also view this as a victory because they probably feel that they have a decent chance in some states of winning most of the grant funding. How well the grant will work in a state is going to depend upon how well a particular state does its job – many states are not prepared to handle this kind of grant program.

There are still areas that will fall through the cracks. For example, it might be financially infeasible for an ISP to take this funding in high-cost places like Appalachia if an ISP has to provide a 25% matching while also offering a low-income broadband product. There are still places where costs are so high that a 75% grant is not sufficient.

Just about everybody won by not using a reverse auction.

Is Wireless Power a Possibility?

Wireless power transmission (WPT) is any technology that can transmit electrical power between two places without wires. As we are moving towards a future with small sensors in homes, fields, and factories, this is an area of research that is getting a lot more attention. The alternative to wireless power is to somehow put small batteries in sensors and devices that have to somehow periodically be replaced.

There are half a dozen techniques that can be used to create electric power remotely. Most involve transmitting some form of electromagnetic radiation that is used to excite a remote receiver that converts the energy into electricity. There have been trials using frequencies of all sorts, including microwaves, infrared light, and radio waves.

The most commonly used form of wireless power transmission today is used in wireless pads that can recharge a cellphone or other small devices. This technology uses inductive coupling. This involves passing alternating current through an induction coil. Since any moving electrical current creates a magnetic field, the induction coil creates a magnetic or electromotive field that fluctuates in intensity as the AC current constantly changes. A cellphone pad only works for a short distance because the coils inside the device are small.

There are a few household applications where induction charging works over slightly greater distances, such as automatically charging electric toothbrushes and some hand tools. We’ve been using the technology to recharge implanted medical devices since the 1960s. Induction charging has been implemented on a larger scale. In 1980, scientists in California developed a bus that could be recharged wirelessly. There is currently research in Norway and China to top off the charge in cars and taxi batteries to avoid having to stop to recharge electric vehicles.

There have successful uses of transmitted radiation to create remote electricity over great distances. Radio and microwaves can be beamed great distances to excite a device called a rectenna or rectifying antenna, which converts transmitted frequency into electricity. This has never been able to produce a lot of power, but scientists are looking at the technology again because this could be a way to charge devices like farm sensors in fields.

The private sector is exploring WPT solutions for everyday life. Wi-Charge is using safe infrared light to charge devices within a room. Energous has developed a radio transmitter that can charge devices within a 15-meter radius. Ossia is developing wireless charging devices for cars that will automatically charge cellphones and other consumer devices. We’re not far away from a time when motion detectors, smoke alarms, CO2 sensor,s and other devices can be permanently powered without a need for batteries or hardwiring.

Scientists and manufacturers are also exploring long-distance power transmission. Emrod in New Zealand is exploring bringing power to remote sites through the beaming of radio waves. On an even grander scale, NASA is exploring the possibility of beaming power to earth gathered from giant solar arrays in space.

Remote power was originally envisioned by Nicola Tesla, and perhaps over the next few decades will become an everyday technology that we take for granted. I’m just looking forward to the day when I’m not wakened in the middle of the night by a smoke detector that wants me to know it’s time to change the battery.

Supply Chain Issues are Here

2021 is already shaping up as the biggest year we’ve ever had for building fiber. As busy as it is now, we are just warming up when looking out at the huge amounts of fiber that might be built as the result of the ARPA grants, aggressive state grant programs, and the possibility of a massive federal infrastructure program. On top of all of this, all of the big telcos have announced aggressive plans to finally build fiber.

AT&T reported recently at an investor conference that supply chain issues will likely mean that the company will only achieve 2.5 million of the 3 million planned new passings for the year. AT&T didn’t name the vendor that was the primary reason for the slowdown, but it’s likely that it’s either Corning or CommScope.

This news has to be sounding loud alarms in boardrooms everywhere in the industry because if AT&T has supply chain issues, then everybody else is likely to have worse ones. It’s hard to imagine that every manufacturer in the industry isn’t giving AT&T the highest priority in its queue. If AT&T can’t buy everything they want, then how will smaller telcos meet fiber expansion goals? How will new fiber overbuilders like cities using ARPA funds be able to break into an overloaded supply chain?

Supply chain issues are arising for a variety of reasons, all of which might come together to create a perfect storm for the industry. One reason for shortages is manufacturing capacity. For example, Corning saw revenues jump by 21% in the recently ended second quarter compared to a year earlier. Factories that are already working at or near capacity and can’t flip a switch to produce 20% more product. Demand is going to grow a lot more. The consulting firm RVA LLC recently predicted that the industry has plans to build fiber past 61 million homes between this year and 2025 – that’s far more fiber than has ever been built.

Supply chain issues are also still suffering from the lack of the raw ingredients needed to manufacture key components. This is one of the key issues behind the chip shortage and the shortage of electronics cases that are made from resin. Much of the global supply chain has not recovered from the impacts of the pandemic – and as the delta variant sweeps the world, this issue is far from behind us.

There are also more mundane supply chain issues. There is still a shortage of truck drivers and port capacity to deliver the glut of materials and products hitting the market as the economy is rapidly improving. Apparently, during the break from the pandemic, many truckers decided they were tired of life on the road and are pursuing something else. The industry is having a hard time training new truckers at the needed pace, and truck driving schools are working overtime.

There are also more subtle changes behind the scenes. For example, many manufacturers have quietly looked for sources other than China during the pandemic. Many companies have come to realize that their own success was tied too closely to supply chains that were wholly within specific regions of China. Switching supply sources to other countries is not something that happens overnight, and many of these new relationships are still growing and maturing.

Some of these issues will get solved over time. But the bigger issue of unprecedented demand is likely going to plague us for much of the next decade. AT&T is not going to give up on the half-million homes it won’t reach this year and will just shove that into next year.

Broadband Growth Continues 2Q 2021

Leichtman Research Group recently released the broadband customer statistics for the end of the second quarter of 2021 for the largest cable and telephone companies. LRG compiles most of these numbers from the statistics provided to stockholders other than for Cox, which is estimated. Leichtman says this group of companies represents 96% of all US landline broadband customers.

The 891,525 net broadband customer additions is a drop from the first quarter additions of 1,020,097 new customers.

 1Q 2021 1Q Change % Change  
Comcast 31,388,000 354,000 1.1%  
Charter 29,634,000 400,000 1.4%  
AT&T 15,481,000 46,000 0.3%  
Verizon 7,263,000 70,000 1.0%  
Cox 5,485,000 50,000 0.9%  
CenturyLink 4,666,000 (62,000) -1.3%  
Altice 4,401,300 200 0.0%  
Frontier 2,798,000 (22,000) -0.8%  
Mediacom 1,468,000 14,000 1.0%  
Windstream 1,131,800 9,500 0.8%  
Cable ONE 1,017,000 14,000 1.4%  
WOW! 826,300 3,400 0.4%  
Atlantic Broadband 517,004 6,847 1.3%  
TDS 513,600 11,900 2.4%  
Cincinnati Bell 437,800 200 0.0%  
Consolidated 393,480 (4,522) -1.1%  
107,422,131 891,525 0.8%  
         
Total Cable 74,737,451 842,447 1.1%  
Total Telco 32,684,680 58,122 0.2%  

As we’ve seen for several years, Comcast and Charter are taking most of the new customers in the industry and together captured 754,000 new customers, or 85% of the net customers added for the quarter. The cable companies collectively added 842,447 customers in the fourth quarter compared to 58,122 for the telcos. If the growth rate so far this year is sustained, the industry will add around 3.8 million customers this year, a drop from the 4.8 million new broadband customers added in 2020.

There are some interesting numbers inside this report. AT&T’s new emphasis on building fiber seems to be paying off as the company added 97,000 net new customers this year. That’s extraordinary considering that the company stopped installed new DSL customers – the company lost 1 million DSL customers last year, and this year the loss is likely to be higher. The third highest customer growth for ISPs comes from Verizon, which has added 134,000 new customers for the year.

We are perhaps seeing why CenturyLink is selling a big pile of copper assets – the company lost a net of 101,000 broadband customers so far this year and continues to get clobbered on DSL. CenturyLink surpassed Frontier as the biggest percentage loser of customers for the year, although Frontier continues to lose DSL customers rapidly as well.

The biggest percentage gainer for the quarter is TDS, with quarterly growth of 2.4%. For those not familiar with the company, about half of its customers are on copper and fiber, with the rest served with cable technology.

It’s getting harder to understand the dynamics behind broadband growth. For instance, the second quarter included the first wave of low-income customers added with the EBB subsidy program that was funded by the American Recovery Plan Act. Only the ISPs know how much of the growth this year came from that plan.

Are We Ready for Big Bandwidth Applications?

There is a recent industry phenomenon that could have major impacts on ISP networks in the relatively near future. There has been an explosion of households that subscribe to gigabit data plans. At the end of 2018, only 1.8% of US homes subscribed to a gigabit plan. This grew to 2.8% by the end of 2019. With the pandemic, millions of homes upgraded to gigabit plans in an attempt to find a service that would support working from home. By the end of the third quarter of 2020, gigabit households grew to 5.6% of all households, a doubling in nine months. But by the end of last year, this mushroomed to 8.5% of all households. OpenVault reports that as of the end of the first quarter of 2021 that 9.8% of all households have subscribed to gigabit plans.

I have to think that a lot of these upgrades came from homes that wanted faster upload speeds. Cable company broadband is stingy with upload speeds for basic 100 Mbps and 200 Mbps basic plans. Surveys my company has done show a lot of dissatisfaction with urban ISPs, and my guess is that most of that unhappiness is due to sluggish upload performance.

Regardless of how we found ourselves at this place, one out of ten households in the US now buys gigabit broadband. As an aside, that fact alone should completely eradicate any further discussions about 25/3 Mbps even being part of the discussion of broadband.

My ISP clients tell me that the average gigabit household doesn’t use a lot more bandwidth than customers buying 100 Mbps broadband – they just get things faster. If you’ve never worked on a gigabit connection, you might not understand the difference – but with gigabit broadband, websites appear on your screen almost instantaneously. The word I’ve always used to describe gigabit broadband is ‘snappy’. It’s like snapping your fingers and what you want appears instantly.

I think the fact that 10% of households have gigabit speeds opens up new possibilities for content providers. In the early days after Google Fiber got the country talking about gigabit fiber, the talking heads in the industry were all asking when we’d see gigabit applications. There was a lot of speculation about what those applications might do – but we never found out because nobody ever developed them. There was no real market for gigabit applications when only a handful of households were buying gigabit speeds. Even at the end of 2019, it was hard to think about monetizing fast web products when less than 3% of all homes could use them.

My instincts tell me that hitting a 10% market share for gigabit subscribers has created the critical mass of gigabit households that might make it financially worthwhile to offer fast web applications. The most likely first applications are probably telepresence and 3D gaming in your living room space. It’s hard to think that there is no market for this.

I know that ISPs are not ready for households to actually use the speeds they have been peddling to them. There is no ISP network anywhere, including fiber networks, that wouldn’t quickly bog down and die if a bunch of subscribers started streaming at fast speeds between 100 Mbps and a gigabit. ISP networks are designed around the concept of oversubscription – meaning that customers don’t use broadband at the same time. The normal parameters for oversubscription are already changing due to the proliferation of VPN connections made for working and schooling from home – ISPs must accommodate large chunks of bandwidth that are in constant use, and that can’t be shared with other customers. Home VPN connections have paralyzed DSL networks, but it’s something that even fiber network engineers are watching carefully.

I’ve been imagining what will happen to a network if households start streaming at a dedicated symmetrical 100 Mbps instead of connecting to Zoom at 2 Mbps. It wouldn’t take many such customers in any neighborhood to completely tie up network resources.

I will be shocked if there aren’t entrepreneurs already dreaming up gaming and telepresence applications that take advantage of the 10% market share for gigabit broadband. In looking back at the past, new technology phenomenon seems to hit almost overnight. It’s not hard to imagine a craze where a million gigabit homes are playing live 3D games in the living room air. When that finally happens,  ISPs are going to be taken by surprise, and not in a good way. We’ll see the instant introduction of data caps to stop customers from using broadband. But we’ll also see ISPs beefing up networks – they’ll have no choice.

The New Broadband Monopolies

I’ve been working in several communities lately where AT&T is the incumbent telephone company. AT&T stopped installing DSL in October 2020, so in my mind, AT&T can no longer be considered as a broadband option. AT&T won’t connect a new customer. That means if somebody suddenly needs to work or attend school from home, AT&T won’t connect them. I’ve heard stories of people who say that if DSL is disconnected for any reason that the company won’t reconnect it. If somebody buys a house already served by AT&T, the new owner can’t get DSL. If somebody gets disconnected for late payment, they aren’t being allowed to reconnect. I heard stories lately of customers who had technical trouble with DSL who have been told that the company can’t and won’t fix it.

In a city where AT&T is the incumbent telco and where the only other broadband option is from a cable company, we are witnessing  the reemergence of a pure monopoly situation. This is the case in many smaller towns and county seats throughout the country. We have a lot of experience knowing how monopolies act – and it’s just a matter of time until the cable companies in these markets start acting like pure monopolies. It’s been less than a year, but over time we can expect the following behavior from the cable companies:

Higher Prices. Monopolies raise rates because there is nothing to stop them from doing so. The big cable companies all set rates at a national level. We already can see that Comcast and Charter are on the path towards $100 basic broadband, and I think the smaller cable companies are trailing yet emulating them.

But the price increases in a monopoly market are more subtle than the basic rates being increased. Cable companies have offered special rates in the past to lure customers from DSL. Over time these introductory rates will disappear because where AT&T stopped competing, the cable companies know they will eventually get the rest of the customers without incentives. Cable companies also offer bundling discounts, and over time those discounts will shrink and disappear in monopoly towns. Finally, in a competitive environment, cable companies are open to negotiating with customers who threaten to leave. But in a monopoly town, there is no reason to negotiate – customers can’t leave. These various discounts will continue for a while, but as it finally sinks in on the cable companies that they are in a monopoly position, they will pull back from giving discounts. If no other ISP ever comes to a monopoly market, then in the long run, everybody in that town will pay the full list prices from broadband, telephone, and cable TV.

Degraded Maintenance. This won’t happen overnight, but it’s inevitable due to the way that big ISPs operate. There is always pressure from corporate for local managers to cut costs. Cable companies don’t conspire at the corporate level to be poor ISPs, but they have budgetary practice and a bonus structure that rewards local employees for cutting corners and costs. One of the easiest things to eliminate in a monopoly environment is maintenance. The local managers for the cable companies likely will not eliminate jobs but rather fail to fill vacant positions over time. We’ve seen this in practice for decades from the big telcos, which have eliminated a huge percentage of technicians over the last few decades. In markets where the cable companies become monopolies, this is bound to happen.

Little Innovation or Upgrades. Cable technology has been improving rapidly over the last decade. Most markets have been upgraded now for download bandwidth to the DOCSIS 3.1 standard. From what I can see, most systems have stuck with the older DOCSIS 3.0 standard for upload speeds. Within a few years, cable companies will have the option to upgrade again to DOCSIS 4.0 – an upgrade that will allow for symmetrical gigabit broadband products to compete against fiber.

Cable companies have no motivation to upgrade in monopoly markets. They may end up doing so over time as part of a corporate-wide effort to have the same technology everywhere. Cable companies will always upgrade competitive markets before non-competitive ones. If they can get by with half-measures to save money, they’ll do so in monopoly towns.

It’s unimaginable to think that cable companies aren’t already having these discussions in markets abandoned by AT&T. Those are the markets where the cable monopolies will maximize the bottom line.

Amazon Sharing Your Broadband

Amazon launched its Sidewalk network in June. This is a local network established between Amazon devices in your home and around your neighborhood. This connects devices like the various kinds of Echo devices and Ring cameras. The network will also communicate with Tile devices used to keep track of things like keys and pets.

The network does not use your home WiFi, but instead establishes a new network using a combination of Bluetooth and 900 MHz LoRa signals. While Amazon won’t comment on future plans for this network, it would be a natural way for Amazon to create a smart home network that is apart and separate from WiFi.

Amazon automatically enabled newer Echo and Ring devices to act as bridges in the network. The network can then connect to any other smart home device that has the ability to communicate with the bridges. This doesn’t work with the first few generations of Amazon Echo devices but comes built into fourth-generation Echo and Dot devices. Specifically, the network enables the following kinds of connections:

  • Localized Bluetooth LE connections between the Sidewalk bridges and Sidewalk-enabled devices in your home;
  • Long-range Bluetooth LE and 900MHz connections between your Sidewalk bridges and Sidewalk-enabled devices outside of your home, including other people’s devices;
  • Long-range Bluetooth LE and 900MHz connections between your Sidewalk-enabled devices and Sidewalk bridges outside the home, including other people’s bridges.

The bridges on the network communicate with the Amazon cloud using your home broadband connection. Amazon says that it is limiting an individual connection to no more than 80 Kbps and is capping total data usage from any home at 500 MB of data per month.

If you have one of these devices in your home, this becomes another use of your monthly data. While half a gigabyte may not sound like a lot pf broadband, it is significant to people using cellular hotspots or other data plans with small data caps. This will be one more use of data that contributes to total home usage for anybody saddled with a broadband data cap.

In urban areas where there are a lot of such devices, this creates an interesting network. If you drop your keys while jogging or your dog wanders away, they can be located if connected to a Tile locator device. Amazon is touting the Sidewalk network as something that is good for everybody.

Amazon’s real plans have to be more than making Tile devices work better. The giant retailer probably has visions of selling a range of outdoor sensors that will work as long as homeowners or neighbors have a bridge device. In the future, you might buy an external sensor that makes its broadband connection through your neighbor’s Ring camera.

But this is also a bit troublesome. This creates a free mesh network for Amazon. Over time, it’s likely that many devices sold by the company will be capable of communicating with this network. The richest guy on the planet will have created an incredibly valuable network by taking small amounts of data from anybody using an Echo or Ring device. Users are able to disconnect bridges from the Sidewalk network, but devices are enabled automatically.

Amazon says data is safe on the network. Data is supposed protected by several layers of encryption, and Amazon plans to delete all data every 24 hours.

This reminds me of the WiFi network created by Comcast using home routers. Comcast swears that the network doesn’t use home broadband, but it’s unlikely that somebody would know it if it did. Amazon isn’t making such a claim and is brazenly using a small slice of people’s home broadband for free. Amazon is the only company that could currently pull this off since it has a huge number of Echo and Ring devices already in homes. But there is nothing stopping other smart home device makers from doing something similar and not even telling us about it.

Did the Senate Just Change the Definition of Broadband?

The recently passed Senate infrastructure legislation included a new definition of an underserved household as being a location that lacks access to reliable broadband service offered with a speed of not less than 100 megabits per second for downloads; and 20 megabits per second for uploads, plus a latency sufficient to support real-time, interactive applications. It’s hard to see this as anything other than a new definition of broadband.

Before jumping completely off this cliff, it’s worth revisiting the history of the federal definition of broadband. The FCC is required to establish a definition of broadband. Congress established this obligation in Section 706 of the FCC governing rules that require the agency to annually evaluate broadband availability in the country. The FCC must then report the state of broadband to Congress every year using the established definition. In these reports, the FCC compiles data about broadband speeds and availability and offers an opinion on the state of broadband in the country. Further, the FCC must act if broadband is not being deployed in a timely manner – but no FCC to date has concluded that the agency is not doing a great job with broadband deployment.

In 2015, the FCC established the current definition of broadband as 25/3 Mbps (that’s 25 Mbps download and 3 Mbps upload). Prior to 2015, the FCC definition of broadband was 4/1 Mbps, set a decade earlier. The FCC didn’t use empirical evidence like speed tests in setting the definition of broadband in 2015. They instead conducted what is best described as a thought experiment. They listed the sorts of functions that a “typical” family of four was likely to engage in and then determined that a 25/3 Mbps broadband connection was enough speed to satisfy the broadband needs of a typical family of four.

The FCC asked the question again in 2018 and 2020 if 25/3 Mbps was still an adequate definition of broadband. The Commission took no action and concluded that 25/3 Mbps was still a reasonable definition of broadband. There were comments filed by numerous parties that argued that the definition of broadband should be increased.

Unfortunately, as happens with many regulatory requirements, the FCC has not been an honest broker in looking at the definition of broadband. There are political consequences for any FCC that increases the definition of broadband because doing so means declaring that millions of households would suddenly classified as not having adequate broadband. If the FCC changes the definition of broadband from 25/3 to 100/20 Mbps, then every home with speeds between 25/3 and 100/20 Mbps suddenly be considered to not have adequate broadband. No FCC wants to be the one that increases the number of homes without broadband.

All of this is politics, of course, and homes and businesses know if broadband is adequate without the FCC setting some arbitrary speed as magically being broadband. Is the home that gets 27 Mbps all that different than one that’s getting 23 Mbps? Unfortunately, when it comes to being eligible for federal grant monies it matters.

I think there is a good argument to be made that the Senate just preempted the FCC in setting the definition of broadband. Declaring that every home or business with speeds less than 100/20 Mbps is underserved is clearly just another way to say that speeds under 100/20 Mbps are not good broadband.

Of course, the FCC could continue to use 25/3 Mbps as the definition of broadband for the purposes of the annual report to Congress. But Congress just changed the definition of broadband that matters – the one that comes with money. If the infrastructure legislation become law it will allow states to use the huge $42.45 billion of federal funding to upgrade the broadband in places that have broadband speeds under 100/20 Mbps.

Of course, the Senate legislation has not yet been enacted because the House of Representatives needs to now pass fresh infrastructure legislation, and then the two versions of new law must be reconciled before going into effect. But the Senate legislative language couldn’t be clearer – 25/3 Mbps is no longer the definition of broadband that matters.

Charter Foresees the End of Traditional Cable

An article by Ben Munson in Fiercevideo quotes Chris Winfrey, the CFO of Charter, as saying that the breaking point for traditional cable TV is approaching.  While Charter is still making money on selling video, all of the industry trends make it hard to think that traditional TV has a lot of legs.

Traditional video has a lot of competition. The most obvious are paid online video services like Netflix, YouTube TV, Hulu, sling TV, and others. But Winfrey cites a big threat from free programming. There are numerous free ad-supported programming sources like IMDb TV, Crackle, Vudu, Tubi TV, and Pluto TV that offer mountains of programming. Winfrey also cites the millions of people who watch for free by using the passwords of a paid subscriber to online content.

Another huge factor is the ever-increasing cost of buying programming. Winfrey says that Charter has absorbed some of the programming costs in recent years but says that in the future, he sees no alternative to passing the programming increases to customers. Price is already a huge factor in driving customers from traditional cable TV, and continued price increases are bound to drive many more millions from the paid subscriptions.

A final issue cited in the article is the high price of carrying local network stations. The fees to carry ABC, CBS, FOX, and NBC have climbed steadily and are a huge factor in the cost of cable packages.

Winfrey thinks the only hope for companies like Charter to keep offering a cable product is to allow cable companies to package programming in different and creative ways – something the programmers don’t see interested in considering – because it likely means trimming channels.

This is an industry segment in crisis. The largest dozen traditional cable providers, including the satellite TV companies, lost almost 6 million customers in 2020 – and the rate of loss looks to be accelerating.

Many smaller ISPs have either dropped cable entirely or are strongly considering it. Other smaller ISPs are now buying alternatives like smaller packages of programming from bundlers of content. Windstream started bundling last year with online providers like Sling TV.

My firm does surveys, and we’ve seen a huge uptick in the last two years of homes that have gone back to rabbit ears to receive local content for free. As recently as three or four years ago, we’d sometimes do a survey and find practically nobody watching over-the-air.

This trend is not a surprise and we’ve been watching the wheels fall off the traditional cable industry for years. But as rates keep climbing, and as more free alternatives appear, it seems likely that cord cutting will continue to the point where some big cable companies will throw up their hands and consider alternatives.

It’s hard to picture any big ISPs with millions of traditional cable customers a decade from now. At some point, I guess we’re going to have to stop using the label of cable company for Comcast, Charter, and others.