Categories
Technology The Industry

Shutting Down Obsolete Technologies

There was an interesting statement during the recent Verizon first quarter earnings report call. The company admitted that shutting down the 3G cellular networks cost it about 1.1 million retail cellular customers along with the corresponding revenues.

This was long expected because there are still a lot of places where 3G technology was the only cellular signal available to rural customers living in remote areas. There were also a lot of people still happy with 3G flip phones even where 4G was available. Some of these customers will likely come back with 4G phones, but many might be angry with Verizon for cutting them off and go elsewhere.

Verizon has been trying to shut down the 3G network for at least five years. Its original plans got delayed due to discussions with the FCC and then got further delayed because of the pandemic – it didn’t seem like a good idea to cut folks dead when cellular stores were shuttered.

This change was inevitable. The bandwidth that can be delivered on the 3G networks is tiny. Most of you remember when you used 3G and a flip phone to check the weather and sports scores. Cellular carriers want to repurpose the spectrum used for 3G to support 4G and 5G. This is something that is inevitable – technologies become obsolete and have to be upgraded or replaced. The 3G transition is particularly abrupt, because the only possible transition is to cut the 3G signal dead, and 3G phones become bricks.

All of the technologies used for broadband and telecom eventually become obsolete. I remember when we used ISDN to deliver 128 Kbps broadband to businesses. I remember working with n-carrier and other technologies for creating data connections between central offices. Telephone switches took up a big room instead of being housed inside a small computer. The earlier version of DOCSIS technology were largely abandoned and upgraded to new technology. BPON became GPON and is now becoming XGS-PON.

Most transitions to new technologies are phased in over time. You might be surprised that there are still working ISDN lines chugging along that are being used to monitor remote sensors. There are still tiny rural cable companies operating the early versions of DOCSIS. But the industry inevitably replaces ancient technology in the same way that none of you are reading this blog on an IBM 5150 or a Macintosh 128k.

But some upgrades are painful. There were folks who lost cellular coverage when 3G was cut dead since they lived in a place that might not be able to receive the 4G replacement. A 3G phone needed only a tiny amount of bandwidth to operate – at levels that newer phones would perceive to be far under one bar of service.

The other painful technology replacement that keeps getting press is the big telcos killing off the copper networks. When copper is cut off in an area, the traditional copper landlines and DSL go dead. In some cases, customers are offered to move to a fiber network. The price might be higher, but such customers are offered a good permanent technology replacement. But not every DSL customer in a city that loses copper service is offered a fiber alternative. Customers find themselves likely having to pay $30 or $40 more to move to the cable company.

In rural areas, the telcos often offer to move customers to wireless. For a customer that lives within a decent distance from a cell tower, this should be an upgrade. Fixed wireless delivered for only a few miles should be faster than rural DSL. But like all wireless technologies, there is a distance limitation around any given tower, and the FWA signal isn’t going to work for everybody. Some customers that lose rural copper are left with whatever alternatives are available – because the telephone company is basically abandoning them. In many rural areas, the broadband alternatives are dreadful – which is why many were sticking with slow rural DSL.

I hear a lot of complaints from folks who lose traditional copper who are upset that they lose the ability to use services that work on copper technology, such as fax machines and medical monitors. It may sound uncaring, but these folks need to buy something newer that works with today’s broadband. Those are the kind of changes that are inevitable with technology upgrades. Just like you can’t take your old Macintosh to get fixed at Best Buy, you can’t keep using a technology that nobody supports. That’s an inevitable result of technology getting better over time. This is not a comfort to the farmer who just lost his 3G cell coverage – but there is no way to keep older technology operating forever.

Categories
The Industry

Should DSL Cost Less Than Fiber?

As I was going through my pile of unread articles, I found an article from the Associated Press that asked how big ISPs can get away with charging the same prices in urban areas for both slow and fast broadband. The article was about Shirley Neville, in New Orleans, who found that she was paying the same price for 1 Mbps DSL from AT&T as other city residents are paying for a fiber connection.

It’s a great question, and I was surprised that I hadn’t thought to write about it before. I investigate broadband prices around the country, and it’s not unusual to find the price for fiber broadband in a city set close to the price charged for DSL.

It would be easy to justify charging the same price for both technologies if AT&T was in the process of converting everybody in New Orleans to fiber. In fact, if that was the reason, I’d be impressed that AT&T wasn’t charging more for the technology upgrade. But this is not the situation. It’s clear that the AT&T fiber business plan is to build fiber to small pockets of cities, but not everywhere. The chances are high that Shirley Neville’s neighborhood and many others will not be getting fiber soon from AT&T, if ever. For every neighborhood that gets fiber, there will be many that will never see AT&T fiber.

Another possibility is that AT&T’s low price for a fiber connection is an introductory price to lure people to switch from Cox, the cable company. Perhaps when the introductory price expires the fiber price will be higher than DSL. This still doesn’t feel like a great answer to Shirley’s question since AT&T is willing to give a fiber customer a big break.

The most likely answer to the question is the ugliest. AT&T doesn’t feel like it needs to reduce the price of DSL in the city because DSL customers are a captive audience. Cox has some of the highest broadband prices in the country, and that gives cover for AT&T to charge whatever it wants for DSL as long as the price is lower than Cox.

Another reason that AT&T can charge the same for DSL and fiber is that there isn’t anybody to tell the company that it shouldn’t do so. The FCC eliminated broadband regulation and the Louisiana Public Service Commission doesn’t assert any authority over broadband prices. Folks like Shirley Neville don’t have anybody looking out for them, and the big ISPs can overcharge customers with impunity.

As the article points out, Shirley’s question is germane today because of the FCC’s investigation of digital discrimination. The article cites an investigation by The Markup, which analyzed over 800,000 broadband offerings from AT&T, Verizon, Earthlink, and CenturyLink in 38 cities across America and found that the four ISPs regularly offer broadband speeds at 200 Mbps or faster at the same price as broadband with speeds under 25 Mbps.

The Markup analysis shows that the neighborhoods with the worse speed options have lower median household incomes in 90% of the cities studied. Where The Markup could gather the data, it also looks like the big ISPs offered the worst deals to the least-white neighborhoods.

USTelecom responded to the issue by stating that the high cost of maintaining old copper networks justifies high prices for DSL. The article cites Marie Johnson of USTelecom writing that “Fiber can be hundreds of times faster than legacy broadband—but that doesn’t mean that legacy networks cost hundreds of times less. Operating and maintaining legacy technologies can be more expensive, especially as legacy network components are discontinued by equipment manufacturers”.

That’s exactly the response I would expect to defend monopoly pricing. Nobody expects the price of DSL to be hundreds of times less than fiber – but DSL should cost less. The big telcos have argued for decades that it costs too much to maintain copper networks. But they never finish that statement by telling us how much money they have collected over the years from a customer like Shirley Neville – possibly hundreds of times more than the cost of her share of the network.

Categories
The Industry

A Last Gasp at Regulating Copper

The Minnesota Public Utilities Commission recently ordered a series of public hearings to investigate the quality of service on the CenturyLink copper networks. The hearings were prompted by a complaint filed by the Communications Workers of America (CWA). The complaint listed the failures of CenturyLink to meet state service standards due to the deterioration of the copper network. CWA also noted that CenturyLink is planning to eliminate half of the remaining technicians who work on copper.

Similar inquiries by other state regulators have been instituted in the last few years against CenturyLink and Frontier. I feel sorry for any customers left on deteriorating copper networks, but proceedings like this one feel like the last gasp of regulators trying to score points by beating up on the telcos that still operate copper networks.

Not that CenturyLink doesn’t deserve a lot of criticism. Its copper networks are in dreadful condition and are in the process of dying. The poor condition of the networks is due in large part to the decades-long lack of maintenance and repairs. We know this is the case because copper networks of a similar age are still operating much better in Europe. The big telcos like CenturyLink, Frontier, Verizon, and AT&T stopped caring about copper networks back in the 1990s, and the networks have been in a steady decline since then.

But U.S. copper networks are truly near the end of life. It’s impossible to neglect maintenance for over twenty years and somehow suddenly make the networks perform better. It’s hard to fathom the intentions of having regional hearings on the topic for any purpose other than letting people vent their frustration with CenturyLink. It’s hard to imagine anything changing as a result of these hearings that will improve service. There might be new fines levied on CenturyLink, but that’s less costly for the company than trying to make the copper work.

Some big telcos are working to convert copper networks to fiber. Frontier and Windstream are building a lot of fiber – and I assume they are overlashing the new fiber wires on the old copper. AT&T and Verizon are selectively expanding fiber in neighborhoods where the cost of construction meets some internally set cost test – but these two companies are quietly moving most copper customers onto cellular connections.

CenturyLink has been up and down on the decision to overbuild residential fiber. It currently looks like the company is only building ‘strategic’ fiber, which I interpret to mean business districts and large apartment complexes. It seems unlikely that CenturyLink will overbuild much more of its residential copper in Minnesota or elsewhere with fiber.

I would bet that if CenturyLink could wave a magic wand and be rid of copper, it would do so. It’s harder each year to maintain copper networks, and a move to eliminate half of the remaining copper technicians shows that the company is finally throwing in the towel. But giving up on copper still means walking away from a lot of revenue.

There are still plenty of customers who want to keep using the copper networks. Say what you want about the inadequacies of DSL, but in most urban markets where my firm does surveys, we still find 10% to 20% of households are still using DSL. These are households for whom the price is more important than broadband speed.

CenturyLink and the other big telcos have recaptured the cost of the copper networks many times over and decided many years ago not to reinvest profits back into new and upgraded networks. We’re now reduced to watching the last death throes of copper networks, and it’s not pretty.

Categories
The Industry

Only Twenty Years

I’ve written several blogs that make the argument that we should only award broadband grants based on future-looking broadband demand. I think it is bad policy to provide federal grant funding for any technology that delivers speeds that are already slower than the speeds already available to most broadband customers in the country.

The current BEAD grants currently use a definition of 100/20 Mbps to define who households that aren’t considered to have broadband today. But inexplicably, the BEAD grants then allow grant winners to build technologies that deliver that same 100/20 Mbps speeds. The policymakers who designed the grants would allow federal funding to go to a new network that, by definition, sits at the nexus between served and unserved today. That is a bad policy for so many reasons that I don’t even know where to begin lambasting it.

One way to demonstrate the shortsightedness of that decision is a history lesson. Almost everybody in the industry tosses out a statistic that a fiber network built today should be good for at least thirty years. I think that numbers is incredibly low and that modern fiber ought to easily last for twice that time. But for the sake of argument, let’s accept a thirty-year life of fiber.

Just over twenty years ago, I lived inside the D.C. Beltway, and I was able to buy 1 Mbps DSL from Verizon or from a Comcast cable modem. I remember a lot of discussion at the time that there wouldn’t be a need for upgrades in broadband speeds for a while. The 1 Mbps speed from the telco and cable company was an 18-times increase in speed over dial-up, and that seemed to provide a future-proof cushion against homes needing more broadband. That conclusion was quickly shattered when AOL and other online content providers took advantage of the faster broadband speeds to flood the Internet with picture files that used all of the speed. It took only a few years for 1 Mbps per second to feel slow.

By 2004, I changed to a 6 Mbps download offering from Comcast – they never mentioned the upload speed. This was a great upgrade over the 1 Mbps DSL. Verizon made a huge leap forward in 2004 and introduced Verizon FiOS on fiber. That product didn’t make it to my neighborhood until 2006, at which time I bought a 30 Mbps symmetrical connection on fiber. In 2006 I was buying broadband that was thirty times faster than my DSL from 2000. Over time, the two ISPs got into a speed battle. Comcast had numerous upgrades that increased speeds to 12 Mbps, then 30 Mbps, 60 Mbps, 100 Mbps, 200 Mbps, and most recently 1.2 Gbps. Verizon always stayed a little ahead of cable download speeds and continued to offer much faster upload speeds.

The explosion of broadband demand after the introduction of new technology should be a lesson for us. An 18-time speed increase from dial-up to DSL seemed like a huge technology leap, but public demand for faster broadband quickly swamped that technology upgrade, and 1 Mbps DSL felt obsolete almost as soon as it was deployed. It seems that every time there has been a technology upgrade that the public found a way to use the greater capacity.

In 2010, Google rocked the Internet world by announcing gigabit speeds. That was a 33-time increase over the 30 Mbps download speeds offered at the time by the cable companies. The cable companies and telcos said at the time that nobody needed speeds that fast and that it was a marketing gimmick (but they all went furiously to work to match the faster fiber speeds).

I know homes and businesses today that are using most of the gigabit capacity. That is still a relatively small percentage of homes, but the number is growing. Over twenty years, the broadband use by the average home has skyrocketed, and the average U.S. home now uses almost 600 gigabytes of broadband per month – a number that would have been unthinkable in the early 2000s.

I look at this history, and I marvel that anybody would think that it’s wise to use federal funds to build a 100/20 Mbps network today. Already today, something like 80% of homes in the country can buy a gigabit broadband product. The latest OpenVault report says that over a quarter of homes are already subscribing to gigabit speeds. Why would we contemplate using federal grants to build a network with a tenth of the download capacity that is already available to most American homes today?

The answer is obvious. Choosing the technologies that are eligible for grant funding is a political decision, not a technical or economic one. There are vocal constituencies that want some of the federal grant money, and they have obviously convinced the folks who wrote the grant rules that they should have that chance. The biggest constituency lobbying for 100/20 Mbps was the cable companies, which feared that grants could be used to compete against their slow upload speeds. But just as cable companies responded to Verizon FiOS and Google Fiber, the cable companies are now planning for a huge leap upward in upload speeds. WISPs and Starlink also lobbied for the 100/20 Mbps grant threshold, although most WISPs seeking grant funding are now also claiming much faster speed capabilities.

If we learn anything from looking back twenty years, it’s that broadband demand will continue to grow, and that homes in twenty years will use an immensely greater amount of broadband than today. I can only groan and moan that the federal rules allow grants to be awarded to technologies that can deliver only 100/20 Mbps. But I hope that state Broadband Grant offices will ignore that measly, obsolete, politically-absurd option and only award grant funding to networks that might still be serving folks in twenty years.

Categories
Uncategorized

AT&T in the News

AT&T has not been in the headlines a lot this year, but recently I’ve seen the company’s name everywhere.

In the recently released financial results for the third quarter, AT&T noted that it now has more fiber broadband customers than non-fiber customers. At the end of the quarter, AT&T had 6.93 million fiber customers compared to 6.86 million remaining non-fiber customers. Non-fiber customers are predominantly U-Verse customers served by two pairs of telephone copper. The company still also has 340,000 DSL customers served by a single copper pair. There are also some rural fixed-wireless customers.

In the third quarter, AT&T added 338,000 fiber customers. The company lost 367,000 non-fiber customers in the second quarter – although counting them as lost is probably a misnomer since many were likely upgraded to fiber.

Upgrading to fiber is good for the company’s bottom line. For the quarter, the average revenue per user (ARPU) was $62.62 for fiber customers compared to only $54.60 for non-fiber customers. AT&T has also been saying for years that the cost of maintenance for copper is a lot higher, so the company is likely shedding costs as it sheds customers served on copper.

We also got a peek at market AT&T’s penetration. AT&T says it passes 18.5 million potential customers with fiber, meaning the company has achieved an overall 37% market penetration on fiber. In the third quarter, the company added fiber to pass 500,000 new locations.

I saw another interesting news blurb about AT&T. Bloomberg reported that AT&T is looking for an equity partner to invest in a major expansion of fiber. That would be a big departure from the past since AT&T has always funded its own capital expenditures and networks.

But it’s not hard to see from the third quarter results why AT&T might be seeking additional funding. In the third quarter, the company generated $9.87 billion of cash. It invested $4.71 billion in new infrastructure and paid $3.75 billion in dividends – leaving only $1.41 billion in free cash.

I would conjecture that AT&T wants to invest more heavily in fiber immediately since it’s clear that there is a mad rush nationwide to build fiber in cities. Fiber overbuilders hope that if they are the first to a market with fiber that it might dissuade other fiber overbuilders – so we are currently seeing a fiber land grab. In the long run, sharing fiber profits with an investor will decrease future AT&T earnings. The calculus that the company is betting on is that the market share gained by building first to markets outweighs the cost of sharing profits.

AT&T is currently debt-heavy. AT&T hasn’t had a recent track record of making good investment decisions. It’s been reported that AT&T lost as much as $50 billion from its purchase of DirecTV. In almost the same time frame, the company lost as much as $42 billion from its purchase and sale of WarnerMedia. The company might not be able to easily borrow the money, particularly at current interest rates.

The final news is that AT&T was fined $23 million to resolve a federal investigation that the company had “unlawfully influenced” the former Illinois Speaker of the House, Michael J. Madigan. AT&T admits that it paid Madigan, through an ally, to promote legislation that would eliminate carrier of last resort in the state – meaning that the company is obligated to serve people who ask for a telephone line. That obligation also comes with legacy regulatory requirements that AT&T wanted to ditch.

What always dismays me, but never surprises me, is that nobody at a big company like AT&T got in trouble for breaking the law – in this case, bribing a government official. The size of the fine might be appropriate for the magnitude of the crime, but I’ve always thought that the folk at big companies would be more likely to hesitate to be unethical if they saw others going to jail for breaking the law. The only real consequence for AT&T, in this case, is that they got caught, and the fine will just be viewed as the cost of doing business.

Categories
The Industry

The Birth of the Digital Divide

A lot of the money being spent on broadband infrastructure today is trying to solve the digital divide, which I define as a technology gap where good broadband is available in some places, but not everywhere. The technology divide can be as large as an entire county that doesn’t have broadband or as small as a pocket of homes or apartment buildings in cities that got bypassed.

I can clearly remember when the digital divide came about, and at that time I remember discussing how the obvious differences between technologies were going to someday become a major problem. Today I’m going to revisit the birth of the digital divide.

Until late in the 1990s, the only way for almost most people to get onto the Internet was by the use of dial-up access through phone lines. ISPs like AOL, CompuServe, and MSN flourished and drew millions of people online. At first, dial-up technology was only available to people who lived in places where an ISP had established local dial-up telephone numbers. But the online phenomenon was so popular, that ISPs eventually offered 800 numbers that could be reached from anywhere. There was no residential digital divide, except perhaps in places where telephone quality wasn’t good enough to accommodate dial-up. Some businesses used a faster technology to connect to the Internet using a T1, which had a blazingly fast speed of 1.6 Mbps, almost 30 times faster than dial-up. To people connecting at 56 kbps, a T1 sounded like nirvana.

The digital divide came into being when the faster technologies of DSL and cable modem were offered to homes. My first DSL line had a download speed of almost 1 Mbps, an amazing 18 times increase in speed over the dial-up modem. At almost the same time, some cable companies began offering cable broadband that also had a speed of around 1 Mbps. Homes in urban areas had a choice of two nearly-identical broadband products, and the early competition between telephone and cable companies was loud and fierce.

The advent of DSL created the first digital divide – the gulf between urban areas and rural areas. While telcos theoretically offered DSL in much of rural America, the 2-mile limitation of the DSL signal meant the speed didn’t carry far outside of the towns that housed the DSL transmitters, called DSLAMs. Many telcos were willing to sell rural DSL, even if speeds were often barely faster than dial-up. Soon after the first DSL was offered to customers, the vendors came up with ISDN-DSL that could deliver a speed up to 128 kbps deeper into rural copper networks – twice the speed of dial-up. But decent DSL never made it very far into most of rural America – and still doesn’t today for much of rural America.

The DSL and cable modem technologies improved within a few years after introduction, and the technology improvements created the second digital divide. I recall versions of DSL that had a maximum speed of 3, 6, 12, 15, 24, and eventually 48 Mbps. The big telcos upgraded to later DSL technology in some neighborhoods, but not others. Sadly, even today we continue to find places where the earliest versions of DSL are still offered, meaning there are places where DSL speeds never climbed above 3, 6, or 12 Mbps. This was particularly painful in towns that didn’t have a cable competitor because they were stuck with whatever flavor of DSL the telephone company offered to them. This was noticeable in big cities where some neighborhoods never saw any DSL upgrades. There was a well-known study done a number of years ago documenting the DSL technologies available in Dallas, Texas. The study showed that poor neighborhoods still had the slowest versions of DSL while more affluent neighborhoods had DSL speeds up to 50 Mbps.

Cable modem technology improved more quickly than DSL. By 2005, the cable modem won the speed game. And that’s when the cable companies started charging more for cable broadband – something they could do because the broadband was faster. This price difference largely meant that low-income households were stuck with DSL, while folks who care about speeds migrated over the years to the cable companies.

The digital divide in rural areas deepened as older DSL was not upgraded while the DSL that had originally been deployed started to reach end-of-life. Copper networks have lasted far past the expected economic useful life and get a little worse every year. In cities, any parts of the city stuck with only DSL fell far behind the neighborhoods where speeds increased significantly from both DSL and cable modems.

Unfortunately, we are not at the end of this story. There is a huge amount of fiber being constructed today in urban areas. But there is no reason to think that most of the ISPs building fiber are going to serve every neighborhood. The big telcos that build fiber like Verizon, AT&T, Frontier, CenturyLink, and others have always cherry-picked what they think are the best neighborhoods – best in terms of either demographics or the lowest in cost of deployment.

Unless we reach a time when fiber is everywhere, the digital divide will stick around. Right now, we’re tackling the rural digital divide – I expect in 5 or 10 years we’ll have to do this all over again to tackle the urban digital divide.

Categories
The Industry

AT&T to Chop Copper Networks

In a pronouncement that is news to nobody, AT&T announced at a recent investor day event that it has plans to cut its copper network footprint in half by 2025. This can’t be a surprise from a company that stopped connecting new DSL customers in October 2020. I figured we could start the countdown clock on copper from that date.

However, Jeff McElfish, the CEO of AST&T’s Communications division, said something that is surprising. He said the company isn’t planning to forcibly move customers off copper as they decommission copper. He says customers are naturally migrating off copper. I find that hard to believe.

My consulting firm administers surveys, and we are still seeing DSL penetration rates in cities between 10% and 40%. Our surveys indicate that the people who are staying with DSL are doing so because of price – they largely hate DSL performance, but it’s what they can afford. This is not hard to understand when looking at the rates for broadband from the big cable companies.

In this blog, I’ve often talked about how expensive broadband is from Comcast and Charter, but broadband rates from some of the other cable companies like Cox and Atlantic Broadband are even higher. There are a lot of homes that can’t afford the cable company prices. It’s hard for me to believe that all of these people are going to voluntarily walk away from DSL over the next two or three years. The last estimate I vaguely remember reading was that there is still something like 19 million households still using DSL.

McElfish said AT&T plans to have 75% of its footprint covered by fiber or fixed cellular wireless by 2025 – I have to assume that in terms of square miles of footprint that this will mostly be wireless. AT&T is going to have a PR problem with trying to push customers to wireless. For rural customers within reach of a tower, a switch from DSL to fixed cellular wireless will be a no-brainer. The broadband speeds will be faster, and the price still affordable. But the big problem in rural markets is that there are huge parts of rural America where fixed wireless won’t work. The rural cellular coverage maps for all three big cellular companies are a joke, and anybody who drives into rural areas can see that you don’t usually have to go far to run out of bars of service. It’s worth noting that cellular voice covers a much larger footprint than cellular data. At some point, AT&T will have to drop rural DSL customers who might have no other alternative than satellite broadband. Extrapolating from McElfish’s statement of covering 75% of the footprint means that AT&T will be abandoning folks in 25% of its footprint.

Urban areas are a bigger issue for AT&T because that’s where most of the DSL customers remain. It’s clear that AT&T has no goal of overbuilding whole cities with fiber but is building in selected neighborhoods. It’s not clear if those neighborhoods are chosen due to the most affordable construction costs or the best demographics – but AT&T will not be building fiber to cover the majority of its footprint in most cities.

With today’s 4G LTE technology that’s been branded as 5G, AT&T is not prepared to deliver fixed cellular broadband to huge numbers of people in cities. That’s what 5G is supposed to fix, and it’s not here yet. But even when AT&T finally implements real 5G (estimated to be 5 – 7 years in the future), the company would have to install a huge number of small cell sites to have enough broadband capacity to migrate DSL customers to fixed cellular broadband. And that means building more fiber deep into neighborhoods to serve the small cell sites. None of that is happening by 2025, so AT&T must be planning on turning down rural copper markets first.

Perhaps AT&T is really counting on everybody else to pick up its DSL customers. T-Mobile is already aggressively rolling out fixed cellular broadband, and Verizon plans a big push starting in late summer of this year. Dish plans to open 25 major markets with cellular data by June. Smaller wireless player like Starry might be making a dent by 2025.

AT&T is ultimately going to have to force people off DSL. The download speeds on much urban DSL are not dreadful, at 15 – 30 Mbps, although upload speeds are nonexistent. I don’t see millions of people voluntarily abandoning the product so that AT&T can tear down the copper without a public stir.

But maybe there is another motive behind this – as the technicians who understand DSL keep retiring, AT&T might not be able to keep DSL running by 2025. I know that sounds cynical, but I don’t think it’s far from the truth.

Categories
The Industry

A Brief History of Rural Broadband

Last week I lectured on the topic of digital redlining in Western North Carolina for a senior class at the University of North Carolina at Asheville. These students are tackling a senior project of developing an advocacy program to help find broadband solutions for our region. I realized when describing redlining that I was also describing how the history of rural telephony had contributed to the poor condition of rural broadband today. Following is a brief history of the key events that tell the story of why rural broadband in much of the country has fallen so far behind broadband in larger towns and cities.

Bell Builds to Cities. Within a few short years after the invention of the telephone in 1876, the well-funded American Bell telephone company, founded by Alexander Graham Bell and investors, had built telephone networks in all of the major cities in the country. Within a decade, they made it out to county seats like Asheville, here in Western North Carolina. American Bell had no interest in building to rural areas, and rural America was not offered the new telephone technology.

Expiration of the Bell Patent. In 1894 the Bell patent on the telephone expired, and telephone companies sprang up across the country. Most of these new companies were started in rural areas by farmers, businessmen, or groups of citizens that came together to bring telephone service. By 1927 there were over 6,000 local telephone companies.

Bell Becomes a Monopoly. American Bell became a monopoly in an extraordinary move where Theodore Vail asked the government to grant monopoly status to the company. This was done to fight off competitive telephone companies in cities. Regulating telephone companies was gradual as states accepted the idea and awarded franchise areas to telephone companies. A consequence of regulation was that small telephone companies also became regulated. The regulation trend culminated in the Telecommunications Act of 1934 that created the FCC.

Birth of Cooperatives. The Rural Electrification Administration, a New Deal agency, had been funding rural electric cooperatives since 1935. In 1949 the agency started making loans to create rural telephone cooperatives, and these companies filled in the remaining rural areas nobody else had built. By 1960 the U.S.was the envy of the world with a 99% landline telephone penetration.

The Growth of Long-Distance Calling. AT&T was created in 1885 to construct the long-line networks to connect cities. The first coast-to-coast call was made in 1915 between New York and San Francisco. Regulation led to low local telephone rates, and telephone companies made up for lost revenues with expensive long-distance rates. The downside of long-distance for rural America was that calls between rural areas and county seats all became long-distance. Coping with long-distance calling became a major concern for many rural families, and rural residents were at a disadvantage compared to city dwellers since they needed to use long-distance to communicate with basic services.

Massive Consolidation. Companies like General Telephone, Continental Telephone, and Citizens Telephone purchased many rural telcos. These companies were the precursors of big rural telcos like Frontier, CenturyTel, and Windstream.

Divestiture of the Bell Companies. In 1984, divestiture separated AT&T, the long-distance company, from the local Bell telcos, with the goal of introducing competition into long-distance. This worked spectacularly, and long-distance prices tumbled. The unfortunate consequence of lower long-distance rates was that the larger telcos saw lower profits in rural areas. Big telcos began to cut staff, close businesses offices, and generally neglect rural properties.

Local Competition and Deregulation. The Telecommunications Act of 1996 mandated local telephone competition. Over time, this led to the big telcos seeking and winning local deregulation, and as the big telcos were deregulated, the neglect of big telco rural properties accelerated.

The Rise of DSL Broadband. The first generation DSL that delivered 1 Mbps download got installed in most telcos. Rural speeds were not as fast as cities due to distance limitations on DSL. The technology improved rapidly in a few short years, and the DSL in towns was upgraded to faster speeds while rural properties mostly were not. When cities got DSL speeds up to 50 Mbps, rural area DSL stayed slow since the big telcos didn’t want to make rural investments.

Small Companies Did Much Better. The remaining smaller telcos and cooperatives did a great job during all of these industry transitions. They maintained copper wiring, upgraded DSL as needed, and eventually started upgrading in many places to rural fiber.

The Large Cellular Companies Shunned Rural America. Big cellular companies built close to major highways to capture roaming traffic, but rural cellular coverage rarely extended to where people live and work. It’s gotten better over a few decades, but rural cellular coverage maps are still largely fictional.

Summary. The poor state of rural broadband can be traced to the ways that the big telcos reacted to industry changes. Small telcos built rural networks, but large telcos gobbled them up over time. The big rural telcos then neglected rural properties in reaction to the changing economics from the deregulation of long-distance and local telephone service. Small telcos showed that it wasn’t necessary to abandon rural properties, but the big telcos stopped making investments in rural networks and for all practical purposes walked away from rural communities.

Categories
The Industry

No Home Broadband Option

We spend a lot of time arguing policy questions, such as asking if 25/3 Mbps is adequate broadband. What policymakers should really be talking about are the huge numbers of homes with dreadful broadband. The worst thing about the deceptive FCC maps is that they often give the perception that most rural areas have at least some broadband options when many rural residents will tell you they have no real broadband options.

Policymakers don’t grasp the lousy choices in many rural areas. The FCC maps might show the availability of DSL, but if it’s even available (often it’s not), the speeds can be incredibly slow. Rural households refuse to pay for DSL that might deliver only 1 or 2 Mbps download and practically no upload.

I think the FCC assumes that everybody has access to satellite broadband. But I’ve talked to countless rural residents who tried satellite broadband and rejected it. Real speeds are often much slower than advertised speeds since trees and hills can quash a satellite signal. The latency can be crippling, and in places where the speeds are impaired, the high latency means a household will struggle with simple real-time tasks like keeping a connection to a shopping site. Satellite plans also come with tiny data caps. I’d like to put a few Washington DC policymakers on a monthly data plan with a 40 GB or 60 GB cap so they can understand how quickly that is used in a month. But the real killer with satellite broadband is the cost. HughesNet told investors last year that its average revenue per customer was over $93 per month. Many rural homes refuse to pay that much for a broadband product that doesn’t work.

We hear a lot of stories about how fixed wireless technology is getting better to the point where we’re hearing preposterous conversations about bringing gigabit fixed wireless to rural areas. There are still a lot of places with woods and hills where fixed wireless is a poor technology choice. I worked with one county recently that gathered thousands of speed tests for fixed wireless that showed average download speeds under 5 Mbps and upload speeds below 1 Mbps. There are still a lot of WISPs that are cramming too many customers on towers, chaining too many towers together with wireless backhaul, and selling to customers who are too far from towers. This is not to say that there aren’t great WISPs, but in too many rural places the fixed wireless choices are bleak.

Rural residents have also suffered with cellular hotspots. These are the plans that cellular companies have had for years that basically price home broadband at the same prices and data caps as cellular broadband. During the pandemic, I’ve heard from families who were spending $500 to $1,000 per month in order to enable home-schooling during the pandemic. This product is not available in huge parts of rural America because of the poor or nonexistent cellular coverage. We complain about the FCC’s broadband maps, but those are heads and tails better than the cellular company coverage maps which massively overstate rural cellular availability.

There is some relief in sight for some rural homes. I recently talked to farmers who are thrilled with the T-Mobile fixed cellular product – but they said distance from cell sites is key and that many of their neighbors are out of range of the few cell sites found in most rural counties. There are rural folks who are happy with Starlink. But there are a lot of people now into the second year on the waiting list to get Starlink. Starlink also has reported problems with trees and hills and also comes with a steep $99 per month price tag.

When a rural household says they have no broadband connection, I’ve learned that you have to believe them. They will have already tried the DSL, fixed wireless, satellite, and cellular hotpots, and decided that none of the options work well enough to justify paying for them. The shame is that the FCC maps might give the impression that residents have two, three, or four broadband options when they really have none.

Categories
Regulation - What is it Good For?

The New Speed Battle

I’ve been thinking about the implications of having a new definition of broadband at 100/20 Mbps. That’s the threshold that has been set in several giant federal grants that allow grant funding to areas that have broadband slower than 100/20 Mbps. This is also the number that has been bandied about the industry as the likely new definition of broadband when the FCC seats a fifth Commissioner.

The best thing about a higher definition of broadband is that it finally puts the DSL controversy to bed. A definition of broadband of 100/20 Mbps clearly says that DSL is no longer considered to be broadband. A 100/20  Mbps definition of broadband means we can completely ignore whatever nonsense the big telcos report to the FCC mapping process.

Unfortunately, by killing the DSL controversy we start a whole new set of speed battles with cable companies and WISPs that will be similar to the controversy we’ve had for years with DSL. Telcos have claimed 25/3 Mbps broadband coverage over huge parts of rural America in an attempt to deflect broadband grants. In reality, there is almost no such thing as a rural customer who can get 25/3 Mbps DSL unless they sit next to a DSLAM. But the telcos have been taking advantage of the theoretical capacity of DSL, and the lax rules in the FCC mapping process allowed them to claim broadband speeds that don’t exist. I hate to admit it, but overstating DSL speeds has been a spectacularly successful strategy for the big telcos.

We’re going to see the same thing all over again, but the new players will be cable companies and WISPs. The controversy this time will be more interesting because both technologies theoretically can deliver speeds greater than 100/20 Mbps. But like with DSL, the market reality is that there are a whole lot of places where cable companies and WISPs are not delivering 100/20 Mbps speeds and would not be considered as broadband with a 100/20 Mbps yardstick. You can take it to the bank that cable companies and WISPs will claim 100/20 Mbps capability if it helps to block other competitors or if it helps them win grants.

The issue for cable companies is the upload speed. One only has to look at the mountains of speed tests gathered around the country to see that cable upload speeds are rarely even close to 20 Mbps. We’ve helped cities collect speed tests where maybe 5% of customers are reporting speeds over 20 Mbps, while the vast majority of cable upload speeds are measured at between 10 Mbps and 15 Mbps. Usually, the only cable customers with upload speeds over 20 Mbps are ones who have ponied up to buy an expensive 400 Mbps or faster download product – and even many of them don’t see upload speeds over 20 Mbps.

This begs the question of what a definition of broadband means. If 95% of the customers in a market can’t achieve the defined upload speeds, is a cable company delivering broadband under a 100/20 Mbps definition? We know how the telcos answered this question in the past with DSL, and it’s not hard to guess how the cable companies are going to answer it.

It’s not a coincidence that this new controversy has materialized. The first draft of several of the big grant programs included a definition of broadband of 100/100 Mbps – a speed that would have shut the door on cable companies. But cable company lobbying began immediately, and the final rules from Congress included the slimmed-down 100/20 Mbps broadband definition.

WISPs have a more interesting challenge because the vast majority of existing WISP connections are nowhere close to meeting either the upload or download speed of 100/20 Mbps. But fixed wireless technology is capable of meeting those speeds. A WISP deploying a new state-of-the-art system can achieve those speeds today for some reasonable number of miles from a tower in an area with good lines of sight. But most existing WISPs are deploying older technology that can’t come close to a 100/20 Mbps test. Even WISPs with new technology will often serve customers who are too far from a tower to get the full speeds. Just like with cable companies, the 100/20 Mbps definition of broadband will allow WISPs to stay in the game to pursue grants even when customers are not receiving the 100/20 Mbps speeds. So brace yourself, because the fights over speeds are far from over.

Exit mobile version