Are We Facing the Splinternet?

One of the consequences of the war between Russia and the Ukraine is that Russia has largely stopped participating in many large worldwide web applications. Russia has blocked Facebook and Twitter. Other applications like Apple, Microsoft, TikTok, Netflix, and others have withdrawn from Russia.

The European Union is in the process of trying to block Russian-generated content such as the state-owned news outlets of RT (formerly Russia Today) and Sputnik. There are discussions of going so far as block all Russian people and businesses from EU search engines.

Russia has responded by declaring Meta, the owner of Facebook, Instagram, and WhatsApp, to be an extremist organization. This has also led the Russian government to withdraw its participation in organizations that set international policies such as the Internet Council of Europe. The EU countered by suspending Russia from the European Broadcasting Union.

There is a new phrase being used for what is happening with Russia – the splinternet. In a full splintenet scenario, Russia could end up being totally separate from the rest of the world as far as participating in the Internet.

There are already countries that don’t fully participate in the worldwide web. North Korea has blocked participation in much of the web. China and Iran block a lot of western content. However, these countries still participate in supporting the general structure and protocols of the Internet, and not all western applications are blocked.

The folks from the worldwide governing bodies that oversee Internet protocols are concerned that Russia, and perhaps China and Iran could decide to fully withdraw from the web and develop their own protocols for use inside the countries. If the countries that have peeled off from the rest of the web don’t maintain the same protocols, then communications with the rest of the world eventually becomes difficult or impossible.

This would have a drastic impact on the web as an international means of communication. There are huge amounts of digital commerce between these countries and the rest of the world over and above social apps. Commerce between these countries and the world depends on email, messaging apps, and collaboration platforms. People and businesses in these countries participate in digital meetings in the same manner as the rest of the world. The economic impacts of large countries effectively withdrawing from worldwide e-commerce would be immense.

This is something that we’ve seen coming for many years. For example, Google and Facebook located servers in Russia so that content generated in Russia would stay in the country and not be stored in servers and data centers outside the country.

A Russian withdrawal from the Internet would be far more drastic than Chinese censoring of web contact – it would cut communications with the outside world to zero. It’s hard to even imagine the impact this would have on Russian businesses, let alone cutting the ties between the Russian people and everybody else. This would create a digital Berlin Wall.

It doesn’t seem likely that having Russia or China withdraw from the web would have much impact on how the rest of the world uses the web. It would mean that citizens in those countries would not benefit from the newest innovations on the web. But most countries already today understand how important the web is for commerce, and for most countries, that’s a good enough reason not to tinker with something that works.

From my perspective, the whole world suffers if folks stop participating in worldwide communications. The web is the great equalizer where folks with similar interests from around the world get to know each other. But we live in a world full of politics and controversy, so it’s probably inevitable that this will spill eventually over to the Internet, like it does to many other parts of the world economy.

What Happened to Verizon Fiber-to-the-Curb?

Back in 2018, Verizon got a lot of press for the release of a fiber-to-the-curb (FTTC) technology it called Verizon Home. The first big test market was Sacramento. The company built fiber along residential streets and used wireless loops to reach homes. At the time, Verizon touted speeds of 300 Mbps but said that it was shooting for gigabit speeds using millimeter-wave spectrum. Verizon tried to make this a self-installed product, and customers got instructions on how to place the receiver in different windows facing the street to find the best reception and speeds.

There were quotes from the time that Verizon intended to build fiber to pass 25 million homes by 2025 with the technology. But then the product went quiet. In 2020, the Verizon Home product reappeared, but it is a totally different product that uses cellular spectrum from cell towers to bring broadband. This is the product that the industry is categorizing as FWA (fixed wireless access). The company no longer quotes a target broadband speed and instead says 5G Home is reliable and fast to power your whole home with lots of devices connected. So all of your TVs, tablets, phones, gaming consoles and more run on the ultra-fast and reliable Verizon network.” In looking through some Ookla speed tests for the FWA product, it looks like download speeds are in the 100 – 150 Mbps range – but like any cellular product, the speed varies by household according to the distance between a customer and the transmitter and other local conditions.

The new cellular-based product has gone gangbusters, and Verizon had over one million customers on the product by the end of the third quarter of 2022, having sold 342,000 new customers in that quarter. The relaunch of the product was confusing because the company took the unusual step of using the same product name and website when it switched to the wireless product. It even kept the same prices.

But the two products are day and night different. Verizon’s original plan was to pass millions of homes with a broadband product that was fast enough to be a serious competitor to cable broadband. Even if the product never quite achieved gigabit speeds, it was going to be fast enough to be a lower-priced competitor to cable companies.

While the new Verizon Home product is selling quickly, the product is not close in capabilities to the FTTC product. Cellular bandwidth is never going to be as reliable as a landline technology or one where fiber is as close as the curb. Verizon (and T-Mobile) have both made it clear that the FWA customers will take second priority for bandwidth availability behind cell phone customers. I don’t know that these companies could do it any other way – they can’t jeopardize unhappiness from a hundred million cellular customers to serve a much smaller number of FWA customers.

I think everybody understands the way that cellular broadband capabilities change during the day. We all see it as the bars of 4G or 5G at our homes bounce up and down based on a variety of factors such as weather, temperature, and the general network usage in the immediate neighborhood. The most interesting thing about being a broadband customer on a cellular network is that the experience is unique to every customer. The reception will vary according to the distance from the cell tower or small cell and the amount of clutter and interference in a given neighborhood from foliage and other buildings.

I expect that large bandwidth users will get frustrated with the variability of the signal and eventually go back to a landline technology. The FWA product is mostly aimed at bringing broadband to rural customers who have no better broadband alternative or to folks in towns for whom saving money is more important than performance. There are a lot of such people who have stuck with DSL for years rather than upgrading to the more expensive cable broadband, and these are the likely target for FWA. In fact, FWA might finally let the telcos turn off DSL networks.

Verizon says it’s still on track with what it calls the One Fiber initiative which is aimed at building Verizon-owned fiber to cell towers and small cell sites. This backbone was likely the planned starting point for neighborhood fiber, but now this is mostly a cost-cutting step to stop paying fiber leases.

Measuring Sustainability

I’ve seen folks around the country suggesting that State Broadband offices ought to put a priority on sustainability when selecting winners of broadband grant funding. It’s a concept that has instant appeal, but I immediately asked myself what it means. How do you measure sustainability in a way that can be used to score grant requests?

It’s likely that most folks would agree on the definition of sustainability. If we are going to use government grant money to build a broadband network, we want that network to be providing broadband service for as long as possible. We expect sustainability for other kinds of infrastructure, such as roads, bridges, and buildings, so why shouldn’t we expect the same thing from a grant-funded broadband network?

But what does sustainable mean for a broadband network? The first test of sustainability is the expected life of the assets being constructed.

  • The longest-lived asset that is being constructed with grants is conduit. There is no reason why a well-maintained conduit system shouldn’t still be fully functional a century from now.
  • There are big debates about the economic life of fiber. If you go by the economic lives allowed by IRS depreciation, then the expected life of fiber is 25 or 30 years. We know that’s ridiculous because there is plenty of forty-year-old fiber still chugging along in the field. We also know that fiber constructed today is far better than fiber built forty years ago. The manufacturers have learned to make higher-quality glass with less impurities. But the big change in the industry is that the folks that install fiber have learned techniques that minimize damage during construction. Poor handling of fiber manifests twenty years later as micro-fissures – and that means cloudy glass. Nobody will give an expected life for well-maintained fiber, but scientists at some of the manufacturers have privately told me that they think it’s at least 75 years – we’ll just have to wait to find out.
  • The assets that cause the most concern for sustainability are electronics – be that fiber electronics or fixed wireless electronics. All electronics must periodically be replaced. I’ve seen some fiber electronics last fifteen years – but that seems to be near the upper end of economic life. The general industry wisdom is that fixed wireless systems have to be replaced every 7 to 10 years.
  • We largely eliminated some ISPs from grant eligibility due to poor sustainability. For example, low-orbit satellites like Starlink are designed to only last 5 to 7 and then fall from orbit. It’s hard to make an argument that grant funding buys great value with this kind of asset.

This all means that the sustainability of electronics must be a concern for all technologies. Any ISP that wins grant funding will likely be replacing some electronics within a decade. One test of any ISP on sustainability is the financial ability and willingness to replace those electronics. That’s hard to judge.

There is another measure of sustainability that is even harder to measure. A big factor in sustainability is the operating philosophy of the ISP that owns the networks. We know there is a big range of what I would call corporate responsibility between ISPs.

If we go strictly by the past, then the ISPs that have the most likely chance of operating a sustainable network for the long term are cooperatives or other ISPs that expect to still be serving the same customers fifty years from now. But not all cooperatives are the same. We see this when looking at how some electric cooperatives have allowed their poles to deteriorate badly over time.

Next in line in trustworthiness might be small telcos that have been around for as long as a hundred years. But over the last few decades, a large percentage of these companies sold to larger ISPs – so, the question for a grant reviewer is if the small telco that gets a broadband grant today will be the same owner of the network a decade or two from now?

A big question mark for many folks is the large ISPs. We saw the big telephone companies let copper and DSL networks rot in place by basically ceasing all maintenance years ago. This was clearly done as a cost savings measure. These companies will argue that there was no sense in continuing to support a dying technology, but we know that is nonsense. The copper networks in places like Germany were well-maintained and still offer DSL today with speeds in many places over 100 Mbps. The big telcos decided to unilaterally cut costs at the expense of customers. Should a grant office award funding to a company that has already failed the public once before? I’m guessing that grant offices will make awards to the big companies by reasoning that fiber networks will last a long time, so maintenance doesn’t matter. But I would argue just the opposite. I think a fiber network can deteriorate even faster without good maintenance than a copper network because the technology is less forgiving. There is still 20-year old DSL cards chugging away, something that likely won’t happen with fiber. If an ISP ignores and doesn’t maintain fiber network electronics, a fiber network could quickly turn into a brick.

I’ve not said anything above that is not common knowledge. But I am at a loss of how to turn what we’ve learned from the past behavior of ISPs in a way to consider sustainability when awarding grants. If sustainability was the most important factor in awarding a grant, I personally would give all of the money to cooperatives and none to big ISPs. And I wouldn’t fund technologies that must be largely replaced within a decade. This is probably why nobody is asking me to award grants!

Telephony in 1927

This may seem like an odd topic to write about. The topic comes from browsing through the 2.8 million documents from 1927 that just entered the public domain as the original copyrights expired. The big national headlines focused on Winnie the Pooh, Disney’s Steamboat Willie, and Sherlock Holmes entering the public domain. But it’s fascinating to browse through the wide range of documents that provide a snapshot of the science, technology, and culture in 1927.

Today’s blog comes from a 12-page pamphlet on The Telephone: Its History and Methods of Operation, published in 1927 by the Illinois Committee on Public Utility Information. The document is a great primer on the state of the telephone industry that year. The most interesting impression for me seeing how pervasive the telephone became only 51 years after the first lab test of the technology.

Some of the more interesting facts from the pamphlet:

  • There were 18 million telephones in the U.S. that year, including 1.6 million in Illinois. That’s one telephone for every 15 people.
  • The U.S. had 61% of all working telephones in the world at the time. Europe had 27%, with only 12% of telephones for the rest of the world. Illinois had 1,189 central offices.
  • We talk about industry consolidation today, but in 1927 there were 39,000 telephone companies in the country, most serving small towns or neighborhoods.
  • There were 380,000 people employed by telephone companies in the U.S., including 15,000 girls employed as private branch exchange operators, just in in Illinois. Illinois had over 30,000 telephone-related manufacturing jobs.
  • A state-of-the-art handset is shown as the picture at the top of this blog.
  • Telephone service cost an average family less than 1% of household income, and it was the affordability that led to the rapid acceptance and deployment of the technology.

The pamphlet gushes about the deployment of telephone technology. “Yet behind this little act of calling by telephone and holding converse with some distant person, there is the story of a marvel so great as to almost put to shame the winder of Aladdin’s Lamp. . . Beginning 5 years ago, the record and the development of the telephone has been so wonderful, so vital in the affairs of man, that it has actually changed the course of human history and has played no small part in the civilization of mankind.”

There were three types of central offices in 1927. Small exchanges used the magneto system that had a battery at the base of each telephone that was charged by turning a crank on the telephone. Larger telephone exchanges used a common battery system that supplied power to telephone sets over copper wires. This system alerted an operator that somebody wanted to place a call by lighting a small lamp on a switchboard. Operators manually placed calls to other exchanges to arrange the connection of a call. Large city exchanges were using the new switchboard technology that allowed an operator to complete calls by connecting a jack to the appropriate trunk line, eliminating most of the time-consuming labor needed to set up a call.

There is a fascinating section describing the network used to place transatlantic calls. A U.S. originating call used a voice path to Europe routed to Rocky Point, Long Island, where the calls were transferred to a powerful radio system that transmitted the call to Cupar, Scotland. The return voice path took a similar path from Rugby, England to Houlton, Maine.

Within seven years of this pamphlet, Congress passed the Telecommunications Act of 1934, which put some regulatory restraints on the large Bell Telephone monopoly that was gobbling up telephone systems across the country. Looking at telephony from a 1927 perspective shows us a time when telephony was still new and was a wonderment to most people.

Here is a look at all of the books and periodicals from 1927 that are now in the public domain. Here is the Telephone Pamphlet. Pay particular attention to the last section that instructs people how to talk on the telephone.

Bringing Back Payphones

My last blog for the year includes a nostalgic look back at my days as a telephony guy – many of my older readers will get it. The Washington Post recently had an article about Mike Dank, who is working to bring free payphones to Philadelphia.

I can remember growing up when payphones were ubiquitous. The peak of the market was 1995, when there were 2.6 million payphones in the country. There was an outdoor payphone at every gas station and around many retail stores. There were payphones inside businesses like hotels, malls, and any other place where a lot of people gathered. And there were big banks of payphones at airports. Anybody who did business travel in the days before cellphones can remember rushing off airplanes to try to find an open payphone.

I rode along once with folks in the payphone department at Southwestern Bell who told me about the elaborate process for collecting coins from phones and for making sure that the coin boxes weren’t getting robbed or embezzled. It required an elaborate effort in large cities to continually empty and count the coins from payphones.

In the days before the deregulation of the big telephone companies, most payphones were provided by the big Bell Telephone Companies. After divestiture into the regional Bell companies, the payphones all got relabeled as AT&T. When traveling, I’d run into payphones from smaller telcos, such as the banks of telephones in the Pittsburg airport that were provided by the North Pittsburgh Telephone Company, which happened to be the monopoly provider at the airport site.

Eventually, the FCC broke up the payphone monopoly, and the payphone business went a bit crazy. There were banks of phones in airports from companies you had never heard of with signs luring travelers to use their phones instead of the telco phones. Hotels were besieged by salespeople trying to get their phones into the lobby. Everybody walked around with a few calling cards in their wallet. But eventually, the ubiquitous presence of cell phones killed the payphone business. One by one, the old telephone booths and wall payphones were torn down and junked.

Payphones aren’t entirely gone, and this Google Map site supposedly shows the remaining payphones. If this site is right, there are still 307 working payphones in the country. Here in North Carolina, the only remaining payphone shown is at the Greensboro airport.

Mike Dank is a 31-year-old guy who became intrigued by payphones. He picked up a payphone for $20 at a flea market a few years ago that has been sitting in his basement. He heard about a group in Portland that had installed ten payphones that provide free calling to the public. This provides a public service to folks who can’t afford a phone or who need to use one in an emergency. In case you don’t remember, you could always call the operator for free from a telco payphone.

He rewired the phone so that it could work fusing WiFi, and he talked a Philadelphia books store, Iffy Books, into installing the phone. The phone uses online software to place a phone call to anywhere in the country (and many places around the world). The bookstore reports that the free phone has been popular and is in steady use.

Dank says that he’d love to rehab more old phones if he can find them. He estimates that it costs around $300 to refit a phone to be able to connect to broadband, so this is a labor of love – but one that I think most old telephone guys will appreciate. Do any of you old telco guys still have old payphones in the basement?

More Assistance for Rural America

The Biden Administration launched an initiative earlier this year that has some interesting benefits for rural communities. The new initiative is called the Rural Partners Network, and has the goal of helping rural areas maximize the benefits available from the federal government.

The new program is putting federal employees directly in rural communities and making them available to help rural communities navigate the confusing federal bureaucracy.

As an example, one of the primary roles of the RPN is to help local communities find and apply for grants. I know this would be extremely useful just in the area of broadband. I’ve counted dozens of distinct ways that communities can get grant funding to help with broadband. By now, most of them have heard of the giant $42.5 billion BEAD grants, but there are many other grants available. For example, there are dozen of grants related to broadband that can go to rural schools and libraries. There are grants for strengthening the electric grid that might allow rural electric companies to extend middle-mile fiber where it’s highly needed. There are grants for rural healthcare facilities. And there are grants for digital inclusion which be used to buy computers and teach people how to use them. These same grants could be used to establish more formal technical training courses.

This blog clearly focuses on broadband issues, but there is a dizzying array of grants for other purposes as well. There are grants to help people kick-start an interesting new business idea. There are grants that help local non-profits meet their goals. There are grants to pay for online education and training. There are grants to help people to get their finances organized in order to qualify for buying a house. But the list of federal grants is so huge that it’s almost impossible for somebody to wade through the possibilities.

The new program will be part of the Department of Agriculture. But the program promises to help rural people and communities work with the Departments of Commerce, Education, Interior, Treasury, the Small Business Administration, and dozens of other agencies.

The RPN has already deployed people in Alaska, Arizona, Georgia, Kentucky, Mississippi, Nevada, New Mexico, North Carolina, Puerto Rico, West Virginia, and Wisconsin. The goal is to get these folks deployed everywhere, assuming the funding is approved by Congress.

I work in rural America a lot, and in my experience, many counties are not able to navigate the huge number of grant opportunities or complete grant requests even if they know about them. This puts small communities at a huge disadvantage compared to larger cities that have a department of full-time grant writers on board. This program will help make sure that federal funding ends up where it’s needed the most.

FCC Cellular Broadband Mapping

I mostly write about broadband, but one of the most common complaints I hear from rural folks is the lack of good cellular coverage. Poor cellular coverage doesn’t seem to have gotten the same press as poor broadband, but not having access to cell phones might be more of a daily challenge than the lack of broadband.

For the first time, the new FCC maps now show us the claimed coverage everywhere for each cellular carrier. This coverage is shown on the same maps used for broadband.

People are going to find the claimed cellular coverage to be confusing since the FCC is showing coverage using massively out-of-date cellular speeds. The FCC maps only ask a cellular carrier to show if it meets the FCC definition of cellular broadband, which is embarrassingly low. A cellular carrier only needs to achieve a speed of 5 Mbps download and 1 Mbps upload to be considered covered for 4G. The FCC has two claimed speed tiers for 5G at 7/1 Mbps and 35/3 Mbps.

The FCC speed thresholds for cellular are massively out of touch with modern technology. According to Ookla’s nationwide speeds test, the national average cellular speeds at the end of the third quarter of 2022 was 148 Mbps download and 16 Mbps upload. The national median speed (meaning half of people are either faster or slower) was 75 Mbps download and 9 Mbps upload. The FCC is sticking with its obsolete definition of cellular broadband speeds for the same reasons it has stuck with using 25/3 as the official definition of broadband – the primary reason likely being the lack of a fifth FCC Commissioner.

That makes the FCC cellular maps largely useless for people in cities. What does it mean if a cellular carrier claims a 5G connection of 7/1 Mbps – that’s probably not even one bar of coverage. My house shows coverage from AT&T, T-Mobile, Verizon, TDS (US Cellular), and Project Genesis, the new Dish Network offering. AT&T claims only 4G coverage at my house and doesn’t claim a speed capability, even though I just tested at over 150 Mbps download as I was writing this blog. The other four carriers claim 5G coverage and speeds of at least 7/1 Mbps, while T-Mobile and Project Genesis claim speeds of at least 35/3 Mbps. The FCC reporting doesn’t give me any idea if I can trust any of these carriers at my house.

That’s because cellular coverage areas are incredibly hard to map. This is something that everybody in America is already an expert on. No matter where you live, you see the bars of available data vary at your house hour-by-hour and day-by-day. Cellular networks are broadcast networks that blast signals to anybody in range of a cell tower. Cellular radio signals can be disturbed by heat, humidity, air pollution, and temperature. And the strength of the signal varies depending on the number of users on the network at a given time.

It’s convenient to picture cellular coverage areas as a circle around a tower, with the signal being broadcast outward everywhere – but that is only true to the flattest and most open places in the county. Cellular signals are blocked or deflected by impediments in the environment, like hills and buildings. While cellular signals travel decently through foliage, leaves still add distortion and cut the distance and strength of a signal. A more apt way to picture a cellular coverage area is as an amoeba with different length arms reaching in many directions.

Because of the physics of cellular delivery, the claimed coverage by cellular companies has been badly overstated. For years, cellular companies have published maps that claim they have the best nationwide coverage – but those maps are badly distorted when looking at real places. Every cell phone user understands dead spots. My house is a good example. I live downtown in a city, and cellular coverage is generally good. But I live partway up a hill, and at my house, there is zero Verizon coverage, although folks at the other end of the block can get Verizon. I use AT&T and run into AT&T dead spots as I drive around.

Rural cellular coverage in the past is often the most exaggerated. Anybody who has driven through rural America knows that a lot of the claimed coverage is bosh. The FCC is hoping to rein in the exaggerated coverage claims of cellular carriers. You can challenge the cellular coverage at your home in the same way that you can challenge landline broadband coverage. The challenge is built directly into the FCC broadband map. When you type in an address, you’ll see a place on the top right to toggle between fixed and mobile broadband. Unfortunately, the method of challenging cellular coverage is cumbersome, and I’ll cover it in another blog.

There is also a process for bulk challenges of cellular broadband by local governments. This means gathering a lot of cellular speed tests around a community, done in a way that meets the FCC rules. I’ve already seen several counties that have started the bulk speed testing to challenge the maps.

Progress Against Robocalling

I mostly write about broadband these days, but we can’t forget that telephony is still a significant part of the industry. While the national penetration rate of residential landline telephones has dropped to about 20%, most businesses continue to have telephones, and practically everybody has a cellphone.

The bane of telephony continues to be robocalling and other nuisance calls that pester anybody with a telephone. There are bad actors that impersonate government or commercial entities with the goal of scamming the elderly and other vulnerable individuals. Scammers pretend to be the Social Security Administration, banks, utilities, the local sheriff, or tech companies in an attempt to solicit credit card numbers or other valuable data from people. In a more development robocalls are used to launch denial of service attacks against hospitals and public service entities to block the ability to send or receive legitimate phone calls.

There is a systematic industry effort to squash robocalling. The Industry Traceback Group includes a collaboration of over 400 wireline, wireless, VoIP, and cable companies that are tackling the robocalling issue. This group works with law enforcement to trace, identify, and stop the sources of illegal robocalling. The group’s goal is to block or shut down illegal robocalling.

The effort is having an impact and routinely has been able to black robocall operations. Earlier this year, the FCC issued a record $225 million in fines against two Texas companies, Rising Eagle and JSquared Telecom. These companies had been making billions of illegal spoofed calls (where they used a false call-from number) to sell fraudulent health insurance. The callers claimed to represent major insurance companies like Aetna, Blue Cross Blue Shield, Cigna, and UnitedHealth Group.

There was a Supreme Court decision earlier in 2022 which threatened to weaken the effort to slow and stop robocalling. The case, Facebook v. Duguid, focused on the definition of an automatic telephone dialing system, which is commonly called an autodialer, as defined in the Telephone Consumer Protection Act (TCPA) from 1991. The Act defined an autodialer as equipment that can store telephone numbers to be used by a random or sequential number generator. The Supreme Court ruled in favor of Facebook and found that definition to be narrow and to only apply to a specific type of calling equipment.

This ruling hasn’t slowed down the Industry Tracking Group since most robocalls still violate the 1991 legislation. Calls made for the purposes of scams still violate the law. It is still illegal to call cell phones with a prerecorded or artificial voice without the permission of the user. Telemarketing calls often also violate state laws when spoofing with false caller ID is used with the intent to defraud or cause harm to call recipients.

The large FCC fine and the attempt to shut down robocalling operations have, unfortunately, driven the robocalling industry overseas, and a large percentage of robocalls now originate from overseas.

The industry is fighting against robocalling in several ways. First, many carriers have provided call-blocking tools to subscribers to block calls from unwanted numbers. The industry has implemented and continues to refine the STIR/SHAKEN process that makes it harder for robocallers to spoof telephone numbers. Probably most importantly, the industry is working with law enforcement to shut down illegal robocalling operations.

One of the most interesting features of the effort is the labeling of calls. I use AT&T for cell service, and my caller ID labels routinely identifies calls as either a telemarketing call or as potential spam. While it’s annoying to continue to get these calls, it’s comforting to be able to ignore them.

Can Frontier Reinvent Itself?

Diana Goovaerts recently wrote an article that quotes Frontier’s Consumer EVP John Harrobin as saying that Frontier expects to become the ‘un-cable” option in the market. He says that Frontier is doing this by simplifying its product lines to eliminate behavior that customers hate.

One of the biggest changes is to get rid of special pricing, where a customer signs with an ISP due to a low-price special, only to see the rates jump up at the end of the special period. This is the one characteristic of big ISPs that customers dislike the most. For years I’ve been advising smaller ISPs to avoid the practice and to offer a fair price all of the time.

I have ISP clients who sometimes panic when they see big ISPs offering special prices as they enter a new market. The specials work to some degree, and the big ISPs lure some customers with the special prices. But small ISPs have learned that after a year or two, when the special pricing ends, they have a good chance to win most of these customers. Small ISPs have learned that when they treat customers fairly that those customers don’t bite on new pricing offers. In the long run, the best way to reduce churn is to treat customers fairly and with transparency.

The interview didn’t mention it, but if Frontier is going to be an un-cable ISP, it also will have to eliminate hidden fees. These are the fees that are not included in advertising but appear on the first customer bill, usually as an unpleasant surprise. The biggest hidden fees are for cable TV service where ISPs hide programming and sports fees and settop box fees – and where a first bill can be $30 higher than the advertised price. But big ISPs do the same thing and don’t mention expensive modem fees in advertising – and customers are instantly unhappy when they get a first bill where the actual price is $10 or $15 higher than the price they expected.

Frontier has announced plans to build fiber to pass 10 million homes and businesses. The company was getting creamed by competition as long as it primarily offered DSL. In 2018, Frontier lost over 200,000 broadband customers. In 2019 the losses grew to 235,000, and in 2020 the company lost 400,000 broadband customers. But by 2021, Frontier started turning the ship around and only lost 35,000 customers for the year as fiber additions started to outnumber DSL losses.

Frontier has a long way to go to have its customer base come to trust it. The company was guilty of all of the same sins as other big rural telcos. It cut back on maintenance to the point where customers might be out for a week or two before they heard from a technician. In many cases, the company would disconnect a customer that had a problem rather than fix it. The company accepted federal CAF II funding, but many customers saw little or no improvements. Frontier has a long way to go to regain the trust of customers that it largely abandoned for many years.

Building fiber is a huge start to regaining customer trust since delivering broadband that actually works is essential to have customers want to stay with any ISP. But Frontier is still going to have to demonstrate to customers that it cares about them when there is a problem. Most small ISPs try to clear customer problems within a day of a trouble report and will work extra hours to do so. If Frontier really wants to be the un-cable company it will mean adding maintenance staff. Folks have become so reliant on broadband that they are annoyed if they lose service for an hour – they won’t forgive an ISP that puts them out of service for a day or longer.

Frontier has also been embroiled in a few overbilling controversies in recent years, and being the un-cable ISP means taking the attitude that the customer is right, even if that costs the company a few bucks.

It’s going to be interesting to see if Frontier’s practices live up to the public relations hype. There is one sign that perhaps the company has started to turn the ship. In the 2022 American Customer Satisfaction Index, the consumer rating of Frontier jumped from 57 to 61 in one year. In 2020, the only ISP with a worse customer rating was Suddenlink. Within a year, the company bypassed the rating for MediaCom and CenturyLink. A 60 rating still means that Frontier (and most other ISPs) is still the most disliked companies among 45 different major business sectors. But if Frontier can sustain being an un-cable ISP, then over time, customers will begin to trust the company again.

ISPs and the Digital Divide

It seems that almost monthly that I am asked about the role that ISPs should take in making sure that we solve the digital divide. I think that people are somewhat shocked every time when I tell them this is not a role for ISPs.

In explaining my answer, let me start by parsing what is meant by the question. We are about to see a lot of grant funding for getting computers into homes and training folks on how to use them. The folks asking this question are hopeful that ISPs are going to take up that role in any meaningful way. The reality is that is rarely going to happen – and it’s not something we should be expecting from ISPs.

ISPs are in the business of building broadband networks and keeping them running. That’s a full-time job. I think that people assume that ISPs want new customers badly enough that they are willing to tackle the digital divide efforts needed so that folks know how to use broadband. An ISP’s role in solving the digital divide is to bring the broadband to homes willing to buy broadband. To use the old analogy of the three-legged stool, the ISP function is to provide the broadband connection– it’s up to somebody else to tackle the other issues of computers and training.

I don’t think that the folks asking this question understand the challenge involved in helping somebody to cross the digital divide. You can’t just hand out computers to homes that don’t have them. A computer is a brick for a household where nobody knows how to use it. It takes a lot of one-on-one effort to sit with people and help them learn how to navigate the possibilities of broadband.

There are programs around that have been doing this the right way. I’ve been told by several folks who train folks that the key to getting somebody to learn to use a computer is to help them to accomplish something they want to do. That’s different for everybody. It might mean helping them look for a job, talk with relatives on social media, search for knitting patterns, or learn a new language – it doesn’t matter what it is, but helping a new computer user to accomplish something useful is the way to prove to them that a computer and broadband is a useful tool.

My firm has been doing broadband surveys for many years, and we’ve noticed that folks will not admit to being afraid or intimated by computers and technology. Folks won’t tell you that the reason they can’t use a computer is that they can’t read very well. But the folks that do computer training tell me that these are some the basic reasons folks don’t or won’t use computers. Somewhere in the past, they tried and failed, and they don’t want to do that again.

I remember twenty years ago, when cable modems and DSL were new that there were computer training courses everywhere. There were free classes in most towns teaching how to use Excel or Word. The training classes were held in computer labs with twenty students at a time – and the training was largely an abysmal failure. While those skills are important for many jobs, they are not things that most people will use regularly, if ever. But somehow, it became accepted that teaching those skills was the way to make folks computer literate. These classes were mostly such a colossal failure that the training courses died out within a few years and were not replaced. We’ve largely gone two decades where there has been no formal forum for folks to learn how to use a computer except to sit with a friend or relative that would take the time to teach them. And this is a shame. There is an immense richness of content on the web today – there is something for everybody.

But back to my original premise of the blog. ISPs do not have the resources to dedicate employees to sit with folks to learn how to use a computer and navigate the web. Some of the big ISPs have given the impression that they are doing this for folks – but that is mostly done for public relations purposes.

There are some ISPs that might be willing to take up this challenge. Some municipal or cooperatives might take a stab at solving the digital divide. I could see some of them using grant money to develop a great program for teaching computer use. But even that is going to be a challenge, because the main focus of these ISPs is like every other ISP – to keep the network operating. We don’t expect car companies to teach us to drive. We don’t expect banks to teach us how to be wise with our money. We really can’t expect ISPs to teach folks how to use computers.

Communities that want to solve the digital divide issues should look elsewhere – perhaps at existing non-profits. There are some local governments that are going to take a stab at this. If no such group exists, then use grant money to kick-start the effort.