Packet Loss and Broadband Performance

In a recent article in FierceWireless, Joe Madden wrote an article looking at the various wireless technologies he has used at his home in rural central California. Over time he subscribed to a fixed wireless network using WiFi spectrum, cellular LTE broadband, Starlink, and a fixed wireless provider using CBRS spectrum. A lot of rural folks can describe a similar path where they have tried all of the broadband technologies available to them.

Since Joe is a wireless expert who works at Mobile Experts, he was able to analyze his broadband performance in ways that are not easily understood by the average subscriber. Joe came to an interesting conclusion – the difference in performance between various broadband technologies has less to do with speed than with the consistency of the broadband signal.

The average speed tests on the various products varied from 10/2 Mbps on fixed wireless using WiFi, to 117/13 Mbps on Starlink. But what Joe found was that there was a huge difference in consistency as measured by packet loss. Fixed wireless on WiFi had packet loss of 8.5%, while the packet loss on fixed wireless using CBRS spectrum dropped to 0.1%. The difference is stark and is due to the interference that affects using unlicensed spectrum compared to a cleaner signal on licensed spectrum.

But just measuring packet loss is not enough to describe the difference in the performance of the various broadband connections. Joe looked at the number of lost packets that were delivered over 250 milliseconds. That will require some explanation. Packet loss in general describes the percentage of data packets that are not delivered on time. In an Internet transmission, some packets are always lost somewhere in the routing to customers – although most packets are lost due to the local technology at the user end.

When a packet doesn’t show up as expected, the Internet routing protocols ask for that packet to be sent again. If the second packet gets to the user quickly enough, it’s the same, from a user perspective, as if that packet was delivered on time. Joe says that re-sent packets that don’t arrive until after 250 milliseconds are worthless because by then, the signal has been delivered to the user. The easiest way to visualize this is to look at the performance of Zoom calls for folks using rural technologies. Packets that don’t make it on time result in a gap in the video signal that manifests as fuzziness and unclear resolution on the video picture.

Packet loss is the primary culprit for poor Zoom calls. Not receiving all of the video packets on time is why somebody on a Zoom call looks fuzzy or pixelated. If the packet loss is high enough, the user is booted from the Zoom call.

The difference in the percentage of packets that are delivered late between the different technologies is eye-opening. In the fixed wireless using WiFi spectrum an astounding 65% of re-sent packets took longer than 250 ms. Cellular LTE broadband was almost as bad at 57%. Starlink was better at 14%, while fixed wireless using CBRS was lowest at 5%.

Joe is careful to point out that these figures only represent his home and not the technologies as deployed everywhere. But with that said, there are easily explainable technology reasons for the different levels of packet delay. General interference plays havoc with broadband networks using unlicensed spectrum. Starlink has delay just from the extra time for broadband signals to go to and from the satellite and the ground in both directions. The low packet losses on a CBRS network might be due to having very few other neighbors using the new service.

Joe’s comparison doesn’t include other major broadband technologies. I’ve seen some cable networks with high packet loss due to years of accumulated repairs and unresolved issues in the network. The winner of the packet loss comparison is fiber, which typically has an incredibly low packet loss and also a quick recovery rate for lost packets.

The bottom line from the article is that speed isn’t everything. It’s just one of the characteristics that define a good broadband connection, but we’ve unfortunately locked onto speed as the only important characteristic.

Mass Confusion over FCC Mapping

You might not be surprised to hear that I am tired of talking about the FCC map. I spend way too much time these days answering questions about the maps. I understand why folks are confused because there are several major mapping timelines and issues progressing at the same time. It’s nearly impossible to understand the significance of the many dates that are being bandied around the industry.

The first issue is the FCC mapping fabric. The FCC recently encouraged state and local governments and ISPs to file bulk challenges to the fabric by June 30. This is the database that attempts to locate every location in the country that can get broadband. The first mapping fabric issued in June 2022 was largely a disaster. Large numbers of locations were missing from the first fabric, while the fabric also contains locations that don’t exist.

Most experienced folks that I know in the industry are unhappy with the fabric because its definition of locations that can get broadband is drastically different than the traditional way that the industry counts possible customers, which is commonly called passings. For example, the FCC mapping fabric might identify an apartment building or trailer park as one location, while the industry would count individual living units as potential customers. This disconnect means that the fabric will never be useful for counting the number of folks who have (or don’t have) broadband, which I thought was the primary reason for the new maps. Some folks have estimated that even a corrected fabric might be shy 30 or 40 million possible broadband customers.

Meanwhile, ISPs were instructed to use the original mapping fabric to report broadband coverage and speeds – the FCC 477 reporting process. The first set of the new 477 reporting was submitted on September 1, 2022. Many folks that have dug into the detail believe that some ISPs used the new reporting structure to overstate broadband coverage and speeds even more than was done in the older maps. The new maps globally show a lot fewer folks who can’t buy good broadband.

There is a second round of 477 reporting due on March 1. That second 477 reporting is obviously not going to use the revised mapping fabric, which will still be accepting bulk challenges until June 30. It could take much longer for those challenges to be processed. There have been some revisions to the fabric due to challenges that were made early, but some of the folks who made early map challenges are reporting that a large majority of the challenges they made were not accepted. This means that ISPs will be reporting broadband on top of a map that still includes the mistakes in the original fabric.

The FCC’s speed reporting rules still include a fatal flaw in that ISPs are allowed to report marketing broadband speeds rather than actual speeds. This has always been the biggest problem with FCC 477 reporting, and it’s the one bad aspect of the old reporting that is still in place. As long as an ISP that delivers 10 Mbps download still markets and reports its speeds as ‘up to 100 Mbps’, the maps are never going to be useful for any of the stated goals of counting customers without broadband.

Finally, the NTIA is required to use the FCC maps to determine how much BEAD grant funding goes to each state. NTIA announced that it will report the funding allocation on June 30. That date means that none of the mapping challenges that states and counties have been working on will be reflected in the maps used to allocate the grant funding. The NTIA announcement implies that only the earliest challenges to the maps might be included in the database used to determine the number of unserved and underserved locations in each state. States that have already made challenges know that those numbers include a lot of mistakes and missed a lot of locations.

Not only will the NTIA decision on funding allocation not include the large bulk challenges filed or underway by many state and local governments, but it won’t reflect the latest 477 reporting being submitted on March 1. There are several states that have made rumblings about suing the NTIA if they don’t get what they consider to be a fair allocation of the BEAD funding. If that happens, all bets are off if a court issues an injunction of the grant allocation process until the maps get better. I can’t help but be cynical about this since I can’t see these maps ever being good enough to count the number of homes that can’t buy broadband. This whole mapping process is the very definition of a slow-motion train wreck, and that means I’ll likely be answering questions about the maps for the indeterminate future.

Getting Serious About Satellite Texting

One of the more interesting telecom announcements at the CES electronics show in Vegas was the announcement from the partnership of Qualcomm and Iridium of plans to bring satellite texting capability to many more cell phones and other devices.

We’ve already seen a few other announcements recently of the ability to make emergency text calls when out of reach of cell coverage. The team of T-Mobile and SpaceX say that T-Mobile customers will be able to reach 911 through a satellite some time in 2023. Apple launched an Emergency SOS system for its newest iPhone users in a partnership with Globalstar, but the service is only available in a handful of cities.

Qualcomm is building this feature into its premier Snapdragon 8 Gen 2 chips, so any new phone or other device using that chip will have texting capabilities. The company says it plans to eventually build the capability into other more affordable chips as well.

For now, Qualcomm has established a 911 service similar to the T-Mobile plans where people can reach 911 when out of the range of the normal cellular network. But the company envisions that cellular carriers will develop price plans to let users text for a fee. That would provide folks with the ability to stay connected while hiking in remote wilderness or during a sea cruise.

Qualcomm is in the business of selling chips, and it would love to see this capability expanded to other places, like built into laptops or new cars. Putting the technology in cars is a major selling point since it would enable features like automatically contacting 911 after an accident.

This first-generation product will be far from perfect, but that’s to be expected from what is basically a beta test. For example, while Iridium satellites blanket the earth, there are times when there is no satellite overhead, and a user might have to wait ten minutes for the next satellite. It seems this issue can be resolved by cell carriers partnering with multiple satellite providers.

This new technology opens up the possibility for people to have some limited connectivity almost anywhere on the globe. For the younger connected generations, this has great appeal. Most people I know with GenZ kids tell me that it’s like banishment to take kids out of reach of connectivity. But more practically, much of the world does not have reliable cellular coverage, and this can bring some form of communication to all.

I know people will read this and assume that the next step is to use satellites to provide data connectivity to cell phones or laptops from anywhere. However, there are limits of physics that make that unrealistic for a handset. The latest Starlink dishy receiver is 19 by 12 inches, and that much surface area is needed to receive the signal from a satellite. However, it’s not hard to imagine a hiker rolling out a flexible receiver to communicate with a satellite – assuming they bring along some kind of power source, perhaps solar.

I track telecom announcements of new technologies and products to give me a baseline a decade from now to see how various technologies performed. It will be interesting to see if satellite texting becomes a routine part of every cellular plan or if it withers on the vine like many other seemingly good ideas that the market didn’t embrace.

Who’s On First?

I saw a short article in Business Wire that said that Comcast Business had landed a project to provide a private wireless network for the guests of The Sound Hotel Seattle Belltown. This is an example of the continuing convergence in the industry where the big cable companies, ISPs, and wireless carriers are freely competing on each other’s turf. For decades we’ve neatly categorized companies as telcos, cable companies, or wireless carriers, but this convenient categorization is starting to fray around the edges, and its getting a lot harder to distinguish between the big industry players.

If we look back ten or fifteen years, the distinctions between these companies were clearly defined. The big telcos served residences and small businesses using DSL. The big telcos were clearly structured in silos. There was practically no interface between the wireless companies at Verizon and AT&T and the broadband business. Verizon went so far as to set up Verizon FiOS, its fiber business, separately in every aspect from the copper and DSL business.

The cable companies had faster broadband than DSL after the upgrades were made to DOCSIS 3.0. Speeds up to 300-400 Mbps blew away the capabilities of DSL. Once those upgrades were completed, the cable companies took market share in cities from the telcos year after year until the cable companies had a near-monopoly in many markets.

The market with more balanced competition has been the large business market. This is the market where fiber quickly became king. At one point the telcos controlled most of this market, with their fiercest competition coming from a handful of big CLECs. Verizon responded to this competition by buying MCI, XO, and others in the northeast. CenturyLink become one of the nationwide market leaders through the acquisition of Qwest and then Level 3. The big cable companies cautiously launched fiber ventures for this market twenty years ago and have picked up a decent market share.

But those simple explanations of the business plans of the big ISPs is now history. As the Business Wire announcement showed, the big companies are crossing technology barriers in new ways. Comcast

Providing a private wireless network for a large hotel is emblematic of a new trend in competition. In doing this, Comcast is crossing technical lines that it would never have considered years ago. From a business perspective, Comcast is going after the full suite of services for businesses like this hotel, not just the wireless network. The newest word in the competitive market is stickiness, and Comcast is likely tying down this hotel as a customer for a long time, assuming it does a great job.

These crossovers are even more evident in the residential and small business markets. Comcast, Charter, and other cable companies are bundling cellular service with broadband and the triple play, something that the telcos have never managed to pull off. Telcos have decided to reclaim urban market share by building huge amounts of fiber. And the cable companies are reacting to that threat by rushing some early versions of DOCSIS 4.0 to the market in order to fix the upload bandwidth issues. The big wireless companies have joined the fray with the FWA cellular wireless broadband products. While these products can’t compete with the bandwidth on fiber or cable networks, the product is still adequate for many homes and hits the market at a much lower price.

This has to be confusing to the average residential consumer. Consumers who abandoned DSL years ago are being lured back by to the telcos by fiber. Folks who have been paying far too much for cellular service are moving to the more affordable cable company wireless service. And people who can’t afford the high price of cable broadband are seemingly flocking to the more affordable FWA wireless. I have to imagine that the customer service desks at the various ISPs are being flooded by customers canceling service to try something different.

Markets always eventually reach an equilibrium. But for now, both the residential and business markets in many cities are seeing a fresh new marketing efforts. A decade from now, it’s likely that we’ll reach a predicable mix of the various technologies. We know this from having watched the markets where Verizon FiOS battled with the cable companies for several decades. But much of the country is just now entering the era of refreshed competition.

Unfortunately, this new competition isn’t everywhere. There is already evidence that new investments are not being made at the same pace in lower-income neighborhoods. Some cities are seeing widespread fiber construction while others are seeing almost none. There will still be a lot of work to do to make sure that everybody gets a shot at the best broadband – but the obvious convergence in the industry shows that we’re headed in the right direction.

No More Underbuilding

Jonathan Chambers wrote another great blog this past week on Conexon where he addresses the issue of federal grants having waste, fraud, and abuse – the reasons given for holding hearings in the House about the upcoming BEAD broadband grants. His blog goes on to say that the real waste, fraud, and abuse came in the past when the FCC awarded federal grants and subsidies to the large telcos to build networks that were obsolete by the time they were constructed. He uses the term underbuilding to describe funding networks that are not forward-looking. This is a phrase that has been around for many years. I remember hearing it years ago from Chris Mitchell, and sure enough, a Google search showed he had a podcast on this issue in 2015.

The term underbuilding is in direct contrast to the large cable and telephone companies that constantly use the term overbuilding to mean they don’t want any grant funding to be used to build any place where they have existing customers. The big ISPs have been pounding the FCC and politicians on the overbuilding issue for well over a decade, and it’s been quite successful for them. For example, the big telcos convinced the FCC to provide them with billions of dollars in the CAF II program to make minor tweaks to rural DSL to supposedly bring speeds up to 25/3 Mbps. I’ve written extensively on the failures of that program, where it looks like the telcos often took the money and made minimal or no upgrades.

As bad as that was – and that is the best example I know of waste, fraud, and abuse – the real issue with the CAF II subsidy is that it funded underbuilding. Rural DSL networks were already dying when CAF II was awarded, mostly due to total neglect by the same big telcos that got the CAF II funding. Those billions could have instead gone to build fiber networks, and a whole lot of rural America would have gotten state-of-the-art technology years ago instead of a tweak to DSL networks that barely crawling alone due to abuse.

The FCC has been guilty of funding underbuilding over and over again. The CAF II reverse auction gave money to Viasat, gave more money for upgrades to DSL, and funded building 25/3 Mbps fixed wireless networks. The classic example of underbuilding came with RDOF, where the areas that were just finishing the CAF II subsidy were immediately rolled into a new subsidy program to provide ten more years of subsidy. Many of the areas in RDOF are going to be upgraded to fiber, but a lot of the money will go into underperforming fixed wireless networks. And, until the FCC finally came to its senses, the RDOF was going to give a billion dollars to Starlink for satellite broadband.

The blame for funding underbuilding lies directly with the FCC and any other federal grant program that funded too-slow technologies. For example, when the CAF II funding was awarded to update rural DSL, areas served by cable companies were already delivering broadband speeds of at least 100 Mbps to 80% of the folks in the country. By the time RDOF was awarded, broadband capabilities in cities had been upgraded to gigabit. The policy clearly was that rural folks didn’t need the same quality of broadband that most of America already had.

But the blame doesn’t just lie with the FCC – it lies with all of the broadband advocates in the country. When the ISPs started to talk non-stop about not allowing overbuilding, we should have been lobbying pro-broadband politicians to say that the FCC should never fund underbuilding. We’ve collectively let the big ISPs frame the discussion in a way that gives politicians and regulators a convenient way to support the big ISPs. Both at the federal and state levels the broadband discussion has often devolved into talking about why overbuilding is bad – why the government shouldn’t give money to overbuild existing ISPs.

Not allowing overbuilding is a ludicrous argument if the national goal is to get good broadband to everybody. Every broadband network that is constructed is overbuilding somebody, except in those exceptionally rare cases where folks have zero broadband options. If we accept the argument that overbuilding is a bad policy, then it’s easy to justify giving the money to incumbents to do better – something that has failed over and over again.

It’s time that we call out the overbuilding argument for what it is – pure protectionism. This is monopolies flexing political power to keep the status quo, however poorly that is working. The big ISPs would gladly roll from one subsidy program to another forever without investing any of their own capital to upgrade rural networks.

Every time a regulator or politician says that we should not be using federal money to overbuild existing networks, we need to prod pro-broadband politicians to counter that argument by saying we should not be spending any more money on underbuilding. Broadband is infrastructure, just like roads and bridges, and we should be investing any grant money into the most forward-looking technology possible. If the national goal is to make sure that everybody has good broadband, then we should be ready to overbuild anywhere the incumbents have underperformed, be that in rural areas or inner cities. It’s time we shift the conversation away from protectionism to instead prioritizing bringing broadband that will still be good a decade or two after the grant award. Let’s not spend another penny of grant money on underbuilding networks by investing in slow technologies that are inadequate and obsolete even before they are completed.

The Disappointment of 5G

Karl Bode recently wrote an excellent article highlighting the overhyping of wireless technologies. He’s right, and for the last twenty years, we’ve been told that a world-changing wireless technology is coming soon, but none ever materialized. No wireless technology has been a bigger flop than 5G when comparing the hype to the eventual reality.

The hype for 5G was amazingly over-the-top. The wireless carriers and vendors blitzed the country in a coordinated effort to paint 5G as the solution that would bring broadband everywhere. 5G was going to bring us self-driving cars. 5G would enable doctors to perform surgery remotely from across the country. 5G was going to fuel an explosion of smart factories that would bring complex manufacturing back to the U.S. And 5G was going to use millimeter waves to bring us gigabit-speed broadband everywhere, eliminating the need for investing in expensive fiber networks.

The hype fired up the general public, which bought into the 5G promises, but the public wasn’t the real audience of the hype. The cellular carriers did a non-stop blitz on federal officials, getting them to buy into the amazing wireless future. The cellular companies launched gimmick networks in downtowns to deliver gigabit cellular speeds using millimeter-wave spectrum as a way to sell the 5G vision. It’s clear in retrospect that the rhetoric and gimmicks were aimed at getting the FCC to release more mid-range spectrum for cellular usage – and it worked. There was pressure on the FCC to move more quickly with proceedings that were examining spectrum availability. The wireless carriers even talked the FCC into allowing cellular carriers to poach free WiFi spectrum in cities. The hype worked so well on elected officials that there was a serious discussion about the U.S. buying one of the big wireless vendors like Nokia or Ericsson so that the U.S. wouldn’t lose the 5G war with China.

The main problem with all of this hype is that the rhetoric didn’t match the specifications for 5G that were adopted by international standards bodies. The 5G specifications included a few key goals: get cellular speeds over 100 Mbps, allow for more simultaneous users at a given cell site, allow a cellphone to use two different spectrum bands at the same time, and allow a user to connect to more than one cell site if the demand needed it. The primary purpose of the 5G spec was to eliminate cell site congestion in places where there are a lot of people trying to simultaneously use the cellular network. Nothing in the 5G specification is earth-shattering. The specification, as a whole, seemed like the natural evolution of cellular to better accommodate a world where everybody has a cell phone.

I wrote several blogs during the height of the 5G hype where I was puzzled by the claims that 5G would bring about a broadband revolution because I couldn’t see those claims backed up by the technical capabilities of 5G. I also wrote several blogs asking about the business case for 5G because I couldn’t find one. We will likely never build a dense cellular network along the millions of miles of roads to support self-driving cars. The biggest business use of 5G touted by the carriers was to get people to buy subscriptions to use 5G to support the smart devices in our homes – but people will never buy a subscription to do what WiFi can do for free.

There is still not a good business case that can drive the new revenues needed to justify spending a lot of money on 5G. Because of this, most of the 5G specification has not been implemented. How many people are willing to pay extra for the ability to connect a cellphone to two cell towers simultaneously?

Instead of 5G that follows the specifications, we’ve gotten more marketing hype where the cellular carriers have labeled the new spectrum from the FCC as 5G. There is almost none of the 5G specification in this product, and the product labeled as 5G still uses 4G LTE technology. The introduction of the new spectrum has relieved the pressure on overloaded cell sites, and we’ve seen cellular speeds rise significantly. But that faster speed is wasted on most cellular customers who don’t do anything more data-intensive than watch video.

It was interesting to see how the rhetoric died down once the cellular carriers got access to more spectrum. The big winner from the marketing hype has been the handset manufacturers, which have convinced customers that they must have 5G phones – without really telling them why. Cellular customers are generally pleased that speeds have increased since this means stronger coverage indoors and in outdoor dead spots. But surveys have shown that only a minuscule percentage of people are willing to pay more for faster cellular speeds.

The most ludicrous thing about the 5G story is that the industry is now hyping 6G. This new marketing hoax is focusing on some of the mid-range spectrum that was originally touted as being part of the 5G war – but the marketers rightfully assume that most customers won’t understand or care about the facts. It seems like the industry has embarked on subdividing what was originally considered as 5G spectrum into small chunks so that the carriers roll out subsequent generations of 6G, 7G, and 8G – all of which were supposedly part of the original 5G revolution. I have no doubt that the public will buy into the hype and want 6G phones when they hit the market, but I also know that none of them will see any difference in performance. The formula seems simple – announce a new G generation every eighteen months and sell a lot of new handsets.

Competing Against Big Cable Companies

I’m asked at least twenty times a year how a small ISP can compete against the big cable companies. The question comes from several sources – a newly-formed ISP that is nervous about competing against a giant company, a rural ISP that is entering a larger market to compete, or investors thinking of funding a new ISP. These folks are rightfully nervous about competing against the big cable companies. Comcast and Charter together have roughly 55% of all broadband customers in the country, so the assumption is that they are formidable competitors.

It’s more realistic to say that they are decent competitors. They have slick marketing materials to try to lure customers. They have persuasive online marketing campaigns to snag the attention of new customers. They have good win-back programs to try to keep customers from leaving them.

But the two big cable companies have one obvious weakness – their prices are significantly higher than everybody else in their markets. Every marketing push by these companies involves giving temporary low special prices to lure customers – but those prices eventually revert to much higher list prices.

There is a great example of this in the market today. Both Verizon and T-Mobile have been adding large numbers of broadband customers to their fixed wireless FWA products that deliver home broadband using cellular spectrum. The two cellular companies have been highly successful in the marketplace, adding over 2.6 million new broadband customers through the first three quarters of 2022, while Comcast and Charter added about half a million customers during that same time period – mostly at the start of the year.

The FWA wireless product is clearly competing on price. The FWA broadband is not as fast or robust as cable company broadband, but the prices are attractive to a lot of consumers. For example, T-Mobile offers 100 Mbps broadband for a $50 monthly fee for customers willing to use autopay – a price T-Mobile says will never increase. This is far below the prices of the cable companies, which are in the range of $90 per month for standalone broadband.

I thought I’d take a look at how Comcast is competing against the lower-price FWA products. Comcast has two special offers in January 2023 for standalone broadband.

  • In a special offer that ends February 1, Comcast will provide 400 Mbps broadband for $30 per month, which requires autopay. The special price is under a contract for one year, but the special price extends for two years (meaning that if a customer terminates during the first year they have to pay for the remaining months of the contract). The special price for this product was higher in the past and likely has been lowered to compete against FWA.
  • The other offer is ongoing and doesn’t end on February 1. Comcast will provide 800 Mbps download speeds for $60 per month, which requires autopay. This is also a two-year term, with the first year under a contract.

Comcast then adds hidden fees to the special price. Unless a customer brings their own modem, Comcast charges $15 per month for a WiFi modem, a price that was increase by $1 this month. In many markets, Comcast also has data caps, and customers that exceed 1.2 terabytes of usage per month are charged $10 for each additional 50 gigabytes of data used in a month.

For the 400 Mbps product, a customer who brings a modem and who doesn’t exceed the data caps will pay $30 per month if using a bank debit and $35 per month with a credit card debit. Using the Comcast WiFi modem (which most customers do), raises the monthly price to $45 or $50 – right in line with the T-Mobile FWA product. But the kicker comes at the end of the term when the price, before a cable modem, jumps to $92 per month, and $107 with the modem. The result at the end of the 800 Mbps special is similar, with the price rising to $97 per month before a WiFi modem. Anybody buying the special today must also worry about whatever rate increases Comcast adds to the base broadband price by 2025.

The special prices offered by the big cable companies are alluring – customers can get a significant discount for a year or two. But inevitably, the prices will skyrocket – and in the case of the 400 Mbps special will more than double at the end of the discounted special.

ISPs that compete against the big cable companies have learned that all they have to do to compete is to offer fair prices and wait out the specials. Over time, customers who get tired of the pricing yoyo will come around. ISPs with fiber tell me that customers that come to them from a cable company almost never go back to cable. Customers appreciate fair pricing with no games and a reliable broadband product that delivers the promised speeds – that’s how you compete against the big ISPs.

Competing with ChatGPT

I’ve been writing this blog daily since 2013, and writing it is the favorite part of my day. Writing the blog forces me to research and solidify my thoughts and opinions about various topics. But suddenly, I’m seeing headlines everywhere saying that ChatGPT will soon handle most writing and there will be no need for folks like me who write every day.

I was obviously intrigued and investigated the ChatGPT software. The latest 3.5 version of the software was launched by OpenAI in November 2022. OpenAI is a for-profit software firm that has been researching the field of artificial intelligence (AI) with the stated goal of developing friendly AI. It’s interesting that friendly is a key part of their mission statement because many AI industry pundits predict that AI will likely eventually compete with humans for resources, much like Skynet in the Terminator movies.

ChatGPT is written atop OpenAI’s third generation of software and is aimed at communicating in a written or conversational way so that a reader can’t tell the difference between the software and a human. The company has numerous investors, but Microsoft just offered to buy a 49% stake in the company for $10 billion. This instantly has me wondering when there will be a fee to use the software instead of the free version that is available now.

The press on ChatGPT has been over-the-top. I’ve seen articles comparing the impact of the launch of ChatGPT to other big events in web history, like the first web browser or the iPhone. Articles are touting that the software will mean that programmers will no longer have to write code, that students will no longer have to write papers, and that there will soon be no need for journalists (or bloggers!)

Early-generation AI writing software has been around for a few years and many baseball box scores and press summaries of quarterly earnings reports have been generated by software. These are writing tasks that are formulaic and repetitive, and I doubt that most folks noticed – although the software never captured the magic of a sports reporter like Shirley Povich, who I enjoyed reading every day for years in the Washington Post.

I had to give this platform a try. Was this software capable of writing something like this blog? If so, it would make me reconsider writing every day because if the software is that good there won’t be much need for human writers before long. As I was testing, I also considered the idea of using the software to get a jump start on a new piece of writing – the idea of seeing if the software could structure and organize an idea would be a time saver if the results were usable.

You can give complicated instructions to the software. You can provide the topic, the desired length of the end product, and describe the desired style of writing. I gave the software several topics to write about, and I was impressed with the speed of the process. The finished product is created almost as soon as you say go to the software.

But I was underwhelmed by the results. The sentences are grammatically perfect, and each paragraph has a topic and tries to make a point. Yet the end result was stilted, and some paragraphs were unreadable – I had to reread them several times to try to decipher the point (but for all I know, my readers have to do the same thing!).

The biggest flaw was that the writing was full of factual errors. That makes a lot of sense because the software distills what is written on the web when writing content. It takes the good and the bad, the factual and non-factual, and the easy-to-understand and obtuse writing that exists on the web and mashes it in a synthesis of what it finds. I realized that I would have to fact-check everything ChatGPT writes because the software has no way to discern what is true or untrue. There is a term for this among data scientists, and I read that ChatGPT currently has a hallucination rate of between 15% and 21%, meaning that it seems to make up that percentage of facts in its writing.

I know there is instant hope among students that this software can churn out the dreaded school essay – but that doesn’t look likely. The software has been out for only two months, and I saw that a software engineer has already developed a program that can detect with more than 90% accuracy if something is written by a human or by ChatGPT. Students beware.

The day will likely come when the ChatGPT writing gets better, but there is nothing in this software today that would make me consider giving up writing or even using this as a tool. The hallucination rate means I can’t trust it to be factual, so it’s not even worth using to create a kernel of an idea for a blog. Most importantly, the output is not readable – it’s all perfect English, but I couldn’t understand the point of about half of what it wrote for me. If my blogs are going to be unreadable, I want the obtuseness to be fully human-generated!

The Slow Death of Operator Services

Starting this month, AT&T customers with digital telephones can no longer dial 411 to get information or dial 0 to get an operator. We’re seeing the beginning of the end of operator services. AT&T still offers this service to copper landlines but already cut this service last year for cellular customers.

This is a big change for the telephone industry, although most people will barely notice. When the industry was new, all calls were processed by operators who sat at switchboards and made manual connections to connect a calling party to the called party. Over time the switching function got automated so that people could directly dial a telephone number.

But operator services remained a major part of the telephone business. Young folks who have grown up with the Internet can’t remember back to a time, not that long ago, when everybody dialed 411 from time to time. There was no way to find the phone number of a person or business outside your local area without calling an operator. Every landline customer got a fixed number of free 411 calls every month, and then you’d pay for the service. Calling to ask for a number didn’t always work because about 10% of homes paid extra to have an unlisted number, meaning it wasn’t published in the directory or given out by 411 operators. It was sometimes hard to find a number if you weren’t sure about the spelling of somebody’s name.

I remember how hard it used to be to find businesses in other cities. You’d have to guess how they were listed in the white pages, which was often different than the name you knew them by. A good directory assistance operator was worth their weight in gold. I can remember calling 411 before moving to a new city to connect my utilities. I’d have no idea the name of the electric company or know the number to call to connect water or telephone service. A good operator could get you the right numbers on one try.

There was a time when dialing 0 for operator was a big deal. The operator could help you with all sorts of things. The disabled could get the operator to dial a call for them. In the days before cell phones and long-distance deregulation, the phone companies helped millions of people every day place collect calls. If your car broke down, a collect call was how you called home for help.

The telephone operator played another important role in the community. Operators got calls from kids who were home alone, elderly who were lonely, and folks feeling suicidal. Particularly at non-busy hours, operators were known to provide a sympathetic ear. Operators were also the predecessor to 911 – you dialed 0 for fires, medical emergencies, and to report a crime in progress. See this ad from AT&T that promoted calling the operator.

I have a personal anecdote about the operator function. I worked in management at Southwestern Bell, and during a union strike, I was assigned to an operator center to work a switchboard. A few days into the strike, I got a call from a woman at a payphone whose husband was having a heart attack. It turns out that this payphone was located at the intersection of the borders of three municipalities, each with its own emergency services. I called the City of St. Louis first and was told that address was not in their jurisdiction. I also chose wrong on the next call, and it took three calls to connect to the right ambulance service. As you might imagine, this shook me since I knew I time was of the essence in trying to save this man’s life. I never knew what happened to this gentleman because, in the life of the operator, I had to immediately move on to the next call. Operators somehow stayed sane while taking these kinds of calls every day.

Operator services have been diminishing in importance for many years. The introduction of 911 rerouted calls for emergencies directly to the right folks – although to this day, operators still get emergency calls. The Internet became the yellow pages, and you can find most, but not all, businesses online. The real death knell for operator services was the cellphone. I knew operator services would become obsolete after buying a flip phone that remembered every number I had ever called. I can’t recall the last time I called an operator, and I wouldn’t be surprised if it was 20 years ago. But it’s been comforting to know that a friendly operator was there if I ever needed help.

Should You Be Benchmarking?

As recently as fifteen years ago, I was often asked by many of my clients to help them benchmark their ISP against their peers. By this, they wanted to know if they had the right number of employees for their customer base, if their revenues and expenses were in line with other similar ISPs, if they had too much expense from overheads, etc.

I always tried to help them, and I would gather statistics from other ISPs and share it with everybody who contributed information. But after a while, I found something that didn’t surprise me but which surprised most of my clients. It turns out that ISPs were not particularly comparable. They seemed to differ in most of the statistics that my clients wanted to understand. Interestingly, a lot of the folks with significantly different metrics considered themselves to be successful.

I started to dig into these differences, and that’s when I realized that, at least for relatively small ISPs, that benchmarking and comparing metrics between peers had little relevance to overall success. What I found on deeper examination was something I already knew but could now prove – that every ISP is unique.

Some of the reasons that ISPs differed in metrics was due to the way they purposefully operated.

  • ISPs differed significantly in their commitment to fix customer problems. Some of my clients didn’t react to customer outages after hours and on weekends, while some had a philosophy of not going home for the day until customer issues were resolved.
  • Some clients purposefully kept functions in-house rather than outsource for a lower cost due to a commitment to keep jobs for long-time employees.
  • Some ISPs set rates as low as possible to benefit the public, while others strove to maximize profits.
  • Some of my clients used software to automate processes as much as possible, while others kept the same methods in place that had worked for decades.
  • Some clients spent extra money to have pension plans and top-notch health insurance for employees while others were less generous.
  • Some of my clients were fully leveraged with debt to take advantage of growth opportunities, while others took pride in being debt free.
  • Some clients served highly rural areas where a truck toll to visit any customer was a huge time-eater.

These kinds of differences make it nearly impossible to make a side-by-side comparison of two ISPs, even ones with the same number of customers.

There was a time when rural independent telephone companies shared so many common characteristics that it was possible to gather benchmarking data that they found useful. But once these companies started to build networks outside of the regulated core areas, the companies changed. Growth outside the company meant competing without subsidies and only tackling growth that looked both manageable and profitable.

Today there is such a wide variety of ISPs that it’s even more difficult to compare them. Is it even possible to compare an ISP associated with an independent telco, one started by an electric coop, one created by a municipality, and one that is an overbuilder not associated with any other business?

This is not to say that mall ISPs don’t have a lot to learn from each other – but benchmarking is probably the least useful approach to understanding the business. ISPs all benefit from comparing technical solutions, software systems, marketing techniques, and how they market against large competitors. But none of that is benchmarking and just means small ISPs benefit from sharing information with similar peers.