The Industry

Rural Cellular Coverage

2021 is going to into history as the year when the whole country is finally talking about rural broadband. The pandemic made it clear that many millions of households and even entire communities don’t have adequate broadband. Congress and the White House responded by funding billions of dollars for improved broadband in the ARPA legislation. We are perhaps edging closer to an infrastructure bill that will allocate tens of billions of additional dollars to fix a lot of the rural broadband divide.

An issue that seems to have fallen off the radar is rural cellular coverage. In 2020, the FCC halted its plans to fund a reverse auction for a $9 billion 5G Fund that was intended to improve rural cellular coverage. That program was halted after it became apparent that Verizon and others were reporting inaccurate cellular coverage data to the FCC – much like the problem we have with FCC broadband maps. In July of this year, the FCC started another round of data collection, so the plans for the 5G fund must still be in play.

The topic has come to my attention recently as my consulting firm was doing several surveys in rural counties. In these surveys, between 30% and 50% of survey respondents said that they had little or no cellular coverage at their homes. In all of these cases, the FCC data from cellular carriers showed ubiquitous coverage throughout each county.

To be fair, the FCC reporting for cellular coverage is outdated. In its annual report to Congress, the FCC reports the percentage of homes and businesses in every county that can receive 5/1 Mbps LTE coverage. Ookla recently reported that the millions of speed tests show that the average national cellular download speed in early 2021 was 76.6 Mbps for AT&T, 82.4 Mbps for T-Mobile, and 67.2 Mbps for Verizon. The FCC is still reporting on a cellular speed that is far slower than the speed that a majority of cellular customers can receive – much in the manner that it keeps focusing on 25/3 Mbps as the definition of broadband.

But it’s really troublesome when the FCC reports to Congress that whole counties can get 5/1 Mbps cellular broadband when residents tell us they have almost no cellular overage. It’s hard to understand why members of Congress who live in some of these rural counties don’t push back on the FCC.

I’ve heard the same kind of rural stories about rural cellular coverage that I’ve heard about rural broadband. People say that coverage peters out during their commute home, and they have no coverage in their neighborhood. I heard the other day from somebody who told me they only get weak cellular coverage in one tiny part of their backyard – anybody of my age remembers running around the corner of airports trying to find that magic spot.

I would venture to say that almost everybody reading this blog knows of cellular dead spots. I live in the center of a city, and there is no Verizon coverage at my house – yet there is good Verizon coverage at the other end of my block. I have relatives that live in a nearby older suburb who have weak and intermittent cellular coverage for the entire neighborhood from both AT&T and Verizon. Every Uber driver in this area can paint a pretty accurate picture of the location of cellular dead zones – and there are a lot of them.

Cellular dead zones are largely a matter of geography and topography. I don’t know if it’s realistic to think that Verizon should be required to show the dead zone at my house on its cellular coverage maps. But I wonder how many people buy homes in hilly cities like mine, only to find that cellphones don’t work. I decided to not make an offer to buy a home here when my cellphone didn’t work at the house.

The problems in cellular rural areas are much more of a concern than my half-block of bad coverage. In rural counties, poor coverage comes from the combination of two issues – distance from a tower and terrain. In cities, there are cell towers within a mile or two of everybody, and with the introduction of small cells, even closer in many cases. In rural counties, there might only be a few cell sites, and many residents live more than two miles from the nearest tower.

The Industry

Local is Better

I am lucky enough to live in a place that is ideal for growing vegetables. I’m up in the mountains, where temperatures are moderate, and we have an average of three inches of rain every month all year. For the past several years, I have bought a big box of fresh produce every week from two local farms. I have never eaten so well, and I am comforted by knowing exactly where my vegetables were raised. I consider myself lucky to even have this option.

Most people in the country buy broadband from a handful of giant ISPs – The four largest ISPs serve more than three-fourths of all broadband customers in the country. But some people are lucky enough to benefit from great local ISPs in the same way that I benefit from being near to local vegetable farms.

There is a wide variety of local ISPs that includes independent telephone companies, telephone cooperatives, municipal ISPs, electric cooperatives, independent fiber overbuilders, and WISPs. Not all local ISPs are great – but from the stories I hear around the country, most people served by local ISPs feel lucky to have the local option.

There was a time 75 years ago when the federal government acknowledged that local is better. When it became clear that the big commercial electric companies were never going to build and serve in rural America, somebody in the federal government came up with the brilliant idea of letting communities build and operate their own electric companies through cooperatives. This didn’t cost the government much since the federal electrification plan provided long-term loans through the Rural Utility Service, and the cooperatives repaid the government for funding rural electric networks.

We’re in the process of handing out billion in federal grants to build rural broadband networks, and there isn’t one shred of localism in the new federal programs. Instead, the government is handing out huge grants that are often lucrative enough to instead attract the biggest ISPs and speculative investors to build rural broadband.

Does anybody really think that AT&T or Charter really wants to provide broadband in rural America? AT&T spent the last forty years milking the last vestiges of profits out of rural networks while making zero investments. Charter and other big ISPs are not going to like the much higher operating costs in rural areas that come from long truck rolls and increased maintenance costs to keep up with huge miles of rural miles.

The big federal grants are providing an incentive for big ISPs or speculative investors to build fiber networks because the grants make it possible to earn high returns. It’s impossible to believe that a big ISP like AT&T is going to provide the same level of customer service, repairs, and network maintenance as would be done by a local ISP. I don’t need a crystal ball to know that there will be a huge difference in twenty years between a fiber network built today by a giant ISP and one built by a rural cooperative. Without future grants, AT&T won’t invest in future electronics upgrades, and the company won’t do the routine maintenance needed to keep the network in good working order. Cooperative fiber networks will be humming along like brand new while the big ISP networks will already be showing signs of age.

The electrification program didn’t include grants, and the newly formed cooperatives eventually repaid the federal government for lending it the money to build power grids. I can’t imagine that the federal government has ever made a better investment or gotten a better return than it did from the electrification loans – the only things that come even close are Interstate highways and the G.I. Bill that sent millions to college after WW II.

In a real slap-down against localism, current federal broadband grant programs are actually stacked against small local ISPs. Federal grants expect a grant recipient to prove upfront that it has the matching funds to pay for the portion of the project not funded from a grant. Small ISPs typically don’t have the kind of balance sheets that traditional banks are looking for, and I know small ISPs that are unable to get grants due to the inability to raise the matching funds. And forget about starting a new cooperative – the grants and banks are not interested in helping start-ups.

It’s a shame that we forgot about the most successful program imaginable for building rural networks. Forty years from now, we are going to see that many of the areas where big ISPs get grant money today will have gone to hell and will need federal funding relief again. But rural markets operated by good local ISPs will be humming nicely along in forty or a hundred years from now. Broadband is one of the areas where local really is better. We all know this, but little guys don’t get a say in writing grant rules.

The Industry

5G for Cars – An Idea that Won’t Die

An industry group calling itself 5G Americas has published a whitepaper that touts the advantages of a smart auto grid powered by 5G and the C-V2X technology. This technology is the car connectivity standard that much of the industry has gelled around, replacing the older DSRC standard.

Over a decade ago, the FCC became so enamored over the idea of self-driving cars that the agency dedicated the 5.9 GHz spectrum band for the sole use of smart cars. The picture painted to the FCC at the time was the creation of a 5G network along roadways that would communicate with self-driving cars. As engineers experimented with smart cars, they quickly came to understand that the time lag involved in making real-time driving decisions in the 5G cloud was never going to be fast enough for the split-second decisions we constantly make while driving. Last year, the FCC halved the amount of bandwidth available for smart cars but didn’t totally kill the spectrum.

This whitepaper still envisions that the concept of a ubiquitous wireless network supporting smart cars. It’s not entirely surprising when looking at the companies that make up 5G Americas – AT&T, Ciena, Cisco, Crown Castle, Ericsson, Intel, Liberty Latin America, Mavenir, Nokia, Qualcomm, Samsung, Shaw Communications, T-Mobile, Telefónica, VMware and WOM. These companies would stand to make a lot of money on the idea if they could talk the government into funding the needed wireless network along roads.

There are still some interesting ideas suggested by the whitepaper. There are a lot of benefits to car-to-car communications. A car can be alerted when a neighboring car wants to change lanes or wants to pass. Drivers could peek into a camera of the car in front of them before trying to pass. Drivers can be alerted about a host of hazards, such as a car running a red light or patches of ice on the road ahead.

Most cars today already include a lot of safety features that weren’t imagined twenty years ago and the benefits envisioned by C-V2X technology sound like the next generation of safety features that car manufacturers are likely to embrace.

But this whitepaper doesn’t give up on a wireless network positioned along roads to communicate with vehicles. It refers to this as an intelligent transportation system (ITS), which would consist of a system of sensors and communications devices along roads specifically designed to communicate with vehicles. The paper touts additional benefits from a wireless network such as communications between cars and traffic lights and smart parking systems in cities.

Much of this whitepaper could have been written over a decade ago and probably was. The benefits are the same ones that have been discussed for years, although there has been some progress in developing the chips and the technology that could enable smart vehicles.

But the one thing that is largely skipped over in the paper is who pays for the infrastructure to support this. The paper suggests a collaboration between roadbuilders (federal, state, and local governments) and the cellular carriers. There is also an allusion about offering such amazing new features that car owners will pony up for a subscription to use the technology. My guess is that the real purpose of this whitepaper is to lobby Congress for grant funding for roadside networks. The paper largely suggests that government should pay for the 5G infrastructure along roads while the cellular carriers collect any subscription revenues.

The benefits touted by the paper all sound worthwhile. It would be nice to feel safe when passing another vehicle. It would be nice if your car could automatically be directed to the nearest parking place to your planned destination. But it’s hard to think those benefits are enough to entice governments to pony up for the needed infrastructure. Most of the roads in America are funded by local and county governments, and most of the roads outside of major cities are lightly traveled. I imagine most counties would laugh at the idea of funding this when many of these same counties don’t yet have broadband to homes.

If enough cars are equipped with the chips to enable this technology, there might be a few major metropolitan areas that might consider the idea. But therein lies the chicken and the egg question – will a city consider an investment in the technology before most cars have the chips, and will carmakers spend the money to install the chips before there are real-world places where this will work?

I hope that the car industry is pursuing the car-to-car communications ideas. That technology could enable most of the safety aspects touted by this whitepaper without investing in the external cellular network. The chipmakers can still make a lot of money by improving car safety. But this idea of having a ubiquitous 5G network along roads is never going to be practical, but it’s an idea that will seemingly never go away.

The Industry

Explaining Growth in Broadband Demand

I haven’t talked about the growth of broadband usage for a while. I was explaining the exponential growth of broadband usage to somebody recently, and I suddenly realized an easy way for putting broadband growth into context.

The amount of data used by the average broadband user has been doubling roughly every three years since the advent of the Internet. This exponential growth has been chugging along since the earliest dial-up days, and we’re still seeing it today. Consider the following numbers from OpenVault showing the average monthly U.S. household broadband usage:

1st Quarter 2018          215 Gigabytes

1st Quarter 2019          274 Gigabytes

1st Quarter 2020          403 Gigabytes

1st Quarter 2021          462 Gigabytes

Average household usage more than doubled in the three years from 2018 to 2021. The growth happened at a compounded growth rate of 29% annually. That’s a little faster than the more recent past, probably due to the pandemic, but in the decade before the pandemic, the compounded annual growth rate was around 26%.

What does this kind of growth mean? One way to think about broadband growth is to contemplate what growth might mean in your own neighborhood. Supposes you are served by DSL or by a cable company using HFC technology. If your ISP has the same number of customers in your neighborhood now as in 2018, the local network is now twice as busy, carrying twice as much traffic as just three years earlier. If your ISP hasn’t made any upgrades in that time, the chances are that you can already see some signs of a stressed network. Perhaps you notice a slowdown during the evening prime time house when most of the neighborhood is using broadband. You’ve probably run into times when it was a challenge making or maintaining a Zoom call.

To the average person, this kind of broadband traffic growth might not seem like a big deal, because they probably assume that ISPs are doing magic in data centers to keep things working. But any network engineer will tell that a doubling of traffic is a big deal. That kind of growth exposes the bottlenecks in a network where things get bogged down at the busiest times.

The most interesting way to put broadband growth into perspective is to look into the future. Let’s say that the historical 26% growth rate continues into the future. There is no reason to think it won’t because we are finding more ways every year to use broadband. If broadband keeps growing at the historical rate, then in ten years your neighborhood network will be carrying ten times more traffic than today. In twenty years it will be carrying one hundred times more traffic than today.

When you think of growth in this manner, it’s a whole lot easier to understand why we shouldn’t be funding any technologies with grant money today that won’t be able to keep up with the normal expected growth in broadband traffic. Looking at growth from this perspective explains why AT&T made the decision last year to stop selling DSL. Understanding the normal growth rate makes it clear that it was idiotic to give CAF II funding to Viasat. Expected growth might be the best reason to not give RDOF subsidies to Starlink.

I have nothing against Starlink. If I still lived in a rural area, I would have been one of the first people on the list for the beta test. But there are already engineers I respect who believe that the Starlink network will struggle if the company sells to too many customers. If that’s even just a little bit true today, then how will Starlink perform in ten short years when the traffic will be ten times higher? And forget twenty years – Starlink is at heart a wireless network, and there are no tweaks to a wireless network that will ever handle a hundred-fold increase in traffic. If Starlink is still viable in twenty years, it will be because it took the same path as Viasat and imposed severe data caps to restrict usage or else raised rates to restrict the number of customers on the network.

I take flak every time I say this, but if I was in charge of grant programs, I wouldn’t fund anything other than fiber. I can’t think of any reason why we would fund any technology that doesn’t have a reasonable chance to still be viable in ten short years when broadband usage will likely be ten times higher than today. I would hope that a government-funded network will still be viable in twenty years when traffic volumes are likely to be one hundred times greater than today. If we don’t get this right, then we’re going to be talking about ways to build rural fiber a decade from now when other technologies crash and burn.

The Industry

A Fiber Land Grab?

I was surprised to see AT&T announce a public-private partnership with Vanderburgh County, Indiana to build fiber to 20,000 rural locations. The public announcement of the partnership says that the County will provide a $9.9 million grant, and AT&T will pick up the remaining $29.7 million investment.

The primary reason this surprised me is that it is a major reversal in direction for AT&T. The company spent the last thirty years working its way out of rural America, capped by an announcement in October 2020 that the company will no longer connect DSL customers. AT&T has publicly complained for years about the high cost of serving rural locations and has steadily cut its costs in rural America by slashing business offices and rural technicians. It’s almost shocking to see the company dive back in as a last-mile ISP in a situation that means long truck rolls and higher operating costs.

I’m sure it was the County grant that made AT&T consider this, but even that is surprising since the County is only contributing 25% of the funding. I’ve created hundreds of rural business plans, and most rural builds need grants of 40% or even much more to make financial sense. I assume that there is something unique about this county that makes that math work. AT&T and other telcos have one major advantage for building fiber that might have come into play – they can overlash fiber onto existing copper wires at a fraction of the cost of any other fiber builder, so perhaps AT&T’s real costs won’t be $29.7 million. Obviously, the math works for AT&T, and another county will be getting a rural fiber solution.

AT&T is not alone in chasing rural funding. We saw Charter make a major rural play in last year’s RDOF reverse auction. The RDOF reverse auction also attracted Frontier and Windstream, and both of these companies have made it clear that pursuing fiber expansion opportunities and pursuing grants are a key part of their future strategic plan.

My instincts are telling me that we are about to see a fiber land grab. The big ISPs other than Verizon had shunned building fiber for decades. When Verizon built its FiOS network, every other big ISP said they thought fiber was a bad strategic mistake by Verizon. But we’ve finally reached the time when the whole country wants fiber.

This AT&T announcement foreshadows that grant funding might be a big component of a big ISP land grab. The big ISPs have never been shy about taking huge federal funding. I wouldn’t be surprised if the big ISPs are collectively planning in board rooms on grabbing a majority of any big federal broadband grant funding program.

I think there is another factor that has awoken the big ISPs, which is also related to a land grab. Consider Charter. If they look out a decade or two into the future, they can see that rural fiber will surround their current footprint if they do nothing. All big ISPs are under tremendous pressure from Wall Street to keep growing. Charter has thrived for the last decade with a simple business plan of taking DSL customers from the telcos. It doesn’t require an Ouiji board to foresee the time in a few years when there won’t be any more DSL customers to capture.

I’m betting that part of Charter’s thinking for getting into the RDOF auction was the need to grab more geographic markets before somebody else does. Federal grant money makes this a lot easier to do, but without geographic expansion, Charter will eventually be landlocked and will eventually stop growing at a rate that will satisfy Wall Street.

Charter must also be worried about the growing momentum to build fiber in cities. I think Charter is grabbing rural markets where it can have a guaranteed monopoly for the coming decades to hedge against losing urban customers to competition from fiber and from wireless ISPs like Starry.

My guess is that the AT&T announcement is just the tip of the iceberg. If Congress releases $42 billion in broadband grants, the big companies are all going to have their hands out to get a big piece of the money. And that is going to transform the rural landscape in a way that I would never have imagined. I would have taken a bet from anybody, even a few years ago, that AT&T would never build rural fiber – and it looks like I was wrong.

The Industry

Forecasting Interest Rates and Inflation

This is a topic that I haven’t written about since I started my blog seven years ago because there hasn’t been a reason. We have just gone through a decade that benefitted from both low interest rates and low inflation – a rarity in historical economic terms.

Anybody building a broadband network can tell you they are seeing significant inflation in the prices of components needed to build a fiber network. There are some who shrug off current inflation as a temporary result of supply chain issues. To a large degree, they are right, but the inflation is real nonetheless. As someone who worked in the industry in past times of inflation, my experience is that prices never go back down to former levels. Even if all of the factors leading to current inflation are eventually solved, it’s unlikely that the companies that make conduits and handholes will ever go completely back to the old prices.

To some degree, the lack of inflation has spoiled us. As recently as a year ago, I knew that I could pull a business plan off the shelf from ten years ago, and it probably still made sense. All of the industry fundamentals from a decade ago were all roughly the same, and a business plan that worked then would still have worked.

I hate to say it, but those days of surety might be over for a while. The chart below is all-too-familiar to those of us who have been in the industry a long time. In the not-too-distant past, we saw periods of both high interest rates and high inflation. 1980 is not ancient history, and those of us who were in the industry at the time recall the jarring effect of both high interest rates and high inflation on telephone companies. This chart doesn’t go back to even worse times, like in 1970 when President Nixon ordered a nationwide freeze on wages and prices to try to stop hyperinflation. I remember seeing a talking head economist on a business show a few years ago who said that we now know how to beat inflation and that high inflation and high interest rates were never coming back to the U.S. economy. I had a good laugh because I knew this guy was a total idiot.

We now live in a global economy, and the U.S. doesn’t have any magic pill that somehow keeps us out of worldwide economic upheaval. As one example, West Africa is currently suffering from hyperinflation. The current inflation rate in Nigeria is 16%, down from over 20%. Nearby Congo is one of the primary sources for metals like cobalt and tantalum that are essential for making things like computer chips and cellphones. When the price of raw material from Congo skyrockets, the industries that use those resources have no choice but to raise prices to compensate.

We don’t have to go back to ancient history to remember when we worried about interest rates. I worked with cities that were floating municipal bonds in the 2000s, and I recall times when they delayed selling bonds hoping that rates would be more favorable in the weeks or months to follow. One fiber project I was working with never was launched because the cost of interest on bonds grew larger than the project could support.

Everybody who builds financial forecasts for broadband businesses is in a quandary. How do we reflect the rising costs for materials and labor? How can anybody forecast the cost to build fiber two years from now or three or five years? We look out over the next ten years and see an industry that wants to grow faster than the support structure for the industry is ready to handle. Companies like Corning have difficult decisions to make. The company could likely sell twice as much fiber as in recent years if it had more factories. But does it dare build those factories? A factory is a fifty-year investment, and does the company want to have huge idle capacity a decade from now when the fiber craze naturally slows down? Every manufacturer in the industry is having a similar conversation, but nobody knows the calculus for figuring out the right answer. And that calculus will get much harder if we see the return of both inflation and higher interest rates.

Interest rates are going to have to increase at some point. The rates have been held below the natural market as a monetary strategy to fuel the economy. But the Federal Reserve signaled a few weeks ago that it foresees six to seven interest rate increases over the next two years.

I don’t mean for this blog to be gloom and doom. For most of my career, I’ve dealt with both inflation and interest rates when making financial forecasts. The last decade spoiled me like it spoiled many of us, and we need to readjust the way we think about the future and figure out how to deal with an economic world that is returning to normal.

The Industry

Fixing the Supply Chain

Almost everybody in the broadband industry is now aware that the industry is suffering supply chain issues. ISPs are having problems obtaining many of the components needed to build a fiber network in a timely manner, which is causing havoc with fiber construction projects. I’ve been doing a lot of investigation into supply chain issues, and it turns out the supply chain is a lot more complex than I ever suspected, which means it’s not going to be easy to get the supply chain back to normal.

One of the supply chain issues that is causing problems throughout the economy is the semi-conductor chip shortage. Looking at just this one issue demonstrates the complexity of the supply chain. A similar story can be told about other supply chain issues like fiber and conduit. Consider all of the following issues that have accumulated to negatively impact the chip supply chain:

  • Intel Stumbled. Leading into the pandemic, Intel stumbled in its transition from 10-nanometer chips to 7-nanometer chips. This created delays in manufacturing that led many customers to look to other manufacturers like AMD. Changing chip manufacturers is not a simple process since a chip manufacturer must create a template for any custom chip – a process that normally takes 4 – 6 months. Chip customers found themselves caught in the middle of this transition as the pandemic hit.
  • Demand for Specific Chips Changed. Chipmakers tend to specialize in specific types of chips, and they shift gears in anticipation of market demand. Before the pandemic, the makers of memory DRAM and NAND chip had curbed production due to declining sales in smartphones and PCs. When the pandemic caused a spike in demand for those devices, the chip makers had already changed to producing other kinds of chips.
  • Labor Issues. Chipmakers were like every other industry with shutdowns due to COVID outbreaks. And like everybody else, the chipmakers had labor shortages due to workers who were unable or unwilling to work during the pandemic.
  • Local Issues. Every industry suffers from temporary local issues, but these issues were far more disruptive than normal during the pandemic. For example, an extended power outage crippled Taiwan’s TSMC. A fire knocked out a factory of auto chipmaker Renesas.
  • A Spike in Demand. One of the consequences of the pandemic has been a huge transition to cloud services. This caused an unexpected spike in chips needed for data centers. Rental car companies maintained revenue during the pandemic by selling rental car stocks – the crunch to replace those rental cars is creating more temporary demand than the industry can supply.
  • Trade War. The ongoing trade issues between the U.S. and China have caused slowdowns in Chinese manufacturing. One estimate I saw said that as many as 40% of Chinese factories were shut during the peak of the pandemic.
  • There is a global shipping logjam. Getting shipped items through ports is taking as long as six weeks due mostly to labor shortages of port workers, ship crews, and truckers. This doesn’t affect just the final chips being shipped but also the raw materials used to make or assemble chips.
  • Raw Material Shortages. The world has tended to lean on single markets for raw materials like lithium, cobalt, nickel, manganese, and rare earth metals. The Brookings Institute says that pandemic has caused delays and shortages of thirteen critical metals and minerals.
  • Selective Fulfillment. Overseas chipmakers like Netherlands’ ASML, Taiwan’s TSMC, and Korea’s Samsung chose to satisfy domestic chip and regional chip demand before global demand in places like the U.S.
  • Receive-as-Needed Logistics. Over the last decade, many manufacturers have changed to a sophisticated manufacturing process that has materials and parts appearing at the factory only as needed. I recall manufacturers that bragged about having components delivered only an hour before use on the factory floor. Anybody using this logistics method has been stopped dead during the pandemic, and many companies are reexamining logistics strategies.

I suspect this list is just touching the tip of the iceberg and that there are probably a dozen more reasons why chips are in short supply. Unfortunately, every major industry has a similar list. It’s not going to be easy for the world to work our way out of all of this because the problems in any one industry tend to impact many others.  I’ve read opinions of optimists who believe we’ll figure all this out in 2022, but others who say some of these issues are going to nag us for years to come.

The Industry

An Update on Robocalling

The FCC has taken a number of actions against robocalling over the last year to try to tamp down on the practice, which every one of us hates. I’ve had the same cellular phone number for twenty-five years, and I attract far more junk calls every day than legitimate business calls.

The FCC has taken a number of specific actions, but so far this hasn’t made a big dent in the overall call volume. Actions taken so far include:

  • The FCC issued cease-and-desist letters to some of the biggest robocallers. For example, in May of this year, the agency ordered VaultTel to stop placing robocalls.
  • The FCC has been fining telemarketers with some of the biggest fines ever issued by the agency. This includes a $225 million fine against a Texas-based health insurance telemarketer for making over one billion spoofed calls. There have been other fines such as $120 million against a Florida time-share company and $82 million against a North Carolina health insurance company.
  • The FCC is hoping that its program for caller ID verification will tamp down significantly on the robocalls. This process, referred to as STIR/SHAKEN requires that underlying carriers verify that a call is originating from an authorized customer. The new protocol has already been implemented by the big carriers like AT&T, but smaller carriers were given more time. The FCC noted recently that it has seen a big shift of robocalling originating from smaller carriers that are not yet part of STIR/SHAKEN.
  • The agency has begun to coordinate efforts with law enforcement to track down and arrest robocallers who continue to flout the rules. That includes working with the U.S. Justice Department and State Attorney Generals.
  • The FCC also gave telephone companies permission to ‘aggressively block’ suspected robocalls. The agency has also encouraged telephone companies to offer advanced blocking tools to customers.

So far, the FCC actions haven’t made a big dent in robocalling. In 2020 we saw about 4 billion robocalls per day. The robocallers picked up the pace of calling in anticipation of getting shut down, and in March of this year, there were over 4.9 million robocalls placed. In the most recently completed month of August, we still saw 4.1 billion robocalls. It appears that the robocallers have just shifted their methods and are able, at this point, to avoid the STIR/SHAKEN restrictions from the big carriers. Hopefully, a lot of this will get fixed when that protocol is mandatory for everybody. The FCC recently announced that it was accelerating the implementation date for a list of carriers that the agency says is originating a lot of the robocalls.

The FCC knew from the start that this wasn’t going to be easy. The process of generating robocalls is now highly mechanized, and a few companies can generate a huge volume of calls. Apparently, the profits from doing this are lucrative enough for robocallers to flirt with the big FCC fines. When I searched Google for the keywords of robocaller and the FCC, the first thing at the top of the list was a company that is still selling robocalling.

We saw the same thing a few years ago with access stimulation, where a few unscrupulous companies and carriers were making big dollars from generating huge volumes of bogus calls in order to bill access charges.

Hopefully, the FCC can eventually put a big dent in robocalling. It’s hard to imagine that anybody is willing to answer a phone call from somebody they don’t know. Hopefully, more giant fines a few major convictions will convince the robocalling companies that it’s not worth it.

The Industry

The Pandemic and the Internet

Pew Research Center conducted several polls asking people about the importance of the Internet during the pandemic. The Pew survey report is seven pages filled with interesting statistics and a recommended read. This blog covers a few of the highlights.

The Overall Impact of the Internet. 58% of adults said that the Internet was essential during the pandemic – that’s up from 52% in April of 2020. Another 33% of adults say the Internet was important but not essential. Only 9% of adults said the Internet wasn’t important to them. The importance of the Internet varied by race, age, level of education, income, and location.

  • As might be expected, 71% of those under 30 found the Internet to be essential compared to 38% of those over 65.
  • 71% of those with a college degree found the internet to be essential versus 45% of those with a high school degree or less.
  • 66% of those in the upper third of incomes found the Internet to be essential compared to 55% of those in the lower third.
  • 61% of both urban and suburban residents found the Internet to be essential compared to 48% for rural residents.

Video Calling Usage Exploded. Possibly the biggest overall change in Internet usage has been the widespread adoption of video calling. 49% of adults made a video call at least once per week, with 12% doing so several times per day. The usage was most pronounced for those who work from home, with 79% making a video call at least once per week and 35% connecting multiple times per day.

Longing for a Return to Personal Interactions. Only 17% of Americans say that digital interactions have been as good as in-person contacts, while 68% say digital interactions are useful but no replacement for in-person contacts.

Challenges with Online Schooling. Only 18% of households said that online schooling went very well, with 45% saying it went somewhat well. 28% of households reported it was very easy to use the technology associated with online schooling, with another 42% saying it was somewhat easy. Twice as many people from the lower one-third of incomes said online schooling technology was difficult than those in the upper one-third of incomes. Nearly twice as many people in rural areas found online schooling technology to be a challenge compared to suburban residents.

Problems with Internet Connections. 49% of all survey respondents said they had problems with the internet connection during the pandemic. 12% experienced problems often.

Upgrading Internet. 29% of survey respondents said they did something to improve their Internet connection during the pandemic.

Affordability. 26% of respondents said they are worried about the ability to pay home broadband bills. This was 46% among those in the lower one-third of incomes.

Tech Readiness. 30% of Americans say they are not confident using computers, smartphones, or other connected electronics. This was highest for those over 75 (68%), those with a high school degree or less (42%), and those in the lower one-third of incomes (38%).

The Industry

You’ve Got Mail

I’ve always been intrigued by the history of technology, and I think a lot of that is due to having almost everything computer-related happen during my lifetime. I missed a tech anniversary earlier this year when email turned 50.

It was April 1971 when software engineer Ray Tomlinson first used the @ symbol as the key to route a message between computers within ARPANET, the birthplace of the Internet. Tomlinson was working on a project at the U.S. Advanced Research Projects Agency that was developing a way to facilitate communications between government computers. There had been transmission of messages between computers starting in 1969, but Tomlinson’s use of the @ symbol has been identified as the birth of network email.

ARPA became DARPA when the military adopted the agency. DARPA kept a key role in the further development of email and created a set of email standards in 1973. These standards include things like having the “To” and “From” fields as headers for emails.

Email largely remained as a government and university protocol until 1989, when CompuServe made email available to its subscribers. CompuServe customers could communicate with each other, but not with the outside world.

In 1993, AOL further promoted email when every AOL customer was automatically given an email address. This led to the “You’ve got mail” slogan, and I can still hear the AOL announcement in my head today.

In 1996, Hotmail made a free email address available to anybody who had an Internet connection. Millions of people got email addresses, and the use of email went mainstream. If you ever used Hotmail, you’ll remember the note at the bottom of every email that said, “P.S. I love you. Get your free email here”. Hotmail was purchased by Microsoft in 1997 and was morphed over time into Outlook. This was one of the first big tech company acquisitions, at $400 million, which showed that huge value could be created by giving away web services for free.

In 1997, Yahoo launched a competing free email service that gave users even more options.

In 2004, Google announced its free Gmail service with the announcement that users could have a full gigabyte of storage, far more than anybody else offered.

Over the years, there have been many communications platforms launched that promised to displace email. This includes Facebook Messenger, WeChat, Slack, Discord, and many others. But with all of these alternate ways for people to communicate, email still reigns supreme and usage has grown every year since inception.

There are over 300 billion emails generated every day. Gmail alone has 1.6 billion email addresses, representing 20% of all people on the planet. In the workplace, the average American employee sends 40 emails each day and receives 121.

The beauty of email is its simplicity. It can work across any technology platform. It still uses HTML protocols to create headers and add attachments to an email. Routing is done with SMTP (Simple Mail Transfer Protocol) that allows messages to be sent to anybody else in the world.

On the downside, the ease of email has spawned spam when marketers found that they could sell even the most bizarre products if they sent enough emails. In recent time, emails have been used to implant malware on a recipient’s computer if they willingly open attachments.

One downside for the future of email is that many Americans under 30 hate using it. We’ll have to see over time if email gets displaced, but it would be a slow transition.

But email is still a powerful tool that is ingrained in our daily lives. Email was one of the early features that lured millions into joining the web. So happy birthday, email.