Fixing the Supply Chain

Almost everybody in the broadband industry is now aware that the industry is suffering supply chain issues. ISPs are having problems obtaining many of the components needed to build a fiber network in a timely manner, which is causing havoc with fiber construction projects. I’ve been doing a lot of investigation into supply chain issues, and it turns out the supply chain is a lot more complex than I ever suspected, which means it’s not going to be easy to get the supply chain back to normal.

One of the supply chain issues that is causing problems throughout the economy is the semi-conductor chip shortage. Looking at just this one issue demonstrates the complexity of the supply chain. A similar story can be told about other supply chain issues like fiber and conduit. Consider all of the following issues that have accumulated to negatively impact the chip supply chain:

  • Intel Stumbled. Leading into the pandemic, Intel stumbled in its transition from 10-nanometer chips to 7-nanometer chips. This created delays in manufacturing that led many customers to look to other manufacturers like AMD. Changing chip manufacturers is not a simple process since a chip manufacturer must create a template for any custom chip – a process that normally takes 4 – 6 months. Chip customers found themselves caught in the middle of this transition as the pandemic hit.
  • Demand for Specific Chips Changed. Chipmakers tend to specialize in specific types of chips, and they shift gears in anticipation of market demand. Before the pandemic, the makers of memory DRAM and NAND chip had curbed production due to declining sales in smartphones and PCs. When the pandemic caused a spike in demand for those devices, the chip makers had already changed to producing other kinds of chips.
  • Labor Issues. Chipmakers were like every other industry with shutdowns due to COVID outbreaks. And like everybody else, the chipmakers had labor shortages due to workers who were unable or unwilling to work during the pandemic.
  • Local Issues. Every industry suffers from temporary local issues, but these issues were far more disruptive than normal during the pandemic. For example, an extended power outage crippled Taiwan’s TSMC. A fire knocked out a factory of auto chipmaker Renesas.
  • A Spike in Demand. One of the consequences of the pandemic has been a huge transition to cloud services. This caused an unexpected spike in chips needed for data centers. Rental car companies maintained revenue during the pandemic by selling rental car stocks – the crunch to replace those rental cars is creating more temporary demand than the industry can supply.
  • Trade War. The ongoing trade issues between the U.S. and China have caused slowdowns in Chinese manufacturing. One estimate I saw said that as many as 40% of Chinese factories were shut during the peak of the pandemic.
  • There is a global shipping logjam. Getting shipped items through ports is taking as long as six weeks due mostly to labor shortages of port workers, ship crews, and truckers. This doesn’t affect just the final chips being shipped but also the raw materials used to make or assemble chips.
  • Raw Material Shortages. The world has tended to lean on single markets for raw materials like lithium, cobalt, nickel, manganese, and rare earth metals. The Brookings Institute says that pandemic has caused delays and shortages of thirteen critical metals and minerals.
  • Selective Fulfillment. Overseas chipmakers like Netherlands’ ASML, Taiwan’s TSMC, and Korea’s Samsung chose to satisfy domestic chip and regional chip demand before global demand in places like the U.S.
  • Receive-as-Needed Logistics. Over the last decade, many manufacturers have changed to a sophisticated manufacturing process that has materials and parts appearing at the factory only as needed. I recall manufacturers that bragged about having components delivered only an hour before use on the factory floor. Anybody using this logistics method has been stopped dead during the pandemic, and many companies are reexamining logistics strategies.

I suspect this list is just touching the tip of the iceberg and that there are probably a dozen more reasons why chips are in short supply. Unfortunately, every major industry has a similar list. It’s not going to be easy for the world to work our way out of all of this because the problems in any one industry tend to impact many others.  I’ve read opinions of optimists who believe we’ll figure all this out in 2022, but others who say some of these issues are going to nag us for years to come.

An Update on Robocalling

The FCC has taken a number of actions against robocalling over the last year to try to tamp down on the practice, which every one of us hates. I’ve had the same cellular phone number for twenty-five years, and I attract far more junk calls every day than legitimate business calls.

The FCC has taken a number of specific actions, but so far this hasn’t made a big dent in the overall call volume. Actions taken so far include:

  • The FCC issued cease-and-desist letters to some of the biggest robocallers. For example, in May of this year, the agency ordered VaultTel to stop placing robocalls.
  • The FCC has been fining telemarketers with some of the biggest fines ever issued by the agency. This includes a $225 million fine against a Texas-based health insurance telemarketer for making over one billion spoofed calls. There have been other fines such as $120 million against a Florida time-share company and $82 million against a North Carolina health insurance company.
  • The FCC is hoping that its program for caller ID verification will tamp down significantly on the robocalls. This process, referred to as STIR/SHAKEN requires that underlying carriers verify that a call is originating from an authorized customer. The new protocol has already been implemented by the big carriers like AT&T, but smaller carriers were given more time. The FCC noted recently that it has seen a big shift of robocalling originating from smaller carriers that are not yet part of STIR/SHAKEN.
  • The agency has begun to coordinate efforts with law enforcement to track down and arrest robocallers who continue to flout the rules. That includes working with the U.S. Justice Department and State Attorney Generals.
  • The FCC also gave telephone companies permission to ‘aggressively block’ suspected robocalls. The agency has also encouraged telephone companies to offer advanced blocking tools to customers.

So far, the FCC actions haven’t made a big dent in robocalling. In 2020 we saw about 4 billion robocalls per day. The robocallers picked up the pace of calling in anticipation of getting shut down, and in March of this year, there were over 4.9 million robocalls placed. In the most recently completed month of August, we still saw 4.1 billion robocalls. It appears that the robocallers have just shifted their methods and are able, at this point, to avoid the STIR/SHAKEN restrictions from the big carriers. Hopefully, a lot of this will get fixed when that protocol is mandatory for everybody. The FCC recently announced that it was accelerating the implementation date for a list of carriers that the agency says is originating a lot of the robocalls.

The FCC knew from the start that this wasn’t going to be easy. The process of generating robocalls is now highly mechanized, and a few companies can generate a huge volume of calls. Apparently, the profits from doing this are lucrative enough for robocallers to flirt with the big FCC fines. When I searched Google for the keywords of robocaller and the FCC, the first thing at the top of the list was a company that is still selling robocalling.

We saw the same thing a few years ago with access stimulation, where a few unscrupulous companies and carriers were making big dollars from generating huge volumes of bogus calls in order to bill access charges.

Hopefully, the FCC can eventually put a big dent in robocalling. It’s hard to imagine that anybody is willing to answer a phone call from somebody they don’t know. Hopefully, more giant fines a few major convictions will convince the robocalling companies that it’s not worth it.

The Pandemic and the Internet

Pew Research Center conducted several polls asking people about the importance of the Internet during the pandemic. The Pew survey report is seven pages filled with interesting statistics and a recommended read. This blog covers a few of the highlights.

The Overall Impact of the Internet. 58% of adults said that the Internet was essential during the pandemic – that’s up from 52% in April of 2020. Another 33% of adults say the Internet was important but not essential. Only 9% of adults said the Internet wasn’t important to them. The importance of the Internet varied by race, age, level of education, income, and location.

  • As might be expected, 71% of those under 30 found the Internet to be essential compared to 38% of those over 65.
  • 71% of those with a college degree found the internet to be essential versus 45% of those with a high school degree or less.
  • 66% of those in the upper third of incomes found the Internet to be essential compared to 55% of those in the lower third.
  • 61% of both urban and suburban residents found the Internet to be essential compared to 48% for rural residents.

Video Calling Usage Exploded. Possibly the biggest overall change in Internet usage has been the widespread adoption of video calling. 49% of adults made a video call at least once per week, with 12% doing so several times per day. The usage was most pronounced for those who work from home, with 79% making a video call at least once per week and 35% connecting multiple times per day.

Longing for a Return to Personal Interactions. Only 17% of Americans say that digital interactions have been as good as in-person contacts, while 68% say digital interactions are useful but no replacement for in-person contacts.

Challenges with Online Schooling. Only 18% of households said that online schooling went very well, with 45% saying it went somewhat well. 28% of households reported it was very easy to use the technology associated with online schooling, with another 42% saying it was somewhat easy. Twice as many people from the lower one-third of incomes said online schooling technology was difficult than those in the upper one-third of incomes. Nearly twice as many people in rural areas found online schooling technology to be a challenge compared to suburban residents.

Problems with Internet Connections. 49% of all survey respondents said they had problems with the internet connection during the pandemic. 12% experienced problems often.

Upgrading Internet. 29% of survey respondents said they did something to improve their Internet connection during the pandemic.

Affordability. 26% of respondents said they are worried about the ability to pay home broadband bills. This was 46% among those in the lower one-third of incomes.

Tech Readiness. 30% of Americans say they are not confident using computers, smartphones, or other connected electronics. This was highest for those over 75 (68%), those with a high school degree or less (42%), and those in the lower one-third of incomes (38%).

You’ve Got Mail

I’ve always been intrigued by the history of technology, and I think a lot of that is due to having almost everything computer-related happen during my lifetime. I missed a tech anniversary earlier this year when email turned 50.

It was April 1971 when software engineer Ray Tomlinson first used the @ symbol as the key to route a message between computers within ARPANET, the birthplace of the Internet. Tomlinson was working on a project at the U.S. Advanced Research Projects Agency that was developing a way to facilitate communications between government computers. There had been transmission of messages between computers starting in 1969, but Tomlinson’s use of the @ symbol has been identified as the birth of network email.

ARPA became DARPA when the military adopted the agency. DARPA kept a key role in the further development of email and created a set of email standards in 1973. These standards include things like having the “To” and “From” fields as headers for emails.

Email largely remained as a government and university protocol until 1989, when CompuServe made email available to its subscribers. CompuServe customers could communicate with each other, but not with the outside world.

In 1993, AOL further promoted email when every AOL customer was automatically given an email address. This led to the “You’ve got mail” slogan, and I can still hear the AOL announcement in my head today.

In 1996, Hotmail made a free email address available to anybody who had an Internet connection. Millions of people got email addresses, and the use of email went mainstream. If you ever used Hotmail, you’ll remember the note at the bottom of every email that said, “P.S. I love you. Get your free email here”. Hotmail was purchased by Microsoft in 1997 and was morphed over time into Outlook. This was one of the first big tech company acquisitions, at $400 million, which showed that huge value could be created by giving away web services for free.

In 1997, Yahoo launched a competing free email service that gave users even more options.

In 2004, Google announced its free Gmail service with the announcement that users could have a full gigabyte of storage, far more than anybody else offered.

Over the years, there have been many communications platforms launched that promised to displace email. This includes Facebook Messenger, WeChat, Slack, Discord, and many others. But with all of these alternate ways for people to communicate, email still reigns supreme and usage has grown every year since inception.

There are over 300 billion emails generated every day. Gmail alone has 1.6 billion email addresses, representing 20% of all people on the planet. In the workplace, the average American employee sends 40 emails each day and receives 121.

The beauty of email is its simplicity. It can work across any technology platform. It still uses HTML protocols to create headers and add attachments to an email. Routing is done with SMTP (Simple Mail Transfer Protocol) that allows messages to be sent to anybody else in the world.

On the downside, the ease of email has spawned spam when marketers found that they could sell even the most bizarre products if they sent enough emails. In recent time, emails have been used to implant malware on a recipient’s computer if they willingly open attachments.

One downside for the future of email is that many Americans under 30 hate using it. We’ll have to see over time if email gets displaced, but it would be a slow transition.

But email is still a powerful tool that is ingrained in our daily lives. Email was one of the early features that lured millions into joining the web. So happy birthday, email.

Preparing for Storm Damage

After every major hurricane, like the category 4 Ida that recently hit Louisiana, there is talk in the telecom and power industries about ways to better protect our essential power and communication grids. There was major damage to grids and networks in Louisiana from hurricane winds and storm surges and massive flooding in the mid-Atlantic from Western Maryland to New York City.

One thing that we’ve learned over time is that there is no way to stop storm damage. Strong hurricanes, tornados, floods, and ice storms are going to create damage regardless of the steps we might take to try to keep aerial utilities safe. What matters most is the amount of time it takes to make repairs – obviously, the shorter, the better.

A natural impulse is to bury all of the aerial utilities. However, the cost to bury wires in many places is exorbitant. There are also issues during some weather events from buried facilities. It takes a long time and a lot of effort to find and fix problems in flooded areas. Buried electric lines are also sensitive to the saltwater corrosion that comes from storm surges from big coastal storms.

The Electric Power Research Institute (EPRI) has been working on ideas to better protect wires, poles, and towers during big storms. EPRI operates an outdoor laboratory in Lenox, Massachusetts, to create simulations of storm damage, EPRI’s goal is to find techniques to either minimize storm damage or shorten the time needed to make repairs.

EPRI research is intriguing to anybody that’s been in the industry. For example, they are exploring ways that towers and poles can be made to collapse in pre-planned ways rather than be destroyed by winds. A controlled collapse could avoid all of the problems of snapped and dangerous power lines. If done properly, EPRI is hoping there would be a way to stand up a downed pole in hours instead of days.

They are exploring a similar line of research, looking at ways for aerial wires to disconnect from poles before drastic damage occurs. This would stop the domino effect of multiple poles being broken and dragged down by a heavy tree landing on a pole span. It would be a lot easier to put fallen wires back onto poles than to untangle and splice wire breaks caused by catastrophic damage.

EPRI is also exploring other aspects that are needed to effectuate storm damage repair. For example, there is a white paper on the site that looks at the effectiveness of alternate telecommunications channels so that key players at a utility to communicate and coordinate after a storm. All of the normal modes of communication are likely to go dead when the wires and towers come tumbling down. The white paper looked at using GEO satellites, LEO satellites, and HF radios to communicate. The goal was to find a communications medium that would allow for a 3-way call after more conventional communication paths are out of service. The best solution was the high-orbit GEO satellites.

This kind of research is both important and interesting because coordination of repair efforts is one of the biggest challenges after every disaster. A utility can have standby crews ready to work, but nothing gets done until somebody can tell them what most needs to be addressed.

Will Hyper-inflation Kill Broadband Projects?

The broadband industry is facing a crisis. We are poised to build more fiber broadband in the next few years than has been built over the last four decades. Unfortunately, this peak in demand hits a market that was already superheated, and at a time when pandemic-related supply chain issues are driving up the cost of broadband network components.

The numbers I am hearing from clients are truly disturbing. Just in the last few weeks, I’ve heard that the cost of conduit and resin-based components like handholes and pedestals is up 40%. I’ve heard mixed messages on fiber – some saying that prices are up as much as 20%, while others are seeing little price increases. I think the increases have to do with the specific kind of fiber being purchased as well as the history of the buyer – the biggest increases are going to new or casual fiber buyers. I’ve heard the cost of fiber-related components like pigtails is also way up.

The kinds of numbers I’m hearing can only be classified as hyper-inflation. It’s way outside the bounds of normalcy when the cost of something is up 20% to 40% in a year. I’ve been listening to a lot of economists lately who say that many price increases that are due to the pandemic are temporary in nature and that what we are seeing is a price bubble – they predict prices ought to revert to old levels over time in competitive markets where multiple providers of components will be bidding for future business.

But I keep looking at the upcoming plans for the country collectively to build fiber, and it looks like we might be seeing a superheated industry for the rest of this decade. When much of the rest of the economy gets back to normal, it’s not hard to envision that not being the case for fiber.

This leads me to ask if this hyper-inflation is going to kill fiber projects? I start with the RDOF winners – the amount of subsidy they get is fixed and won’t be adjusted for higher costs. At what point do some of the RDOF projects stop making sense? The business modelers for these companies must be working overtime to see if the projects still work with the higher costs. It won’t be shocking to see some of the RDOF winners give up and return Census blocks to the FCC.

But this same thing affects the winners of every other grant. Consider the recent grant filings with the NTIA. Those were some of the most generous grants ever awarded, with the NTIA program picking up as much as 75% of the cost of a project. What happens to the winners of those grants if materials are much more costly when they go to build the project? Any extra cost must be borne by the grant winner, meaning that the real matching could be a lot more than 25%. Some grant winners are going to have a hard time finding extra funding to cover the extra costs. Some of these projects are in extremely rural places, and one has to worry that having to pay extra might make the difference between a project making sense and not. Even with grants, it’s often a fine line between a project being feasible or not.

This same worry has to be spreading through the big ISPs. Companies like Frontier, Windstream, and Lumen are betting their future viability on building more fiber. How do those plans change when fiber is a lot more expensive?

The worst thing is that we have no idea where prices will peak. We’ve not really yet seen any market impact from RDOF and other big grant programs. We’ve seen some impact from CAREs spending, but that was a drop in the bucket compared to what we’re likely to see from ARPA and federal infrastructure spending.

I have a lot of nervous clients, and I have no good advice for them on this issue. Should they buy as much fiber and components as they can now before prices rise even more, or should they wait and hope for parts of the market to return to normalcy? What we’re experiencing is so far outside the bounds of normal that we have no basis for making decisions. I chatted with a few folks recently who speculated that the best investment they could make this year would be to buy $1 million of fiber reels and sit on them for a year – they might be right.

Multi-gigabit Broadband

There have been a few ISPs in recent years that have quietly offered residential broadband with speeds up to 10-gigabits. However, this year has seen an explosion of ISPs marketing multi-gigabit broadband.

I recall an announcement from Google Fiber last year offering an upgrade to 2-gigabit service in Nashville and Huntsville for $100 per month. Since then, the company has expanded the offer to other markets, including Atlanta, Charlotte, Kansas City, Raleigh-Durham, Austin, Salt Lake City, Provo, and Irvine.

Not to be outdone, Comcast Xfinity announced a 2-gigabit product, likely available in those markets where Google Fiber is competing. But Comcast doesn’t seem to really want to sell the product yet, having priced it at $299.95 per month. We saw the same high pricing when Comcast first introduced gigabit service – it gave them the bragging rights for having the fastest product, but the company was clearly not ready to widely sell it.

https://www.xfinity.com/gig

Midco, the cable company, markets speed up to 5-gigabits in places where it has built fiber. In recent months I’ve seen announcements from several rural cooperatives and telcos that are now offering 2-gigabit speed.

This feels like a largely marketing-driven phenomenon, with ISPs trying to distinguish themselves in the market. It was inevitable that we’d see faster speeds after the runaway popularity of 1-gigabit broadband. OpenVault reported that as of June of this year that 10.5% of all broadband subscribers are buying a gigabit product. It makes sense that some of these millions of customers could be lured to spend more for even faster speed.

There are still a lot of broadband critics who believe that nobody needs gigabit broadband. But you can’t scoff at a product that millions are willing to buy. Industry pundits thought Google Fiber was crazy a decade ago when it announced that its basic broadband speed was going to be 1-gigabit. At that time, most of the big cable companies had basic broadband products at 60 Mbps, with the ability to buy speeds as fast as 200 Mbps.

It was clear then and is still true today that a gigabit customer can rarely if ever, download from the web at a gigabit speed – the web isn’t geared to support that much speed the whole way through the network. But customers with gigabit broadband will tell you there is a noticeable difference between gigabit broadband and more normal broadband at 100 Mbps. The human eye can perceive the improvement that comes with gigabit speed.

The most aggravating thing about the debate about multi-gigabit speeds is how far the regulators have fallen behind the real world. According to OpenVault, the percentage of homes that subscribe to broadband with speeds of 100 Mbps or faster has grown to 80% of all broadband subscribers. We know in some markets that delivered speeds are less than advertised speeds – but the huge subscription levels are proof that subscribers want fast broadband.

Farm Fresh Broadband

I was lucky enough to get an advanced copy of Farm Fresh Broadband by University of Virginia professor Christopher Ali. It’s a great read for anybody interested in rural broadband. The book is published by MIT Press and is now available for pre-order on Amazon.

The first half of the book discusses the history and the policies that have shaped rural broadband, and my review of his book will focus on this early discussion, which is near and dear to my heart. Ali hit on the same topics that I have been writing about in this blog for years. Of particular interest was Ali’s section talking about the policy failures that have led to the poor state of rural broadband today. Ali correctly points out that “we have a series of policies and regulations aimed at serving the interests of monopoly capital rather than the public interest and the public good”. Ali highlights the following policy failures that have largely created the rural digital divide:

  • Definition-Based Policy. The FCC has been hiding since behind its 25/3 Mbps definition of broadband since 2015. We still see this today when current federal grants all begin with this massively outdated definition of broadband when defining what is eligible for grant funding. We recently passed a milestone where over 10% of all households in the country are subscribed to gigabit broadband, and yet we are still trying to define rural broadband using the 25/3 Mbps standard. Unfortunately, sticking with this policy crutch has led to disastrously poor allocation of subsidies.
  • Technology Neutrality. Ali points to regulators who refuse to acknowledge that there are technologies and carriers that should not be worthy of federal subsidies. This policy is largely driven by lobbying by the big ISPs in the industry. This led to the completely wasted $10 billion CAF II subsidy that was given to shore up DSL at a time when it was already clear that DSL was a failure as a rural technology. This same lack of regulator backbone has continued as we’ve seen federal subsidies given to Viasat in the CAF II reverse auction and Starlink in the RDOF. The money wasted on these technologies could have instead been invested in bringing permanent broadband solutions to rural areas. It looks like Congress is going to continue this bow the big ISPs by allowing grants to be awarded for any technology that can claim to deliver 100/20 Mbps.
  • Mapping. Ali highlights the problems with FCC mapping that disguises the real nature of rural broadband. He points to the example of Louisa County, Virginia, where the FCC maps consider the county to have 100% broadband coverage at 25/3 Mbps. It turns out that 40% of this coverage comes from satellite broadband. Much of the rest comes from overstatements by the telcos in the county of the actual speeds. M-Lab speed tests show the average speeds in the county as 3.91 Mbps download and 1.69 Mbps upload – something that was not considered as broadband a decade ago by the FCC. Unfortunately, Louisa County is not unique, and there are similar examples all over the country where poor mapping policies have deflected funding away from the places that need it the most.
  • Localism. There are hundreds of examples where small regional telephone companies and cooperatives have brought great broadband to pockets of rural America. We have made zero attempt to duplicate and spread these success stories. In the recent CAF II awards we saw just the opposite, with huge amounts of money given to companies that are not small and not local. We already know how to fix rural broadband – by duplicating the way we electrified America by loaning money to local cooperatives. But regulators would rather hand out huge grants to giant ISPs. When we look back in a few decades at the results of the current cycle of grant funding, does anybody really believe that a big ISP like Charter will bring the same quality of service to communities as rural cooperatives?

The second half of the book is the really interesting stuff, and all I supply for that are some teasers. Ali describes why farmers badly need broadband. He describes the giant bandwidth needed for precision agriculture, field mapping, crop and livestock monitoring, and overall management of farms to maximize yields. One of the statistics he cites is eye-opening. Fully-deployed smart agriculture could generate 15 gigabytes of data annually for every 1,000 acres of fields. With current land under cultivation, that equates to more than 1,300 terabytes of data per year. We have a long way to go to bring farms the broadband they need to move into the future.

I do have one criticism of the book, and it’s purely personal. Ali has a huge number of footnotes from studies, articles, and reports – and it’s going to kill many of my evenings as I slog through the fascinating references that I’ve not read before.

An Update on Telemedicine

I’ve been keeping tabs on the news about telemedicine since it is touted throughout the industry as one of the big benefits of having good broadband. One piece of news comes from a survey conducted by Nemours Children’s Health. This is a large pediatric health system with 95 locations in Delaware, Florida, New Jersey, and Pennsylvania. The company treats almost half a million children annually.

Nemours released a report on Telehealth in July. The report was based on a survey of 2,056 parents/guardians of children. The survey had some interesting results:

There is a Need for Telehealth. 48% of the survey respondents said that they had at least one recent experience where there was a hardship in getting a sick child to a live doctor visit. This included reasons such as living in an unsafe community or not having easy access to transportation. 28% of respondents reported two such occasions, and 15% reported three or more. These are the situations for which telehealth is an ideal solution to get a doctor to look at a sick child when care is needed rather than when the child can be transported to a doctor’s office.

Telehealth Good for Parents. Almost 90% of the respondents to the survey said that telehealth makes it easier for parents to take an active role in a child’s health care. A lot of parents said that somebody other than them takes sick children to see a doctor during the workday, and they love being able to participate first-hand in the discussion with a doctor.

Provider’s Play a Big Role in Enabling Telehealth. 28% of respondents to the survey said they have never been offered a telehealth visit. 12% said they had never heard of telehealth. Respondents who use telehealth said they were more likely to use the service when it is offered as an option by the health provider.

Reimbursement is Still a Barrier. Two-thirds of parents say that having telehealth visits covered by insurance is essential for them to consider using the service. There was a big push during the pandemic for insurance companies to cover telehealth visits. There is a concern at Nemours for this to continue when things return to normal.

As further evidence that reimbursement is a major issue, a recent article in KHN (Kaiser Health News) shows that there are surprising issues that are impacting telehealth. The article discusses insurance companies that don’t want to cover telehealth visits where the patent and doctor are in different states. This is based on laws in most states and also in Medicare and Medicaid rules that require a licensed clinician to hold a valid medical license in the state where a patient is located.

These laws don’t stop people from voluntarily visiting a doctor in another state, but the law is being raised for telemedicine. This is surfacing as an issue as states start rolling back special rules put in place during the early days of the pandemic.

Johns Hopkins Medicine in Baltimore recently had to cancel over 1,000 telehealth visits with patients in Virginia because such visits would not be covered by insurance. This left patients to find a way to make the physical trip to Johns Hopkins or find another health provider. As someone who has used John Hopkins, this is the place where people from the DC region look to when they need to see the best specialists.

When I first heard about telemedicine a decade ago, the ability to see specialists was one of the biggest cited benefits of telemedicine. These kinds of issues are always more complicated than they seem. For example, state medical boards don’t want to give up the authority to license and discipline doctors that treat patients in the state. Of course, money comes into play since medical licensing fees help to pay for the medical boards. When insurance companies find it too complicated to deal with a gray legal issue, they invariably take the safe path, which in this case is not covering cross-state telemedicine visits.

Probably the only way to guarantee that telemedicine will work would be with legislative action to clear up the gray areas. Add this to the list of broadband topics that need a solution from Congress.

The Migration to Faster Speeds

The OpenVault Broadband Insights Report for the 2nd quarter of 2021 highlights the strength of customer demand for broadband.

The most interesting statistic is the migration of customers to faster broadband tiers. The following table shows the percentage of households subscribed to various broadband speed plans in 2020 and 2021.

June 2020 June 2021
Under 50 Mbps 18.4% 10.5%
50 – 99 Mbps 20.4% 9.6%
100 – 199 Mbps 37.8% 47.5%
200 – 499 Mbps 13.5% 17.2%
500 – 999 Mbps 5.0% 4.7%
1 Gbps 4.9% 10.5%

In just the last year, the number of households subscribed to gigabit broadband has doubled, while the number subscribed to slower speeds nearly cut in half. Many millions of homes upgraded to faster broadband plans over the past year.

OpenVault provides some clues as to why homes are upgrading to faster broadband. Consider the following table that shows the percentage of households using different amounts of total monthly broadband.

June 2018 June 2019 June 2020 June 2021
Less than 100 GB 51.6% 42.7% 34.2% 29.5%
100 – 499 GB 37.7% 39.5% 37.6% 38.6%
500 – 999 GB 8.9% 13.7% 19.4% 21.1%
1 -2 TB 1.7% 3.7% 7.8% 9.3%
Greater than 2 TB 0.1% 0.4% 1.0% 1.5%

The percentage of homes using less than 100 gigabytes of broadband per month has dropped by 43% over three years. At the same time, the number of homes using more than a terabyte of data per month has grown by 500% over three years. While there may be no direct correlation between having a faster broadband plan and using more broadband, total broadband usage is likely one of the factors leading residential customers to upgrade. Another big factor pushing upgrades is customers looking for faster upload speeds to support working and schooling from home.

The average household broadband usage in June 2021 was 433 gigabytes – which is the combined upload and download usage for the average American home. (405 GB download and 28 GB upload). To put that number into perspective, look at how it fits into the past trend of average broadband usage.

1st quarter 2018           215 Gigabytes

1st quarter 2019           274 Gigabytes

1st quarter 2020           403 Gigabytes

2nd quarter 2020         380 Gigabytes

1st quarter 2021          462 Gigabytes

2nd quarter 2021         433 Gigabytes

The second quarter 2021 usage is up 14% over 2020, but down compared to the first quarter of this year. OpenVault observed that broadband usage seems to be returning to seasonal patterns, and in past years it’s been normal for broadband usage to decrease during the summer.

The continued increased household usage has to be good news for ISPs like Comcast that are enforcing monthly data caps. OpenVault shows 10.8% of homes are using more than a terabyte of data per month. However, OpenVault also shows that having data caps influences broadband customers to curtail usage. In June, the average usage for homes with no data caps was 451.6 GB and 421.1 GB for homes with data caps.