Space Weather and Broadband

There was an interesting phenomenon that happened in September when Starlink launched 49 new satellites. The satellites were successfully deployed by the rocket, but as the satellites were being maneuvered to reach the final orbital slots there was a geomagnetic storm that caused 38 of the satellites to fall back to earth.

Space storms happen when radiation affects the magnetosphere that surrounds the earth. This is a band of particles that are held close to the planet due to the earth’s magnetic field. A geomagnetic storm occurs when there is an exchange of energy from outer space to the orbiting particles. The biggest storms are caused by mass ejections of particles and energy that occur during large solar flares. These solar flares release radiation and highly charged particles into space, which during a storm, interface with the magnetosphere.

It is the charged particles from the storms that manifest in the Aurora Borealis or northern lights. The extra energy from the storms can also play havoc with GPS and other space-based communications. The earth’s atmosphere keeps most of the radiation from solar flares away from the planet, but strong storms can wreak havoc with radio communications and can even produce feedback in long-haul electric wires that can disrupt the power grid.

During a geomagnetic storm, energy is pushed from the particles in the magnetosphere to the upper reaches of the ionosphere. This can temporarily increase the heat and the intensity of the ionosphere, which is what happened to the satellites. They met unexpected resistance that the tiny thrusters on the small satellites were unable to overcome.

Scientists have been looking at ways to better predict solar flares and the ensuing storms. In this case, with a warning, the satellite launch would have been delayed until the storm had passed. It’s a big challenge to predict the size and location of solar flares. The sun has an eleven-year cycle for the period of the heaviest solar flare activity, but a solar flare can erupt at any time.

Scientists around the world have been studying the sun using NASA’s Solar Dynamics Observatory. Scientists in China have had some success by tracking changes in the magnetic field of the sun, particularly in how that manifests in changes on the sun’s surface. They say that the temperature temporarily drops on the surface of the sun in the area where flares are coming. They have predicted several solar flares within 48 hours of an eruption. They have a long way to go for this to be accurate. Even when we get to the point of successfully predicting solar flares, it’s an even bigger challenge to predict if the particles from the flare will hit the earth. The worse impacts come when our planet is in the direct path of the ejected particles.

Tracking space weather matters since we are becoming reliant on space technologies. We’ve all incorporated GPS and satellite weather into our daily routines. We use space monitors for scientific research, to study farm fields, and to keep an eye on the various militaries around the planet. And suddenly, we have a lot of people using satellites for broadband. It was costly to Starlink to lose most of the satellites from a launch. But the potential damage from space storms is going to increase dramatically as we use space more and more. Starlink alone keeps talking about having 30,000 broadband satellites.

It’s not hard to picture the impact of losing these technologies for a few days up to a week. How many of you still carry an atlas in your car in case GPS doesn’t work? Businesses of all types plan outdoor work based on weather predictions that use data gathered by satellites. And having multi-day broadband outages can be devastating, particularly for rural businesses or people working from home. Space technology has become everyday technology, but it’s too easy to take for granted and to assume it will always work.

A New Definition of 6G

We now know how the wireless carriers are going to continue the string of new G generations of cellular technology.

5G was originally defined to include spectrum up to 90 GHz or 100 GHz. In the last few years, international standards bodies have been developing new 6G standards in what is called the terahertz wavelengths between 100 GHz and 1 THz. By definition, these higher frequency bands are the remaining part of the radio spectrum, and the so the 6G being defined by international scientists will be the final generation of G technology.

These super-high frequencies have a lot of interesting potential for indoor uses since this spectrum can transmit an immense quantity of data over short distances. But the high frequencies might never be used for outdoor broadband because the extremely short radio waves are easily distorted and scattered by everything in the environment, including air molecules.

Scientists have speculated that transmissions in the terahertz frequencies can carry 1,000 times more data than the current 5G spectrum bands. That’s enough bandwidth to create the 3D holograms needed for convincing virtual presence (and maybe my home holodeck).

But terahertz frequencies are going to be of little use to the cellular carriers. While cellular companies have still not deployed a lot of the 5G standards, the marketing folks at these companies are faced with a future where there would be no more G generations of cellphones – and that is clearly a lost marketing opportunity.

Several of the wireless equipment vendors have started to refer to bandwidths in the centimetric range as 6G. These are frequencies between 7GHz and 20 GHz. I have to admit that I got a really good belly laugh when I read this, because much of this frequencies is already in use – so I guess 6G is already here!

When 5G was first announced, the big news at the time was that 5G would open up the millimeter-wave spectrum between 24 GHz and 40 GHz. The equipment vendors and the cellular carriers spent an immense amount on lobbying and advertising, talking up the wonders of millimeter-wave spectrum. Remember the carefully staged cellular commercials that showed gigabit speeds on cell phones? That was done using millimeter-wave spectrum.

But now, the marketing folks have pulled a big switcheroo. They are going to rename currently used spectrum as 6G. I guess that means millimeter-wave spectrum will become 7G. This also leaves room for several more generations of G marketing before reaching the 100 GHz terahertz spectrum.

This will clearly cause a mountain of confusion. The international folks are not going to rename what they have already labeled as 6G to mollify the cellular marketers. We’re going to have articles, advertising, and lobbying talking about two completely different versions of 6G. And before the ink is dry, we’ll also be talking about 7G.

The cellular vendors also want us to change the way we talk about spectrum. The folks at Nokia are already suggesting that the newly dubbed 6G spectrum bands should be referred to as midband spectrum – a phrase today that refers to lower spectrum bands. That sets the stage to talking about upper bands of frequency as 7G, 8G, and 9G.

What is funniest about this whole process is that there still isn’t even any 5G being used in the world. The cellular carriers have implemented only a small portion of the 5G specification. But that hasn’t deterred the marketers who have convinced everybody that the new bands of spectrum being used for 4G are actually 5G. It’s a pretty slick marketing trick that lets stops the cellular carriers from not having to explain why the actual 5G isn’t here yet.

Don’t Forget Lifeline

There has been a big push nationwide to get customers enrolled in the Affordable Connectivity Program (ACP), that provides a $30 monthly subsidy for broadband providers – a discount that can be applied to any broadband product. With the ACP discount, a qualifying customer can buy a broadband product normally priced at $60 for $30.

Most ISPs seem to have forgotten about the FCC Lifeline program that can provide a monthly discount of $9.25 off a telephone or broadband bill for qualifying customers. Consumers can qualify for the Lifeline discount if the household income is at or below 135% of the Federal Poverty Guidelines or else by participating in Medicaid, SNAP (formerly Food Stamps), SSI, Federal Housing Assistance, VA Veterans pension, or VA survivor’s pension.

It’s a little easier to qualify for ACP since it is available to homes at or below 200% of Federal Poverty Guidelines. The ACP discount is also available to those who participate in Medicaid, SNAP, Federal Housing Assistance, WIC, SSI, or Lifeline.

That last requirement is the important one – customers can qualify for both the ACP discount and Lifeline, meaning an ISP can collect a total subsidy of $39.25 for a qualifying customer.

The FCC made some changes to the Lifeline program in July. The most important change for ISPs is that the cap for a lifeline subscriber was increased to 1.28 gigabytes per month – which is higher than the data cap for ISPs like Comcast. The FCC set the new annual budget for 2023 at $2.57 billion and changed the rule so that the size of the funding will be increased each year using the Consumer Price Index.

There have been a few barriers that have kept many ISPs from participating in Lifeline. Many of them thought that the $9.25 subsidy was too small to bother with. There also is a requirement that an ISP must be an Eligible Telecommunications Carrier (ETC), a status that is granted by State regulatory Commissions. Years ago, this implied that an ETC gained carrier of last resort obligations, which meant they were required to serve anybody in a service area. But since broadband has been largely deregulated, carrier-of-last-resort doesn’t have much meaning these days.

A lot of ISPs said that Lifeline was a pain to implement. There was no easy way for ISPs to know if a household qualified, and audits of the program would often mean rebating funds to the FCC. That issue has largely been resolved since the FCC now maintains a database that is updated monthly of homes that participate in the various federal subsidy programs. An ISP can feel safe in giving the discount to a household on this list.

Any ISP that is participating in ACP in order to reach low-income households should consider the Lifeline discount as well. Extending a $39.25 discount to households is a significant saving.

There are still a few nuances for ISPs that try this. Practically everybody that qualifies for Lifeline will qualify for ACP, but not everybody that qualifies for ACP can get Lifeline due to the lower limit on household income.

There are still a lot of questions about how many ISPs are actually trying to implement the ACP discount. Most ISPs have a lot of customers that qualify, but ISPS don’t seem to be pushing the discount. But for any ISP that wants to bring broadband to as many folks in a community as possible, a $39.25 customer discount can make it a lot easier to make broadband affordable.

ISPs and the Digital Divide

It seems that almost monthly that I am asked about the role that ISPs should take in making sure that we solve the digital divide. I think that people are somewhat shocked every time when I tell them this is not a role for ISPs.

In explaining my answer, let me start by parsing what is meant by the question. We are about to see a lot of grant funding for getting computers into homes and training folks on how to use them. The folks asking this question are hopeful that ISPs are going to take up that role in any meaningful way. The reality is that is rarely going to happen – and it’s not something we should be expecting from ISPs.

ISPs are in the business of building broadband networks and keeping them running. That’s a full-time job. I think that people assume that ISPs want new customers badly enough that they are willing to tackle the digital divide efforts needed so that folks know how to use broadband. An ISP’s role in solving the digital divide is to bring the broadband to homes willing to buy broadband. To use the old analogy of the three-legged stool, the ISP function is to provide the broadband connection– it’s up to somebody else to tackle the other issues of computers and training.

I don’t think that the folks asking this question understand the challenge involved in helping somebody to cross the digital divide. You can’t just hand out computers to homes that don’t have them. A computer is a brick for a household where nobody knows how to use it. It takes a lot of one-on-one effort to sit with people and help them learn how to navigate the possibilities of broadband.

There are programs around that have been doing this the right way. I’ve been told by several folks who train folks that the key to getting somebody to learn to use a computer is to help them to accomplish something they want to do. That’s different for everybody. It might mean helping them look for a job, talk with relatives on social media, search for knitting patterns, or learn a new language – it doesn’t matter what it is, but helping a new computer user to accomplish something useful is the way to prove to them that a computer and broadband is a useful tool.

My firm has been doing broadband surveys for many years, and we’ve noticed that folks will not admit to being afraid or intimated by computers and technology. Folks won’t tell you that the reason they can’t use a computer is that they can’t read very well. But the folks that do computer training tell me that these are some the basic reasons folks don’t or won’t use computers. Somewhere in the past, they tried and failed, and they don’t want to do that again.

I remember twenty years ago, when cable modems and DSL were new that there were computer training courses everywhere. There were free classes in most towns teaching how to use Excel or Word. The training classes were held in computer labs with twenty students at a time – and the training was largely an abysmal failure. While those skills are important for many jobs, they are not things that most people will use regularly, if ever. But somehow, it became accepted that teaching those skills was the way to make folks computer literate. These classes were mostly such a colossal failure that the training courses died out within a few years and were not replaced. We’ve largely gone two decades where there has been no formal forum for folks to learn how to use a computer except to sit with a friend or relative that would take the time to teach them. And this is a shame. There is an immense richness of content on the web today – there is something for everybody.

But back to my original premise of the blog. ISPs do not have the resources to dedicate employees to sit with folks to learn how to use a computer and navigate the web. Some of the big ISPs have given the impression that they are doing this for folks – but that is mostly done for public relations purposes.

There are some ISPs that might be willing to take up this challenge. Some municipal or cooperatives might take a stab at solving the digital divide. I could see some of them using grant money to develop a great program for teaching computer use. But even that is going to be a challenge, because the main focus of these ISPs is like every other ISP – to keep the network operating. We don’t expect car companies to teach us to drive. We don’t expect banks to teach us how to be wise with our money. We really can’t expect ISPs to teach folks how to use computers.

Communities that want to solve the digital divide issues should look elsewhere – perhaps at existing non-profits. There are some local governments that are going to take a stab at this. If no such group exists, then use grant money to kick-start the effort.

When Broadband Doesn’t Work

I recently lost my company email service. We have been using Rackspace to host our email for a decade, and we loved the customer service. The company was immediately responsive to our questions, and the company was one of our most satisfactory tech vendors.

But then our email went dead, and we got the worst imaginable response. Rackspace posted about every twelve hours for days letting its millions of users know that it was investigating the issue, but there were no explanations or communications beyond the periodic uninformative web postings.

My blog today is not to complain about Rackspace, although I would rate their responsiveness to the outage as a one out of ten. Losing email is no fun, but we muddled by. I can imagine how devastating this incident was for retail businesses that lost email during the heart of the Christmas shopping season.

Losing email reminded me of how reliant businesses are on technology and on the web platforms that underlie our businesses. One part of my consulting practice is conducting surveys and interviews of businesses across the country for communities that are trying to understand the broadband environment. The number one issue I hear from businesses is how devastating it is to lose a broadband connection. A lot of businesses basically go dead when losing broadband.

I don’t think the average person realizes how reliant businesses are on broadband. People are not surprised when broadband shuts down consultants, engineers, or architects who rely on broadband to exchange data and files on projects. But the loss of broadband today can shut down businesses that the public doesn’t think of as broadband intensive.

An example I ran into recently was a sports bar. The business had a multiple-day broadband outage that the ISP blamed on a cable cut. The business relied on broadband for a huge number of functions. The bar lost its automated reservation system. The bar made a lot of money from arcade games that shut down because they were controlled in the cloud. The bar could no longer take credit card payments. It lost its automated accounting system that logged every transaction into the books. The online payroll system was gone, and there was no way to easily track hours, or tips, or pay employees. It became difficult to order supplies since the ordering systems for food and drink had largely been automated. The business lost access to the online banking it used every day. And customers lost the free WiFi.

One of the goals that most communities have is to make sure that the business community has good broadband. That used to mean making sure that every business could buy broadband from at least one fast ISP. But I’ve talked to businesses of all sizes that would gladly buy broadband from two ISPs to protect against losing connectivity. A decade ago, only the largest businesses were concerned about redundancy – today, a large percentage of businesses want a backup broadband connection using a different physical path. Businesses have heartily adopted technology but, by doing so, are vulnerable in ways they never were before. A decade ago, this bar would not have had all of these functions online.

My email outage highlights the other kind of outage that worries businesses. Every one of the functions utilized by the bar is provided by a different online vendor. Every online system a business uses is only as good as the ability of the underlying tech vendor to keep the systems running.

There are too many ways for a business to lose functionality. We’ve seen widespread broadband outages that cascade across the country and temporarily knock the biggest tech companies offline. Every system that this bar uses is susceptible to the underlying vendor having a system crash, being hacked, or getting hit with malware. Technology is great, and it has made businesses far more efficient – until it stops working.

I’ll probably never know why I lost my email. But this reminded me that I can’t fully take any online technology for granted – everything is going to fail periodically, whether due to losing broadband or due to the underlying tech company having an issue. It’s not a comfortable feeling knowing that things can go bad instantly when you least expect it.

A Repeat Performance for Cable TV 3Q22

Traditional cable providers continue to lose cable TV customers at the same fast pace as the second quarter of the year. In the third quarter, the cable companies list 1.68 million customers after losing over 1.65 million customers in the second quarter.

These numbers come from Leichtman Research Group, which compiles most of these numbers from the statistics provided to stockholders, except for Cox, which is privately held and estimated. Leichtman says this group of companies represents 96% of all traditional U.S. cable customers.

2Q 2022 Change Change
Comcast 16,582,000 (562,000) -3.3%
Charter 15,291,000 (204,000) -1.3%
DirecTV 13,500,000 (400,000) -2.9%
Dish Network 7,607,000 (184,000) -2.4%
Verizon 3,383,000 (96,000) -2.8%
Cox 3,140,000 (90,000) -2.8%
Altice 2,491,800 (82,400) -3.2%
Mediacom 525,000 (15,000) -2.8%
Frontier 322,000 (21,000) -6.1%
Breezeline 323,038 (9,274) -2.8%
Cable ONE 202,000 (19,000) -8.6%
   Total 63,366,838 (1,682,674) -2.6%
Hulu Live 4,400,000 400,000 10.0%
Sling TV 2,211,000 214,000 9.7%
FuboTV 1,231,000 284,265 30.0%
Total Cable 39,536,512 (981,674) -2.5%
Total Telco / Satellite 25,513,000 (701,000) -2.7%
Total vMvPD 7,143,735 898,265 12.6%

The losses are fairly even across the industry, with every large provider except Charter losing more than 2% of total cable customers for the quarter. At the current pace, the industry might lose 10% of all cable customers this year. To put these numbers into perspective, these same companies had over 85 million cable customers at the end of 2018 – the industry has lost over a quarter of its customers since then.

In the quarter, the three online cable alternatives that LRG tracks gained almost 900,000 new customers for the quarter, A few major online alternatives, like YouTube TV aren’t on the list since they don’t announce customer counts.

The biggest percentage losers continue to be Frontier and Cable ONE, with Comcast the third largest.

Serving the Most Remote Locations

There is one interesting aspect of the BEAD grant rules that I’ve never written about. There is a provision in the BEAD rules that say that if no ISP seeks funding in an unserved or underserved area, a State broadband office may engage with ISPs to find somebody willing to serve such areas.

In order to make this work, States are allowed to offer additional inducements, such as providing additional state matching funds for the grant areas. State broadband offices would also be able to provide some kind of extra scoring points during the grant processes to make such grants eligible – with all of this being done with total transparency.

A State is first required to put out a general request for ISPs to serve such areas, a step that I assume would mean issuing an RFP. If that effort fails, the State can reach out and negotiate with specific ISPs to serve the unclaimed areas.

This option is driven by the overall goal of the BEAD grants to bring broadband to everybody who is unserved or underserved. This requirement will only kick in for states that have leftover money after trying to satisfy BEAD grant applications.

It’s an interesting process, but one that I think will be needed for various reasons. First, the RDOF subsidies created what the industry is calling swiss cheese or checkerboard coverage areas, which created little pockets that are geographically separated from other likely grant areas. The FCC RDOF maps really made a mess of the rural areas of many counties, where it will be a challenge to find a solution for every one of these pockets.

There also might be parts of counties that are particularly high cost that ISPs might not pursue. The BEAD grants can award up to 75% of the cost of a grant area, but there are many rural places where 75% grant funding is not enough. My back-of-the-envelope math says that areas with costs higher than $10,000 per passing might need more than a 75% grant – and there are a lot of rural areas with costs higher than that. I think the NTIA and State broadband offices will be shocked to find out how many grant applications fit into the high-cost category.

The BEAD rules allow States to make exceptions for grant applications for high-cost areas that need more than 75% grant funding. But the rules are not ISP-friendly and allow a State broadband office to seek alternate proposals or to allow other technologies for such areas – such as fixed wireless using unlicensed spectrum or satellite broadband. It’s not hard to imagine ISPs deciding to pass on a BEAD application for grants greater than 75% if a State broadband office can ignore such grants because of the high cost.

These rules add one more layer of complexity to an already complex grant program. This adds to the calculus an ISP must do in order to decide if it wants to seek a BEAD grant in high-cost areas. Does an ISP put in for a grant request of greater than 75% with a chance that the State will instead seek a lower cost solution? Or should an ISP wait until the State broadband office is soliciting ISPs to serve the high-cost areas? I guess it will all boil down to the philosophy of each State broadband office – and they are not likely to say ahead of time how they feel about high-cost areas.

I get a little more upset every time I write about the BEAD rules. Many states have already held state broadband grant programs of the size of the money they’ll get from BEAD, and those grants have probably a tenth of the rules of the BEAD grants. These grants were processed quickly, and projects tend to start construction the year after the grant award. I have to imagine that State broadband offices are intimidated by the BEAD rules – it puts them on the spot to do everything perfectly and meet all of the NTIA rules – and that’s probably not possible.

A Look at Smart Agriculture

We are at an interesting time in the history of man. The population just crossed the 8 billion mark. At the same time, we’re seeing big changes in weather patterns all over the globe that are disrupting the traditional ways that we raise crops. Some areas are already looking at prolonged droughts, while other places are a lot wetter than ever before. And just about everywhere is hotter.

I remember when I was a kid that there was a lot of talk about world starvation. The world population in 1960 had just hit 3 billion people, and there were a lot of countries on the edge of starvation. Science came to the rescue with new varieties of wheat, rice, and corn developed by Norman Borlaug and others, and food production around the globe soared.

The way to feed today’s population is through smart agriculture, and we don’t have far to look to see what that looks like. The Netherlands, at about the same size as Maryland, is one of the major food producers in Europe and the second biggest food exporter behind the U.S. The small country produces 4 million cows, 13 million pigs, and 104 million chickens annually.

Netherlands is also one of the major providers of vegetables for Europe. The county has an amazing 24,000 acres of greenhouses that grow crops. The greenhouses are efficient and can raise ten times more crops per acre than traditional fields, using less fertilizer. It takes only a half-gallon of water to grow a pound of tomatoes in greenhouses compared to the global average of 28 gallons.

Netherlands is also the world’s top supplier of seeds for ornamental plants and vegetables. There are multiple climate-controlled seed banks that maintain multiple strains of plant seeds to be able to provide the diversity that is needed in the race to keep crop strains ahead of the diseases that can destroy crops.

Greenhouse agriculture is highly dependent on technology. Greenhouses are automated for the watering and tending of crops. Greenhouses utilize a system called Scoutbox that captures and analyzes insects in greenhouses to allow for a quick reaction to avoid infestations. Farmers have virtually eliminated pesticides in greenhouses. Greenhouses are automated for the watering, tending, and shipping of produce – they are food-producing factories.

Field crop agriculture is taking advantage of smart tractors and other smart equipment. Drones are widely used to monitor field crops. Satellite images are analyzed to pinpoint areas of fields that need water, fertilizer, or other amendments. Computers track and monitor farm animals from birth. The county has developed a side industry that gathers food and crop waste to feed animals.

The country is a hub for agricultural research, with 15 of the top twenty agribusinesses having research and development labs in the country. All of this agriculture needs broadband. Like the U.S., the rural areas of the country are the last to get broadband. But the country has put a big push on connectivity. 100% of homes and farms can buy DSL. This is not like the slow rural U.S. DSL, but mostly with reliable speeds between 25 Mbps and 50 Mbps. Over 92% of residents have access to cable company broadband. Over 30% of homes now have access to fiber.

It’s obviously easier to fully wire a small country than our humongous far-flung farming areas. But the Netherlands example is highlighting a different way to raise food by putting greenhouses close to the people who consume the crops.

The one drawback to the agricultural methods in the country is that greenhouses require a lot of power. That’s a particularly pressing problem in a year when the Ukraine war is restricting oil and natural gas supplies. Like much of Europe, this tough time is goading the country to move more quickly to alternate energy sources. The country is already getting a lot of energy from wind and is working towards creating electricity with biomass and geothermal technologies.

The U.S. is experimenting with all of the same agricultural technologies being used in the Netherlands. But this small country is way ahead of us in terms of implementation. You have to wonder which region of the country will push these new technologies forward the fastest – it could be a big deal for an area looking to create jobs.

FCC Implements Broadband Labels

The FCC voted recently to implement consumer broadband labels. This was required by section 60504 of the Infrastructure Investment and Jobs Act. The new rules will become effective after the Office of Management and Budget approves the new rules and after the final notice is published in the federal register. ISPs will then generally have six months to implement the labels.

The labels look a lot like the nutrition labels that accompany food. The label will include basic information like a customer’s service plan name, the monthly price for standalone broadband, any special pricing that in place currently and when that special pricing expires, and a description of other monthly and one-time fees. The labels also must disclose the typical broadband speeds and latency.

It’s going to be interesting next summer to see how ISPs react to the label requirement. The pricing information alone must be giving shivers to the marketing folks at the biggest ISPs. This will make them list the price of standalone broadband to every customer and compare that to what the customer is currently paying. It’s been incredibly easy for consumers to subscribe to broadband in the past and never know the list price of what they are buying.

The requirement that I think will be the most controversial is the requirement to disclose the typical broadband speed and latency. I can’t wait until next year to see big ISPs implement this requirement. In the many surveys we have done, most consumers tell us that they have no idea of the speeds they are supposed to get – and that most of their monthly broadband bills don’t mention the speed.

Some ISPs will have a real dilemma with the speed disclosure.

  • It’s extremely challenging for a DSL or fixed wireless ISP to tell any customer the speed, since speeds vary from home to home and by the time of day. Even if one of these ISPs wants to disclose a reasonable estimate of speed, it’s hard to think how they can reasonably do so. I can’t imagine how these ISPs can provide a label to a prospective customer since the ISP won’t know the real speed until they try to connect to the customer.
  • What will ISPs do who have been exaggerating speeds in the FCC broadband reporting? Just to use an example I heard yesterday, there are places where Starlink reported 350 Mbps to the FCC where a customer was barely getting 50 Mbps. If ISPs report the FCC speeds to customers, they are going to hear a mountain of complaints from folks who aren’t seeing the high speeds. But if an ISP tries to be more truthful about speeds on the broadband label, it will have demonstrated that it has fudged the speeds for the FCC mapping.
  • The most interesting speed issue might be upload speeds. It’s hard to think that any cable company or WISP is going to report upload speeds under 20 Mbps because doing so would be an admission of not delivering broadband. But declaring 20 Mbps or faster upload speeds won’t sit well with customers who are getting something far slower.

We’ll have to wait and see, but my guess is that ISPs will report the same speeds to customers that are reported to the FCC. But an ISP that is exaggerating FCC speeds should be ready for an onslaught of customer complaints from customers that know the speeds on the label are not right. I think this is part of the reason why these labels were mandated – to force ISPs to come closer to telling customers the truth. But there are going to be some contentious years coming for ISPs that claim imaginary speeds on the broadband label or to the FCC.

The FCC is not quite done with the labels. The FCC issued a Future Notice of Proposed Rulemaking to solicit input on a few issues. One is how to include bundling on the labels. The surveys my firm does are still showing more than 50% of customers on bundles in urban areas, and bundling allows ISPs to hide the pricing for any component of the bundle.

The FNPRM also asks if there is a better way to disclose speeds, such as average speeds instead of typical speeds. Finally, the FCC is asking if ISPs should disclose network management policies that might harm consumers, such as blocking, throttling, or paid prioritization for some customers.

We won’t see the broadband labels in practice until at least next summer – but I’m expecting an uproar after folks see what ISPs say about prices and speeds.

Interest Rates and Grant Matching

I have a lot of clients looking at broadband grants that will require matching funds, and they are rightfully getting worried about the climb in interest rates.

Back when the upcoming BEAD grants were announced in November 2021, many of my clients had access to loans with interest rates in the range of 3% to 4%. The higher interest rates we are now seeing will clearly have a huge impact on the ability to afford accepting a grants to build in a rural area. Almost by definition, rural areas are sparsely populated and so it is always a challenge to cover any debt payments on grant matching funds.

Consider the following table that shows the annual debt payments that would be due for a $10 million loan for terms of 20, 25, and 30 years, at interest rates varying from 3% to 8%. This might be a loan for a $40 million BEAD grant where the grant applicant must cover the 25% matching cost for a 75% grant. The second set of numbers shows the percentage difference for each loan compared to a 20-year loan at 3%.

Interest Rate 3% 4% 5% 6% 7% 8%
20 years 672,157 735,818 802,426 871,846 943,929 1,018,522
25 Years 574,279 640,120 709,525 782,267 858,105 936,788
30 Years 510,193 578,301 650,514 726,489 805,864 888,274
20 years 100% 109% 119% 130% 140% 152%
25 Years 85% 95% 106% 116% 128% 139%
30 Years 76% 86% 97% 108% 120% 132%

The table demonstrates several things. First, big interest rate increases are a massive disincentive for an ISP to make new investments. If an ISP had a business plan last year to build a new project with a 3% loan, the debt cost has climbed 40% to 52% with a 7% or 8% interest rate. Since debt costs are one of the major expenses for building fiber, this kind of increase could easily kill expansion plans.

I know a lot of ISPs who are putting expansion plans on hold due to the interest rates. If an ISP decides to accept a high interest rate, it would only be due to a belief that the loan could be refinanced if interest rates drop. But many loans don’t allow refinancing for some fixed number of years. This is also gambling. In the past, when interest rates spiked like they are now, the rates have usually dropped back down – but there is never any guarantee that rates will drop back to the low levels of just a year ago.

This is a bigger dilemma when borrowing to match grants. Grant projects have completion requirements, and ISPs might be forced to accept a high interest rate loan due to the timing of construction. Building a grant project is different than normal planned expansion, where a project can be delayed waiting for more favorable interest rates.

One of the ways to offset higher interest rates is through longer loan terms. But that’s not always easily achievable. Many lenders don’t like making loans for more than twelve or fifteen years. It might not be easy to get a longer loan term. It’s also worth noting that one of the main consequences of banks raising interest rates is that banks start to pull back from making new loans. This may be counterintuitive, but the underlying interest rates that banks have to pay also increases when retail interest rates are higher. Higher underlying rates increase the risk and financial consequences of loan defaults. Just like home mortgages are harder to find when interest rates are higher, it’s possible that the banks that were willing to loan to grant projects might also back off.

The retraction of new debt is exactly what the Federal Reserve intends when it raises interest rates. The whole point of raising the rates is cool off an overheated economy – without going too far and causing a recession. It’s going to be a shock to at any ISP to find out that the bank it was counting on is less interested in lending to them.