Fusion Energy on the Horizon?

This blog isn’t broadband-related, but it’s something that I find intriguing. Fusion energy has been touted as being about thirty years away since I was in college almost fifty years ago. As recently as ten years ago that was still the prediction. There have been huge amounts of investigation and progress during that time, but each new finding uncovered new challenges. The biggest issue has been finding a way to safely contain a ball of plasma that is as hot as the center of the sun. The approach over the years was to develop extremely powerful magnets that could suspend and hold the plasma.

But it looks like we finally found the breakthrough. Helion, a start-up in Everett, Washington, along with a few other companies, looks to finally be on the path of building and selling a workable fusion reactor. The company is currently building and plans to market its seventh-generation reactor which should be completed in 2024.

One of the unique aspects of the Helion approach is that it is not trying to sustain a ball of plasma – that’s where the big fusion reactors have struggled. Helion instead creates short, repetitive bursts of plasma. The company’s sixth-generation reactor was built in 2020 and has been generating a high-energy pulse every ten seconds since then while achieving a temperature of over 100 million degrees Celsius with each burst. The company has been able to repeatedly sustain plasma, like in the center of the sun, for longer than 1-millisecond per burst. The goal of the next-generation machine will be to generate a pulse every second.

Helion has also taken a different approach than other fusion attempts in the generation of electricity. The typical approach has been to use the heat generated by the fusion plasma to create steam to drive turbines. Helion is instead using the electromagnetic pulses to take advantage of the electromagnetic waves released during the creation of the plasma, taking advantage of Faraday’s Law of Induction. Helion has created a magnetic field around the fusion reactor that interacts with the energy that is released when deuterium and helium-3 ions are smashed together. Helion says this is resulting in a 95% energy efficiency compared to 70% for the more traditional approach.

The seventh-generation fusion reactor will be about the size of a commercial shipping container and will produce about 50 megawatts of clean energy. That’s enough power for 40,000 homes. Helion believes it will be able to generate electricity for about $10 per megawatt, which is about a third of the cost of coal-fired or solar power generation.

Perhaps the best feature of the fusion reactor is that it creates no serious waste. There are two radioactive isotopes created by the reaction. The first is tritium, which has a half-life of twelve years, and that is big demand for use in wristwatches and highway exit signs. The outer output is helium-3, which is needed to produce the fusion reaction – the fusion generator creates its own fuel. Helium-3 is rare and could also provide the basis for spaceship propulsion systems that might let us travel between stars. Approximately 25 tons of helium-3 could generate all of the electricity used by the country in a year – but the whole U.S. supply of the helium-3 today is only about 20 kilograms.

The end-product of widespread fusion generators would be the creation of endless clean energy. With fusion power, we’d still need electric grids. However, as unlimited power can be produced locally, this technology would eventually eliminate the energy-wasteful high-power transmission systems used today to connect regions of the electric grid together.

The first customers of the technology are likely to be power-hungry data centers. Data centers are most often built in the parts of the country with the most affordable electricity, but fusion power would mean we could put data centers close to the places where data is most used.

The Busy Hour and Data Caps

As states are getting ready to create their broadband plans for the NTIA’s $42.5 billion BEAD grants, we’re starting to see some interesting arguments being made by incumbents to influence state broadband plans. One of the aspects of the BEAD plan that hasn’t been discussed much yet is that the NTIA is stressing affordability. For example, the NOFO states several times that states must develop a middle-class rate plan. Everybody I know is scratching their head on that what that means, but to the big ISPs, this must sound like rate caps – something they have vigorously fought everywhere.

One of my readers from Maine says that one aspect of high rates – data caps – is a big topic of discussion in the state. ISPs are making the claim that they’ve made many times that data caps are needed to manage the network. The rest of the ISP argument is that heavy broadband users are creating extra costs and should pay for the extra usage.

I’ve written about this before, and the big ISP argument is pure bosh. Broadband costs are not related to the overall volume of broadband being delivered on a network. The cost is determined almost entirely by what network engineers call the busy hour. The business hour each day is when a network sees the busiest broadband usage. For networks with a lot of residential customers, the busy hour is usually during the evening when the most families are streaming video. However, I’ve seen a few networks where the busy hour was at some different time of the day. One example I know of is a network serving college students that has more usage after 10:00 PM and into the early hours of the morning, which the ISP accredits to gaming. Networks that serve business customers might have the busy hour at almost any time of the day, depending upon the mix of businesses using the network.

The busy hour is important because it determines how much bandwidth is needed for the various components of the network. For example, the data connection between the ISP core and a neighborhood must be large enough to handle the busy hour – otherwise, bandwidth and speeds get slower. Another important network segment that depends upon the busy hour is the connection to and from the ISP and the Internet – there must be enough bandwidth available to accommodate the expected busiest time each month. Most ISPs buy enough bandwidth to the Internet to have at least a 20% cushion of capacity greater than the expected busiest hour.

As long as an ISP is buying enough broadband to the Internet to satisfy the busy hour, it doesn’t matter how busy the network is the rest of the time – because all other times are less busy than the peak time. The ISP is paying for the peak bandwidth and doesn’t get any savings in bandwidth cost for times when customers don’t use the bandwidth.

The data cap for most ISPs is set at or close to 1 terabyte per month (1,000 gigabytes) of total combined upload and download usage for a customer. According to the most recent statistics from OpenVault, 15% of all homes are now using more than a terabyte of bandwidth per month. I see network performance statistics from my clients, and none of them believe that the busy hour is caused by a few big users – it happens at the time of the day when the highest number of homes use broadband at the same time. There is a limit on how much bandwidth a given home can realistically use if they are doing the same things that many homes do – like watching video or gaming. A house that uses a terabyte of data per month is likely no busier than a home that uses half that. The house with the higher monthly usage normally just uses the broadband for more hours per day to accumulate the terabyte of usage.

The only time that data usage costs more money for an ISP is when it has to increase the size of the data connection to the Internet. But interestingly, even that doesn’t always cost an ISP more money because as network usage has grown, the cost per gigabit of traffic to reach the Internet has dropped year after year. I have ISPs that are carrying four times the traffic of only a few years ago but are still paying no more for the connection – which is another argument about heavy users creating more costs for an ISP.

Data caps do not make any significant change to lower the busy hour, and any ISP that says they do is being untruthful. Data caps serve only one purpose – to make additional money for ISPs. There is no other explanation, and any ISP that argues that data caps are a network control issue is counting on the technology naivety of regulators and policy people. When Comcast first introduced data caps, only a fraction of one percent of homes used more than a terabyte per month. Comcast made the same argument then that the data caps were to rein in the usage of the few heavy users. The argument was no more valid then than it is now. ISPs have to be thrilled to see so many homes now exceeding the data caps – the overage fees for exceeding the data caps can be as much as $50 per household. Seeing 15% of households exceeding the data caps is like Christmas to big ISPs, like a big secret rate increase.

Update on DOCSIS 4.0

LightReading recently reported on a showcase at CableLabs where Charter and Comcast demonstrated the companies’ progress in testing the concepts behind DOCSIS 4.0. This is the big cable upgrade that will allow the cable companies to deploy fast upload speeds – the one area where they have a major disadvantage compared to fiber.

Both companies demonstrated hardware and software that could deliver a lot of speed. But the demos also showed that the cable industry is probably still four to five years away from having a commercially viable product that cable companies can use to upgrade networks. That’s a long time to wait to get better upload speeds.

Charter’s demonstration was able to use frequencies within the coaxial cables up to 1.8 GHz. That’s a big leap up from today’s maximum frequency utilization of 1.2 GHz. As a reminder, a cable network operates as a giant radio system that is captive inside of the coaxial copper wires. Increasing the range of spectrums used means opening up a big range of additional bandwidth capacity inside of the transmission. These new breakthroughs are akin to the creation of G.Fast which harnesses higher frequencies inside the telephone copper wires. While engineers can theoretically guess how the higher frequencies will behave, the reason for these early tests is to find all of the unexpected quirks of how the various frequencies interact inside of the coaxial network in real-life conditions. A coaxial cable is not a sealed environment and allows interference from the outside world that can interfere unexpectedly with parts of the transmission path.

Charter used equipment supplied by Vicma for the node, Teleste for amplifiers, and ATX Networks for taps. The node is the electronics that sit in a neighborhood and converts the signal from fiber onto the coaxial network. Amplifiers are needed because the signals in a coaxial system don’t travel very far without having to be amplified and refreshed. Taps are the devices that peel signals from the coaxial distribution network to feed into homes. A cable company will have to replace all of these components, plus install new modems, to upgrade to a higher frequency network – which means the DOCSIS 4.0 upgrade will be expensive.

One of the impressive changes from the Charter demo was that the company said it could overlay the new DOCSIS system over top of an existing cable network without respacing. That’s a big deal because respacing would mean moving existing channels to make room for the new bandwidth allocation.

Charter was able to achieve a download speed of 8.9 Gbps download and 6.2 Gbps upload. They feel confident they will be able to get this over 10 Gbps. Comcast achieved speeds on its test of 8.2 Gbps download and 5.1 Gbps upload. In addition to researching DOCSIS 4.0, Comcast is also looking for ways to use the new technology to beef up existing DOCSIS 3.1 networks to provide faster upload speeds earlier.

Both companies face a market dilemma. They are both under pressure to provide faster upload speeds today. If they don’t find ways to do that soon, they will lose customers to fiber overbuilders and even the FWA wireless ISPs. It’s going to be devastating news for cable stock prices in the first quarter after Charter or Comcast loses broadband customers – but the current market trajectory shows that’s likely to happen.

Both companies are still working on lab demos and are using a breadboard chip designed specifically for this test. The normal lab development process means fiddling with the chip and trying new versions until the scientists are satisfied. That process always takes a lot longer than executives want but is necessary to roll out a product that works right. But I have to wonder if cable executives are in a big hurry to make an expensive upgrade to DOCSIS 4.0 so soon after upgrading to DOCSIS 3.1.

7G – Really?

I thought I’d check in on the progress that laboratories have made in considering 6G networks. The discussion on what will replace 5G kicked off with a worldwide meeting hosted in 2019 at the University of Oulu, in Levi, Lapland, Finland.

6G technology will explore the frequencies between 100 GHz and 1 THz. This is the frequency range that lies between radio waves and infrared light. These spectrums could support unimaginable wireless data transmission rates of up to one terabyte per second – with the tradeoff that such transmissions will only be effective for extremely short distances.

Scientists have already said 5G will be inadequate for some computing and communication needs. There is definitely a case to be made for applications that need huge amounts of data in real-time. For example, a 5G wireless signal at a few gigabits per second is not able to transmit enough data to support complex real-time manufacturing processes. There is not enough data being transmitted with a 5G network to support things like realistic 3D holograms and the future metaverse.

Scientists at the University of Oulu say they are hoping to have a lab demonstration of the ability to harness the higher spectrum bands by 2026, and they expect the world will start gelling on 6G standards around 2028. That all sounds reasonable and is in line with what they announced in 2019. One of the scientists at the University was quoted earlier this year saying that he hoped that 6G wouldn’t get overhyped as happened with both 4G and 5G.

I think it’s too late for that. You don’t need to do anything more than search for 6G on Google to find a different story – you’ll have to wade through a bunch of articles declaring we’ll have commercial 6G by 2030 before you even find any real information from those engaged in 6G research. There is even an online 6G magazine with news about everything 6G. These folks are already hyping that there will be a worldwide scramble as governments fight to be the first ones to master and integrate 6G – an upcoming 6G race.

I just shake my head when I see this – but it is nothing new. It seems every new technology these days spawns an industry of supposed gurus and prognosticators who try to monetize the potential for each new technology. The first technology I recall seeing this happen with was municipal WiFi in the 1990s. There were expensive seminars and even a paper monthly magazine touting the technology – which, by the way, barely worked and quickly fizzled. Since then, we’ve seen the guru industry pop up for every new technology like 5G, block-chain, AI, bitcoin, and now the metaverse and 6G. Most new cutting-edge technologies find their way into the economy but at a much slower pace than touted by the so-called early experts.

But before the imaginary introduction of 6G s by 2030, we will need to first integrate 5G into the world. Half of the cellphones in the world still connect using 3G. While 3G is being phased out in the U.S., it’s going to be a slower process elsewhere. While there are hundreds of Google links to articles that predict huge numbers of 5G customers this year – there aren’t any. At best, we’re currently at 4.1G or 4.2G – but the engineering reality is obviously never going to deter the marketers. We’ll probably see a fully compliant 5G cell site before the end of this decade, and it will be drastically different, and better, than what we’re calling 5G today. It’ll take another few years after that for real 5G technology to spread across U.S. urban areas. There will be a major discussion among cellular carriers about whether the 5G capabilities will make any sense in rural areas since the 5G technology is mostly aimed at solving overcrowded urban cellular networks.

Nobody is going to see a 6G cellphone in their lifetime, except perhaps as a gimmick. We’re going to need several generations of better batteries before any handheld device can process terabyte data without zapping the battery within minutes. That may not deter Verizon from showing a cellular speed test at 100 Gbps – but marketers will be marketers.

Believe it or not, there are already discussions about 7G – although nobody can define it. It seems that it will have something to do with AI and the Internet of Things. It’s a little fuzzy about how something after 6G will even be related to the evolution of cellular technology – but this won’t stop the gurus from making money off the gullible.

Future-proofing Grants

There has been a lot of discussion in the last few months about how wonderful it was for Congress to have increased the speed requirements for broadband grant eligibility to 100/20 Mbps in the $42.5 billion BEAD grants. But is it really all that wonderful?

It’s obvious that the FCC’s definition of broadband of 25/3 Mbps is badly out of date. That definition was set in 2015, and it seemed like an adequate definition at the time. If we accept that 25 Mbps was a good definition for download speed in 2015 and that 100 Mbps is a good definition in 2022, then that is an acknowledgment that the demand for download broadband speed has grown at about 21% per year, which is shown in the table below.

Historic Download Speed Demand in Megabits / Second

2015 2016 2017 2018 2019 2020 2021 2022
25 30 37 44 54 65 79 95

We have outside evidence that the 21% growth rate makes sense. Several times over the last decade, both Cisco and Opensignal opined that the residential demand for download speed has been growing at that same 21% rate. Cisco said that it thought business demand was growing at about a 23% clip.

This raises an interesting question of how good it is for a grant program today to use a 100 Mbps definition for broadband? The main reason that this is a relevant question is that the BEAD grants aren’t going to be constructed for many years. My best guess is that the majority of BEAD grants will be awarded in 2024, and ISPs will have four more years to finish network construction – until 2028. The above table shows how much broadband demand for download speed grew from 2015 until 2022. What might this look like by the time the BEAD networks are fully implemented?

Future Projected Download Speed Demand in Megabits / Second

2022 2023 2024 2025 2026 2027 2028
100 121 146 177 214 259 314

If we accept that 100 Mbps download is adequate today as a definition of download broadband speed, then if broadband demand continues to grow at 21% annually, the definition of download broadband ought to be over 300 Mbps in 2028. I know many cynics will say that broadband demand cannot continue to grow at the historic rate, but those same people would have said the same thing in 2015 – and been proven wrong. In fact, there has been a steady growth curve for broadband speed demand back into the 1980s. There is no evidence I’ve heard that would indicate that the demand growth has slowed down.

We don’t really need to have this theoretical discussion of adequate broadband speeds because the market is ahead of the above speed growth curves. Since the early 2000s, cable companies have unilaterally raised the speed of basic broadband to keep ahead of the demand curve. The cable companies have raised minimum speeds every few years as an inexpensive way to keep customers happy with cable broadband.

The cable industry is in the process right now of increasing the speed of basic download speed to 200 Mbps – a number higher than predicted by the table above for 2022. There is a strong argument to be made that the cable companies have been resetting the definition of broadband while regulators were too timid to do so. I can remember when the cable companies collectively and unilaterally increased speeds to 6 Mbps. 12 Mbps, 30 Mbps, 60 Mbps, 100 Mbps, and now 200 Mbps.

This argument is further strengthened when considering that the big cable companies serve almost 70% of all broadband customers in the country today. When Congress gave the FCC responsibility for broadband in the 1996 Telecommunications Act, the requirement that the FCC has largely shoved under the rug was that rural broadband should be in parity with urban broadband. If 70% of new broadband subscribers in the U.S. are offered 200 Mbps broadband as the slowest basic product, it’s hard to argue that having a definition of anything under 200 Mbps today is not parity.

Congress wasn’t all that brave in setting the definition of grant-eligible at 100/20 Mbps. That is the lowest possible current definition of download speeds, and a number that is already starting to drift to be obsolete. Recall the gnashing of teeth in the industry last year while the legislation was being created – cable companies and WISPs both thought that 100/20 Mbps was too aggressive.

If we really wanted to future-proof the BEAD grants, then technology that won’t be built until 2028 should be required to deliver at least 300 Mbps download. Anything less than that means networks that the public will feel are inadequate as they are being deployed.

Another BEAD Grant Complication

I’ve been thinking more about the NTIA’s definition of Reliable Broadband Service that was part of the recently issued Notice of Funding Opportunity (NOFO) for the $42.5 billion BEAD grants. That definition says that any grant cannot be used to overbuild a reliable broadband technology that meets or exceeds the 100/20 Mbps speed threshold of the grants. The NOFO said that the grants can’t be used where speeds are adequate for the following technologies: (i) fiber-optic technology; (ii) Cable Modem/ Hybrid fiber-coaxial technology; (iii) digital subscriber line (DSL) technology; or (iv) terrestrial fixed wireless technology utilizing entirely licensed spectrum or using a hybrid of licensed and unlicensed spectrum.

The policy behind this makes sense – the NTIA doesn’t think that valuable federal grant dollars should be used where adequate broadband technology is already in use. That would make them a good shepherd of the federal dollars.

But this particular definition is going to cause some complications the NTIA might not have considered. I’ve been running into rural FWA cellular wireless broadband in rural markets. So far, I’ve only encountered the new technology from T-Mobile and Verizon. But this will also be introduced by Dish Network. AT&T says it also has plans to roll out the faster cellular home product.

The FWA technology is enabled when a cellular company beefs up cell sites to provide home broadband in addition to cell phone service. This is being enabled by the introduction of new spectrum bands. For marketing purposes, the carriers are labeling these new bands as 5G, although the technology is still 4G LTE.

The cell carriers have been offering a weak version of home broadband for years, marketed as a hotspot or jetpack. But that technology shared the same frequencies used for cell phone service, and the broadband has been slow, weak, erratic, and expensive. However, putting home broadband onto new cellular spectrum changes the product drastically.

Recently I heard from a farmer who is getting 200 Mbps download broadband from a rural T-Mobile FWA connection – this farmer sits right next to a large cell tower. According to the NTIA, this farm should not receive any grant subsidy to bring fiber broadband with a grant. But as is usual, real life is a lot more complicated than that. This same farmer says that his nearest neighbors, only a little over a mile away, are seeing speeds significantly below 50 Mbps.

This makes sense because that’s how cellular technology works. Most people don’t realize how quickly broadband signal strength weakens with distance from a cell site. In cities, practically everybody is within half a mile or a mile from a cell site, so we never notice. But in rural areas, most people live too far from a cell site to get decent bandwidth from this technology. Consider the following heatmap of a real cell site.

The fastest broadband speeds would be within a few thousand feet, like with the farmer. The area that might get 100 Mbps broadband is in the orange and yellow areas on the map. The speeds in the green areas are where speeds fall below 100 Mbps, and by the time the broadband signal reaches the light blue areas the speeds are almost non-existent. The purple areas show where a voice signal might carry, but only unreliably.

What does this mean for the BEAD grants? As T-Mobile and the other cell carriers start updating rural cell sites they are going to be putting heatmaps like the one above into the FCC mapping system. It’s worth noting that most cell sites don’t create a roughly symmetrical coverage pattern because the wireless signal gets disrupted by any obstacles in the environment, even small rolling hills. It’s also worth noting that cellular coverage is dynamic and changes with temperature, precipitation, and even wind.

Recognizing cellular broadband coverage (licensed) as reliable broadband will have several consequences. First, this disrupts grant coverage areas since there will be cellular areas in every county that won’t be eligible for grants. This will create a swiss cheese phenomenon where there are areas where grants are allowed next to rural areas that are not allowed. That will complicate the engineering of a broadband solution for the areas that are left. This is the same thing the FCC did with the RDOF awards – chopped up potential grant areas into incoherent, illogical, and costly swiss cheese.

This also might mean this farmer won’t get fiber. His neighbors who can’t get good speeds on T-Mobile might be covered by a BEAD grant, but an ISP might be unwilling to fund the cost to reach this farmer if the cost is not covered by a grant.

I doubt that the NTIA thought of the practical consequences of the new definition, just like I can’t imagine the FCC had the slightest idea of the absolute mess they made with RDOF coverage areas. The only way to justify building a new network in a rural area, even with grants, is to cover large areas with one coherent network – not by building a network that has to somehow avoid RDOF areas and cell towers.

ISPs interested in BEAD awards are now going to have to wait until the new broadband maps come out to know what this might do to their grant plans. I’m thinking that, at least in some cases, this will be the final straw that breaks the camel’s back and convinces an ISPs to walk away and not even try.

The BEAD NOFO – Financial Issues

The NTIA has established basic rules for the $42.5 billion BEAD grants in the recent Notice of Funding Opportunity (NOFO). One of most important aspects of the rules that potential applicants need to understand relates to funding and financing. Note that the NOFO instructs the States what it expects to be included in each state’s broadband grant program for the BEAD funding.

The first set of rules concerns the amount of grant funding. Since the IIJA passed Congress, the industry has been talking about BEAD grants offering 75% grant funding. It’s not that simple.

The NOFO says that states are required to incentivize matches of greater than 25 percent from subgrantees. That means states must make every effort to award less than a 75% grant. In fact, if two entities request building fiber to the same geographic area, the one asking for the smaller amount of money will automatically win, assuming they meet the basic grant requirement. This makes sense and will stretch the grant money further, but ISPs should be prepared for a sliding scale where the less the borrowing the greater the grant points.

The original Congressional language also held out a big promise for the layering of grants. The legislation specifically promised that an ISP could use ARPA or CARES funding from states and localities as matching for the BEAD grants. But the NTIA rules turn that promise on its head. States are encouraged to require a match from the subgrantee rather than utilizing other sources where it deems the subgrantee capable of providing matching funds. If a grant applicant has the ability to fund the grant matching, the NOFO rules suggest states should not allow the layering of local monies as grant matching. When that sinks in, it’s going to put a lot of public-private partnership discussions on hold.

The more disturbing requirement in the grant is that applicants must provide an irrevocable letter of credit along with a grant application. During the application process, prospective subgrantees shall be required to submit a letter from a bank . . . committing to issue an irrevocable standby letter of credit, in the required form, to the prospective subgrantee. The letter shall at a minimum provide the dollar amount of the letter of credit.

I have to wonder if the folks at NTIA understand what an irrevocable standby letter of credit (SLOC) means. Consider a grant application for $40 million, with a $10 million grant match. A bank must treat an SLOC as if were a bank loan. When the bank issues the SLOC, it ties up the $10 million on its balance sheet in the same way it would if it made a loan. The bank can’t loan that money to anybody else – it is frozen. While the bank is still holding the cash, it is not treated as a bank’s cash reserve since it is pledged. The bank will charge a minimal amount of interest on the letter of credit. In recent years that’s been something like 2% – hard to know what that might be with rising interest rates. If the interest rate is 2%, and the grant process takes a year to process, the ISP will have spent $200,000 in interest expense – even if it doesn’t win the BEAD grant.

It gets worse. When an ISP wins a grant it must then produce an irrevocable letter of credit for the life of the grant. This is even worse than the first letter of credit. Bank loans for fiber projects typically use construction financing – the same kind of financing used if you build a house. For a project built over four years, the ISP would take a draw each month as it needs funds and would only start paying interest on money that has been drawn. If a letter of credit must be created on the first day of a grant award, then using my example, interest rates for the full $10 million of matching would start when the letter of credit is issued. That completely negates the primary advantage of bank construction financing. My back-of-the-envelope math tells me that for a $10 million matching, the two layers of letters of credits could add $1 million to the cost of the project – all flowed to banks in the form of interest. None of this money is recoverable from the grant funding and comes out of the grant winner’s pocket.

To make matters even worse, a lot of smaller ISPs will not be able to obtain the letter of credit needed to apply for the grant. It’s a typical chicken and egg scenario. A bank won’t give an ISP a SLOC unless their existing balance sheet supports that much of a loan. But the ISP’s balance sheet won’t justify the SLOC until it wins the grant. This rule will definitely discriminate against smaller ISP – and by smaller, I’m including some fairly large companies like regional telephone companies and cooperatives.

The NOFO says there will be additional language coming to describe how municipalities will deal with the letter of credit issue. The NTIA is probably struggling with this because bond financing is more complex than a bank loan. A bond doesn’t exist until the day that bond buyers agree to buy the bond. It’s always possible that a bond issue won’t sell, so there can be no bank guarantees tied to future bond issues. I can’t wait to see this solution.

I don’t want to be dramatic, but this seems like massive overkill. It would appear that the NTIA is so fearful of having a few grant winners who will default on projects that they are imposing a billion-dollar industry cost to solve a million-dollar problem.

The Waning of the Bundle

For many years the cable companies (and telcos who sold cable TV) have relied on the power of the bundle as a way to gain and then retain customers. The bundle was one of the predominant marketing tools for such ISPs. They showed customers that there was a big savings from buying multiple products.

I can remember the first time I moved into a Comcast area about eight years ago and wanted to buy only broadband. The company strongarmed me and wouldn’t let me buy broadband without some form of a cable product, so I ended up buying the smallest cable product possible – and never even connected the settop box. The days of forced bundles are now gone, but in this case, Comcast got an extra $30 per month from me for several years, which I had to pay since the only alternative was super-slow DSL.

My consulting firm has done market surveys for twenty years, and I can remember for many years that at least 70% of customers of cable companies were buying a bundle. Most of these bundles were not forced like mine. At the peak, over 75% of homes were buying traditional cable TV, and most cable TV customers who became interested in buying broadband readily agreed to the bundle.

Buying a bundle felt like a good deal since the ISP could show you a big savings over buying the various products individually. The bundle savings was the predominant marketing story used for many years to lure existing cable customers into adding broadband. The bundle was possibly the most important reason why the cable companies captured so many DSL customers – because DSL customers in cities were not dissatisfied with speeds in the same way they became during the pandemic.

But then along came Netflix and other online video providers and cord cutting, and millions of customers started dropping cable service. That’s when the downside of bundling became apparent. The big cable companies used bundling to try to bully customers into not dropping cable TV. Let me give an example. If a customer had a $100 bundle of both broadband and cable TV and asked to break the bundle, they might have been told that their new bill was going to be $70. It didn’t matter which of the two services they dropped, the service that remained was going to cost more than half of the current billing. When customers heard this, many of them decided not to break the bundle – they weren’t going to get the savings they hoped for.

The bundle must still be a powerful tool to lure customers because the web is full of inexpensive one-year deal for bundles for new customers – likely aimed at the remaining DSL customers. . But the interest in buying traditional cable is plummeting in the same manner as telephone service did a decade ago. Last year the cable industry collectively lost nearly five million cable customers, and the rate of homes dropping cable seems to be accelerating. One has to think that a lot of those millions that dropped cable last year decided to break the bundle to do so.

By the end of this year, the national penetration rate of households buying traditional cable TV is going to drop below 50%. I don’t think anybody knows when and where the cable market will bottom out – there will likely remain loyal families for many years to come.

I have to wonder if, at some point, the cable companies will give up on the idea of bundling. The metro cable markets are going to soon be flooded by competitors – first by the cellular companies selling FWA cellular broadband, and over the next few years by a ton of new fiber competition. Those new competitors are going to be focused entirely on offering affordable broadband prices, and the bundle issue might muddy the water more than help the cable companies. It’s going to be an interesting marketing evolution to watch.

The BEAD NOFO – A New Definition of Broadband Technology

The Notice of Funding Opportunity (NOFO) for the $42.2 billion BEAD grants establishes new rules for the grants that might have a wider implication for broadband elsewhere. One of the most interesting aspects of the NOFO was the definition of a new term – Reliable Broadband Service.

The NOFO defines Reliable Broadband Service to means a broadband service that is shown to be providing broadband in the FCC maps using (i) fiber-optic technology; (ii) Cable Modem/ Hybrid fiber-coaxial technology; (iii) digital subscriber line (DSL) technology; or (iv) terrestrial fixed wireless technology utilizing entirely licensed spectrum or using a hybrid of licensed and unlicensed spectrum.

The key purpose of this new term is to define the technologies that can’t be overbuilt if the speeds on those technologies meet the BEAD speed requirements. For example, BEAD grants can’t be used to overbuild a cable network that has speeds today faster than 100/20 Mbps. However, there are some tiny rural cable systems that have speeds slower and that – and BEAD could be used to overbuild them.

The more interesting aspect of this definition is that the BEAD grants can be used to overbuild any other technology such as satellite broadband or fixed wireless networks using unlicensed spectrum. It doesn’t matter what the speeds are for these networks since NTIA has declared these technologies to be unreliable.

While this was expected and could be imputed from the Congressional enabling language, this is a clear blow to existing WISPs that are delivering decent speeds in rural places. The vast majority of WISPS use unlicensed spectrum – and I believe that this language also covers the use of CBRS spectrum – it’s lightly licensed but is not exclusive for a given ISP.

It’s also hard with this definition to think that anybody using satellite or unlicensed spectrum for fixed wireless will be eligible for BEAD grants, since those technologies have been declared by the NTIA as being unreliable. I don’t think it matters what speed these technologies can deliver – the NTIA has labeled them as unreliable.

This definition puts a label on those technologies that I’ve never seen used, but which is descriptive. Fixed wireless coverage varies in practice due to factors like temperature, humidity, and precipitation. The biggest issue with unlicensed spectrum is the possibility of debilitating interference. Not every home can see satellites due to terrain or tree cover, and satellite technology cannot guarantee serving everybody in a grant footprint – a key requirement in the NOFO for any grant winner. Earlier federal grants allowed for technologies that would reach most, but not all households – but the BEAD grants insist on total coverage.

Another part of the reasoning for the NTIA in coming up with this definition is probably the useful life of satellite and fixed wireless technology. It sounds pretty certain that low-orbit satellites will fall out of the sky after 5 – 7 years and will have to be replaced. The whole industry understands current fixed wireless technology to have a shelf life of around seven years. It’s easy to define these technologies as unreliable since there is no guarantee that a grant awardee will replace the technologies when needed.

This also raises an interesting question for elsewhere in the industry. Starting last year, the major broadband agencies in the federal government have been required to regularly communicate and coordinate. I have to wonder if this new definition might be the death knell for some of the big open RDOF awards that use satellite and fixed wireless technologies. If the FCC agrees that those technologies can’t serve everybody in an award area, then this new definition gives the FCC an easy way to cancel those awards.

This doesn’t mean that WISPs or satellite providers won’t still be in the market or even that they might not fare well – they are free to compete. But this new definition means that these technologies can be overbuilt with fiber or other technologies that satisfy the NOFO. And it seems likely that this means that satellite companies and WISPs using unlicensed spectrum are not going to be able to get BEAD grants.

Like every requirement in the NOFO, these rules will have to be interpreted by the fifty states. I think this particular language is fairly clear, but it will be interesting to see how states interpret this and the many other new rules and definitions created by the NOFO.

 

An Odd Appeal to Rural America

USTelecom recently sent a letter to practically every politician who might have a hand in deciding how broadband grants are awarded – the White House and key Cabinet officials, the NTIA, the FCC, members of Congress, Governors, Mayors, other local officials, tribal leaders, and state broadband offices. That’s some mailing list!

The main thrust of the letter is that communities should only rely on experienced broadband partners to build and operate networks – obviously meaning the big ISPs. The letter reminds officials that building a network is only a part of the solution and that communities need partners that know how to operate the business over the long run. The letter specifically calls out municipalities and non-profits as not being good partners because of their “propensity to fail at building and maintaining complex networks over time.”

The letter asks Congress to modify the current grant rules to remove any preferences for municipalities, non-profits, and electric cooperatives. USTelecom wants the grant rules to be changed to favor ISPs with experience and financial wherewithal. The big ISPs also think that communities should only be able to spend grant money by giving it to an ISP partner.

USTelecom also uses the letter to ask for changes that will make it easier to build broadband networks. They ask the various governments to:

  • Eliminate permitting delays and fees.
  • Streamline rights-of-way acquisition.
  • Streamline easements for railroad and other complex situations.
  • Eliminate Title II regulation (which, by the way, was eliminated by the last FCC – they actually fear it coming back).
  • Change the contributions to the Universal Service Fund so that all players pay a fair share.
  • Use only the FCC’s new maps to determine grant eligibility.

This letter is perhaps the most succinct statement of the broadband wish list of the big ISPs that I’ve seen in many years. They have been lobbying for everything on this list, but I can’t recall them asking for everything at the same time.

From a strategic position, this letter is mostly aimed at local officials. It’s unlikely that Congress or the White House is going to change the trajectory of the current grants at this late date. To do so would start the grant clock all over and push grant funding a few more years into the future.

It’s an interesting appeal to make to local governments since city and county officials will have a big hand in determining who gets grant funding when they choose a grant partner. This wish list basically tells local officials that they should have no option other than to fork grant money over to the biggest ISPs. And while asking local officials to change local rules to make it easier to build broadband, the big ISPs don’t want local governments to be able to challenge the FCC maps that the ISPs create. My guess is that most local officials are going to be offended by this document, so I don’t think this is going to get the reaction that USTelecom is hoping for.

The other odd aspect of this appeal is that most current grant money is going to rural America. The letter asks to keep electric cooperatives out of the broadband business – but many rural people still remember how the electric cooperatives bailed them out when nobody else would bring electricity. It’s interesting to stress experience when electric cooperatives have been around a lot longer than ISPs like the cable companies.

The big telephone companies have been around the longest – but they have a very poor name in rural America. A century ago, the large Bell companies refused to build in rural America, just like the big electric companies. Thousands of small local telcos were formed to fill the void but most eventually got gobbled up by companies that morphed into CenturyLink, Frontier, and Windstream. The big telcos have largely abandoned rural America over the last few decades – and it is that neglect that is the primary reason why rural broadband is in such bad shape. I’m sure there are some communities that will partner with the big ISPs – but a lot of communities that I work with would hope to partner with almost anybody else. This letter is not going to change many minds.