Please Don’t Force Low Rates

The NTIA conducts an annual broadband survey, and the 2021 survey asked a question about affordability. The survey asked folks who didn’t have home broadband what they would be willing to pay, with the question, “At what monthly price, if any, would your household buy home Internet service?”

The NTIA estimates that there are over four million households that say they can’t afford broadband. The purpose of the survey was to understand the kind of price points that might be needed to get broadband to more of these households. I think the NTIA was surprised by the results – three-quarters of respondents said they would only get broadband if it was free.

I find this result to be troubling for several reasons. First, many of the homes in this category are the poorest homes that truly can’t afford broadband. I’m sure many of the folks who say they can’t afford broadband would love to have it like most of the rest of us. But as important as broadband is, it’s not more important than rent and food.

My second problem is that not everybody who says they can’t afford is telling the truth. How do I know this? My firm has been doing surveys in the industry for over twenty years, and we’ve asked some version of the same question in hundreds of surveys. What I have discovered is doing surveys is that the responses to any survey questions involving money are not fully reliable.

This is well-known by folks who give surveys for a living. As an example, as many as half of survey respondents give a false answer when asked the level of family income. There are a lot of conjectures about why this is so, but surveyors know you can’t fully trust the responses to most questions involving money.

A question I’ve often asked is what people would like to pay for broadband. That’s a little different than the question being asked by the NTIA, but not much different. The big difference is that we ask this question to everybody, not just those who say they can’t afford broadband. It’s not usual to see 20% or 30% of respondents saying they don’t want to pay more than $10 or $15 per month. Most of the people giving that response are already paying $60 or $70 per month for broadband. I have no doubt that the responses about wanting low rates are serious – folks really would love to save money. But the responses I see are clearly reflecting what folks wish that broadband cost, which is different than what they are willing to spend – we already know they are spending a lot more. ISPs that get this survey response never know what to do with the answer – they know they can’t afford to sell broadband at super-low rates if they want to stay in business.

My biggest concern is what the NTIA or State Broadband Offices will do with the results of this survey. I’m afraid they are going to come out with a policy that ISPs must offer a $30 broadband rate coupled with the $30 ACP plan reimbursement for qualifying homes. This would provide broadband to the homes that really can’t afford broadband – but it would also give low-price broadband to a lot of homes that are willing to pay more, but who will gladly take the discount.

I’ve looked at dozens of rural broadband feasibility studies this year, and most rural ISPs absolutely cannot make the business work if some portion of customers is only paying $30 (through the ACP). I’ve looked at a lot of plans where the rates have to be $60, $70, or even $80 for the ISP to cover all costs – particularly the debt used to provide grant matching funds.

The NTIA isn’t supposed to be able to direct rates in the BEAD grants. The IIJA legislation clearly says that the NTIA cannot force any kind of price control on grant applicants. But that doesn’t mean they can’t try this in a backdoor way, such as giving more grant points to ISPs willing to offer super-low rates. That probably doesn’t qualify as forcing low rates, but it sure feels the same.

My last trouble is that I sympathize with the NTIA if they try this. The agency wants to get broadband into every home, and the only way to do this for the poorest homes is to make broadband somehow free. I know that rural cooperatives and telephone companies might try to make this work. But as I tell all of my clients – you have to let the numbers speak. If an ISP needs $70 rates to break even and then offers a $30 broadband product as a way to get grants, they are going to put the entire business at risk if too many people take that product.

I know forcing low rates is tempting, and it would feel like a good policy. But you can’t put rate pressure on ISPs willing to work in rural areas where costs are already sky-high. If the government wants the poorest homes to get broadband, the right solution is to something like putting more money into the ACP. The right solution is not to ask ISPs to shoulder the economic burden of too-low rates.

New Science – October 2022

Today’s blog looks at some new technologies that may someday have an impact on computing and broadband. We’re living in a time when labs everywhere are making some big breakthroughs with new technology, and it’s hard to predict which ones will become part of our everyday lives.

Artificial Synapses. Engineers at MIT have developed a new kind of artificial synapse that can process data several million times faster than the human brain. The human brain is still the best computer in the world due to the unique structure of neurons and synapses. Scientists have been working for years to try to mimic the structure of the human brain by developing chips that can perform multiple computations simultaneously using data stored in local memory instead of elsewhere. Early work in the field has created neural networks to mimic the way the brain works.

The new technology differs from past attempts by using protons instead of electrons to shuttle data. The scientists created a new kind of programmable resistor that uses protons and which allows for the use of analog processing instead of precise digital processing. The core of the new device is phosphosilicate glass (PSG), which is silicon dioxide with added phosphorus. This material allows for the passage of protons at room temperature while blocking electrons.  A strong electric field can move protons through the chip at almost the speed of light, allowing for the processing of data a million times faster than earlier neural nets.

Replacement of Silicon? Researchers at the EPFL School of Engineering in Lucerne, Switzerland have discovered some interesting properties of vanadium dioxide that would allow building devices that can remember previous external stimuli. This might allow for making chips out of VO­­­­­­­­­­­­­­­2 that would play the same role today as silicon while also acting as a data storage medium. This would allow for the storage of data directly as part of the structure of a chip.

Scientists found in the past that VO2 can outperform silicon as a semiconductor. VO2 also has an interesting characteristic where it changes from an insulator to a metal at 154 degrees Fahrenheit. Researchers found that when VO2 is heated and then cooled that it remembers any data stored at the higher temperature. The researchers believe that VO2 can be used to create permanent data storage that would be embedded directly into the material comprising a chip.

One-Way Superconductor. Scientists at the Delft University of Technology in the Netherlands, along with scientists from Johns Hopkins, have been able to create one-way superconductivity without using magnetic fields – something that was thought to be impossible.  This would be an amazing breakthrough because semiconductors that use superconducting materials would be hundreds of times faster than chips today with zero energy loss during data processing – something that might remove much of the heat created in data centers.

The researchers have found the possibility by using Trinobiumoctabromide (Nb3Br8). They were able to create diodes with a film of the material only a few atoms thick to create a Josephson diode, which are the core component for quantum computing.

The biggest challenge remaining for the team is to enable the superconducting diode to function at temperatures above 77K, which would enable functioning using liquid nitrogen cooling. One of the challenges of all superconductors has been the ability to enable the process at anything other than super-cold temperatures. But it’s not hard to envision using the technology to create large data centers of quantum computers.

Broadband and Demographics

In August, Roberto Gallardo, the Director of the Purdue Center for Regional Development wrote an article that takes a quantitative look at measuring the digital divide. Gallardo has created what he calls the Digital Divide Index (DDI) that allows for the mapping of various socioeconomic factors that can limit households from buying broadband. I obviously want you to read the rest of my blog first, but it’s then worth popping out and playing with the DDI map.

The Digital Divide Index creates a value for every county in the country, with values ranging from 1 to 100, with 100 meaning the highest level of digital divide. The DDI map also plots an infrastructure/adoption (INFA) score and a Socioeconomic score (SE) for every county.

The  Infrastructure/Adoption (INFA) score includes five variables that are related to broadband adoption:

  • The percentage of homes that are not receiving 100/20 Mbps broadband as measured by the Ookla speed test.
  • Percentage of homes without a computer.
  • Percentage of homes that don’t have a broadband subscription.
  • Download speeds.
  • Upload speeds.

The Socioeconomic Score (SE) includes the following five variables:

  • Percentage of the population over age 65.
  • The percentage of the population age 25 and older that didn’t complete high school.
  • The individual poverty rate.
  • The percentage of the population with a disability.
  • A newly calculated digital inequality or internet income ratio.

The DDI is the first attempt I’ve seen to bring together such a broad range of variables and relate them to the digital divide. Gallardo uses FCC 2020 data in creating the DDI. He also grabbed information from the U.S. Census Bureau’s 5-year American Community Survey, the Bureau of Economic Analysis, Lightcast (formerly known as Economic Modeling Specialists, Inc. or EMSI), and Venture Forward by GoDaddy.

Gallardo assigned values to each county using the many variables. Gallardo then separated counties into three equal groups by the values. The average DDI score for the lowest third of counties was 17.33, while the score of the top third was 36.58. Numerically, this infers that the third of the counties with the most digital divide issues have twice the digital divide issue, on average than counties with the least amount of digital divide issues. This is not a surprising finding, but I’ve never seen it expressed numerically.

The article goes on to create charts that compare the variables between the lowest and highest-ranking counties. For example, the report compared urban and rural. Not surprising, the percentage of counties with the worst DDI scores is almost four times more rural than urban. It turns out that other variables don’t differ between the highest and lowest DDI rankings, such as the percentage of children in the population.

This kind of report is useful for those interested in fixing the digital divide in a community. The report provides some interesting clues about how to find homes that are on the wrong side of the digital divide. For example, households that include somebody with a disability look to be far more likely to be on the wrong side of the digital divide.

Fearing the Competition

Over the last six months, practically every big carrier in the industry has made a formal announcement that they are not worried about specific competitors. The latest one I read was in LightReading where Nick Jeffery of Frontier said he’s not worried about competition from the cable companies upgrading to DOCSIS 4.0 or from cellular carriers offering FWA home broadband. Frontier is building a lot of fiber, and Jeffery was commenting that he thinks fiber is a superior technology compared to the alternatives. To be honest, this might be the only claim I read where the ISP was being truthful. Frontier has been at the bottom of the heap in the industry for many years and led in the percentage lost broadband and cable TV customers quarter after quarter. It’s got to be refreshing for the company to be deploying a technology that gives it a fighting chance to succeed.

I’m not citing all of the other CEOs that said the same thing – but these announcements were pretty much across the board – basically, no carrier is afraid of other competitors.

I’ve seen all of the big cable companies quoted as saying they aren’t afraid of FWA cellular broadband. And yet, in the second quarter of this year, T-Mobile and Verizon added over 800,000 new customers, while the large cable companies collectively lost 150,000 customers during the quarter. The cable companies rightfully say they have superior technology when competing against 100 Mbps download speeds, but the FWA cellular carriers have much lower rates and are attracting customers who think that cable broadband costs too much.

The big telcos that are building fiber have all made the same claim about not fearing FWA wireless. The big telcos collectively lost less than 100,000 customers in the second quarter of this year, the best they’ve done in ages. The small loss disguises the fact that the telcos continue to lose DSL customers but are largely replacing them with fiber customers – except Lumen, which had a net customer loss for the quarter of 93,000.

I’ve seen most of the big fiber overbuilders scorning cable company broadband and saying they aren’t worried about DOCSIS 4.0 – like Frontier said. That’s a fairly easy thing not to fear for now since we’re a number of years away from any conversions to DOCSIS 4.0. But Comcast and others are talking about soon introducing some of the higher split technologies on DOCSIS 3.1 to boost upload speeds sooner. Will fiber overbuilders fear the cable companies more after some upgrades?

The WISPs that will be installing new versions of fixed wireless, including some technologies that claim to be able to deliver speeds up to a gigabit, say they are not afraid of competing against rural fiber networks built with grant funding. That’s an interesting claim since the general public seems to have grasped that fiber is better. It will be interesting to see what happens in places where rural fiber competes against fast rural broadband.

The big three cellular carriers all claim they are not afraid of Dish Network becoming the fourth major cellular carrier. It’s an odd claim to make since Dish says the only way for it to gain market share is to be extremely aggressive with prices. The cellular industry is already highly competitive, and it can’t be good for any of the bigger carriers to have to lower rates.

I get a chuckle every time I read one of these statements because when a carrier goes out of its way to mention a competitor, it is worried. The reality is that every carrier in a competitive situation has to be concerned about competitors. In the end, this is a battle that is going to be fought at the local level, market by market. I can picture that the various technologies will get a different reception depending on local factors. But for now, apparently, nobody fears the competition.

Digital Equity Foundation Act

In an interesting bill before Congress, Senator Ray Lujan of New Mexico and Representative Doris Matsui of California introduced the Digital Equity Foundation Act of 2022. This bill would create a nonprofit organization called the Foundation for Digital Equity that would work to leverage public and private funds to solve the digital equity gap. Within six months after the bill is passed, the Secretary of Commerce would be required to establish a nonprofit corporation called the Foundation for Digital Equity.

It’s an interesting concept because it would create an organization that would bridge government and philanthropic organizations to formally tackle problems associated with digital equity. The Foundation would have four specific goals:

  • Work to raise funds from philanthropic organizations, businesses, and state and local governments to tackle issues associated with digital literacy, digital inclusion, and digital equity.
  • Work to promote partnerships and collaborations that would tackle specific hands-on projects that would further digital equity goals.
  • Promote equitable access to broadband and associated applications like telehealth, distant learning, and e-government.
  • Work to supplement programs from the NTIA that are aimed at solving digital equity issues.

The Act gives the Foundation a wide scope of operations and suggests a number of different activities that the Board would be free to pursue. Some examples include:

  • Conduct and support studies, competitions, and projects.
  • Award digital equity, digital inclusion, and digital literacy grants (I assume the grants would be funded by the private sector).
  • Support training programs related to digital equity for researchers, scientists, government officials, and those in higher education.
  • Create for-profit subsidiaries to stimulate economic development.
  • Engage in data collection related to inequities and community needs related to digital equity.
  • Write, edit, print, publish, and sell books and other materials relating to efforts carried out by the Foundation, the Department of Commerce, or the FCC.
  • Developing a publicly available evaluation process to enable communities to identify and quantify digital equity problems.

The Foundation would not be a government agency but instead would be a 501(c) nonprofit. However, there would be some government funding to pay for the administrative operation of the Foundation. Interestingly, donations to the Foundation would not be tax deductible. The governance of the Foundation would start with a Committee of named government officials who would have six months to seek a Board of five non-compensated Board members to permanently operate the Foundation. At least three of the Board members must have broad and general experience with digital equity, digital inclusion, or digital literacy. One member must have experience working with private nonprofit organizations. The Secretary of Commerce, the Director of the NTIA, the Chairman of the FCC, the Secretary of the Treasury, and the Under Secretary of Agriculture for Rural Development will also serve as non-voting members of the Board.

This is an interesting approach and recognizes that it’s going to take a long time to tackle digital equity problems. The IIJA provides for $3 billion in digital equity grants over the next three years, but this Act recognizes that a long-term approach is needed to go beyond the work started with these grants.

The one thing I’ve learned about digital equity in the last year is that the only successful approach requires working with people one by one. Somebody has to sit with the recipient of a first computer to help them learn how to use it. Training somebody to effectively use the web requires individual training. This Foundation is not going to itself solve any digital equity problems. But if done well, it will match government, philanthropic, and business donations with organizations that are going to do the hard work that is needed.

The Demand for Middle-Mile Fiber

The deadline for the NTIA’s middle-mile grant program just closed, and the NTIA said that it received 235 applications totaling $5.5 billion in grant requests for a $1 billion grant program. Applicants in parts of Florida, South Carolina, Puerto Rico, and Alaska were given more time to apply due to recent natural disasters, so there may still be a few more requests. I think the program would have received many more requests, but folks already assumed it would be massively oversubscribed.

I was surprised when the IIJA legislation allocated only $1 billion to middle-mile fiber. That works out to only $20 million per state. That may sound like a lot, but to put it into perspective, California set aside $3.25 billion of its ARPA funding just for middle-mile. The one billion is nice, but it is not nearly enough to satisfy the nationwide need for more fiber backbones reaching into rural areas and connecting cities.

What exactly is middle-mile fiber? It’s the fiber used to connect communities to the Internet. Middle-mile fiber brings the transport that is needed to serve last-mile ISPs, cell towers, and any large broadband users like hospitals, factories, or other key anchor institutions.

It’s easy to understand why middle-mile fiber is needed. Much of rural America is connected to the Internet by a single fiber route provided by one of the big rural telephone companies. If there are fiber cuts or problems with the electronics on the only existing fiber route, an entire region will lose broadband. Just over the last month, I’ve talked with three counties that have experienced broadband outages this year that lasted from half a day to several days. It’s easy to imagine in today’s world how these outages can decimate a local economy.

Middle-mile is needed for several reasons. First, some of the fiber routes reaching remote areas were built in the 1980s and 1990s and are aging. There have been big improvements in the manufacturing of fiber since then, and new fiber is expected to have a much longer expected life, but some of the fiber built in those years is wearing out. Part of the problem with older fiber is that we used poor construction techniques decades ago, where we tugged fiber through conduits and created small stress points that went bad prematurely – we are much gentler with fiber installation today. Aerial fiber reaching into rural areas tends to follow the main roads, and aerial fibers have likely been cut over time from accidents that broke poles or storm damage.

The other reason we need more fiber is resiliency. Until recently we used the word redundancy to describe this need. Redundancy meant building fiber into rings so that a single fiber cut wouldn’t knock out a town or region from broadband. Resiliency stretches that definition further to talk about building fiber in such a way that it is better protected from fiber cuts and can be repaired more quickly.

The final reason we need more middle-mile fiber is cost – monopoly providers tend to charge a lot for transport on monopoly routes. Prices tumble when there is middle-mile competition.

Grants are needed to build rural middle-mile fiber because there is likely not going to be enough revenue on most rural fiber routes to justify funding a middle-mile route with normal financing. Grant funding for middle-mile makes the statement that rural communities are important. It doesn’t do much good to build rural last-mile networks if there is no affordable and reliable way to bring bandwidth to the new networks.

It will be interesting to see how the NTIA spreads the funding. I have to imagine that some of the grant requests are from states or groups of counties asking to build large statewide or regional networks. It’s likely that most of the grant requests hope to build fiber routes that immediately solve existing problems. But unfortunately, more than 80% of the requests are not going to get funded. Maybe the great demand for this grant program will prompt Congress to find more funding for middle-mile. It’s one of the best investments they can make.

Broadband Satellite Issues

One of the most interesting aspects of serving broadband from low-orbit satellites is that it brings issues related to space into the broadband discussion. Space issues were less important for high earth orbit satellites that sit 20,000 miles above the earth. Other than an occasional impact from sunspots, there wasn’t much of note. But there are two recent events that highlight our new focus on low-earth orbit satellites. I would never have imagined a decade ago that I would be interested in these topics in terms of the impact on broadband.

The first is a piece of legislation introduced by Senators Maria Cantwell (D-WA), John Hickenlooper (D-CO), Cynthia Lummis (R-WY), and Roger Wicker (R-MS). The legislation is called the Orbital Sustainability (ORBITS) Act. The bill is intended to begin the development of a technology called active debris removal (ADR) that would be used to remove dangerous debris from low earth orbit.

The risk of space debris has been well documented by NASA and others. There are over one hundred million pieces of debris orbiting the earth today. These range in size from dust-sized up to out-of-service satellites and rocket boosters. Space will be getting a lot more crowded as the industry plans to launch tens of thousands of additional satellites in the coming years. Space is going to get crowded.

So why is debris a problem? The issue was described by NASA scientists Don Kessler in 1978. He postulated that as mankind put more objects into orbit that the inevitability of collisions would increase and that over time there would be more and more debris. This is easy to understand when you realize that every piece of debris is circulating at over 20,000 miles per hour. When objects collide, even more debris is created, and Kessler postulated that there would eventually be a cloud of debris that would destroy anything in orbit, making low-space unusable.

The legislation would fund research into different technologies that can be used to clean debris, with NASA tackling some of the trials. The hope is for an eventual system that scrubs space of debris as it is created to keep the valuable low-orbit space usable.

In other news, President Putin of Russia has threatened to destroy Starlink and other satellites that are helping Ukraine in the war between the two countries. Targeting satellites as part of war is an idea that has been used by Hollywood for years. The first such movie I remember is Moonraker, the James Bond movie that sent the British secret service agent into space.

In September, a Russian diplomat said at the United Nations that satellites could be legitimate military targets. He argued that civilian satellites that provide broadband might be a violation of the Outer Space Treaty that provides for only peaceful uses of satellite technology. He is obviously aiming his comments at Starlink, although in a few years, there will be multiple companies in the same category.

Russia has already been targeting Starlink with cyberwarfare hacking to try to corrupt the satellite software. It’s been reported that Russia was also looking for a way to identify the location of the satellite receivers on the ground.  But it was clear from recent threats that Russia is hinting at some method of crippling or destroying satellites in orbit.

The earth has become massively reliant on satellite technology. It’s now becoming a source of broadband, but there are many other vital uses such as GPS technology, weather forecasting, studying and tracking resources like water and minerals, and numerous other uses.

The idea of attacks on satellites is scary. This might range from some sort of hunter satellites that attack other satellites or more indiscriminately through something like nuclear blasts that would disable all electronics. But the investment in satellites is huge and would not easily be replaced. The bigger question raised is if it is worth spending money on satellites that can be destroyed.

It’s likely that the threats are just rhetoric because every country depends on satellites for a lot of everyday functions. But countries have done insane things in wartime before, so it’s not off the table.

Pinpointing Urban Broadband Gaps

The City of Chicago asked some researchers at the University of Chicago for help to identify the neighborhoods and the number of households that are not connected to broadband. It’s been well known that large numbers of people in cities don’t have broadband, but there have been no easy ways to pinpoint where solutions are needed. The researchers, along with folks from the University of California at Santa Barbara tackled the challenge and reported on their findings in this report.

The report highlights something that’s obvious but hard to do – you can’t tackle the digital divide until you can identify where help is needed. Cities often concentrate efforts on public housing because they understand the needs there, but there is no easy way to identify other blocks and neighborhoods that could use help solving the digital divide.

The research began with the U.S. Census ACS survey data and also the FCC Form 477 data – the researchers fully understood the weaknesses of both sets of data. They layered on Ookla speed test data in an attempt to identify areas that have broadband that underperforms other parts of the City. Finally, they layered in the results of earlier work where millions of online queries were made to ISPs asking for the availability of broadband at specific addresses.

They found the following, which is not a surprising list for anybody working with urban broadband:

  • Broadband adoption varied widely across Chicago with broadband penetration rates in neighborhoods varying from 58% to 93%.
  • Neighborhoods with the lowest broadband adoption rates correlated well with the majority-Black neighborhoods that reflect the City’s historical patterns of residential segregation.
  • Adoption rates also correlated well with other factors such as the percentage of the Hispanic population, low incomes, low education attainment, and a higher percentage of elderly residents.
  • 90% of Census blocks have at least one ISP that offers high-speed broadband. There was a big variance in the number of high-speed broadband options. 50.6% of Census blocks have only one high-speed option.

Some of the findings are not surprising. There have been numerous studies that have correlated race, income, level of education, and age to broadband adoption. The big difference with this study is that the researchers were able to specifically identify the parts of the City with the lowest broadband adoption and could pinpoint which of these factors likely played a role for each pocket.

Some of the findings are unexpected. If 90% of Census blocks have at least one high-speed ISP, then 10% of the Census blocks in Chicago do not have a high-speed broadband option. We’ve been spending a lot of money to make sure that rural areas get good broadband, but this finding means there may be far more people in cities without a good broadband option than rural folks. This should lead to some interesting discussions on how to use some of the BEAD grant funding.

The fact that 50% of Census blocks in Chicago have only one high-speed ISP means that half of the City is served by an ISP that holds a virtual broadband monopoly. In a large City that has multiple ISPs, this is probably a surprising finding.

One of the most important questions I have is how this research can be duplicated in other cities. If cities want to get the best results from digital literacy plans, they need to know where to concentrate resources. If cities want to push ISPs to make sure that everybody has at least one high-speed broadband option, they need to pinpoint those areas that don’t have fast broadband.

ISP Liability

Charter was recently ordered to pay over $1.1 billion to the estate of the family of an 83-year-old Charter customer that was murdered by a Spectrum technician in 2019. A jury had originally ordered Charter to pay $337 million in compensation plus $7 billion in punitive damages. The judge lowered the punitive damages to be more in line with comparable punitive damage calculations.

This was a case that should concern all ISPs. The technician, Roy Holden, was seemingly a good technician. He had completed over 1,000 service calls with no customer complaints. It turns out that the technician had stolen credit cards and checks from a few elderly customers, but this wasn’t discovered until after the murder. Charter had done a routine background check when he was hired that showed no arrests, convictions, or other criminal behavior. There was nothing about Roy Holden that made him look any different on paper than the many technicians hired by other ISPs.

It’s likely that the award was so large due to Charter being such a large and profitable company. But even the base award of $337 million would ruin all but the largest ISPs in the country.

This is obviously a pretty rare event and, as Charter argued in court, was totally unforeseeable. How can any ISP know when it has a rogue or unbalanced technician? Unless an employee is acting erratically, it’s impossible to think that an ISP, or the many other kinds of companies that do in-home customer service calls can protect against this kind of event.

ISPs have no financial backstop for this kind of large court award. Most of my clients carry general business insurance in the range of perhaps $5 million. That level of coverage won’t come close to covering the damages awarded in this case. I don’t know many ISPs that could survive a lower award – even $20 – $50 million would ruin most of my clients.

This kind of event is rare, and I can’t imagine that insurance can be purchased to protect against it. If there is such a policy, it would have to be extraordinarily expensive, and ISPs would have a hard time justifying the premiums due to the low risk of ever having such an event.

Facility-based ISPs generally don’t carry a large amount of insurance. It’s not feasible to insure expensive networks against things like storm damage. Instead, ISPs rely on big storm damage to be covered by FEMA along with other infrastructure that is damaged in big natural disasters like storms, fires, and floods.

I suspect this award will send some ISPs to talk to their insurance agent – and they will find that there is no practical way to insure against this kind of event. But that doesn’t make ISPs any different than companies that install appliances, countertops, or air conditioners. I think this is one of those things that ISPs shouldn’t think too hard about. I’ve read articles on the issue that suggest that ISPs need a more vigorous vetting process for new employees. But realistically, that probably makes almost no difference, although it might convince a jury to set a smaller award.

Taxability of Grant Funding

Congress will be considering a bill in a lame duck session after the election to make grant revenues non-taxable. This is a big deal because making grants taxable was another huge hurdle for smaller ISPs to accept large amounts of grant funding. The new legislation is co-sponsored by Senators Mark Warner (D-Va.) and Jerry Moran (R-Kan.). They are hoping the bill will be considered as part of a larger tax bill that will consider other issues like keeping the government open past December, extending the President’s enhanced child-care credit, and other tax and financial issues.

Grant funding is normally taxable. The majority of federal grants go to state and local government agencies, which by definition don’t pay income taxes. But folks like businesses and scientists that accept grant funding have always had to declare the grant funding as income. That makes a lot of sense for a grant that is largely used to pay salaries – the grant is income while the salaries are deductions, and there would typically be little tax liability for most grant recipients.

But that’s not true for grants used to build infrastructure. Consider a $20 million grant to build fiber. If grant income is taxable, a grant recipient accepting the grant in one year will incur roughly a $4.2 million federal tax liability for the year. Over the next 20-30 years, the grant recipient will see taxable income reduced by the original $20 million as the grant recipient writes off the asset through depreciation expense. Theoretically, a $20 million grant recipient has no overall tax liability over the life of the grant assets. But a grant recipient must write a big check to the IRS now and wait for decades to slowly get the money back.

No ISPs like the idea of a huge upfront tax bill as the cost of taking a grant. The largest ISPs can handle this since their corporations have a lot of cash. This is not true for smaller ISPs. I know very few small ISPs who sit with million of cash in the bank that could be used to cover the instant tax liability. Having to pay this tax is a major disincentive to taking grants – and could be a disaster for an ISP not ready to make the big tax payment.

The IRS used to have the authority to make this kind of change, but that ability was rescinded in the big tax bill that passed in 2017. The IRS waived the taxability of grants in 2009 for broadband infrastructure grants that were awarded under the BIP and BTOP programs. But with the IRS out of the picture, it now requires a specific Act of Congress to excuse the taxability for broadband grants.

This taxability issue doesn’t only apply to the big upcoming $42.5 billion in BEAD grants but to all other broadband grants. The billions in grants being awarded by the NTIA, the USDA ReConnect program, and the many state broadband grants are also taxable. Even should Congress pass this law, it’s unlikely that it will reach back to past years to excuse grants taken recently.

ISPs in most states must also contend with the state income taxes that would follow the grants. Most states automatically adopt changes made in federal tax laws – but a few do not and would also need legislative intervention to exclude infrastructure grants from taxability.

ISPs should reach out to legislators in the coming month to be heard on the issue. If this law doesn’t pass, I foresee grant recipients that will be unable to make the needed large tax payments immediately after accepting grant funding. This is one more item that will significantly add to the cost of accepting grant funding – and in many cases, will make it infeasible to accept funding for rural broadband projects that are barely going to break even.