Regulation - What is it Good For?

Viasat and the CAF II Reverse Auction

The FCC has done some very odd things over the years to try to solve the rural broadband problem. One of the oddest was the CAF II reverse auction that awarded $122.5 million to Viasat. Things move so fast in this industry that this award at times feels like ancient history, but the reverse auction just ended in August 2018.

There is a lot of grant money currently raining down on the rural broadband industry, but when this award was made, that wasn’t happening. It’s hard to fathom that only three short years ago that the FCC deemed that giving $122.5 to a geosynchronous satellite company was good policy.

This blog is not intended as a criticism of Viasat. If you live in a place where DSL is not available, then satellite broadband is your likely only alternative. Viasat satellite broadband has gotten better over time. The broadband on the ViaSat-1 satellite launched in 2011 was dreadfully slow. The company markets broadband as fast as 100 Mbps download on the ViaSat-2 satellite launched in 2017. The company plans three new ViaSat-3 satellites with even higher capacity, with the first to launch sometime in 2022.

My consulting firm does detailed market research in rural counties, and we’ve heard from satellite customers across the country. We’ve never met anybody that fully loves the product. The most common complaints are high prices, small data caps, and big latency. Prices are high compared to other forms of broadband, with the latest pricing from Viasat as follows:

Price Speed Data Cap
Unlimited Bronze $84.99 12 Mbps 40 GB
Unlimited Silver $119.99 25 Mbps 60 GB
Unlimited Gold $169.99 100 Mbps 100 GB
Unlimited Platinum $249.99 100 Mbps 150 GB

There is a $12.99 per month additional fee for equipment on top of these prices. A customer must sign a 2-year contract to get these prices, with a fee of $15 per remaining month if a customer breaks a contract.

We’ve been told by customers that download speed slows to a crawl after a customer exceeds the monthly data allotment. To put these data caps into perspective, OpenVault says that at the end of the first quarter of this year that the average U.S. home used 462 gigabytes of data. It’s not easy for a modern home to curtail usage down to 60 or 100 GB.

The biggest performance problem is probably the latency, which can be 10 to 15 times higher than with terrestrial broadband. The latency lag is due to the time required for the signals to go to and from the satellites parked at over 22,000 miles above the earth which adds time to every round-trip connection to the web. Most real-time web connections, such as using voice-over-IP or connecting to a school or corporate WAN, work best with a latency of less than 100 ms (milliseconds). We see speed tests on satellites with a reported latency between 600 ms and 900 ms.

I wonder about the long-term repercussions of this reverse auction grant award. Most federal programs prohibit providing a government subsidy to an area that is already receiving a federal broadband subsidy. Viasat is going to be collecting $12.25 million per year from the FCC through 2027. Will this mean that people unlucky enough to live where Viasat won the reverse auction can’t get faster broadband out of the wave of new grant funding? If so, these homes might be doomed to not get a landline broadband solution for decades.

The FCC should never have allowed geosynchronous satellite broadband into the reverse auction. Perhaps we can’t fully blame the FCC for not foreseeing the pandemic where rural people are screaming for better broadband. But it didn’t take much of a crystal ball in 2018 to understand that something better would come along sooner than the 10-year window they are providing in the Viasat award areas.

The FCC is likely to repeat this same mistake if it awards nearly $1 billion to Starlink in the RDOF reverse auction. If the federal infrastructure funding becomes available, there will be ISPs willing to build fiber to areas where Starlink will be getting a 10-year RDOF subsidy. It’s not too late for the FCC to change course and not make the RDOF award. Otherwise, the agency might be dooming a lot more people to not getting a permanent broadband solution.

Regulation - What is it Good For?

Is Defining Broadband by Speed a Good Policy?

I’ve lately been looking at broadband policies that have shaped broadband, and I don’t think there has been any more disastrous FCC policy than the one that defines broadband by speed. This one policy has led to a misallocation of funding and getting broadband to communities that need it.

The FCC established the definition of broadband as 25/3 Mbps in 2015, and before then, the definition of broadband was 4/1 Mbps, set a decade earlier. The FCC defines broadband to meet a legal requirement established by Congress and codified in Section 706 of the FCC governing rules. The FCC must annually evaluate broadband availability in the country – and the agency must act if adequate broadband is not being deployed in a timely manner. The FCC chose broadband speed as the way to measure its success, and that decision has become embedded in policies both inside the FCC and elsewhere.

There are so many reasons why setting an arbitrary speed as the definition of broadband is a poor policy. One major reason is that if a regulatory agency is going to use a measurement index to define a key industry parameter, that numerical value should regularly be examined on a neutral basis and updated as needed. It’s ludicrous not to have updated the speed definition since 2015.

Cisco has reported for years that the demand for faster speeds has been growing at a rate of about 21% per year. Let’s assume that the 25/3 definition of broadband was adequate in 2015 – I remember at the time that I thought it was a fair definition. How could the FCC not have updated such a key metric since then? If you accept 25 Mbps download as an adequate definition of broadband in 2015, then applying the expected growth in demand for speed by 21% annually produces the following results.

Download Speeds in Megabits / Second

2015 2016 2017 2018 2019 2020 2021
25 30 37 44 54 65 79

This is obviously a simplified way to look at broadband speeds, but a definition of the minimum speed to define broadband at 79 Mbps feels a lot more realistic today than 25 Mbps. Before arguing about whether than is a good number, consider the impact of extending this chart a few more years. This would put the definition of broadband in 2022 at 96 Mbps and at 116 Mbps in 2023. Those higher speeds not only feel adequate – but they feel just. 80% of the homes in the country already have access to cable company broadband where a speed of at least 100 Mbps is available. Shouldn’t the definition of broadband reflect the reality of the marketplace?

We know why the FCC stuck with the old definition – no FCC wanted to redefine broadband in a way that would define millions of homes as not having broadband. But in a country where 80% of households can buy 100 Mbps or faster, it’s impossible for me to think this one fact doesn’t mean that 100 Mbps must be the bare minimum definition of broadband.

There have been negative consequences of this definition-based policy. One of the big problems is that the 25/3 Mbps speed is slow enough that DSL and fixed wireless providers can claim to be delivering broadband even if they are delivering something less. Most of the FCC mapping woes come from sticking with the definition of 25/3 Mbps. If the definition of broadband today was 100 Mbps, then DSL providers would not be able to stretch the truth, and we would not have misallocated grant funding in recent years. Stubbornly sticking with the 25/3 definition is what saw us giving federal broadband grants to companies like Viasat.

As long as we are going to define broadband using speeds, then we’ll continue to have political fights over the definition of broadband. Congress recently ran headlong into this same issue. The original draft of the Senate bill had proposed a definition of broadband as 100/100 Mbps. An upload speed set at that level would have prohibited broadband grants for cable companies, WISPs, and Starlink. Sure enough, by the time that lobbyists made their calls, the definition of upload speed was lowered to 20 Mbps in the final legislation. Congress clearly gave in to political pressure – but that’s the line of business they are in. But we’ve had an FCC unwilling to be honest about broadband speeds for political reasons – and that is totally unacceptable.

Regulation - What is it Good For?

Do We Still Need the Universal Service Fund?

There is currently a policy debate circulating asking who should pay to fund the FCC’s Universal Service Fund. For decades the USF has collected fees from telephone carriers providing landline and cellular phones – and these fees have been passed on to consumers. As landline telephone usage has continued to fall, the fees charged to customers have increased. There have been calls for years to fix the USF funding mechanism by spreading the fees more widely.

Since the fund today is mostly being used to support broadband, the most logical way to expand funding is by collecting the fee from ISPs – which would also likely pass the fees on to consumers. A new idea has surfaced that suggests that the USF should instead be funded by the biggest users of the Internet – being Netflix, Google, Facebook, and others. This argument was likely started by the big ISPs who wanted to deflect the fee obligations elsewhere. The argument is that the big web companies get tremendous benefits from the Internet without paying towards the basic infrastructure.

As I’ve read this back-and-forth debate, I was struck by a different thought. Instead of expanding funding for the USF, we ought to be talking about curtailing it. The Universal Service Fund is used for several purposes. USF funds the subsidies to get cheaper broadband for schools and libraries. The fund also pays for getting better broadband for rural health care facilities. These seem like worthwhile programs that should continue to be funded.

But the USF also has been supporting the Lifeline program that gives a $9.25 monthly discount to qualifying low-income homes. The amount of that monthly subsidy hasn’t been changed in years and has become more irrelevant over time. Some of the big ISPs have completely dropped out of the program, such as AT&T that ditched participation in most states where it is still a telco. There were always rumors that the fund included a lot of fraud – but we never saw enough detail to ever understand if this was true.

It seems like the current White House and Congress have a better alternative to Lifeline. The ARPA bill created the Emergency Broadband Benefit that gives low-income homes a $50 discount on broadband during the pandemic. Congress has suggested replacing that with a more permanent $30 discount. If Congress gets its act together and passes the infrastructure bill, then it’s time to have a serious talk about eliminating the FCC’s Lifeline program. There is no need to have both programs.

The final use of the Universal Service Fund is what I often refer to as the FCC’s slush fund. The FCC lets this fund accumulate and supposedly uses it to improve broadband in the country. But frankly, the FCC is terrible at this. Consider the history of this piece of the USF:

  • This money was originally intended to support rural telephone companies. State regulators capped telephone rates in most states in the range of $15 – $20 per month, and that was not enough revenue to support the telephone networks in high-cost areas. The Congress and the FCC had decided many years ago that the U.S. economy was best served if everybody was connected to the telephone network, and this might have been the biggest boon to rural America after electrification. This was an effective policy, and at one point, we had a 99% telephone penetration rate in the country. This fund was needed, but had big flaws. The FCC handed out the money based on formulas instead of looking at the need of individual telcos. This resulted in some telcos and commercial telco owners getting incredibly rich from an over-generous subsidy. There was never any serious attempt at the FCC to get this right.
  • But as landline telephone service has been transplanted by cellular service and VoIP, the FCC transitioned this fund to subsidize rural broadband. Perhaps the best use of the funding was the ACAM program that gave money to rural telcos, many of which leveraged the money and took big loans to build rural fiber. When people marvel at the amount of rural fiber in the Dakotas – it was funded by the ACAM program. But this plan also had some faults when some telcos used the ACAM to upgrade DSL and pocketed much of the subsidy.
  • After this, the FCC used the slush fund for a series of disastrous funding plans. The first was CAF II, where the FCC gave $11 billion to the largest telcos to upgrade rural DSL to 10/1 Mbps. This funding was given at a time when 10/1 Mbps was already too slow. A few telcos used the money properly but made little dent in improving broadband since tweaking out-of-date rural DSL didn’t make broadband much better. I’m not alone in thinking that some of the big telcos pocketed much of this money. They made a few cosmetic upgrades but largely used the money straight to bottom line profits. The FCC was so aghast at the way this funding was wasted that it tacked on an extra $2 billion payment to the telcos after the end of the program.
  • Next, the FCC held a small reverse auction with some money left over from CAF II. Some of this money went to worthwhile fiber projects, but money also went to ISPs like Viasat – a mind-numbing use of federal subsidies.
  • Next came the RDOF reverse auctions. I think we’ll look back a decade from now and judge that this funding did far more harm than good. If you follow my blog, you know I believe that the FCC mucked up this program in half a dozen ways, each of which will have long-term consequences in the neighborhoods where the FCC got it wrong.
  • Finally, the FCC tried to fund a $6 billion 5G fund that would have handed subsidies to cellular carriers to extend cell coverage into areas where it’s needed. But there was so much deception in the reporting of rural cellular speeds that the FCC finally pulled the plug on this – although I think this idea is likely to roar back to life one of these days.

The bottom line is that the FCC is incredibly inept in administering the slush fund. I don’t know why anybody would think that a regulatory agency made up mostly of industry lawyers could be the best place to entrust billions of dollars of broadband funding. It’s hard to imagine that the FCC could have done any worse over the last decade with this slush fund. I’m pretty sure that any six readers of this blog could have chatted over beers and come up with better ways to use the money.

So rather than have the debate of whether AT&T or Facebook should fund the Universal Service Fund – why don’t we have the debate about largely eliminating the fund? I can’t think of any reason why we should continue to let the FCC gum up rural subsidy programs. Let’s find a way to fund the school, libraries, and rural health care, and let’s get the FCC out of the business of goofing up subsidies.

Regulation - What is it Good For?

Technology Neutrality

Christopher Ali, a professor at the University of Virginia, says in his upcoming book Farm Fresh Broadband that technology neutrality is one of the biggest policy failures of our time. I completely agree, and today’s blog explores the concept and the consequences.

Over the last decade, every time that a pot of grant money has appeared on the horizon, we’ve heard talk at the FCC about making sure that there is technology neutrality when choosing the winners and losers of federal grants. This phrase had to be invented by one of the big ISPs because as is often typical of DC politics, the meaning of technology neutrality means exactly the opposite of what you might think it means.

Technology neutrality is a code word for allowing slower technologies to be funded from grants. The first time I remember hearing the phrase was in 2018, during the lead-up to the CAF II reverse auction. This was a $2 billion reverse auction for locations that hadn’t been claimed in the original FCC CAF II program. Many in the industry thought that federal grant funds ought to only be used to support forward-looking technologies. The term technology neutrality was used to support the argument that all ISPs and technologies should be eligible for grant funding. It was argued (mostly by ISPs that use slower technologies) that the FCC should not be in the game of picking winners and losers.

The technology neutrality proponents won the argument, and the FCC allowed technologies with capabilities as slow as 25/3 Mbps into the reverse auction. The results were what might be expected. Since lower-speed technologies tend to also be the least expensive to build, the slower technologies were able to win in a reverse auction format. It was not surprising at the end of that auction to see that three of the four top winners will collect $580 million to deploy slower technologies. This included fixed wireless providers AMG Technology (Nextlink) and WISPER, as well as high-orbit satellite provider Viasat.

The same argument arose again as the rules were being developed for the RDOF reverse auction. The first auction offered $14 billion in subsidies for ISPs to build last-mile broadband in places that the FCC thought had no broadband with speeds of at least 25/3 Mbps. The FCC heard testimony from the industry about the technologies that should be eligible for the subsidies. In the end, in the name of technology neutrality, the FCC allowed every technology into the reverse auction. The following is a quote from the FCC order that authorized the RDOF funding:

Although we have a preference for higher speeds, we recognize that some sparsely populated areas of the country are extremely costly to serve and providers offering only 25/3 Mbps may be the only viable alternative in the near term. Accordingly, we decline to raise the required speeds in the Minimum tier and we are not persuaded that bidders proposing 25/3 Mbps should be required to build out more quickly or have their support term reduced by half.

Again, it was not surprising to see that the list of RDOF winners included companies that will use the funding to build slower technologies, including fixed wireless and DSL. Only two of the top winners promised to build gigabit-capable broadband everywhere (a consortium of electric cooperatives and Charter). The FCC also decided at the last minute to allow Starlink into the auction – even those nobody knew at that time the speeds that could be delivered. The FCC really goofed up the technology issue by allowing some WISPs to bid and grab major winnings in the auction by promising to be able to deliver gigabit speeds with fixed wireless technology – a technology that doesn’t exist for a rural setting.

We recently saw the technology neutrality issue rear its head again in a big way. As the Senate was crafting legislation for a major infrastructure program, the original draft language included a requirement that any technologies built with the money should be able to immediately deliver speeds of 100/100 Mbps. That requirement would have locked out fixed wireless and cable companies from the funding – and likely also satellite companies. In backroom wrangling (meaning pressure from the big ISPs), the final legislation lowered that threshold to 100/20 Mbps.

The reason that Ali says that this is a policy failure is that the broadband policymakers are refusing to acknowledge the well-known fact that the need for broadband speeds continues to increase year after year. We just went through a miserable pandemic year where millions of homes struggled with inadequate upload broadband speeds, and yet the technology neutrality canard was rolled out yet again to justify building technologies that will be inadequate almost as soon as they are built. I would argue that the FCC has an obligation to choose technology winners and losers and shouldn’t waste federal broadband money on technologies that have no long-term legs. The decision by regulators and legislators to allow grant funding for slower technology means that the speed that current ISPs can deliver is being given priority over the speed people need.

Regulation - What is it Good For?

Treasury Defines Capital Project Fund Grants

The U.S. Department of the Treasury finally released the rules for the $10 billion Capital Projects Fund that will be distributed to states for broadband. The full rules are here.

This blog is not going to spit back all of the rules. Those have already been outlined well by others. Here is a great summary from Kevin Taglang from the Benton Institute.

States must apply to Treasury for the funds. The amount that each state can receive is here. A lot of the recently released rules tell states how to go about the process of claiming the money. States must make an application by December 27 and have until a year later to file details of the specific grants made within the state.

It’s hard to think that states won’t pursue this money, although a few small states might have problems finding enough eligible projects. I’m going to concentrate below on a few of the Treasury rules that will carry into state grant rules.

States Will Administer Grants. States will make awards to specific projects. Each state will need a grant program that follows the federal rules for this money. Since these new rules are different than the rules governing many existing state grant programs, the states will have to quickly adjust in order to follow these rules for at least this one grant. Some states are going to need legislative changes if current grant rules are established by the legislature.

Communities and States Can Define Eligible Areas. These grants do not use FCC mapping in determining eligibility. A grant area must only be shown to not have reliable 100/20 Mbps broadband in order to be eligible – and this is a very loose test. Treasury provides wide leeway in defining eligible areas, and almost any reasonable form of proof of poor broadband can suffice to prove an area is eligible. Of course, states will have some say in defining eligible areas, and I foresee a huge tug-of-war over this issue between state grant offices, communities, ISPs, and legislators.

Symmetrical Gigabit Speeds. Grant technologies must be able to provide symmetrical 100 Mbps speeds. There is going to cause confusion all over the industry as different grant programs have different speed requirements. This might also require legislative changes in some states. There is a provision that says that speeds can be slower where 100/100 Mbps isn’t practical, so expect a lot of challenges by ISPs trying to fund slower technologies.

Eligible Projects. A project must meet all of the following requirements: A project must be spent on capital assets that will enable work, education, and health monitoring. Projects must address a critical need that results from or was made obvious during the pandemic. Projects must address a critical community need.

Mostly for Infrastructure. Treasury wants a priority for last-mile infrastructure. States can request middle-mile projects, but Treasury must approve. Some money will be allowed for devices, but the state must retain ownership of devices. Money can go for improvement to government facilities that meet all of the eligibility rules.

No Required Matching. Treasury allows states to fund projects 100%, with no matching. But states might require matching in order to spread the grant benefits to more projects.

Some Prior Costs. Costs back to March 1, 2021 can be included in a grant under some circumstances. This might cover costs like a feasibility or engineering study.

Labor Standards. The rules do not mandate paying Davis-Bacon wages, but it encourages projects to pay a living wage.

Projects Completed by End of 2026. Projects must be completed by then, although Treasury has the ability to grant extensions.

Summary. I expect most states will grab the available funding. This funding should result in a state-administered grant program in every state in 2022 since states have to demonstrate having awarded this money by the end of next year. Since states are likely to put their own twist on these rules, keep an eye out for specific state rules. And start getting projects ready!

Regulation - What is it Good For?

Using Private Rights-of-Way for Fiber

As if the broadband industry didn’t already have enough obstacles, a new issue has arisen in Virginia. A couple in Culpepper County, John and Cynthis Grano, have sued the Rappahannock Electric Cooperative to stop it from putting fiber on existing pole lines that are located on a private easement.

To put this lawsuit into perspective, Virginia law in the past would have required a utility to negotiate a private easement to gain access to the placement of utility networks on private land. But in 2020, the legislature passed a new law that allows electric and communications utilities to add fiber along existing aerial and buried rights-of-way without getting additional permission from property owners. This law was passed to make it easier to build fiber in rural Virginia.

The Cooperative was getting ready to embark on a $600 million rural fiber project to bring broadband to rural customers, but this lawsuit has caused the Cooperative to halt plans for now.

As is usual with lawsuits, there are always additional facts to consider. The rights-of-way in question are not along a road in a public right-of-way. Instead, the fiber route cuts across the landowner’s property, which also is the site for one of the Cooperative’s electric substations. Prior to the law being passed, the Cooperative had offered a $5,000 fee to use the rights-of-way on the property.

It might seem logical that the new law would have preempted this kind of lawsuit – because this situation is exactly what legislators had in mind when they passed the law. But I’ve learned in this industry that a new law is only truly secure after the law has been successfully tested in court.

This case has already made it through the first round of the courts, where a U.S. District Court sided with the property owners. The ruling said that the new law stripped the property owners of existing rights that had been established in the 1989 easement agreement with the Cooperative. The court said that landowners had lost property value even without the Cooperative trying to hang new fiber on existing easements.

This lawsuit has to bring a chill to any fiber builder in the country that relies on private rights-of-way and easements to build their project. The right to use public rights-of-way has been long established and cemented by challenges to laws early in the last century. This new Virginia law tried to grant the same status to private easements that have always been given to public rights-of-way – and that is a new area of law.

I would have to assume that for this issue to stop the fiber expansion that the Cooperative must have a lot of electric lines that use private rights-of-way. Electric grids routinely cross private land – the large tower transmission grids mostly use private rights-of-way, and utilities rarely build high-voltage routes along public roads. If the issue was only with this one farm, the Cooperative could probably bypass it, but I’m sure the issue applies to many other properties as well.

The lawsuit should raise a red flag for any ISP that has rights-of-way on private land. There are a lot more private easements in place than you might suppose. Many subdivisions own their own roads. Private roads are routine in rural areas. ISPs routinely rent land for huts and cabinets.

None of this will be any comfort to the many households that were slated to get fiber broadband. Electric cooperatives like Rappahannock are leading the way in a lot of rural America for bringing fiber to areas with little or no current broadband. Virginia has a state goal to solve the rural broadband gap by the end of 2024, and this lawsuit will put a damper on those plans. Just a little side note that will drive broadband advocates crazy, the property owner in this case has subscribed to Starlink and is not impacted by having to wait for better broadband.

Regulation - What is it Good For?

Satellite Companies Fighting over RDOF

There has been an interesting public fight going on at the FCC as Viasat has been telling the FCC that Elon Musk’s Starlink should not be eligible for funding from the Rural Digital Opportunity Fund (RDOF). At stake is the $886 million that Starlink won in December’s RDOF auction that is still under review at the FCC.

Viasat had originally filed comments at the FCC stating that the company did not believe that Starlink could fulfill the RDOF requirements in some of the grant award areas. Viasat’s original filings listed several reasons why Starlink couldn’t meet its obligations, but the primary one was that Starlink technology was incapable of serving everybody in some of the more densely populated RDOF award areas. Viasat calculated the number of potential customers inside 22-kilometer diameter circles – the area that it says can be covered by one satellite. According to Viasat’s math, the most customers that could reasonably be served is 1,371 customers – and the company identified 17 RDOF areas with a greater number of households, with the maximum one having 4,126 locations.

There have been similar claims made by others in the industry who say that Starlink will be good for serving remote customers, but the technology is not capable of being the only ISPs in an area and serving most of the homes simultaneously.

Last month, Viasat made an additional claim that Starlink does not have sufficient backhaul bandwidth to serve a robust constellation. This stems from an ongoing tug-of-war at the FCC over 12 GHz spectrum. Starlink wants this spectrum to enable it to create more ground stations for transferring data to and from the satellite constellation. This is spectrum that Dish Networks owns that it wants to purpose for 5G. Dish Network has offered a spectrum-sharing plan that would greatly reduce Starlink’s use of the spectrum. The FCC filings on the topic are interesting reading, as wireless engineers on both sides of the issue essentially argue that everything the other side says is wrong. I’m not sure how the FCC ever decides which side is right.

The latest Viasat criticism of Starlink is based upon public statements made by Elon Musk at the Barcelona MWC conference, where he commented on how hard it is to fund the satellite business. Musk said that the business is likely to need between $20 billion and $30 billion in additional investment to reach the goal of over 11,000 satellites. Musk said his first priority is just to make sure that Starlink doesn’t go bankrupt. Viasat says that this is evidence that Starlink is a ‘risky venture’, something the FCC originally said should not be eligible for the federal RDOF subsidy.

Starlink recently asked the FCC to ignore everything that Viasat has filed and said that the Viasat comments are anti-competitive and are a ‘sideshow’. This has to be a huge puzzler for the FCC. We already see Starlink bringing good broadband to remote places that don’t have any broadband today. But the question in front of the FCC is not if Starlink can be a good ISP, but whether the company deserves a 10-year federal subsidy to support the business. Obviously, if Starlink needs at least $20 billion more to be viable, then getting or not getting the $886 million spread over ten years is not going to make a difference in whether Starlink makes it as a company.

The FCC is in a bind because many of these same issues were raised before the RDOF auction in an attempt by others to keep Starlink out of the auction. It wasn’t hard to predict that Starlink would win the subsidy in some of the most remote places in the country since it was willing to bid lower than other ISPs. The FCC voted to allow Starlink into RDOF just before the auction, and is now seeing that original decision challenged.

It’s also an interesting dilemma because of the possibility of an infrastructure plan by Congress that would fund fiber in most of the places won by Starlink. Would the FCC had allowed Starlink into the RDOF had it known about the possibility of such federal grants – I would have to guess not. The FCC is now faced with depriving areas from getting a permanent subsidy if they continue with the plan to give the RDOF to Starlink. That would just be bad policy.

Regulation - What is it Good For? Uncategorized

Another Problem with RDOF

I have been critical of the RDOF awards for a number of reasons, but one of the worst problems isn’t being discussed. When the FCC picked the eligible areas for the RDOF awards, there was no thought about whether the grant award areas make any sense as a service area for an ISP. Instead, the FCC picked Census blocks that met a narrow definition of speed eligibility without any thought of the nearby Census blocks. The result is that RDOF serving areas can best be described as a checkerboard, with RDOF serving areas scattered in with non-RDOF areas.

The easiest way to show this is with an example. Consider the community of Bear Paw in western North Carolina. This is a community of 200 homes, 42 cottages, and 23 condominiums that sticks out on a peninsula in Lake Hiwassee. The community was founded to house the workers who originally built the Tennessee Valley Authority’s Nottley dam on the Hiwassee River. Today’s community has grown from the original cottages. As you might expect for a small town deep into Appalachia, the town has poor broadband, with the only option today being slow DSL offered by Frontier. Residents describe the DSL as barely functional. This is exactly the kind of area where the RDOF awards were supposed to improve broadband.

Below are two maps. The first is printed from the FCC’s RDOF maps – it’s a little hard to read because whoever created the map at the FCC chose a bizarre color combination. On the right is a more normal map of the same area. The red areas on the FCC map are the places where RDOF was claimed by an ISP. As you can see, in a community with only 265 households, the FCC awarded RDOF to some parts of the community and not to others.







The checkerboard RDOF award causes several problems. First, any ISP will tell you that the RDOF award areas are ludicrous – it’s impossible for an RDOF winner to build only to the red areas.

And that’s where the second problem kicks in. The RDOF award winner in Bear Paw is Starlink, the satellite company. Starlink is not going to be building any landline broadband. Unfortunately for Bear Paw, giving the award to Starlink makes no sense. All of the lots in Bear Paw are in the heavy woods – that’s one of the attractions for living in the community. Everything I’ve read say that satellite broadband from Starlink and others will be sketchy or even impossible in heavily wooded areas.

The obvious solution if Starlink doesn’t work well is for the community to try to find another ISP to build fiber to the community. But getting another ISP to build in Bear Paw won’t be easy. Other federal and state grant programs will not fund the red RDOF areas on the FCC map. Even should Congress pass the infrastructure bill, there might not be enough grant money made available to an ISP to make a coherent business case to build to Bear Paw. The FCC checkerboard awards significantly curtail any future grant funding available to serve the community.

The shame of all of this is that any other grant program would have brought a real solution for Bear Paw. With most grants, an ISP would have proposed to build fiber to the entire community and would have applied for the grant project to make that work. But the RDOF awards are going to make it hard, or impossible to ever find solutions for the parts of the checkerboard that the RDOF left behind.

By spraying RDOF awards willy-nilly across the landscape, the FCC has created hundreds of places with the same situation as Bear Paw. The FCC has harmed Bear Paw in several ways. It first allowed a company to win the RDOF using a technology that is not suited to the area. Why wasn’t Starlink banned from bidding in wooded parts of the country? (Or an even better question might be why Starlink was allowed into the RDOF process at all?)  Since no other grants can be given to cover the RDOF areas, there will probably not be enough grant money available from other sources for an ISP to bring fiber to the community. Even if the federal infrastructure funding is enacted and the federal government hands out billions in broadband grant money, towns like Bear Paw are likely going to get left behind. How do you explain to the residents of Bear Paw that the FCC gave out money in a way that might kill their once-in-a-generation chance to get good broadband?

Regulation - What is it Good For? The Industry

It’s Time for Collaboration

There has never been a better time for communities to collaborate to fund better broadband solutions. It almost seems like it’s raining grant money this year, and there is likely a lot more in grants coming over the next few years.

Communities are going to get the biggest bang for the buck with a collaborative effort. If each stakeholder in a community seeks its own solution, the community will see a wasteful overlap of broadband construction instead of seeing money spent wisely to make sure that everybody gets the broadband solution they want.

What do I mean by collaboration? I mean coordinating funding efforts to take the best advantage of grant monies that available to different community stakeholders right now. Consider the following sources of funding available today:

  • In the most unexpected grants of all, local counties, cities, towns, and townships got a share of the $350 billion Coronavirus State and Local Fiscal Recovery Fund that can be used for broadband.
  • Schools can fund some fiber infrastructure through the E-Rate capital program.
  • Libraries have more grants headed their way than ever before.
  • Rural health care grants through the Universal Service Fund are higher than ever.
  • A $1 billion grant program for tribes is just now closing, but more will be on the way.
  • States have announced huge amounts of state grants, with funding from the ARPA being added to existing state broadband grants.

There are also huge amounts of more traditional broadband grant funds to consider:

  • Many rural communities have areas that will hopefully get broadband through the RDOF awards. But these awards rarely cover everything in an area and are often a strange checkerboard of RDOF and non-RDOF Census Blocks. The FCC is still wading through the long forms for the RDOF winners, and once funds are released, the winners will have four years to build infrastructure.
  • A $288 million grant program with the NTIA just closed.
  • There is a $3 billion grant program that will be coming from the EDA later this year.
  • There is a second round of ReConnect grants that will likely commence before the end of the year.
  • Some areas saw awards made in last year’s CAF II reverse auction. If these areas are not already under construction, they will be within the next year or two.

Smart communities are going to organize the many stakeholders to take the best advantage of this funding. There are a number of ways that collaboration can make for the best broadband result:

  • Communities need to make sure that somebody is going to somehow fill in the checkerboard of grant areas awarded in the RDOF and the CAF II reverse auction. Those two grants relied on faulty FCC mapping data, and there are huge swaths of equally needy areas nearby to most of these grant areas. Areas that don’t get covered by good broadband in the next few years could be left behind for a long time.
  • There is a huge opportunity for anchor institutions to get a long-term facility-based fiber solution. If all of the anchor institutions in an area join together, they can collectively negotiate for long-term IRUs for a private fiber network that can connect schools, libraries, health care facilities, city and town anchor institutions, non-profits, rural electric substations, and tribal facilities together. A collaborative network is the ultimate way to cut long-term costs through the collective purchase of transport and broadband. Such collaboratives will be even stronger if they stretch across multiple counties. These collaboratives will be great for the companies building RDOF or other grant-funded networks – they get a large long-term revenue stream.

This is the time for communities to have these discussions. The big grant monies are going to be awarded over the next few years – so it’s not too late to figure this out. I think communities will be surprised to find out how much buying power all of the anchor institutions and other stakeholder have when they combine forces.

Regulation - What is it Good For?

Why No Outcry about AT&T DSL?

I’ve been a little surprised that there hasn’t been any regulatory reaction to AT&T pulling out of the DSL market last October. The company stopped taking orders for DSL connections. I’ve heard instances where the company won’t even connect a voice customer on copper. I’ve heard stories that once DSL is disconnected for any reason that the company won’t reconnect it. If somebody buys a house served only by AT&T, the new owner can’t get DSL, even if the old owner had it. If somebody gets disconnected for late payment, they aren’t being allowed to reconnect. I heard a few stories lately of customers who had technical trouble with DSL and were told that the company can’t and won’t fix it.

This is a huge change for customers. In towns where there is a cable competitor, AT&T has suddenly made the cable company into a de facto monopoly as the only broadband option for a new customer. In rural areas where there is no cable competitor, the change strands homes with no landline alternative. AT&T says publicly that rural DSL is being replaced by fixed cellular broadband – but it seems like the wireless product is far from universally available. Many homes are left with no alternative other than satellite broadband, assuming they have a decent view of the sky.

I really thought that at least a few states would react to the change. Just a year ago, the Public Service Commission in New Mexico took CenturyLink to task for doing a poor job for copper customers. I expected at least a few states to be up in arms about AT&T. Perhaps state regulators have finally given up on telephone copper and are getting realistic about the coming end of the copper networks. Or perhaps AT&T has a strong enough lobbying effort in states to stave off a public enquiry.

AT&T took a different approach than Verizon, which has been following the formal rules for turning down copper networks. Verizon goes through the process of notifying all customers that they are withdrawing copper services, and months later cuts the copper dead. AT&T is not going out of service yet. Customers with DSL are allowed to keep the service. But give the company the slightest reason to disconnect, and AT&T is gone. The company is withdrawing from copper through attrition instead of a formal withdrawal. It’s an interesting tactic because it doesn’t trigger any of the regulatory rules associated with fully walking away from copper. It’s pretty clear that this is AT&T’s first shot and that the day will come when they’ll disconnect the remaining customers and walk away from the copper business entirely.

I’ve been hearing similar stories for several years about CenturyLink and Frontier. Customers are put on a years-long wait list to get new service. Customers are often told that problems can’t be fixed, and the telco walks away instead of repairing problems. But those two companies have not formally stopped providing DSL to new customers in the same manner as AT&T.

The reaction that I was expecting was for states to make AT&T prove that it has a fixed cellular alternative in an area before refusing customers on copper. That would be hard for AT&T to do since many rural areas have poor or no cellular service.

The biggest impact of this change is not in rural areas. AT&T’s rural DSL in most places has been so slow that it barely works. Many rural homes walked away from DSL years ago since the monthly rate couldn’t be justified with the tiny bandwidth being delivered. The bigger impact from this change comes in cities and towns where AT&T is the telephone incumbent. DSL has retained a decent market share in many towns because it’s cheaper than cable broadband. In the many towns and cities that AT&T serves, the company has taken away the low-price option from the market. I’m sure that potential customers are still being surprised when they find out that the DSL option is off the table. I haven’t seen any noticeable reaction from the cable companies in these markets, but by now, they realize they are a monopoly. We know over time this will mean a slow return to monopoly practices – higher prices, less attention to maintenance, slower repair times, and less incentive to upgrade.