Why No Outcry about AT&T DSL?

I’ve been a little surprised that there hasn’t been any regulatory reaction to AT&T pulling out of the DSL market last October. The company stopped taking orders for DSL connections. I’ve heard instances where the company won’t even connect a voice customer on copper. I’ve heard stories that once DSL is disconnected for any reason that the company won’t reconnect it. If somebody buys a house served only by AT&T, the new owner can’t get DSL, even if the old owner had it. If somebody gets disconnected for late payment, they aren’t being allowed to reconnect. I heard a few stories lately of customers who had technical trouble with DSL and were told that the company can’t and won’t fix it.

This is a huge change for customers. In towns where there is a cable competitor, AT&T has suddenly made the cable company into a de facto monopoly as the only broadband option for a new customer. In rural areas where there is no cable competitor, the change strands homes with no landline alternative. AT&T says publicly that rural DSL is being replaced by fixed cellular broadband – but it seems like the wireless product is far from universally available. Many homes are left with no alternative other than satellite broadband, assuming they have a decent view of the sky.

I really thought that at least a few states would react to the change. Just a year ago, the Public Service Commission in New Mexico took CenturyLink to task for doing a poor job for copper customers. I expected at least a few states to be up in arms about AT&T. Perhaps state regulators have finally given up on telephone copper and are getting realistic about the coming end of the copper networks. Or perhaps AT&T has a strong enough lobbying effort in states to stave off a public enquiry.

AT&T took a different approach than Verizon, which has been following the formal rules for turning down copper networks. Verizon goes through the process of notifying all customers that they are withdrawing copper services, and months later cuts the copper dead. AT&T is not going out of service yet. Customers with DSL are allowed to keep the service. But give the company the slightest reason to disconnect, and AT&T is gone. The company is withdrawing from copper through attrition instead of a formal withdrawal. It’s an interesting tactic because it doesn’t trigger any of the regulatory rules associated with fully walking away from copper. It’s pretty clear that this is AT&T’s first shot and that the day will come when they’ll disconnect the remaining customers and walk away from the copper business entirely.

I’ve been hearing similar stories for several years about CenturyLink and Frontier. Customers are put on a years-long wait list to get new service. Customers are often told that problems can’t be fixed, and the telco walks away instead of repairing problems. But those two companies have not formally stopped providing DSL to new customers in the same manner as AT&T.

The reaction that I was expecting was for states to make AT&T prove that it has a fixed cellular alternative in an area before refusing customers on copper. That would be hard for AT&T to do since many rural areas have poor or no cellular service.

The biggest impact of this change is not in rural areas. AT&T’s rural DSL in most places has been so slow that it barely works. Many rural homes walked away from DSL years ago since the monthly rate couldn’t be justified with the tiny bandwidth being delivered. The bigger impact from this change comes in cities and towns where AT&T is the telephone incumbent. DSL has retained a decent market share in many towns because it’s cheaper than cable broadband. In the many towns and cities that AT&T serves, the company has taken away the low-price option from the market. I’m sure that potential customers are still being surprised when they find out that the DSL option is off the table. I haven’t seen any noticeable reaction from the cable companies in these markets, but by now, they realize they are a monopoly. We know over time this will mean a slow return to monopoly practices – higher prices, less attention to maintenance, slower repair times, and less incentive to upgrade.

The Senate Broadband Grants

Now that it looks like the House might sign an infrastructure bill, I figured it was time to take a harder look at the $42.45 billion Senate broadband grant program. There is still more to be done before this becomes law – the House needs to pass infrastructure legislation, and then any differences in the two bills must be reconciled. That still gives lobbyists a lot of time to try to change the rules. The following is not a complete summary of the Senate bill, just the highlights that I think are the most important.

The Senate created a $42.45 billion grant program to be administered by the NTIA. Administered in this case means the NTIA will interpret the final grant rules within the confines set by this legislation. The grant money will go to states, and states will pick grant winners using the rules dictated by the NTIA. This won’t be the free-for-all as we’ve seen with CARES and ARPA funding, and I expect fairly explicit rules from the NTIA based upon whatever is demanded in the final legislation. 10% of the funding will be distributed nationwide to reach the highest cost areas to serve. That determination probably comes from the FCC’s cost models. Every state will get at least $100 million.

The grants define unserved to mean areas that lack access to 25/3 Mbps broadband. Underserved means areas that have broadband greater than 25/3 but less than 100/20 Mbps. Since speed is the determining factor, this means using the FCC mapping data – unless the NTIA allows an alternate method. States must offer grant funding to cooperatives, nonprofit organizations, public-private partnerships, private companies, public or private utilities, public utility districts, or local governments.

The funding can be used for several purposes, including building last-mile infrastructure to unserved and underserved locations, connecting to anchor institutions, data collection and mapping, providing WiFi or reduced-cost broadband to eligible MDUs, broadband adoption, or any other use allowed by the NTIA. This list is worth noting because all of the money doesn’t go to last-mile infrastructure.

A state must certify that it will bring broadband to all unserved areas and anchor institutions before money can be used for underserved areas. States must prioritize the following in awarding grants: deployment to persistent poverty counties and areas, the speeds of the proposed technology, the length of time required to build a network, and compliance with federal labor laws.

There is a challenge process, and local governments, nonprofit organizations, or ISPs can challenge the eligibility of proposed grant areas, including if an area is unserved or underserved. The NTIA can intervene in these challenges.

Grant winners must bring at least 25% matching funds. Matching funds can be in cash or in-kind contributions. Matches can be made from CARES and ARPA funds.

Grant winners must build broadband networks that provide speeds of at least 100/20 Mbps. Broadband must be made available to every home and business in a grant area that wants service. Grant recipients must offer at least one low-cost broadband service option for eligible subscribers – the NTIA will determine the definition of an acceptable low-cost option.

It won’t be easy to understand the winners and losers in this grant until after the NTIA crafts the specific grant rules. The cable companies, WISPs, and maybe even the satellite companies already won a huge battle by setting the eligible technology requirement to 100/20 Mbps. The challenge process allows the incumbents to delay the grant process and create havoc.

But the public is also a big winner. There have been shenanigans for years from telcos that have lied about rural areas that can receive 25/3 Mbps broadband. Allowing funding to areas with speeds up to 100/20 Mbps washes away most of the past nonsense.

The big ISPs likely also view this as a victory because they probably feel that they have a decent chance in some states of winning most of the grant funding. How well the grant will work in a state is going to depend upon how well a particular state does its job – many states are not prepared to handle this kind of grant program.

There are still areas that will fall through the cracks. For example, it might be financially infeasible for an ISP to take this funding in high-cost places like Appalachia if an ISP has to provide a 25% matching while also offering a low-income broadband product. There are still places where costs are so high that a 75% grant is not sufficient.

Just about everybody won by not using a reverse auction.

Did the Senate Just Change the Definition of Broadband?

The recently passed Senate infrastructure legislation included a new definition of an underserved household as being a location that lacks access to reliable broadband service offered with a speed of not less than 100 megabits per second for downloads; and 20 megabits per second for uploads, plus a latency sufficient to support real-time, interactive applications. It’s hard to see this as anything other than a new definition of broadband.

Before jumping completely off this cliff, it’s worth revisiting the history of the federal definition of broadband. The FCC is required to establish a definition of broadband. Congress established this obligation in Section 706 of the FCC governing rules that require the agency to annually evaluate broadband availability in the country. The FCC must then report the state of broadband to Congress every year using the established definition. In these reports, the FCC compiles data about broadband speeds and availability and offers an opinion on the state of broadband in the country. Further, the FCC must act if broadband is not being deployed in a timely manner – but no FCC to date has concluded that the agency is not doing a great job with broadband deployment.

In 2015, the FCC established the current definition of broadband as 25/3 Mbps (that’s 25 Mbps download and 3 Mbps upload). Prior to 2015, the FCC definition of broadband was 4/1 Mbps, set a decade earlier. The FCC didn’t use empirical evidence like speed tests in setting the definition of broadband in 2015. They instead conducted what is best described as a thought experiment. They listed the sorts of functions that a “typical” family of four was likely to engage in and then determined that a 25/3 Mbps broadband connection was enough speed to satisfy the broadband needs of a typical family of four.

The FCC asked the question again in 2018 and 2020 if 25/3 Mbps was still an adequate definition of broadband. The Commission took no action and concluded that 25/3 Mbps was still a reasonable definition of broadband. There were comments filed by numerous parties that argued that the definition of broadband should be increased.

Unfortunately, as happens with many regulatory requirements, the FCC has not been an honest broker in looking at the definition of broadband. There are political consequences for any FCC that increases the definition of broadband because doing so means declaring that millions of households would suddenly classified as not having adequate broadband. If the FCC changes the definition of broadband from 25/3 to 100/20 Mbps, then every home with speeds between 25/3 and 100/20 Mbps suddenly be considered to not have adequate broadband. No FCC wants to be the one that increases the number of homes without broadband.

All of this is politics, of course, and homes and businesses know if broadband is adequate without the FCC setting some arbitrary speed as magically being broadband. Is the home that gets 27 Mbps all that different than one that’s getting 23 Mbps? Unfortunately, when it comes to being eligible for federal grant monies it matters.

I think there is a good argument to be made that the Senate just preempted the FCC in setting the definition of broadband. Declaring that every home or business with speeds less than 100/20 Mbps is underserved is clearly just another way to say that speeds under 100/20 Mbps are not good broadband.

Of course, the FCC could continue to use 25/3 Mbps as the definition of broadband for the purposes of the annual report to Congress. But Congress just changed the definition of broadband that matters – the one that comes with money. If the infrastructure legislation become law it will allow states to use the huge $42.45 billion of federal funding to upgrade the broadband in places that have broadband speeds under 100/20 Mbps.

Of course, the Senate legislation has not yet been enacted because the House of Representatives needs to now pass fresh infrastructure legislation, and then the two versions of new law must be reconciled before going into effect. But the Senate legislative language couldn’t be clearer – 25/3 Mbps is no longer the definition of broadband that matters.

State versus Federal Regulation

I’m often asked questions by clients about specific federal or state regulations, and my clients often don’t understand how or why certain regulations apply to them. One of the things I’ve always relied on in understanding a specific regulation is something that I learned many years ago in a class on regulatory theory. Regulatory theory defines five different ways that the federal government and states can impose regulations. I often understand a regulation better when I understand how that regulation was created.

Sometimes federal regulations clearly preempt all state regulatory action. Such federal rules generally stem from a law passed by Congress that is interpreted by a regulatory agency like the FCC. A simple example is CALEA rules that allow federal law officials to wiretap or gain access to customer ISP data through a valid subpoena. Congress passed the Communications Assistance for Law Enforcement Act (CALEA) in 1994, with some modifications that can from the Patriot Act. The FCC followed Congressional action with a list of specific procedures that ISPs must follow to comply with the law. I’m not aware of any cases of state regulatory rules trying to modify the CALEA requirements, so this is a federal mandate that applies in all states, without question or challenge.

More common are federal mandates that preempt less stringent state laws but permit states to establish more stringent standards. A good example of this is pole attachment rules. There have been requirements since the Telecommunications Act of 1934 that pole owners must allow attachers onto poles. These rights were further reinforced by the Telecommunications Act of 1996. The FCC followed the 1996 Act with revised federal pole attachment standards and rules. Thirty states still follow those federal rules, but another twenty states have gone on to clarify and require even stricter pole attachment rules. All states are still governed by the general principles established in the federal rules, but the FCC explicitly invited states to get more specific as they see fit.

We are currently seeing the third example of federal regulation that comes in the form of rules included in federal grants or other forms of assistance. Every federal broadband grant program has included a different definition of broadband and the geographic areas that might be funded by that grant. This is confusing to most people because they want to know the federal definition for terms like unserved and underserved. The FCC has never set a definition of these terms, so there is no federal standard. The confusion comes since every grant program is free to redefine these terms. Sometimes grant programs have no choice – for example, many of the rules used by the USDA for the ReConnect grants were established when Congress provided the money for that program. But the rules used for Reconnect grants have no bearing on grants issued by other agencies. Even when there are Congressional rules for a given grant, the agency issuing the grant always has leeway in setting specific rules that were not mandated by Congress. Anybody who has ever applied for a federal grant knows that the grant rules are unique to each grant and also immutable – you either follow the rules or you don’t get the grant.

We sometimes see regulations established by a cooperative program in which voluntary national standards are formulated by federal and state officials working together. The best historical example of this in the telecom world has been the Federal-State Joint Board. The Joint Board is a cooperative effort by FCC and state regulators to create rules. When I first joined the industry, the Joint Board established many of the rules relevant to cost separations and settlements between carriers. In more recent years, the Joint Board has focused on Universal Service Fund issues.

Finally, some regulations are clearly in the state domain, either from common practice or because the federal government never chose to regulate an issue. There are a number of common state regulations that affect telecom carriers. Some common examples are the establishment of a state Universal Service Fund, annual reporting rules for carriers to a State Commission, state rules that might prohibit municipalities from becoming ISPs, and state rules for specific taxes and levies.

Update on RDOF

The RDOF reverse auction was completed in December 2020, and since then, the FCC has been silent about the disposition of any of the winning bidders. Part of the quiet period has been due to the FCC processing the long-form applications where grant winners demonstrate that they have the technical, managerial, and financial capability of fulfilling the RDOF buildouts. The FCC’s silence came to an end recently with several actions taken by the agency.

The FCC first published a list of almost 11,000 Census blocks where RDOF grant winners have elected to reject funding totaling over $78.5 million. The defaults came for a variety of reasons. For example, there are some cases where investigation showed that an ISP was already offering fast broadband in a grant area. Other defaults come from the bidders deciding they couldn’t make an economic case for the Census block clusters won in the auction. The FCC has the ability to fine grant winners that default on winning grants, and the agency will likely consider the fines on a case-by-case basis.

Next, the FCC sent letters to dozens of RDOF winners asking them to voluntarily give up the RDOF funding for specific Census blocks. These are places that should never have been included in the auction. The list included such places as large metropolitan airports and big parking lots. Starlink alone was asked to give up funding to 6,500 Census blocks.

The FCC also sent a different letter to 197 RDOF grant winners asking them to reevaluate taking funding in specific blocks where the FCC now believes that one or more ISP already delivers speeds of 25/3 or faster. The FCC acknowledges that most of these areas were erroneously included in the action due to bad mapping data. These grant winners were warned that if they don’t relinquish these Census blocks that the FCC will look hard at blocking the funding for these areas – an action that could delay approval of overall funding for an ISP.

The FCC also announced that it was rejecting some of the claimed areas for LTD Broadband. The FCC killed funding for the company in California, Kansas, and Oklahoma due to the company not getting regulatory approval in those states to become an Eligible Telecommunications Carrier. This rejection killed $187.5 million in California to serve 76,856 locations, $81.1 million in Oklahoma to reach 39,889 locations, and $3.2 million in Kansas to reach 2,122 locations.

There has been some question of what happens to the Census blocks that are now back in play. Some have speculated that these areas would roll into the next RDOF auction. However, the FCC has no power to claim or reserve Census blocks, and any of the areas that fall out of RDOF becomes eligible for the many other grant programs now underway.

Finally, the FCC started making RDOF awards. It awarded $311 million to these companies that have made it successfully through the long-form review process. This will bring broadband to over 200,000 homes. This is only a small fraction of the $9 billion that was claimed in December. But for these companies and these Census blocks, the 10-year clock will soon be started to fulfill whatever technology was promised in the winning bid.

The FCC still has a long way to go. These actions clear up some of the messes caused by the FCC using bad mapping in setting grant areas. There are industry analysts that say that this effort is not complete and that additional areas are yet to be identified. But the FCC has disposed of less than $1 billion of the $9 billion in awards, so expect to hear periodic awards for more grant winners.

The FCC still has to wrestle with a few huge issues. There have been numerous complaints saying that grant winners like LTD Broadband should not have been allowed to participate in the auction in a big way. There are other grant winners that claimed the ability to deploy gigabit rural wireless – a technology that nobody I know thinks exists. There are likely ISPs that will not make it through the long-form review for some other reason, such as the inability to guarantee funding. The agency is likely a long way from resolving these many issues.

California Tackles Middle-Mile Fiber

The California legislature unanimously passed legislation in both the Assembly and Senate to fund $3.2 billion for middle-mile fiber and another $2 billion for last-mile networks. It’s an interesting use of broadband money and recognizes something that we don’t talk about enough. There are currently huge federal grants aimed at bringing good last-mile broadband to rural areas, but many of the rural places in America still have inadequate backhaul to reach connectivity to the Internet.

The California legislation recognizes several things. First, there is a lot of federal money currently aimed at providing last-mile networks, but barely any grant funding currently for middle-mile fiber. Second, I imagine legislators in California have all heard stories about how rural communities today lose all Internet access for hours or days at a time when the single fiber reaching the community gets cut or has an electronics failure.

The best fiber last-mile network in the world can’t function if the fiber that routes traffic to and from the Internet goes out of service. The middle-mile fiber networks in rural America are often on the oldest fiber still operating in the country. Much of these routes were built years ago by the big incumbent telephone companies to provide a fiber path to support long-distance traffic. Most of these networks are configured like a wagon wheel where straight-line fiber paths are built from rural communities to a hub larger city in a region.

Like the rest of rural networks maintained by the big telcos, many of these fiber routes have been poorly maintained, and still likely are using electronics that are past the useful life. I wrote a blog last year about how the communities in northwest Colorado banded together to build an alternative for the inadequate middle-mile network still operated by CenturyLink. The existing fiber would maddingly go out of service, sometimes for days, and would leave communities and vital institutions like hospitals and public safety with no Internet access. The communities built a new middle-mile network they labeled as Project THOR, and almost immediately after activation, the new network saved communities from another big regional middle-mile outage.


Rural communities not only need reliable middle-mile networks to deliver traffic to and from the Internet, but these networks must be redundant so that a single fiber cut doesn’t kill broadband for an entire community. That means building fiber rings. Too much of our daily lives now rely on broadband, and it is poisonous to a local economy when broadband access dies for an hour or a day.

The California middle-mile plan anticipates open-access where affordable transport can be provided to any ISP that wants to use the network. The legislation has a dual stated purpose – to first provide reliable broadband access to the rural parts of the state, but secondarily to make it easier for ISPs to serve in rural parts of the state. Buying connectivity on the traditional rural middle-mile networks is often unreasonably expensive and has been a barrier for serving pockets of rural customers.

The new networks will focus on reaching parts of the state where businesses and residents don’t have access to broadband faster than 25/3 Mbps. Almost invariably, in most states, these are the region that don’t have adequate middle-mile networks.

It’s always interesting to see any legislators pass something unanimously – it only happens when a topic is indisputably important. Once built, the new middle-mile fiber routes will serve rural California for many decades to come. There won’t be any headlines when this network is functioning and meeting its purpose – instead, there will no longer be news stories of small towns that lost broadband access for a few days.

The Battle Over Grant Rules

There has been a huge battle going on behind the scenes in Washington as Congress wrestles with including broadband grants in the infrastructure bill. Every lobbyist in Washington has been working overtime to try to influence the process. We’ve now seen the Senate’s vision of legislation, but there will still be a big fight when it’s time to reconcile the Senate and House broadband grant rules. Here are some of the key issues being contested in creating final infrastructure legislation:

What is Grant Eligible? This is the biggest area of contention. The big cable companies and telcos only want to see grants being awarded to places that don’t have 25/3 Mbps broadband today. More importantly, they want to see the eligible areas defined by the FCC’s lousy mapping. The big ISPs don’t want to make it easy for communities to make claims that actual broadband speeds are far slower than what has been reported to the FCC.

What the big ISPs definitely don’t want to see is the definition of unserved and underserved to be updated to something closer to reality. They do not want to see grant funding available to areas that don’t have 100/20 Mbps speeds today.

The worst possible scenario for the big ISPs is that local communities get to decide what areas need better broadband like is happening with ARPA funding that’s been given to cities, counties, and states. They know that cities intend to build fiber to poor neighborhoods, or even to whole cities – and the ISPs want to maintain the monopoly in these areas.

Who Decides the Grant Rules? There is also a lot of arm wrestling about who gets to decide the rules for grants. The big ISPs are not happy that Treasury got to set the rules for ARPA grants because that was a new group of decision-makers who have never been lobbied before. Giving the rulemaking ability to Treasury also meant that the White House could provide input by making its wishes known.

There is no perfect answer to this question from the perspective of the big ISPs. They don’t want Treasury to set the rules again. There are several alternatives. One is to let the FCC award the funding through another reverse auction, the option with the highest chance of following the FCC mapping. The rules can be set by the NTIA – a group that lobbyists know well – but which is likely unpredictable if handed tens of billions of dollars. Grant money could be given to the USDA to administer through the ReConnect grant program and the arcane rules at the RUS. Or funding could be given to states to decide – but only of the rules are first restricted to limit states from using the money freely.

Interestingly, I’ve heard credible rumors over the last few months that each of these options has been considered during various permutations of writing the legislation. The key goal for the big ISPs is to be able to influence grant rules, regardless of who will dispense the money. If the rules are set tight enough, much of the grant money could be unusable – and nothing would please the big ISPs more. For example, if money is divvied up evenly by state, then there are many states that can’t spend a pro rata share of the billions. If the rules can strictly only be used in places that can’t get 25/3 Mbps, then it’s probably impossible to spend much of the infrastructure money.

Broadband Speeds for New Infrastructure. A fight over speeds is the same thing as a fight over the technologies that can be built with grant funds. It seems the Senate has accepted the goal for new infrastructure at 100/20 Mbps. This speed enables grant funding to go to cable companies, WISPs, and satellite broadband. If grants can only be used to fund 100 Mbps symmetrical speeds, those two technologies are largely eliminated. You may have noted a spate of opinion pieces lately throughout the industry claiming that we don’t need symmetrical speeds. This is what that argument is all about.

Summary. You can quickly see who is winning the lobbying war by skimming through proposed legislation for these critical elements. On one side of the battle are broadband and community advocates who think we should largely use grant money to build fiber. This side argues for symmetrical speeds. They want communities to decide the areas that need better broadband rather than stick with erroneous FCC maps.

On the other side are big ISPs that don’t want to see fiber everywhere. They are pushing for a strict definition of areas that are eligible for grants, and they want technologies that barely meet 100/20 Mbps to be grant eligible. They want the ability to influence the writing of the grant rules.

We are now deep into the sausage-making part of legislation, and all of these issues are still open for debate. The Senate legislation clearly favors the big ISP position. There is still work to be done to get a bill that reconciles the House and Senate plans – and much more infighting to come. But at this point, the big ISPs and their lobbyists are winning the fight – which likely means in ten years, we’ll still be wondering why many parts of the country didn’t get adequate broadband.

Let’s Not Forget the Lobbyists

Common Cause recently released a report, Broadband Gatekeepers, that describes the influence that lobbyists have on broadband policies. The numbers are staggering – the ISP industry spent $234 million lobbying the 116th Congress (2019 and 2020). That number is likely understated since the rules on reporting lobbying are lax, and enforcement is almost nonexistent. That number doesn’t include the huge amounts of lobbying efforts at State legislatures.

The evidence of lobbying is all around us in the industry, yet we don’t talk about it very much. Consider the massive push for 5G a few years ago by the federal government. Industry lobbyists convinced both Congress and the White House that the country was facing a 5G crisis. The lobbyists injected a sense of urgency through the guess of arguing that we are losing the 5G race to China and that our economy will never recover. It’s hard to fully fault the FCC for passing pro-5G rules when they were being pushed to do so by Congress and the White House, all which were prodded by lobbyists.

The cellular carriers had some legitimate concerns, and that’s usually the case for most issues that are heavily lobbied. The FCC was dragging its feet in approving new cellular spectrum. Cities were taking a long time to approve the location of small cell sites.

The intense lobbying on 5G paid off. The FCC gave the carriers carte blanche authority to place small cells anywhere and at a low licensing cost. The FCC sped up and pushed through a ton of new spectrum for the industry. But the 5G effort went too far, and there was serious talk about the US Government buying Nokia or Ericsson so that the US could control its 5G future. All of the government reactions to the supposed 5G crisis were crazy when you consider that we still don’t have any phones served with 5G – and the world has not come tumbling down around us.

5G wasn’t the only issue on lobbyists’ plates during the last decade. Intense lobbying got the last FCC to eliminate broadband regulation by killing its Title II authority. State regulatory Commissions have largely deregulated the big ISPs over the last decade. The big telcos pocketed most of the $10 billion CAF II grants with no repercussions. Numerous states legislatures have passed prohibitions against municipalities and even electric cooperatives from offering broadband. AT&T decided last year to unilaterally stop selling DSL, with no regulatory pushback that I can see. The big ISPs have regularly redlined poor urban neighborhoods. The big telcos stopped maintaining rural copper networks. The two biggest cable companies are on a trajectory for having a basic $100 broadband product. This list could go on for a few pages. It’s pretty obvious that lobbying has paid off big time for big ISPs and cellular carriers. .

The Common Cause report looks at the political spending and lobbying spending of the big ISPs. The report demonstrates how specific lobbying efforts have derailed attempts at broadband regulation. The report specifically focuses on ISP spending during the 116th Congress as an example of how political spending impacts legislation.

Everything detailed in this report is dwarfed by the current lobbying efforts of the big ISPs, which are trying to stave off real competition through the billions in broadband grants that are raining down on the industry. The big ISPs are genetically opposed to competition in any form, even in the rural markets they wrote off decades ago. There is intense lobbying at every level of government to not use grants to build fiber except in rural areas.

The report discusses some commonsense legislation that could put some brakes on lobbying by requiring more openness and disclosures. While this blog looks only at the broadband industry, it’s scary to think that there are many similar lobbying efforts by large corporations throughout the economy.

Federal Broadband Coordination

The White House is now requiring that the three agencies that are involved with broadband funding – the National Telecommunications and Information Administration (NTIA), the Federal Communications Commission (FCC) and the U.S. Department of Agriculture (USDA) – to share information about broadband funding.

The agencies have agreed to share the following information about any location that is receiving or is under consideration for federal broadband funding:

  • Every ISP serving in the area
  • Description of broadband technology and speeds being delivered for each ISP
  • Maps showing the geographic coverage of ISPs
  • Identity and details of any ISP that has received or will receive broadband funds from any of the three agencies.

This kind of coordination seems vital in the current environment and where all three agencies are awarding sizable grants. It’s not hard to imagine having different ISPs seeking grants from different federal grant programs to serve the same geographic areas.

But then what happens? Will two agencies collaborate to decide which grant program will make the award? That would add another layer of complexity to grants if a grant application filed with one agency is suddenly conflicting with a grant request at another agency. Will ISPs be informed if discussions are happening behind the scenes between agencies concerning a grant request?

This also raises the issue of different agencies having significantly different grant requirements. We’re already seeing differences among grants in terms of identifying areas that are eligible for grant awards, different definitions of qualifying broadband speeds, different lists of technologies that will or won’t be funded, etc. How can the agencies collaborate if grants trying to serve the same area are following different grant rules? For example, what does collaboration mean when grants at one agency allow for wireless technologies when grants at another agency don’t?

One of the most troublesome aspects of this arrangement is that the agencies are going to share information on existing broadband speeds and coverage. The whole industry understands that the FCC’s database for this data is often badly flawed. Some grant programs today are open to examining facts that prove the errors in the FCC mapping data – but will the FCC be open to having its data challenged by a grant request filed with a different agency?  For collaboration to be effective, all three agencies have to be working with the same set of facts.

One of the oddest aspects of this collaborative effort is that it’s only required to last two years and any of the three agencies is free after that to end the collaboration. That makes it sound like somebody doesn’t think this is going to work.

The collaboration sounds like a worthwhile effort if the agencies can work together effectively. But it’s not hard to imagine the collaborative effort adding complexity and possibly even paralysis when considering multiple grants for the same location. How will the three agencies resolve differences between grant programs? My biggest fear is that this effort will add paperwork and time to the grant process without improving the process.

Time to Revisit the Small Cell Order

One of the most controversial orders from the last FCC chaired by Ajit Pai was the order in 2018 that small cell sites be given priority. That order made two specific rulings. First, the FCC issued a declaratory ruling which said that the FCC has the authority to override local and state regulations concerning small cell deployments. Second, the order created a ‘shot clock’ that requires localities to process small cell applications quickly while also setting a cap on local fees.

Earlier this year, NATOA (the National Association of Telecommunications Officers and Advisors) and the CWA (Communications Workers of America) released a letter and a report that argues that it’s time to revisit that FCC order. They argued that the timeline set by the order is ridiculously, short considering the complexity of some of the installations. They also point out that cellular carriers are not using the FCC order to install ‘pizza boxes’ on poles as cellular carriers originally promised but are placing devices as large as refrigerators on poles, creating dangerous situations for technicians of other utilities that have to navigate around the large devices. Finally, the letter argues that there is no justifiable reason for setting small cell application fees below cost – cities are being required to subsidize the big cellular companies.

It’s important to put the original FCC order into context before taking a position on the issues raised in this letter. Starting around 2015, the cellular industry declared an emergency and said that the US was falling badly behind China in the race towards 5G. Both the White House and the Congress jumped aboard on the issue and said that quickly deploying 5G must be a top priority for the US economy. You might recall that the US government went so overboard on the 5G race that there was even talk about the US government buying Nokia or Ericsson so that the US wouldn’t be left behind.

In this environment, where pushing 5G forward was considered a national emergency, it was easy for the FCC to push through this order that gave cellular carriers everything on their wish list concerning small cell deployments. Just six years later, we can see that 5G deployment was not an emergency. None of the big promises made about 5G have materialized, and in fact, the cellular carriers are still struggling to define a business plan that will monetize 5G.

The real reason for the push for 5G was that the 4G cellular networks were getting overloaded – and small cell sites were needed to bolster the existing cellular networks. Everybody relies on our cellular networks, and that was a legitimate reason for the FCC to take action – but the cellular companies never publicly made this argument. The carriers didn’t want the public to know that their 4G networks were in trouble since that would hurt their stock prices. Instead, the cellular companies pulled off one of the biggest public relations scams in history and invented the 5G race to push through regulations that benefitted them.

I agree with the CWA and NATOA that it’s time to put the genie back in the bottle and revisit the small cell order. Like with all regulatory policy disputes, both sides of the issue have some legitimate concerns. The cellular carriers had a legitimate beef when they said that some cities took far too long to process permits for small cell sites. The cities also had legitimate concerns – they wanted some say so in the placement and aesthetics of the small cell deployments – and they want to be able to say no to putting a refrigerator-sized device in the middle of a busy pole line.

It’s time for the FCC to reopen this docket and try again. We now know the kinds of devices that the cellular carriers want to place, and there can be separate rules for placing pizza boxes versus refrigerators on poles. We also now have thousands of examples of the effort required by cities to review and implement small cell requests. A new docket could examine the facts instead of being pushed forward by an imaginary 5G national emergency.

The cellular carriers got everything they wanted, and any regulatory ruling that is this one-sided is almost always a bad one. We now understand that there is no 5G race with China – but we also recognize that cellular carriers have a legitimate need to keep placing small cell sites. It’s time for the FCC to weigh the facts and reissue rules that put a balance between cellular carrier and city interests – because that’s what good regulations are supposed to do.