The Government Needs to Address the Homework Gap

I’ve been at a bit of a loss over the last few days on what to write about, because suddenly newspapers, blogs, and social media are full of stories of how impossible it is for some students to work at home during the Covid-19 shutdowns. I’ve been writing this topic for years and there doesn’t seem to be a lot I can add right now – because the endless testimonials from students and families struggling with the issue speak louder than anything I can say.

There have been some tiny reactions of the federal government to help solve the issue. For example, the FCC removed the E-Rate exemption that said that government-powered broadband couldn’t be used for the general public. This allowed schools and libraries to aim their broadband outside for the general public and for students trying to keep up with homework. This was always a stupid restriction and I hope whatever DC bureaucrat originally dreamed this up is forced to use satellite broadband for the next year.

I’ve also seen notices from small ISPs that are distributing WiFi hotspots to students that need them. That is a great idea and I totally support. What I haven’t seen is anybody talking about who is going to pay the cellular data bills on those hotspots when they come due. Verizon has halped a little by temporarily adding 15 GB of usage to its data plans, but it doesn’t take long to rack up a big cellular data bill working on a hotspot.

These fixes are temporary bandaids. I’m sure any students benefiting by these recent changes are grateful. But it’s still second-class broadband that makes families park in cars while kids do homework. And as much as cellular hotspots are a great solution that brings broadband to the home – it’s also a curse if this brings monthly broadband bills of hundreds of dollars per month just to do homework.

I’m sure that most school systems will somehow slog through the rest of this school year. However, I’ve talked to several rural school administrators in the last week who worry that half of the children working at home are learning little or nothing while at home. I’ve seen school systems already asking if they should push all students to the next grade this year, whether they are ready or not.

The big challenge is going to come if this crisis carries forward into the next school year starting this fall. I doubt that there are many school systems with rural students that are ready to face this for a whole school year. Let’s hope that doesn’t happen, but if it does then our lack of broadband for students becomes a national shame.

I don’t have many suggested quick solutions that will help the homework gap by the fall. It’s hard to even predict how much fiber construction will be done this summer due to social distancing – likely less than was planned.

One might hope that communities will install many more outdoor-facing hotspots. It would be nice to see these at every government building and at socially-minded businesses everywhere. This is a fix that is within the reach of every community. Any business that has broadband ought to consider sharing it during the times of the day or night when the business isn’t using it. Let’s turn all parking lots for towns of all sizes into WiFi zones.

It would also be nice if the FCC could somehow turn up the pressure on the wireless carriers to provide fixed cellular broadband. This is the technology used by AT&T that beams data using cellular frequencies from cell sites to small dishes at homes. This provides a better indoor signal than regular cellular service, and the cellular companies price this more like a broadband service than cellular service. AT&T has halfheartedly rolled out the product as a way to implement their CAF II obligations – but the word from rural areas is that it’s not marketed and nearly impossible for customers to buy. T-Mobile promised to roll this product out in every rural market as part of the agreement to merge with Sprint and the government needs to hold their feet to the fire to make this happen quickly this year.

Unfortunately, the FCC sabotaged their ability to push for better broadband solutions when they killed Title II authority and stopped regulating broadband. The solution we really need this year is for Congress to resolve the Title II issue once and for all and to make the FCC responsible for finding broadband solutions. Right now everything the FCC says on the topic is rhetoric because they have no power to compel ISPs to do anything. This is no time for politics and rhetoric, but a time for action.

The FCC is Redlining Rural America

The recent statistics of broadband usage in the US provide evidence that, unwittingly, the FCC is redlining rural America. OpenVault recently released its Broadband Industry Report for 4Q 2019 that tracks the way that the US consumes data. OpenVault has been collecting broadband usage for more than ten years, and the last two reports have been eye-opening.

The most important finding is that the average data consumed by households grow by 27% from 2018 to 2019 – in the fourth quarter of 2019 the average US home used 344 gigabytes of data, up from 275 gigabytes a year earlier.

The report also looks at power users – homes that consume a lot of broadband. They report that nearly 1% of homes now use 2 terabytes per month and 7.7% use over 1 terabyte per month. A terabyte is 1,000 gigabytes. The percentage of homes using over 1 terabyte almost doubled from 4% a year earlier. This statistic is important because it shows the number of homes that are hitting the 1 terabyte data caps of companies like Comcast, AT&T, Cox, and Mediacom is quickly growing.

Homes are starting to buy gigabit broadband when it’s available and affordable. 2.8% of homes in the country now subscribe to gigabit speeds, up 86% from the 1.5% of homes that bought gigabit in 2018.

54% of homes now purchase broadband plans with speeds of 100 Mbps or faster. Another 23.6% of homes are subscribing to broadband between 50-75 Mbps. This means that nearly 78% of homes are subscribing to data plans of greater than 50 Mbps. The average subscribed speed grew significantly in 2019, up from 103 Mbps to 128 Mbps.

What’s the point of all of these statistics? They show that broadband usage and speeds in urban America is growing by leaps and bounds while broadband in rural America sits still. Urban broadband speeds have increased so rapidly that the average home in the US in 2019 got speeds that were 25 Mbps faster than what they had in 2018. The average speed of broadband in 2019 was more than 100 Mbps faster than the FCC definition of broadband. I contend that FCC actions and inaction have now culminated in the redlining of rural broadband households. It may sound drastic to call the FCC inaction redlining, but I think the word fits the situation.

Redlining historically has been used to describe how big corporations discriminate against poor neighborhoods. Redlining is more often due to neglect than to conscious decisions – grocery stores don’t consider poor neighborhoods as places to build; cable companies and telcos make upgrades in neighborhoods where they have the most customers or the highest revenue per customer. The consequence of redlining is that some neighborhoods get left behind.

The FCC has taken a series of actions that is dooming large parts of rural America to poor broadband for decades to come. One of the most egregious actions by the FCC is refusing to consider a faster definition of broadband, although every statistic shows that urban America is leaping far ahead of rural America and the broadband gap is now growing rapidly each year.

The decision to stick with the outdated 25/3 definition of broadband then boxes the FCC into having to allow federal grant dollars go to build technologies that meet the 25/3 definition of broadband. Considering how fast broadband speeds and consumption are growing, this is an amazingly shortsighted decision when considering that that grant recipients for programs like RDOF have six years to construct the new networks. There will be ISPs still constructing 25/3 broadband networks using federal money in 2026.

Next, the FCC has made it clear that any rural area that gets any federal or state subsidy – even if it’s to support 25/3 Mbps, or to support satellite broadband is not going to be eligible for future federal assistance. Once the FCC sticks you with poor broadband, they’re done with you.

Finally, the FCC continues to hide behind ludicrously dreadful maps that show good broadband available for millions of homes that have no broadband option. The rules for the 477 data collection are lousy, but that’s only half the problem, and I can’t recall ever hearing any discussion at the FCC about penalizing ISPs that file fraudulent speeds. There should be huge financial penalties for a telco that claims 25/3 speeds when nobody gets speeds even close to that or for WISPs that claim 100 Mbps speeds and deliver 15 Mbps. These ISPs are stopping whole counties from being eligible for broadband grants.

All of these FCC actions and inaction have doomed huge swaths of rural America from even participating in federal grant programs to get better broadband. If that’s not redlining, I don’t know what else to call it.

Old Regulation Rears its Head

The way that we regulate telecom services is interesting. The FCC has effectively eliminated federal regulation of broadband, the service that over 90% of households now use. Meanwhile, landline telephone service, the telecom product that is used by an ever-decreasing number of homes is still heavily regulated.

The target of much of the remaining regulation are the big telephone companies that still operate large copper networks. It’s easy to bash the big telephone companies because of the poor quality of services offered on those copper networks, and I’ve done so many times in this blog.

When you stop and think about it, those companies are still using copper networks built in the 50s, 60s, and 70s. If the telcos had been good stewards of those networks and maintained them meticulously those networks would still be 50 to 70 years old, and older than the 35-40 year expected life for the networks. The big telcos largely ignored maintenance of copper for the last 30 years or more, and frankly, it’s a miracle that the old copper networks are still working.

Perhaps the oddest aspect of telephone regulation is that a regulatory body will occasionally punish a big telco for still being in the copper business. A good example is a proceeding in New Mexico last fall where CenturyLink asked to be deregulated for landline telephone services. This doesn’t mean that they would stop offering the services, but rather that many of the old regulations put in place at the heyday of the telephone monopolies would be excused. Most states have deregulated the big telcos from a lot of the old telephone rules.

The New Mexico Public Regulations Commission (NMPRC) rejected the request and said that CenturyLink had not demonstrated that there was ‘effective competition’ for residential telephone service. It’s hard to find any way to defend that decision. First, in many states, there are now more residential telephone customers using cable company telephone services than the old telephone company copper. Interestingly, cable companies face almost no regulation in providing telephone service and any cable company in New Mexico does not live under the same rules that CenturyLink must follow.

Further, the latest surveys I’ve seen show that 96% of US adults now have a cell phone. It’s hard to say with a straight face that cellular service is not a direct competitor to landline telephone service. Considering the big recent stir at the FCC where cellular 4G coverage maps were shown to be largely fictional, perhaps a lot of rural New Mexico doesn’t have cellular coverage – and perhaps that’s what drove the Commission’s decision. It’s worth noting  that cellular companies are also not as heavily regulated as landline telephone providers.

The regulation that is most relevant in this case is the obligation to be the carrier of last resort. The telcos like CenturyLink are still expected, within some regulatory exceptions, to provide service to anybody who asks for service. That obligation doesn’t extend to the cable companies, to the cellular companies, or even to rural broadband – just to telephone service.

I have no doubt that there are rural homes in the state for which CenturyLink is the only communications link to the world. In areas where there is no cellular service and where the cable companies refuse to build networks, there are rural homes that rely on CenturyLink and other telcos to keep them connected. The regulatory question that must be asked is if such homes are sufficient reason to still strongly regulate telephone service in a state. Hopefully, the number of homes without cellular service will decrease significantly when the FCC awards the $9 billion in the 5G Fund program to extend cellular service to more remote communities.

It’s not an easy question to answer. We know CenturyLink could have done a better job of taking care of their copper. In this country, the smaller independent telephone companies did the needed maintenance to keep copper in the best shape possible. We saw the same thing in Germany where the copper networks were built at the same time as US copper, but which have been maintained better.

But in this country, most of the smaller telcos have already replaced, or have plans to replace the old copper with fiber. In Germany, there are vigorous public debates on the topic, with engineers saying that the copper networks are not likely to last more than another decade. Where copper remains the Germans have invested in the fastest DSL possible – something the big telcos here inexplicably have not done.

To some degree the decision in New Mexico is meaningless. No regulatory decision can make the old copper perform better or last longer, so there are not many practical ramifications of the Commission’s decision. CenturyLink didn’t even own these networks for most of the years when the maintenance wasn’t done – although they have likely cut back further on maintenance in recent years, as have the other big telcos.

I’m not highlighting New Mexico for this issue because many other states have made similar regulatory decisions. Regulators are rightfully mad at the big telcos for neglecting copper, and even madder that there are no plans to upgrade the copper to something better. But the time for regulators to do something about this was twenty and thirty years ago. The copper wires in New Mexico are going to die, and at some future date the networks will go dark. The regulators can choose to regulate copper down to the last day of the last customer – but to a large degree, the remaining regulations don’t mean a whole lot.

Why are US Broadband Prices so High?

I’ve wondered for years about why broadband prices are higher in the US than the rest of the world. The average price in other industrial counties is significantly lower. In France broadband averages $31, Germany is $35, Japan is $35, South Korea is $33, and the UK is $35. The average price of broadband in the US is approaching $70, so we’re at twice the price as other countries.

Thomas Philippon tackles this question in his new book The Great Reversal: How America Gave Up on Free Markets. He’s an economist at NYU who moved to the US in the 1990s but has kept an eye on Europe. The book looks at a lot more than just broadband prices and Philippon looks at other major industries like airlines, pharmaceuticals, and the US food chain.

He says something that was a wake-up call to me. Go back 30-40 years and the situation was reversed. At that time the US had some of the lowest prices in the world for things like telecom, airline tickets, pharmaceuticals, and food – and not just a little cheaper. Prices here were 30-40% lower than in Europe at that time. In just a few decades the situation has completely reversed and US prices from major industries are now much higher than in Europe.

How did this happen? He says the cause is almost entirely due to what he calls corporate concentration. In every one of the industries where prices have climbed in the US, there have been numerous large corporate mergers that have had the net impact of reducing competition. As fewer and fewer giant companies control a market there is less competition. One result of corporate concentration is the ability of industries to squash regulations through corporate lobbying – and lowering regulations inevitably leads to higher profits and higher prices.

It’s not hard to trace the history of consolidation through any of the major industries in this country. Since the readers of the blog are telecom folks, consider the telecom landscape in 1990:

  • At that time the Baby Bell companies were still separate from AT&T.
  • There was a vigorous CLEC market starting to grow led by companies like MCI.
  • There were probably a hundred different medium-sized regional cable companies.
  • There were not as many cellular companies due to the limited licenses granted for spectrum, but there was still strong regional competition from smaller cellular companies.
  • There were dozens of thriving manufacturers of telecom electronics.
  • In 1990 we had vigorous regulation, and at the state level there was still a lot of telecom rate regulation.

In just thirty years that picture changed. Most of the Baby Bells came back together under the AT&T umbrella. Comcast and Charter went on wild buying sprees and consolidated most of the medium-sized cable companies. Telcos purchased and neutered their competition, like the purchase of MCI by Verizon. Comcast and AT&T went on to merge with giant content providers to further consolidate the industry supply chain.

Telecom regulation has been all but killed in the country. This is almost entirely at the bidding of lobbyists. The current FCC went so far as to write themselves out of regulating broadband. All of these events resulted in US broadband that now costs twice as much as the rest of the industrialized world.

Meanwhile, Europe took the opposite approach. In 1990, regulation in Europe was local to each country and concentrated on protecting local industries in each country – and that led to high prices. However, after the creation of the European Union in 1993, regulators adopted the philosophy of promoting competition in every major industry. From that point forward, European regulators made it extremely difficult for competing corporations to merge. Regulators took special care to protect new market entrants to give them a chance to grow and thrive.

The regulatory policies in the US and Europe have completely flipped since 1990. The US was pro-competition in the 90s, as well-evidenced by the Telecommunications Act of 1996. Today’s FCC is working hard to eliminate regulation. European regulators now put competition first when making decisions.

It’s never too late for the US to swing back to being more competitive. However, for now, the monopolies are winning, and it will be hard to break their hold on politicians and regulators. But this is something we’ve seen before. At the turn of the nineteenth century, big corporations had a stranglehold on the US. Monopolies inevitably abuse their market power and eventually there is enough public backlash to push the government to re-regulate industries.

Sharing Grant-funded Fiber

The FCC misses no opportunity to talk about how much they support rural broadband, so hopefully they will take advantage of an opportunity to open up a lot of new fiber in rural America. The FCC is going to fund $9 billion for the 5G Fund later this year that is intended to bring better cell phone coverage to rural areas. That funding will go to cellular carriers.

A lot of the 5G Fund is going to be used to build fiber to rural cell towers and the FCC should make any such middle-mile fiber available to others at affordable rates. One of the biggest impediments to building last-mile networks in remote areas is still the absence of fiber backhaul. If the FCC is going to pay to run fiber to rural areas, then it only makes sense they would make such fiber available to last-mile ISPs.

The big cellular carriers will say that this is a burden they don’t want to bear, but that is bosh. Big companies like Verizon and AT&T are already are among the largest seller of fiber transport in the country, so they have everything needed to sell transport on these new fiber routes. The cellular companies will already be obligated to maintain the new fiber routes, so carrying additional traffic in the fibers doesn’t increase ongoing costs. Since the fiber will be free to the cellular carriers, the transport rates ought to be set low – any revenue derived on these fibers would still be pure gravy for the cellular companies

There will be smaller cellular carriers in the auction, and I would expect most of them to already be planning on selling transport on any new fiber routes. But not all of the smaller carriers will do so, so the FCC should make this mandatory – as they should for any middle-mile fiber route funded by the federal coffers.

States should also adopt this same policy. I’ve seen state grants go towards middle-mile fiber that was not made available to other carriers at affordable rates. Middle-mile fiber subsidized by the government should always be made available to others, and at subsidized rates that recognize the government contribution towards paying for the fiber.

I don’t think the same thing should be true for last-mile fiber. Most grant funding today is being used to build last-mile fiber in areas of low density. Even with grant funding, many of these last-mile projects barely pay for themselves. It would make no sense to allow competitors into last-mile fiber, because doing so might bankrupt the ISP that won the grant to build to a remote area.

The FCC mandated the sharing of middle-mile fiber built with the stimulus grants fifteen years ago. Many of those middle-mile networks have been leveraged to enable last-mile broadband projects that might otherwise never have materialized. But there are middle-mile projects from that program that didn’t follow the rules, like the middle middle-mile network in West Virginia that was basically handed to Frontier to use and charge as they wish.

The big carriers have a poor record of sharing fiber with competitors. The Telecommunications Act of 1996 mandated that the big telcos make excess dark fiber available to others with rates set at incremental cost. While some persistent ISPs have been able to lease dark fiber under those rules, the big telcos have worked hard to make it too difficult for somebody to buy. The telcos have also convinced the FCC over the years to change the rules to make it harder to buy dark fiber.

If this new batch of fiber is made available to others there must be rules. Without guidelines, the big telcos will declare that they need all of the fiber strands being built, even if they only use two fiber out of a 24-fiber. The FCC rules should include guidelines for setting a reasonable number of spare and reserve fibers.

The rules for the 5G fund have not yet been finalized, and hopefully, the FCC will do the right thing. These new fiber routes are going to some of the most remote places in the country and not all middle-mile routes will be of any use to others. Even if only one out of ten of the fiber routes built with the 5G Fund is used to create last-mile networks, the 5G Fund will have accomplished more than just improving rural cellular coverage.

How FCC Policies Hurt Communities

I was recently looking at one of the counties where the winner of the CAF II reverse auction was Viasat, a satellite broadband provider. There are many other rural counties with an identical outcome. As I thought about these counties, I came to realize that a series of FCC policies and decisions have hurt these counties in their search for better broadband. There is no single FCC action that hurt them, but a cascading series of individual decisions have made it harder for them to find a broadband solution.

The first FCC decision that created the current situation is when the current FCC declined to consider an increase in the definition of broadband from 25/3 Mbps. That definition was set in 2015 and there is ample record on file in FCC proceedings that 25/3 is already an obsolete definition of broadband.

The most recent evidence comes from OpenVault. The company just released its Broadband Industry Report for 4Q 2019 that shows the average subscribed speed in the US grew from 103 Mbps in 2018 to 128 Mbps in 2019. That result is largely being driven by the cable companies and the fiber providers that serve more than 2/3 of all of the broadband customers in the country. The FCC is stubbornly sticking to the 25/3 Mbps definition of broadband even as a large majority of households in the country are being given speeds greater than 100 Mbps.

The decision to stick to the outdated 25/3 Mbps then created a second problem for rural America when the outdated FCC speed definition is used to award federal grants. The FCC decided in the CAF II reverse auction grants that any technology that met the 25/3 Mbps speed was acceptable. The FCC boxed themselves in since they couldn’t set a higher speed threshold for grants without admitting that the 25/3 Mbps threshold is inadequate. That auction awarded funding for technologies that can’t deliver much more than 25 Mbps. What’s worse is that the winners don’t have to finish building new networks until 2025. When the FCC blessed the use of the 25/3 threshold in the reverse auction they also blessed that 25/3 Mbps broadband will still be adequate in 2025.

The next FCC decision that is hurting these specific counties is when the FCC decided to allow satellite broadband companies to bid for scarce federal broadband grant monies. The FCC probably thought they had no choice since the satellite providers can meet the 25/3 Mbps speed threshold. This was a dreadful decision. Satellite broadband is already available everywhere in the US, and a grant given to satellite broadband brings no new broadband option to a rural area and only pads the bottom line of the satellite companies – it doesn’t push rural broadband coverage forward by a millimeter.

Finally, the FCC recently rubbed salt in the wound by saying that areas that got a previous state or federal broadband grants won’t be eligible for the additional federal grants out of the upcoming $20.4 billion RDOF grant program. This means that a county where a broadband grant was given to satellite provider is ineligible for grant money to find a real broadband solution.

Such counties are possibly doomed to be stuck without a broadband solution due to this chain of decisions by the FCC. I’m sure that the FCC didn’t set out to hurt these rural counties – but their accumulated actions are doing just that. Each of the FCC decisions I described was made at different times, in reaction to different issues facing the FCC. Each new decision built on prior FCC decisions, but that culminated in counties with a real dilemma. Through no fault of their own, these counties are now saddled with satellite broadband and a prohibition against getting additional grant monies to fund an actual broadband solution.

A lot of this is due to the FCC not having a coherent rural broadband policy. Decisions are made ad hoc without enough deliberation to understand the consequences of decisions. At the heart of the problem is regulatory cowardice where the FCC is refusing to acknowledge that the country has moved far past the 25/3 Mbps broadband threshold. When 2/3 of the country can buy speeds in excess of 100 Mbps it’s inexcusable to award new grant monies for technologies that deliver speeds slower than that.

It’s obvious why the FCC won’t recognize a faster definition of broadband, say 100 Mbps. Such a decision would instantly classify millions of homes as not having adequate broadband. There is virtually no chance that current FCC will do the right thing – and so counties that fell through the regulatory cracks will have to find a broadband solution that doesn’t rely on the FCC.

California’s New Privacy Law

If you use the web much you noticed a flurry of new privacy notices at the end of last year, either through pop-up notifications when you visited a website or by emails. These notifications were all due to the California Consumer Privacy Act, the new privacy laws that went into effect on January 1.

The law applies to companies that use the web and that have annual revenues over $25 million, companies that buy, sell or collect data on 50,000 or more consumers, and companies of any size that make more than 50% of their revenue by selling customer’s personal information.

The new law has a lot of requirements for web companies operating in California. Web companies must provide California consumers the ability to opt-out from having their personal information sold to others. Consumers must be given the option to have their data deleted. Consumers must be provided the opportunity to view the data collected about them. Consumers also must be shown the identity of third parties that have purchased their data.

The new law defines personal data broadly to include things like name, address, online identifiers, IP addresses, email addresses, purchasing history, geolocation data, audio/video data, biometric data, or any effort made to classify customers by personality type or trends.

The penalties for violating the law are severe. Consumers can sue web companies for up to $2,500 if they don’t offer these options by January 1 and up to $7,500 per violation if a company intentionally violates the law. It’s not too hard to anticipate the class action lawsuits already brewing that will result from this law.

While these new rules only apply to web companies and how they interact with California consumers, many web sites have taken the safe approach and are applying the new rules to everybody. That’s a safe approach because it’s difficult for web companies to always know where a web visitor is from, especially for people who use VPNs to hide their location.

California isn’t the only state with new privacy rules. Washington has new rules that are not as severe as the California ones but that still layer a lot of new requirements onto ISPs. New York is working on a privacy law that is said to be even tougher than the California one.

These state laws are only in place because Congress seems unable to pass a set of federal privacy rules. The issue has been debated over the last two years, and draft bills have been written, but no proposed law has come before the Senate for a vote, so the issue has gone nowhere. People are rightfully concerned that their data is being used and many people want the government to set some guidelines to protect them. The states are filling the legislative void in the absence of federal legislators taking action.

Web companies will face dilemmas with a proliferation of state privacy laws. Do they try to comply only with customers in a given state? What’s most concerning for web companies is that as more states pass privacy laws that some of the laws will inevitably conflict. There is also a big question about how these laws apply to foreign companies. The California law is written to apply to every company interfacing with California consumers. To complicate matters for web companies, European Union privacy rules are also tough and will inevitably conflict with parts of the California rules.

Like all new laws, this new law will be tested in court. The more interesting challenges will be how this law might impact companies from outside California. The $25 million of revenue is a low threshold and there are numerous companies across the country with revenues of that size that have likely done nothing in response to this law. If companies keep even the most rudimentary database of customer information, then theoretically they violate this law if anybody in the database resides in California. There are going to be lawyers trying to make a living from chasing companies that violate the law, and I doubt that it will take long for the lawsuit to surface.

Will the Big Telcos Pursue RDOF Grants?

One of the most intriguing questions concerning the upcoming $16.4 billion RDOF grant program is if the big telcos are going to participate. I’ve asked the question around the industry and I’ve talked to folks who think the big telcos will fully wade into the reverse auctions, while others think they’ll barely play. We’re not likely to know until the auctions begin.

The big telcos were the full beneficiaries of the original CAF II program when the FCC surprisingly decided to unilaterally award the big telcos the full $9 billion in funding. In that grant program, CenturyLink received over $3 billion, AT&T almost $2.6 billion, Frontier nearly $2 billion, and Windstream over $1 billion. The telcos were supposed to upgrade much of their most rural properties to receive broadband speeds of at least 10/1 Mbps.

CenturyLink and Frontier both recently told the FCC that they are behind in the CAF II build out and didn’t meet their obligation at the end of 2019 to be 80% finished with the upgrades. From what I hear from rural communities, I think the problem is a lot more severe than just the telcos being late. Communities across the country have been telling me that their residents aren’t seeing faster speeds and I think we’re going to eventually find out that a lot of the upgrades aren’t being made.

Regardless of the problems with the original CAF II, the FCC is now offering the $16.4 billion RDOF grant program to cover much of the same areas covered by CAF II. The big telcos are faced with several dilemmas. If they don’t participate, then others are going to get federal assistance to overbuild the traditional big telco service territories. If the big telcos do participate, they have to promise to upgrade to meet the minimum speed obligations of the RDOF of 25/3 Mbps.

Interestingly, the upgrades needed to raise DSL speeds on copper to 25/3 Mbps are not drastically different than the upgrades needed to reach 10/1 Mbps. The upgrades require building fiber deeper into last-mile networks and installing DSL transmitters (DSLAMs) in the field to be within a few miles of subscribers. Fiber must be a little closer to the customer to achieve a speed of 25/3 Mbps rather than 10/1 Mbps – but not drastically closer.

I think the big telcos encountered two problems with the CAF II DSL upgrades. First, they needed to build a lot more fiber than was being funded by CAF II to get fiber within a few miles of every customer. Second, the condition of their rural copper is dreadful and much of it probably won’t support DSL speeds. The big telcos have ignored their rural copper for decades and found themselves unable to coax faster DSL speeds from the old and mistreated copper.

This begs the question of what it even means if the big telcos decide to chase RDOF funding. Throwing more money at their lousy copper is not going to make it perform any better. If they were unable to get 10/1 speeds out of their network, then they are surely going to be unable to get speeds upgraded to 25/3 Mbps.

We can’t ignore that the big telcos have a natural advantage in the RDOF auction. They can file for the money everywhere, and any place where a faster competitor isn’t vying for the money, the big telcos will have a good chance of winning the reverse auction. There are bound to be plenty of places where nobody else bids on RDOF funding, particularly in places like Appalachia where the cost is so high to build, even with grant funding.

It would be a travesty to see any more federal grant money spent to upgrade rural DSL particularly since the FCC already spent $9 billion trying to upgrade the same copper networks. The copper networks everywhere are past their expected useful lives, and the networks operated by the big telcos are in the worst shape. I’ve known many smaller telcos that tried in the past to upgrade to 25/3 on rural DSL and failed – and those companies had networks that were well-maintained and in good condition. It would be impossible to believe the big telcos if they say they can upgrade the most remote homes in the country to 25/3 Mbps speeds. Unfortunately, with the way I read the RDOF rules, there is nothing to stop the big telcos from joining the auction and from taking big chunks of the grant money and then failing again like they did with the original CAF II.

Kari’s Law

ISPs should be aware of two new laws that went into effect in January. The first is Kari’s law. This law requires that all phone systems sold, leased or installed after February 16, 2020 must be pre-configured so that a user can directly dial 911 with no other dialing steps required. This law is aimed to bring the 911 system into buildings served by PBXs, keysystems, Centrex, and cloud-based VoIP. The intent is to improve 911 for places like hotels, universities, hospitals, and businesses. This law puts the responsibility to comply not only on phone system manufacturers, but also on anybody who installs, manages, or operates a voice system.

The law also creates new requirements on existing phone systems that went into effect on January 6. Phone systems must be configured so that any call placed to 911 will immediately notify a ‘central location’ that a call has been placed to 911. This must be implemented immediately for any existing phone system that can provide the notification without needing a software or hardware upgrade. The FCC believes that a large percentage of phone systems are capable of making such notifications, so those notifications must be activated. It’s worth noting that there is no exemption for small businesses – anybody operating a private phone system is expected to comply with the law. Interestingly, the law applies to outward-calling locations like an outbound call center that can’t receive calls.

The FCC leaves a lot of interpretive room in defining a ‘central location’ for delivering the notification. Their goal is that a notification of a call to 911 must be sent to a location where there is a high likelihood that somebody will see it. The FCC wants 911 centers to be able to contact somebody at a business to gain entrance and to hopefully locate the person that made the 911 call.

Notifications can be in any format including emails, text messages, pop-up notifications, alarms, etc. The new rules also require some ancillary information to be included in the notification, where technically feasible. This includes information like a callback number and as much information as possible about the location of the 911 caller (room number, the wing of building, etc).

To the extent possible this also applies to ‘mobile’ devices that are controlled by a phone system. This might include cordless phones used inside of a business or desksets that can be moved anywhere within the business. Companies are not expected to track commercial cellphones that aren’t on their system or the location of devices that are carried off-site.

The second law that went into effect in January is Ray Baum’s Act. One of the many provisions of this law requires that 911 centers be provided with ‘dispatchable location’ information. In plain English, that means that first responders want to know ‘the right door to kick down’ when responding to a 911 call. This goes into effect concurrently with Kari’s law and means that businesses must provide more information to 911 centers about how to respond to call made from their location.

This new law is also aimed at the same kind of buildings as Kari’s law – places like hotels or a business where a 911 responder doesn’t know how to locate the person that called 911. At a minimum, every call to 911 must convey a validated 911 street address. That’s routine information for calls made from single-family homes, but not necessarily so for a complex business like a hospital or business high-rise complex. If a validated 911 address can be conveyed today it must be done so. Businesses are given one year to implement this change and are expected to coordinate with 911 centers if they want to provide complicated information on room number, building layouts, etc.

The law also requires better reporting for mobile devices that are controlled by a phone system. The rules expect the notification to 911 to include the best information possible about the location of a caller with a mobile device such as a cordless phone. This could be as detailed as a room number or something less accurate such as the location of the nearest WiFi hotspot. Companies have two years to implement this change.

The changes that come from the Roy Baum Act are intended to be coordinated with the nearest 911 PSAP so that they understand the nature and quality of the location data when they get a call. Businesses are expected to notify, and test as needed, to make sure that PSAPs know how to locate callers at a business. The FCC has the ability to fine parties that don’t comply with the law, and so expect a few test cases within the next year when businesses fail to implement the new rules or else fail to convey information to their 911 center.

Is the FCC Killing State Matching Grants?

In a bizarre last-minute change of the language approved for the upcoming $16.4 billion RDOF grant funds, the FCC inserted new language into the rules that would seem to eliminate grant applicants from accepting matching state grants for projects funded by the RDOF grants.

The new language specifically says that the RDOF grant program now excludes any geographic area that the Commission “know[s] to be awarded funding through the U.S. Department of Agriculture’s ReConnect Program or other similar federal or state broadband subsidy programs or those subject to enforceable broadband deployment obligations.”

It’s fully understandable that the FCC doesn’t want to award grant money from multiple federal grant programs for the same project, and that was a loophole that is sensible to close. I think most industry folks understood this to be true even if it wasn’t in writing.

But the idea of blocking states from making grants to supplement RDOF is counterintuitive. More than half of the states now have state broadband grant programs. It makes no sense for the FCC to tell states how they can spend (or in this case how they cannot spend) their state grant monies.

The whole concept of blocking state matching grants goes against the tradition of federal funding. The vast majority of federal funding programs for infrastructure encourage state matching funds and many programs require it. Matching state grants are used along with federal grants for building infrastructure such as roads, bridges, water and sewer systems, airports, etc. Why would the FCC want to block this for broadband?

The state grant programs that I’m most familiar with were planning to provide matching grants for some RDOF grants. Broadband offices at the state level understand that building broadband networks can be expensive and they know that in some cases the extra funding is needed to make broadband projects viable.

It’s important to remember that the RDOF grants are aimed at the most remote customers in the country – customers that, by definition, will require the largest investment per customer to bring broadband. This is due almost entirely due to the lower household densities in the RDOF grant areas. Costs can be driven up also by local conditions like rocky soil or rough terrain. Federal funding that provides enough money to build broadband in the plains states is likely not going to be enough to induce somebody to build in the remote parts of Appalachia where the RDOF grants are most needed.

State grant programs often also have other agendas. For example, the Border-to-Border grants in Minnesota won’t fund broadband projects that can’t achieve at least 100 Mbps download speeds. This was a deliberate decision so that government funding wouldn’t be wasted to build broadband infrastructure that will be too slow and obsolete soon after it’s constructed. By contrast, the FCC RDOF program is allowing applicants proposing speeds as slow as 25 Mbps. It’s not hard to argue that speed is already obsolete.

I know ISPs that were already hoping for a combination of federal and state grants to build rural infrastructure. If the FCC kills matching grants, then they will be killing the plans for such ISPs that wanted to use the grants to build fiber networks – a permanent broadband solution. Even with both state and federal grants, these ISPs were planning to take on a huge debt burden to make it work.

If the matching grants are killed, I have no doubt that the RDOF money will still be awarded to somebody. However, instead of going to a rural telco or electric coop that wants to build fiber, the grants will go to the big incumbent telephone companies to waste money by pretending to goose rural DSL up to 25 Mbps. Even worse, much of the funding might go to the satellite companies that offer nothing new and a product that people hate. I hate to engage in conspiracy theories, but one of the few justifications I can see for killing matching grants is to make it easier for the big incumbent telcos to win, and waste, another round of federal grant funding.