We Need Public 5G Spectrum

Last October the FCC issued a Notice for Proposed Rulemaking that proposed expanding WiFi into the 6 GHz band of spectrum (5.925 to 7.125 GHz). WiFi has been a huge economic boon to the country and the FCC recognizes that providing more free public spectrum is a vital piece of the spectrum puzzle. Entrepreneurs have found a myriad of inventive ways to use WiFi that go far beyond what carriers have provided with licensed spectrum.

In much of the country the 6 GHz spectrum is likely to be limited to indoor usage due to possible outdoor interference with Broadcast Auxiliary Service, where remote crews transmit news feeds to radio and TV stations, and Cable Television Relay Service, which cable companies used to transmit data within a cable company. The biggest future needs for WiFi are going to be indoors, so restricting this spectrum to indoor use doesn’t feel like an unreasonable limitation.

However, WiFi has some inherent limitations. The biggest problem with the WiFi standard is that a WiFi network will pause to allow any user to use the bandwidth. In a crowded environment with a lot of devices the constant pausing adds latency and delay in the system, and in heavy-use environments like a business hotel the constant pauses can nearly shut down a WiFi network. Most of us don’t feel that interference today inside our homes, but as we add more and more devices over time, we will recognize the inherent WiFi interference into our network. The place where WiFi interference is already a big concern is in heavy wireless environments like hospitals, factories, airports, business hotels, and convention centers.

Many of our future computing needs are going to require low latency. For instance, creating home holograms from multiple transmitters is going to require timely delivery of packets to each transmitter. Using augmented reality to assist in surgery will require deliver of images in real time. WiFi promises to get better with the introduction of WiFi 6 using the 802.11ax standard, but that new standard does not eliminate the innate limitations of WiFi.

The good news is that we already have a new wireless standard that can create a low-latency dedicated signal paths to users. Fully implemented 5G with frequency slicing can be used to satisfy those situations where WiFi doesn’t meet the need. It’s not hard to picture a future indoor network where a single router can satisfy some user needs using the WiFi standard with other uses satisfied using 5G – the router will choose the best standard to use for a given need.

To some degrees the cellular carriers have this same vision. They talk of 5G being used to take over IoT needs instead of WiFi. They talk about using 5G for low latency uses like augmented reality. But when comparing the history of the cellular networks and WiFi it’s clear that WiFi has been used far more creatively. There are thousands of vendors working in today’s limited WiFi spectrum that have developed a wide array of wireless services. Comparatively, the cellular carriers have been quite vanilla in their use of cellular networks to deliver voice and data.

I have no doubt that AT&T and Verizon have plans to offer million-dollar 5G solutions for smart factories, hospitals, airports and other busy wireless environments. But in doing so they will tap only a tiny fraction of the capability of 5G. If we want 5G to actually meet the high expectations that the industry has established, we ought to create a public swath of spectrum that can use 5G. The FCC could easily empower the use of the 6 GHz spectrum for both WiFi and 5G, and in doing so would unleash wireless entrepreneurs to come up with technologies that haven’t even been imagined.

The current vision of the cellular carriers is to somehow charge everybody a monthly subscription to use 5G – and there will be enough devices using the spectrum that most people will eventually give in and buy the subscription. However, the big carriers are not going to be particularly creative, and instead are likely to be very restrictive on how we use 5G.

The alternate vision is to set aside a decent slice of public spectrum for indoor use of 5G. The public will gain use of the spectrum by buying a 5G router, with no monthly subscription fee – because it’s using public spectrum. After all, 5G is a just standard, developed worldwide and is not the proprietary property of the big cellular companies. Entrepreneurs will jump on the opportunity to develop great uses for the spectrum and the 5G standard. Rather than being held captive by the limited vision of AT&T and Verizon we’d see huge number of devices using 5G creatively. This could truly unleash things like augmented reality and virtual presence. Specialty vendors would develop applications that make great strides in hospital health care. We’d finally see smart shopping holograms in stores.

The public probably doesn’t understand that the FCC has complete authority over how each swath of spectrum is used. Only the FCC can determine which spectrum can or cannot be used for WiFi, 5G and other standards. The choice ought to be an easy one. The FCC can let a handful of cellular companies decide how society will use 5G or they can unleash the creativity of thousands of developers to come up with a myriad of 5G applications. We know that creating public spectrum creates immense societal and economic good. If the FCC hadn’t set aside public spectrum for WiFi we’d all still have wires to all our home broadband devices and many of the things we now take for granted would never have come to pass.

Another Story of Lagging Broadband

We don’t really need any more proof that the FCC broadband data is massively out of touch with reality. However, it seems like I see another example of this almost weekly. The latest news comes from Georgia where the Atlanta Journal-Constitution published an article that compared actual broadband speeds measured by speed tests to the FCC data. The newspaper analyzed speed tests from June through December 2017 and compared those results to the FCC databases of supposed broadband speeds for the same time period. Like everywhere else that has done this same comparison, the newspaper found the FCC data speeds to be overstated – in this case, way overstated.

The newspaper relied on speed tests provided by Measurement Labs, an Internet research group that includes Google, the Code for Science & Society, New America’s Open Technology Institute, and Princeton University’s PlanetLab. These speed tests showed an average Internet speeds of only 6.3 Mbps for areas where the FCC data reported speeds of 25 Mbps are available.

Anybody that understands the FCC mapping methodology knows that you have to make such a comparison carefully. The FCC maps are supposed to show available speeds and not actual speeds, so to some degree the newspaper is comparing apples and oranges. For instance, when multiple speeds are available, some people still elect to buy slower speeds to save money. I would expect the average speed in an area where 25 Mbps is the fastest broadband to be something lower than that.

However, the ultralow average speed test results of 6.3 Mbps points out a big problem in rural Georgia – homes electing to buy lower speeds can’t possibly account for that much of a difference. One thing we now know that is an area shown by the FCC to have 25 Mbps broadband speeds is probably served by DSL and perhaps by fixed wireless. The vast majority of cable companies now have speeds much faster than 25 Mbps and areas shown on the maps that are served by cable companies will show available speeds of at least 100 Mbps, and in many cases now show 1 Gbps.

The only way to explain the speed test results is that the FCC maps are wrong and the speeds in these areas are not really at the 25 Mbps level. That highlights one of the big fallacies in the FCC database, which is populated by the ISPs. The telcos are reporting speeds of ‘up to 25 Mbps’ and that’s likely what they are also marketing to customers in these areas. But in reality, much of the DSL is not capable of speeds close to that level.

The newspaper also gathered some anecdotal evidence. One of the areas that showed a big difference between FCC potential speed and actual speed is the town of Social Circle, located about 45 miles east of Atlanta. The newspaper contacted residents there who report that Internet speeds are glacial and nowhere near to the 25 Mbps as reported on the FCC maps. Several residents told the newspaper that the speeds are too slow to work from home – one of the major reasons that homes need faster broadband.

Unfortunately, there are real-life ramifications from the erroneous FCC maps. There have been several grant programs that could have provided assistance for an ISP to bring faster broadband to places like Social Circle – but those grants have been limited to places that have speeds less than 25 Mbps – the FCC definition of broadband. Areas where the maps are wrong are doubly condemned – they are stuck with slow speeds but also locked out of grant programs that can help to upgrade the broadband. The only beneficiary of the bad maps are the telcos who continue to sell inadequate DSL in towns like Social Circle where people have no alternative.

The State of Georgia has undertaken an effort to produce their own broadband maps in an attempt to accurately identify the rural broadband situation. The University of Georgia analyzed the FCC data which shows there was 638,000 homes and businesses that couldn’t get Internet with speeds of at least 25 Mbps. The state mapping effort is going to tell a different story, and if the actual slow speeds indicated by the speed tests are still true today then there are going to by many more homes that actually don’t have broadband.

It seems like every examination of the FCC mapping data shows the same thing – widespread claimed broadband coverage that’s not really there. Every time the FCC tells the public that we’re making progress with rural broadband, they are basing their conclusions on maps they know are badly flawed. It’s likely that there are many millions of more homes that don’t have broadband than claimed by the FCC – something they don’t want to acknowledge.

Continued Lobbying for White Space Spectrum

In May, Microsoft submitted a petition to the FCC calling for some specific changes that will improve the performance of white space spectrum used to provide rural broadband. Microsoft has now taken part in eleven white space trials and makes these recommendations based up on the real-life performance of the white space spectrum. Not included in this filing is Microsoft’s long-standing request for the FCC to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has long favored creating just one channel of unlicensed white space spectrum per market – depending on what’s available.

A number of other parties have subsequently filed comments in support the Microsoft proposals including the Wireless Internet Service Providers Association (WISPA), Next Century Cities, New America’s Open Technology Institute, Tribal Digital Village and the Gigabit Libraries Network. One of the primary entities opposed to earlier Microsoft proposals is the National Association of Broadcasters (NAB), which worries about interference with TV stations from white space broadband. However, the group now says that it can support some of the new Microsoft proposals.

As a reminder, white space spectrum consists of the unused blocks of spectrum that are located between the frequencies assigned to television stations. Years ago, at the advent of broadcast television, the FCC provided wide buffers between channels to reflect the capability of the transmission technology at the time. Folks my age might remember back to the 1950s when neighboring TV stations would bleed into each other as ghost signals. As radio technology has improved the buffers are now larger than needed and are larger than buffers between other blocks of spectrum. White space spectrum is using those wide buffers.

Microsoft has proposed the following:

  • They are asking for higher power limits for transmissions in cases where the spectrum sits two or more channels away from a TV station signal. Higher power means greater transmission distances from a given transmitter.
  • They are asking for a small power increase for white space channels that sit next to an existing TV signal.
  • They are asking for white space transmitters to be placed as high as 500 meters above ground (1,640 feet). In the US there are only 71 existing towers taller than 1,000 feet.
  • Microsoft has shown that white space spectrum has a lot of promise for supporting agricultural IoT sensors. They are asking the FCC to change to white space rules to allow for narrowband transmission for this purpose.
  • Microsoft is asking that the spectrum be allowed to support portable broadband devices used for applications like school buses, agricultural equipment and IoT for tracking livestock.

The last two requests highlight the complexity of FCC spectrum rules. Most people would probably assume that spectrum licenses allow for any possible use of spectrum. Instead, the FCC specifically defines how spectrum can be used and the rural white space spectrum is currently only allowed for use as a hot spot or for fixed point-to-point data using receiving antennas at a home or business. The FCC has to modify the rules to allow use for IoT for farms sensors, tractors and cows.

The various parties are asking the FCC to issue a Notice of Proposed Rulemaking to get comments on the Microsoft proposal. That’s when we’ll learn if any other major parties disagree with the Microsoft proposals. We already know that the cellular companies oppose providing multiple white space bands for anything other than cellular data, but these particular proposals are to allow the existing white space spectrum to operate more efficiently.

Is One Touch Make-Ready Really Faster?

The new federal rules for one-touch make ready (OTMR) finally went into effect on May 21, after having been passed by the FCC last November. For those not familiar with the term make-ready, this refers to any work that has to be done to a pole to make it ready to add a new wire, like a fiber cable. There are national safety standards that define the distance required between different kinds of wires and also clearance required from wires to the ground – and often existing poles can’t accommodate a new wire that meets all of the needed spacing. The make-ready that’s needed to get onto an existing pole often involves rearranging existing wires to create the needed clearance, or in drastic cases a replacement of an old pole with a taller pole.

The new OTMR rules apply only in the thirty states that follow FCC pole attachment rules. The FCC has strongly encouraged other states to implement something similar, but they are not mandated to do so. The new rules also don’t change the fact that poles owned by electric cooperatives and municipalities are exempt from federal pole attachment rules.

The new rules speed up the process of getting onto most poles – but as I’ve dug into the new rules, I’m not sure they are really going to drastically cut the timeline needed to build fiber on poles.

The most significant change in the rules is a new classification of poles as either simple or complex make-ready. The order defines how to make this classification. In real life practice, the new attacher will suggest this determination, although it could get overturned by the pole owners.

There are streamlined new rules and timelines for completing the make-ready on simple poles. If the pole owner is unwilling to commit to fixing simple poles in the needed time frame, then the new attacher is allowed to make the changes after properly notifying the pole owner. The new attacher is free to rearrange any existing wires as needed, again after having properly notified all of the parties. These new rules eliminate situations where a pole owner refuses to cooperate with a new attacher, as happened in a few cities where AT&T fought Google Fiber. Something to consider is that the rules require using a make-ready contractor that has been pre-approved by the pole owner – but there are ways around this in some circumstances.

This sounds like a huge improvement in the pole attachment process because new fiber builders now have a guaranteed process for getting onto poles with simple make-ready. In most places, the majority of poles ought to be classified as simple. This isn’t true everywhere and we’ve seen cities where the majority of poles are crowded and might be classified as complex.

The problem that still remains is any complex poles. Those are poles where the make-ready could cause damage to existing wires or where the old pole must be replaced. The make-ready process for complex poles has always been slow. The new rules tighten up time frames a little, but the time required to get onto a complex pole can still take a long time.

For complex poles the process will still allow the existing wire owners to work sequentially. This coordination has to be scheduled by the pole owner. The process could still take six months even if done perfectly. What’s troubling is that I still don’t see any easy resolution for when the pole owner on the existing attachers drag their feet on complex poles. Other than some slightly improved timelines, the work on complex poles looks to still be as dreadful as today.

What does this mean for aerial construction? Consider a long run of 30 poles where 2 of the poles require complex make-ready. The new attacher can get the make-ready done on the 28 simple poles more quickly than in the past. Those simple poles might be ready to hang the new fiber within 60-days. But new fiber still can’t be hung on this route until all 30 poles are ready.

A new fiber builder still faces the same bad choices they have today. They can wait six months or more for the complex make-ready to be completed. If the complex work bogs down the new attacher faces the prospect of going to the state regulatory commission for help – something that can add an additional six months. The only other alternative is to bury around the complex poles – something that can add a lot of cost, especially when there is rocky soil.

The one-touch make-ready rules would be awesome if networks were comprised mostly of simple poles. A fiber overbuilder could have fiber on poles within a few months of starting a project. However, the reality is that there are many poles in the world that won’t be classified as simple. Many urban poles are too short to add another wire and have to be replaced with taller poles. Poles at busy intersections can already hold a maze of wires. Some poles today are going to carry other impediments like small cell sites that are going to make it harder to add fiber.

We’re going to have to see these new rules in practice before we can conclude that one-touch make-ready provides a major benefit. The FCC’s motives for OTMR are good and they are trying to favor easier fiber construction. We’re just going to have to wait to see if the new rules make any actual difference with the overall timeline for aerial construction.

The Penn State Broadband Study

Penn State conducted an intensive study of broadband in rural Pennsylvania. The study was funded by the Center for Rural Pennsylvania, a legislative agency of the Pennsylvania General Assembly.  The results will surprise nobody who works with rural broadband and the study concluded that actual broadband speeds are significantly slower than the speeds reported by the ISPs to the FCC.

The study concluded that there was not one rural county in the state where more than 50% of residents actually achieve the 25/3 Mbps that the FCC has defined as broadband. The study came to these conclusions by conducting more than 11 million speed tests. Residents voluntarily provided an additional 15 million speed test results.

These results are similar to what’s been reported by Microsoft – they measure the actual speeds at which millions of customers download Microsoft software every month. Microsoft says such tests are the best measure of real broadband speeds and that roughly half of all broadband connection in the country are done at speeds slower than the definition of broadband.

Some of the Penn State results are dramatic. For example, in Westmoreland County the FCC maps show the whole county has access to 25/3 Mbps broadband and yet the average download speed for the county was only 12.3 Mbps. Allegheny County also shows 100% broadband coverage on the FCC maps and yet the average download speed in the County is only 20 Mbps.

The study further showed that the difference between actual and reported speeds have been widening since 2014. That’s likely to mean that the FCC maps are showing improvements that aren’t really happening in the rural networks.

I have to point out, in the FCC’s favor, that households don’t always buy faster broadband when it’s available – many households continue to purchase older, slower DSL to save money. However, this phenomenon can’t come close explaining the results in Westmoreland County, where the actual speeds are only 12 Mbps – half the FCC’s definition of broadband. A more likely explanation is that the maps for the County show broadband available in rural areas where actual DSL speeds are only a few Mbps.

CCG helps our clients conduct similar tests on a smaller scale and we’ve seen similar results all across the country. The FCC maps are often pure fantasy. We routinely find rural areas that supposedly have fast broadband where there is no broadband. We often study county seats that supposedly have fast data speeds and yet where actual speed tests show something far slower. The speeds on the FCC maps come from data that is self-reported by ISPs, and some of the ISPs clearly have reasons to overreport the available speeds.

What is really irksome is that the FCC knows all of this already. They know that ISP reported broadband speeds are overstated, and yet the FCC compiles the faulty data and makes policy decisions based upon garbage data. The FCC’s recently published their 2019 Broadband Deployment Report which concluded that broadband is being deployed in the US on a reasonable and timely basis. In my opinion, that conclusion borders on fraud since the FCC knows that much of the data used to reach that conclusion is wrong. The real broadband situation in rural America is much more like what is being reported by Penn State and Microsoft. Rural residents in places like Allegheny County, Pennsylvania should be incensed that the FCC is telling the world that their broadband is up to snuff.

The FCC is starting a multi-year process to ‘improve’ the broadband maps – but this will just push the problem a few years into the future. The fact is that it’s almost impossible to map real broadband speeds in rural America. How can you map broadband speeds when real networks in rural America are in lousy shape? How can you map broadband speeds when two neighbors can experience drastically different broadband speeds due to the nuances in their copper wires? The big telcos have neglected maintenance on copper networks for decades and it’s no surprise that broadband speeds vary widely even within a neighborhood.

The best solution is to throw the maps away. The fact is that every place served by copper ought to be considered as underserved, and locations more than a few miles from a DSLAM ought to be considered as unserved. We need to stop pretending that we can somehow make a realistic map of broadband speed availability – the proposed new mapping might be a little better, but it can never be accurate. Every ISP technician that works in the field will tell you how ridiculous it is to try to map rural broadband speeds.

We need to face facts and recognize that we’re going to have these same issues until rural America gets fiber. There are now enough places in rural America with fiber to show it can be done. The FCC’s ACAM program has shown that fiber can work if there are subsidies to help with the construction costs. We’ve understood this for more than a century since we built the rural electric grids. But we probably can’t fix the problem until we’re honest about the scope of poor broadband. I have big doubts that this FCC is ever going to acknowledge that the real state of broadband is the one highlighted by this study.

Will Congress Be Forced to Re-regulate Broadband?

Last year the current FCC largely deregulated broadband. They killed Title II regulation and also handed off any remaining vestiges of broadband regulation to the Federal Trade Commission. The FCC is still left with broadband-related tasks associated with broadband. For instance, they still have to track broadband adoption rates. They are still required to try to solve the rural digital divide. They still approve electronics used to provide broadband. But this FCC has killed its own authority to make ISPs change their behavior.

I wrote a blog a month ago talking about the regulatory pendulum. Industries that become dominated by monopolies are always eventually regulated in some manner – governments either proscribe operating rules or else break up monopolies using antitrust laws. One only has to look at the conversation going on in Washington (and around the world) about somehow regulating Facebook, Google and other big web platforms to see that this is inevitable. Big monopolies always grow to trample consumers and eventually the public demands that monopoly abused be curbed.

It’s only been a little over a year since the FCC deregulated broadband and there are already topics looming that beg for regulation. There is nothing to stop this FCC or a future FCC from reintroducing regulation – the courts already gave approval for regulating using Title II. Regulation can also come from Congress – which is the preferred path to stop the wild swings every time there’s a new administration. Even the ISPs would rather be regulated by Congress than to bounce back and forth between FCCs with differing philosophies.

Over half of the states have introduced bills that seek to regulate data privacy. Consumers are tired of data breaches and tired of having their personal information secretly peddled to the highest bidder. A year ago the California legislature passed data rules that largely mimic what’s being done in Europe. The Maine legislature just passed rules that are even more stringent than California in some ways.

It’s going to be incredibly expensive and complicated for web companies to try to comply with rules that differ by state. Web companies are in favor of one set of federal privacy rules – the big companies are already complying with European Union rules and they’ve accepted that providing some privacy to consumers is the cost of doing business. Privacy rules need to apply to ISPs as much as they do to the big web companies. Large ISPs are busy gathering and selling customer data in the same manner as web companies. Cellular companies are gathering and selling huge amounts of customer data.

There are other regulatory issues that are also looming. It seems obvious that if the administration and the Senate turn Democratic that one of their priorities will be to reimplement net neutrality. The ISPs are already starting to quietly violate net neutrality rules. They are first tackling things that customers like such as sponsored video as part of a cellular plan – but over time you can expect the worst kind of abuses that were the reasons behind net neutrality rules.

I think that broadband prices are going to become a major issue. The big ISPs have all acknowledged that one of the few tools they have to maintain earnings growth is to raise broadband prices. Cord cutting is accelerating and in the first quarter the ISPs lost cable customers at a rate of 6% annually. Cord cutting looks like it’s going to go much faster than the industry anticipated as millions of customers bail on traditional cable each quarter. The pressure to raise broadband rates is growing.

We’ve already seen the start of broadband price increases. Over the last few years the ISPs have been raising rates around the edges, such as increasing the monthly price for a broadband modem. More recently we’ve seen direct broadband price increases such as the $5 rate increase for bundled broadband by Charter. We’re seeing Comcast and other ISPs start billing people for crossing data caps. Most recently we know that several ISPs are talking about significantly curtailing special rates and discount for customers – eliminating those discounts probably equates to a 10% – 15% rate increase.

At some point, the FCC will have to deal with rising broadband rates. Higher broadband rates will increase the digital divide as households get priced out from affording broadband. The public will put a lot of pressure on politicians to do something about ISP prices.

Deregulating broadband at a time when a handful of ISPs have the vast majority of broadband customers was one of the most bizarre regulatory decisions I’ve ever seen. All monopolies, regardless of industry need to be regulated – we’ve known this for over a hundred years. It’s just a matter of time before Congress is forced to step up and re-regulate broadband. It may not be tomorrow, but I find it highly unlikely that broadband will still be deregulated a decade from now, and I expect it much sooner.

Is the FCC Really Solving the Digital Divide?

The FCC recently released the 2019 Broadband Deployment Report, with the subtitle: Digital Divide Narrowing Substantially. Chairman Pai is highlighting several facts that he says demonstrate that more households now have access to fast broadband. The report highlights rural fiber projects and other efforts that are closing the digital divide. The FCC concludes that broadband is being deployed on a reasonable and timely basis – a determination they are required to make every year by Congressional mandate. If the FCC ever concludes that broadband is not being deployed fast enough, they are required by law to rectify the situation.

To give the FCC some credit, there is a substantial amount of rural fiber being constructed – mostly from the ACAM funds being provided to small telephone companies with some other fiber being deployed via rural broadband grants. Just to provide an example, two years ago Otter Tail County Minnesota had no fiber-to-the-premise. Since then the northern half of the county is seeing fiber deployed from several telephone companies. This kind of fiber expansion is great news to rural counties, but counties like Otter Tail are now wondering how to upgrade the rest of their county.

Unfortunately, this FCC has zero credibility on the issue. The 2018 Broadband Deployment Report reached the same conclusion, but it turns out that there was a huge reporting error in the data supporting that report where the ISP, Barrier Free, had erroneously reported that they had deployed fiber to 62 million residents in New York. Even after the FCC recently corrected for that huge error they still kept the original conclusion. This raises a question about what defines ‘reasonable and timely deployment of broadband’ if having fiber to 52 million fewer people doesn’t change the answer.

Anybody who works with rural broadband knows that the FCC databases are full of holes. The FCC statistics come from the data that ISPs report to the FCC each year about their broadband deployment. In many cases, ISPs exaggerate broadband speeds and report marketing speeds instead of actual speeds. The reporting system also contains a huge logical flaw in that if a census block has only one customer with fast broadband, the whole census block is assumed to have that speed.

I work with numerous rural counties where broadband is still largely non-existent outside of the county seat, and yet the FCC maps routinely show swaths of broadband availability in many rural counties where it doesn’t exist.

Researchers at Penn State recently looked at broadband coverage across rural Pennsylvania and found that the FCC maps grossly overstate the availability of broadband for huge parts of the state. Anybody who has followed the history of broadband in Pennsylvania already understands this. Years ago, Verizon reneged on a deal to introduce DSL everywhere – a promise made in exchange for becoming deregulated. Verizon ended up ignoring most of the rural parts of the state.

Microsoft has blown an even bigger hole in the FCC claims. Microsoft is in an interesting position in that customers in every corner of the country ask for online upgrades for Windows and Microsoft Office. Microsoft is able to measure the actual speed of customer download for tens of millions of upgrades every quarter. Microsoft reports that almost half of all downloads of their software is done at speeds that are slower than the FCC’s definition of broadband of 25/3 Mbps. Measuring a big download is the ultimate test of broadband speeds since ISPs often boost download speeds for the first minute or two to give the impression they have fast broadband (and to fool speed tests). Longer downloads show the real speeds. Admittedly some of Microsoft’s findings are due to households that subscribe to slower broadband to save money, but the Microsoft data still shows that a huge number of ISP connections underperform. The Microsoft figures are also understated since they don’t include the many millions of households that can’t download software since they have no access to home broadband.

The FCC is voting this week to undertake a new mapping program to better define real broadband speeds. I’m guessing that effort will take at least a few years, giving the FCC more time to hide behind bad data. Even with a new mapping process, the data is still going to have many problems if it’s self-reported by the ISPs. I’m sure any new mapping effort will be an improvement, but I don’t hold out any hopes that the FCC will interpret better data to mean that broadband deployment is lagging.

How’s CAF II Doing in Your County?

The CAF II program was tasked with bringing broadband of at least 10/1 Mbps to large parts of the country. I’ve been talking to folks in rural counties all over the country who don’t think that their area has seen much improvement from the CAF II plan.

The good news is that there is a way to monitor what the big telcos are reporting to the FCC in terms of areas that have seen the CAF II upgrades. This web site provides a map that reports progress on several different FCC broadband plans. The map covers reported progress for the following programs:

  • CAF II – This was the $11 billion subsidy to big telcos to improve rural broadband to at least 10/1 Mbps.
  • CAF II BLS – This was Broadband Loop support that was made available to small telcos. Not entirely sure why the FCC is tracking this using a map.
  • ACAM – This is a subsidy given to smaller telcos to improve broadband to at least 25/3 Mbps, but which many are using to build gigabit fiber.
  • The Alaska Plan. This is the Alaska version of ACAM. Alaska is extremely high cost and has a separate broadband subsidy plan.
  • RBE – These are the Experimental Broadband Grants from 2015.

Participants in each of these programs must report GIS data for locations that have been upgraded, and those upgraded sites are then shown on the map at this site. There is, of course, some delay between the time of completing upgrades and getting information onto this map. It’s now been 4.5 years into the six-year CAF II plan, and the carriers have told the FCC that many of the required upgrades are completed. All CAF II upgrades must be finished by the end of 2020 – and likely most will be completed sometime earlier next year during the summer construction season that dictates construction in much of the country.

The map is easy to use. For example, if you change the ‘Fund’ box at the upper right of the map to CAF II, then all of the areas that were supposed to get CAF II upgrades are shown in light purple. In these areas, the big telcos were supposed to upgrade every residence and business to be able to receive 10/1 Mbps or better broadband.

The map allows you to drill down into more specific detail. For example, if you want to see how CenturyLink performed on CAF II, then choose CenturyLink in the ‘Company Name’ box. This will place a pin on the map for all of the locations that CenturyLink has reported as complete. As you zoom in on the map the upgraded locations will show as dark purple dots. You can zoom in on the map to the point of seeing many local road names.

The map has an additional feature that many will want to see. Down on the left bottom of the map under ‘Boundaries’ you can set political boundaries like County borders.

Most counties are really interested in the information shown on the map. The map shows the areas that were supposed to see upgrades along with areas that have been upgraded to date. This information is vital to counties for a number of reasons. For example, new federal grants and most state grant programs rely on this data to determine if an area is eligible for additional funding. For example, the current $600 million Re-Connect grants can’t be used for areas where more than 10% of homes already have 10/1 Mbps broadband. Any areas on this map that have the purple dots will probably have a hard time qualifying for these grants. The big telcos will likely try to disqualify any grant requests that build where they say they have upgraded.

Probably the most important use of the map is as a starting point for counties to gather accurate data about broadband. For example, you might want to talk to folks that live in the upgraded areas to see if they can really now buy 10/1 Mbps DSL. My guess is that many of the areas shown on these maps as having CAF II upgrades are still going to have download speeds less than 10/1 Mbps. If you find that to be the case I recommend documenting your findings because areas that didn’t get a full upgrade should be eligible for future grant funding.

It’s common knowledge that rural copper has been ignored for decades, often with no routine maintenance. It’s not surprising to anybody who has worked in a DSL environment that many rural lines are incapable of carrying faster DSL. It’s not easy for a big telco to bring 10/1 Mbps broadband over bad copper lines, but unfortunately, it’s easy for them to tell the FCC that the upgrades have been done, even if the speed is not really there.

This map is just one more piece of the puzzle and one more tool for rural counties to use to understand their current broadband situation. For example, it’s definitely a plus if the big telcos really upgraded DSL in these areas to at least 10/1 Mbps – many of these areas had no DSL or incredibly slow DSL before. On the flip side, if the big telcos are exaggerating about these upgrades and the speeds aren’t there, they are going to likely block your region from getting future grant money to upgrade to real broadband. The big telcos have every incentive to lie to protect their DSL and telephone revenues in these remote areas. What’s not tolerable is for the big telcos to use incorrect mapping data to deny homes from getting better broadband.

Should Rural Fiber be a Utility?

I’ve heard or read half a dozen people in the last month say that the way we get rural fiber everywhere is to make fiber a regulated utility. This idea certainly has appeal for the many rural places that don’t have fiber today. On the surface this sounds like a way to possibly get fiber everywhere, and it’s hard to see a downside to that.

However, I can think of a number of hurdles and roadblocks to this concept that might be hard to overcome. This blog is too short to properly explore most of these ideas and it would require a 40-page whitepaper to give this topic justice. With that caveat, here are some of the big issues to be solved if we wanted to create rural fiber utilities.

What About Existing Fiber? What would we do about all of those who have already built rural fiber? There are small telcos, cooperatives, and rural communities that have already acted and found a way to fund a rural fiber network. Would we force someone who has already taken the commercial risk to somehow convert those existing fiber properties into a utility? Most small companies that have built rural fiber took on a huge debt burden to do so. Rural communities that have built fiber likely put tax revenues on the line to do so. It seems unfair to somehow force those with vision to already tackle this to somehow transform into a regulated utility.

What About Choice? One of the most important goals of almost every community I have worked with is to have broadband choice. One of the key aspects of a fiber utility is that it will almost certainly be a monopoly. Are we going to kick out WISPs in favor of a fiber utility? Would a fiber monopoly be able to block satellite broadband? .

The Definition of Rural. What areas are eligible to be part of a regulated fiber utility? If the definition is defined by customer density, then we could end up with farms with fiber and county seats without fiber. There’s also the more global consideration that most urban areas don’t have fiber today. Do we ask cities that don’t have fiber to help subsidize rural broadband? It’s impractical to think that you could force city networks to become a utility because that would financially confiscate networks from the big cable companies.

Who Pays for It? Building fiber in rural America would probably require low-interest loans from the government for the initial construction – we did this before when we built rural electric grids, so this can be made to work. But what about keeping fiber utilities solvent for the long run? The rural telephone network functioned so well because revenues from urban customers were used to subsidize service in rural places. When the big telcos were deregulated the first thing they did was to stop the internal subsidies. Who would pay to keep fiber networks running in rural America? Would urban ISPs have to help pay for rural broadband? Alternatively, might this require a tax on urban broadband customers to subsidize rural broadband customers?

Who Operates It?  This might be the stickiest question of all. Do we hand utility authority to local government, even those who are reluctant to take on the responsibility? Would people favor a fiber utility if the government handed over the operations to AT&T, Verizon, CenturyLink or Frontier? What do we do about cooperatives where the customers want to own their fiber network? Do we force existing fiber owners to somehow sell or give their networks to a new utility?

What About Carrier of Last Resort? One of the premises of being a utility is the idea that everybody in the monopoly service area can get service. Would we force fiber utilities to serve everybody? What about a customer who is so remote that it takes hundreds of thousands of dollars of construction to reach them? Who gets to decide who gets service? Does a fiber utility have to build to reach every new home?

What About Innovation? Technology never sits still. How do force fiber utilities to upgrade over time to stay current and relevant? Upgrading infrastructure is an expensive problem for existing utilities – as I found out recently when a water problem uncovered the fact that my local water utility still has some of the original main feeder pipes built out of wood. The common wisdom is that fiber will last a long time – but who pays to replace it eventually like we are now doing with the wooden water pipes? And what about electronics upgrades that happen far more often?

Government’s Role. None of this can be done without strong rules set by and enforced by the government. For example, the long-term funding mechanisms can only be created by the government. This almost certainly would require a new telecom act from Congress. Considering how lobbyists can sideline almost any legislative effort, is it even possible to create fiber utility that would work? Fiber utilities would also require a strong FCC that agrees to take back and strongly regulate and enforce broadband regulations.

Summary. I’ve only described a partial list of the hurdles faced in creating rural fiber utilities. There is no issue on this list that can’t be solved – but collectively they create huge hurdles. My biggest fear is that politics and lobbying would intervene, and we’d do it poorly. I suspect that similar hurdles faced those who created the rural electric and telephone companies – and they found a way to get it done. But done poorly, fiber utilities could be a disaster.

Summary Conclusions for Designing an FCC Broadband Grant

The earlier series of blogs looked at a number of ideas on how the FCC could create the most effective federal grant program for the upcoming $20.4 billion of announced grants. Following is a summary of the most important conclusions of those blogs:

Have a Clearly Defined Goal. If a federal grant’s program goal is something soft, like ‘improve rural broadband’ then the program is doomed to failure and will fund solutions that only incrementally improve broadband. The grant program should have a bold goal, such as bringing a permanent broadband solution to a significant number of households. For example, done well, this grant could bring fiber to 4 – 5 million homes rather than make incremental broadband improvements everywhere.

Match the Grant Process with the Grant Goals. Past federal grants have often had grant application rules that didn’t match the goals. Since the results of grants are governed by the application rules, those are all that matter. Stated goals for a grant are just rhetoric if those goals are not realized in the grant application requirements. As an example, if a grant goal is to favor the fastest broadband possible, then all grant application rules should be weighted towards that goal.

Match Speed Requirement with the Grant Construction Period. The discussion for the proposed $20.4 billion grant contemplates a minimum speed goal of 25/3 Mbps. That’s a DSL speed and is already becoming obsolete today. A goal of 25/3 Mbps will be badly outdated by the time any grant-funded networks are built. The FCC should not repeat their worst decision ever that gave out $11 billion for CAF II funding to build 10/1 Mbps networks – a speed that was obsolete even before the grants were awarded. The FCC should be requiring future-looking speeds.

Make the Grants Available to Everybody. FCC grant and loan programs often include a statement that they are available to every kind of entity. Yet the actual award process often discriminates against some kinds of applicants. For example, grants that include a loan component make it generally impossible for most municipal entities to accept the awards. Loan rules can also eliminate non-RUS borrowers. Grant rules that require recipients to become Eligible Telecommunications Carriers – a regulatory designation – discriminate against open access networks where the network owner and the ISP are separate entities. If not written carefully, grant rules can discriminate against broadband partnerships where the network owner is a different entity than the operating ISP.

Reverse Auction is not a Good Fit. Reverse auctions are a good technique to use when taking bids for some specific asset. Reverse auctions won’t work well when the awarded area is the whole US. Since reverse auctions favor those who will take the lowest amount of funding a reverse auction will, by definition, favor lower-cost technologies. A reverse auction will also favor parts of the country with lower costs and will discriminate against the high-cost places that need broadband help the most, like Appalachia. A reverse auction also favors upgrades over new construction and would favor upgrading DSL over building faster new technologies. From a political perspective, a reverse auction won’t spread the awards geographically and could favor one region, one technology or even only a few grant applicants. Once the auction is started the FCC would have zero input over who wins the funds – something that would not sit well with Congress.

Technology Matters. The grants should not be awarded to technologies that are temporary broadband band-aids. For example, if the grants are used to upgrade rural DSL or to provide fixed cellular broadband, then the areas receiving the grants will be back at the FCC in the future asking for something better. It’s hard to justify any reason for giving grants to satellite providers.

States Need to Step Up. The magnitude of the proposed federal grant program provides a huge opportunity for states. Those states that increase state grant funding should attract more federal grants to their state. State grants can also influence the federal awards by favoring faster speeds or faster technologies.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.