Ideas for Better Broadband Mapping

The FCC is soliciting ideas on better ways to map broadband coverage. Everybody agrees that the current broadband maps are dreadful and misrepresent broadband availability. The current maps are created from data that the FCC collects from ISPs on the 477 form where each ISP lists broadband coverage by census block. One of the many problems with the current mapping process (I won’t list them all) is that census blocks can cover a large geographic area in rural America, and reporting at the census block level tends to blur together different circumstances where some folks have broadband and others have none.

There have been two interesting proposals so far. Several parties have suggested that the FCC gather broadband speed availability by address. That sounds like the ultimate database, but there are numerous reasons why this is not practical.

The other recommendation is a 3-stage process recommended by NCTA. First, data would be collected by polygon shapefiles. I’m not entirely sure what that means, but I assume it means using smaller geographic footprints than census blocks. Collecting the same data as today using a smaller footprint ought to be more accurate. Second, and the best idea I’ve heard suggested, is to allow people to challenge the data in the mapping database. I’ve been suggesting that for several years. Third, NCTA wants to focus on pinpointing unserved areas. I’m not sure what that means, but perhaps it means creating shapefiles to match the different availability of speeds.

These ideas might provide better broadband maps than we have today, but I’m guessing they will still have big problems. The biggest issue with trying to map broadband speeds is that many of the broadband technologies in use vary widely in actual performance in the field.

  • Consider DSL. We’ve always known that DSL performance decreases with distance from a DSL base station. However, DSL performance is not as simple as that. DSL also varies for other reasons like the size of the gauge of copper at a customer or the quality of the copper. Next door neighbors can have a significantly different DSL experience if they have different size wires in their copper drops, or if the wires at one of the homes have degraded over time. DSL also differs by technology. A telco might operate different DSL technologies out of the same central office and see different performance from ADSL versus VDSL. There really is no way for a telco to predict the DSL speed available at a home without installing it and testing the actual speed achieved.
  • Fixed wireless and fixed cellular broadband have similar issues. Just like DSL, the strength of a signal from a wireless transmitter decreases over distance. However, distance isn’t the only issue and things like foliage affect a wireless signal. Neighbors might have a very different fixed wireless experience if one has a maple tree and the other has a pine tree in the front yard. To really make it difficult to define the speed, the speeds on wireless systems are affected to some degree by precipitation, humidity and temperature. Anybody who’s ever lived with fixed wireless broadband understands this variability. WISPs these days also use multiple spectrum blocks, and so the speed delivered at any given time is a function of the particular mix of spectrum being used.

Regardless of the technology being used, one of the biggest issues affecting broadband speeds is the customer home. Customers (or ISPs) might be using outdated and obsolete WiFi routers or modems (like Charter did for many years in upstate New York). DSL speeds are just as affected by the condition of the inside copper wiring as the outdoor wiring. The edge broadband devices can also be an issue – when Google Fiber first offered gigabit fiber in Kansas City almost nobody owned a computer capable of handling that much speed.

Any way we try to define broadband speeds – even by individual home – is going to still be inaccurate. Trying to map broadband speeds is a perfect example of trying to fit a round peg in a square hole. It’s obvious that we can do a better job of this than we are doing today. I pity a fixed wireless ISP if they are somehow required to report broadband speeds by address, or even by a small polygon. They only know the speed at a given address after going to the roof of a home and measuring it.

The more fundamental issue here is that we want to use the maps for two different policy purposes. One goal is to be able to count the number of households that have broadband available. The improved mapping ideas will improve this counting function – within all of the limitations of the technologies I described above.

But mapping is a dreadful tool when we use it to start drawing lines on a map defining which households can get grant money to improve their broadband. At that point the mapping is no longer a theoretical exercise and a poorly drawn line will block homes from getting better broadband. None of the mapping ideas will really fix this problem and we need to stop using maps when awarding grants. It’s so much easier to decide that faster technology is better than slower technology. For example, grant money ought to be available for anybody that wants to replace DSL on copper with fiber. I don’t need a map to know that is a good idea. The grant process can use other ways to prioritize areas with low customer density without relying on crappy broadband maps.

We need to use maps only for what they are good for – to get an idea of what is available in a given area. Mapping is never going to be accurate enough to use to decide which customers can or cannot get better broadband.

How Bad is the Digital Divide?

The FCC says that approximately 25 million Americans living in rural areas don’t have access to an ISP product that would be considered as broadband – currently defined as 25/3 Mbps. That number comes out of the FCC’s mapping efforts using data supplied by ISPs.

Microsoft tells a different story. They say that as many as 163 million Americans do not use the Internet at speeds that the FCC considers as broadband. Microsoft might be in the best position of anybody in the industry to understand actual broadband performance because the company can see data speeds for every customer that updates Windows or Microsoft Office – that’s a huge percentage of all computer users in the country and covers every inch of the country.

Downloading a big software update is probably one of the best ways possible to measure actual broadband performance. Software updates tend to be large files, and the Microsoft servers will transmit the files at the fastest speed a customer can accept. Since the software updates are large files, Microsoft gets to see the real ISP performance – not just the performance for the first minute of a download. Many ISPs use a burst technology that downloads relatively fast for the first minute or so, but then slows for the rest of a download – a customer’s true broadband speed is the one that kicks in after the burst is finished. The burst technology has a side benefit to ISPs in that it inflates performance on standard speed tests – but Microsoft gets to see the real story.

I’ve ranted about the FCC’s broadband statistics many times. There are numerous reasons why the FCC data is bad in rural America. Foremost, the data is self-reported by the big ISPs who have no incentive to tell the FCC or the public how poorly they are doing. It’s also virtually impossible to accurately report DSL speeds that vary from customer to customer according to the condition of specific copper wires and according to distance from the DSL core router. We also know that much of the reporting to the FCC represents marketing speeds or ‘up-to’ speeds that don’t reflect what customers really receive. Even the manner of reporting to the FCC, by Census block, distorts the results because when a few customers in a block get fast speeds the FCC assumes that everyone does.

To be fair, the Microsoft statistics measure the speeds customers are actually achieving, while the FCC is trying to measure broadband availability. The Microsoft data includes any households that elect to buy slower broadband products to save money. However, there are not 140 million households that purposefully buy slow broadband (the difference between 163 million and 24 million). The Microsoft numbers tell us that the actual speeds in the country are far worse than described by the FCC – and for half of us slower than 25/3 Mbps. That is a sobering statistic and doesn’t just reflect that rural America is getting poor broadband, but also that many urban and suburban households also aren’t achieving 25/3 Mbps.

I’ve seen many real-life examples of what Microsoft is telling us. At CCG Consulting we do community surveys for broadband and we sometimes see whole communities where the achieved speeds for customers is lower than the speeds advertised by the ISPs. We often see a lot more households claim to have no broadband or poor broadband than would be expected using the FCC mapping data. We constantly see residents in urban areas complain that broadband with a relatively fast speed seems slow and sluggish.

Microsoft reported their findings to the FCC, but I expect the FCC to ignore their story. This is a drastic departure from the narrative that the FCC is telling Congress and the public. I wrote a blog just a few weeks ago describing how the FCC is claiming that big ISPs are delivering the speeds that they market. Deep inside the recent reports the FCC admitted that DSL often wasn’t up to snuff – but the Microsoft statistics mean that a lot of cable companies and other ISPs are also under-delivering.

In my mind the Microsoft numbers invalidate almost everything that we think we know about broadband in the country. We are setting national broadband policy and goals based upon false numbers – and not numbers that are a little off, but rather than are largely a fabrication. We have an FCC that is walking away from broadband regulation because they have painted a false narrative that most households in the country have good broadband. It would be a lot harder for politicians to allow broadband deregulation if the FCC admitted that over half of the homes in the country aren’t achieving the FCC definition of broadband.

The FCC has been tasked by Congress to find ways to improve broadband in areas that are unserved or underserved – with those categories being defined by the FCC maps. The Microsoft statistics tell us that there are huge numbers of underserved households, far higher than the FCC is recognizing. If the FCC was to acknowledge the Microsoft numbers, they’d have to declare a state of emergency for broadband. Sadly, the FCC has instead doomed millions of homes from getting better broadband by declaring these homes as already served with adequate broadband – something the Microsoft numbers say is not true.

The current FCC seems hellbent on washing their hands of broadband regulation, and the statistics they use to describe the industry provide the needed cover for them to do so. To be fair, this current FCC didn’t invent the false narrative – it’s been in place since the creation of the national broadband maps in 2009. I, and many others predicted back then that allowing the ISPs to self-report performance would put us right where we seem to be today – with statistics that aren’t telling the true story. Microsoft has now pierced the veil to see behind the curtain – but is there anybody in a position of authority willing to listen to the facts?