Testing the FCC Maps

USTelecom has been advocating the use of geocoding to make broadband maps more accurate. As part of that advocacy, the association tested their idea by looking at the FCC mapping in parts of Virginia and Missouri.

What they found was not surprising, but still shocking. They found in those two states that as many as 38% of households in rural census blocks were classified as being served, when in fact they were unserved. In FCC-speak, served is a home that has broadband available of 25/3 Mbps or faster. Unserved means homes having either no broadband available or that can buy broadband slower than 10/1 Mbps.

This distinction has huge significance for the industry. First, it’s been clear that the FCC has been overcounting the number of homes that have broadband. But far worse, the FCC has been awarding grants to provide faster broadband in unserved areas and all of the places that have been misclassified have not been eligible for grants. We’re about to enter the biggest grant program ever that will award $20.4 billion, but only to places that don’t have 25/3 Mbps speeds – meaning these misclassified homes will be left out again if the maps aren’t fixed soon.

The USTelecom effort is not even complete since several cable companies in the state did not participate in the trial – and this might mean that the percentage of homes that are misclassified is even larger. The misclassified homes are likely going to be those in census blocks that also contain at least some homes with fast broadband. Homes just past where the cable company networks start might be listed as being capable of buying a gigabit, and yet have no broadband option.

The existing FCC maps use data that is reported by ISPs using the Form 477 process. In that process, ISPs report speed availability by census block. There are two huge flaws with this reporting method. First, if even one customer in the census block can get fast broadband, then the whole census block is assumed to have fast broadband. Second, many ISPs have been reporting marketing speeds instead of actual speeds, and so there are whole census blocks counted as served when nobody can get real broadband.

The trial also uncovered other problems. The ISPs have not been accurate in counting homes by census block. Many ISPs have never accurately mapped their customers, and so the test found numerous examples of customers reported in the wrong census blocks. Additionally, the counts of buildings by census block are often far off, due in part to the confusing nature of rural addresses.

The bottom line is that the FCC has been collecting and reporting highly inaccurate data concerning rural broadband. We’ve known this for a long time because there have been numerous efforts to test the maps in smaller geographic areas that have highlighted these same mistakes. We also have evidence from Microsoft that shows that a huge number of homes are not connected to the Internet at speeds of at least 25/3 Mbps. That’s not just a rural issue, and for the Microsoft numbers to be true there must be a massive number of urban homes that are getting speeds slower than what is being reported to the FCC.

As dramatic as this finding is from USTelecom, it doesn’t tell the whole story. Unfortunately, no mapping strategy is going to be able to truthfully report the broadband speeds for DSL and fixed wireless. The speed of these products varies by home. Further, there is no way to know if a given home can utilize these technologies until somebody tries to connect them. Perhaps this isn’t important for DSL since there is almost no rural DSL capable of delivering 25/3 Mbps broadband. But any mapping of the coverage area of fixed wireless is going to be suspect since many homes are impeded from seeing a tranmitting antenna or else receive slower speeds than their neighbors due to impediments. The USTelecom effort is mostly fixing the boundary issues where homes are assumed to have broadband today but don’t. The 38% misreporting would be much higher if we could somehow magically know the real capabilities of DSL and fixed wireless.

The current FCC didn’t create this problem – it goes back several FCCs ago to the start of the 477 reporting system. However, I have to wonder if this FCC will change its mind about the status of rural broadband in the country even with better maps. The current FCC released broadband data for 2016 that included a huge error. A new ISP, Barrier Free had reported serving 25/3 broadband in census blocks covering 62 million people, when in June of that year the company didn’t yet have any customers. The FCC gleefully reported that the number of homes without broadband had dropped by 25%, mostly due to this reporting error. Even after correcting the error the FCC still declared that broadband in rural America was on the right trajectory and didn’t need any extraordinary effort from the FCC. I’m sure they will decide that rural broadband is fine, even if the number of unserved homes jumps significantly due to better mapping.

Setting the Definition of Broadband

One of the commenters on my blog asked a good question – can’t we set the definition of broadband by looking at the broadband applications used by the typical household? That sounds like a commonsense approach to the issue and is exactly what the FCC did when they set the definition of broadband to 25/3 Mbps in 2015. They looked at combinations of applications that a typical family of four might use in an evening, with the goal that a household ought to have enough broadband to comfortably do those functions at the same time. This might best be described as a technical approach to defining broadband – look at what households are really using and make sure that the definition of broadband is large enough to cover the expected usage for a typical household.

Taking this approach raises the bigger question – what should the policy be for setting the definition of broadband? I don’t know that I have any answers, but I ask the following questions:

  • The FCC largely conducted a thought experiment when setting the 25/3 definition of broadband – they didn’t try to measure the bandwidth used in the scenarios they considered. If the FCC had measured real homes doing those functions they likely would have found that bandwidth needs were different than they had estimated. Some functions use less bandwidth than they had supposed. But usage also would have been larger than they had calculated, because the FCC didn’t compensate for WiFi overheads and machine-to-machine traffic. As a household makes use of multiple simultaneous broadband functions, the WiFi networks we all use bog down when those applications collide with each other inside the home network. The busy-hour behavior of our home networks needs to be part of a mathematical approach to measuring broadband.
  • The FCC could have gotten a better answer had they hired somebody to measure evening broadband usage in a million homes. We know that broadband usage is like anything else and there are households that barely use broadband and others that use it intentsely. The idea of pinpointing the usage of a typical family is a quaint idea when what’s needed is to understand the curve of broadband usage – what’s the percentage of homes that are light, average, and heavy users. I’m sure that one of the big companies that track broadband usage could measure this somehow. But even after making such measurements we need a policy. Should the definition of broadband be set to satisfy the biggest broadband users, or something else like the medium speed used by households? Analytics can only go so far and at some point there has to be a policy. It’s not an easy policy to establish – if the definition of broadband is set anywhere below the fastest speeds used by households, then policy makers are telling some households that they use too much broadband.
  • If we are going to use measurements to determine the definition of broadband, then this also has to be an ongoing effort. If 25/3 was the right definition of broadband in 2015, how should that definition have changed when homes routinely started watching 4K video? I don’t think anybody can deny that households use more broadband each year, and homes use applications that are more data intensive. The household need for speed definitely increases over time, so any policy for setting a definition of broadband needs to recognize that the definition must grow over time.
  • One fact that is easy to forget is that the big cable companies now serve two-thirds of the broadband customers in the country, and any discussion we have about a definition of broadband is only considering how to handle the remaining one-third of broadband users. There is a good argument to be made that the cable companies already define the ‘market’ speed of broadband. The big cable companies all have minimum broadband speeds for new customers in urban markets today between 100 – 200 Mbps. The companies didn’t set these speeds in a vacuum. The cable companies have unilaterally increased speeds every 3-4 years in response to demands from their customers for faster speeds. I think there is a valid argument to be made that the market speeds used to serve two-thirds of the customers in the country should be the target broadband speed for everybody else. Any policymaker arguing that 25/3 Mbps should still be the definition of broadband is arguing that one-third of the country should settle for second-class broadband.
  • In a related argument I harken back to a policy discussion the FCC used to have when talking about broadband speeds. I can remember a decade or more ago when the FCC generally believed that rural broadband customers deserved to have access to the same speeds as urban customers. That policy was easy to support when cable networks and telco copper networks both delivered similar speeds. However, as cable broadband technology leaped ahead of copper and DSL, these discussion disappeared from the public discourse.
  • When looking at grant programs like the upcoming RDOF program, where the funded networks won’t be completed until 2027, any definition of broadband for the grants needs to look ahead to what the speeds might be like in 2027. Unfortunately, since we can’t agree on how to set the definition of broadband today, we have no context for talking about future speeds.

These are not easy questions. If the FCC was doing its job we would be having vigorous discussions on the topic. Sadly, I don’t foresee any real discussions at the FCC about the policy for setting the definition of broadband. The FCC has hunkered down and continues to support the 25/3 definition of broadband even when it’s clear that it’s grown obsolete. This FCC is unlikely to increase the definition of broadband, because in doing so they would be declaring that millions of homes have something less than broadband. It seems that our policy for setting the definition of broadband is to keep it where it is today because that’s politically expedient.

FCC – Please Don’t Fund 25/3 Broadband

The current FCC recognizes the disaster that was created when the original CAF II grant program subsidized the construction of broadband that supports speeds of only 10/1 Mbps. Several FCC commissioners have said that they don’t want to repeat that disaster. Had the CAF II grant monies been allowed for companies other than the big telcos, much of the money would have gone to fiber ISPs and we’d see a lot more areas covered with good broadband today (meaning fewer headaches for the FCC).

Today I ask the question: what speeds should the new $20.4 billion RDOF grant fund support? In the NPRM for the RDOF grant program, the FCC suggests that the minimum speed they will fund is 25/3 Mbps. It looks like the funding for these grants will start in 2021, and like the CAF II program, anybody taking the money will have six years to complete the broadband construction. I think the right way to think about the speeds for these grants is to look at likely broadband speeds at the end of the construction period in 2027, not at where the world is at two years before the RDOF is even started. If the FCC bases the program on broadband speeds today, they will be making the same error as on the original CAF II – they will use federal money to build broadband that is obsolete before it’s even constructed.

I start by referring to a recent blog where I challenge the idea that 25/3 should be the definition of broadband today. To quickly summarize that blog, we know that broadband demand has been growing constantly since the days of dial-up – and the growth in broadband demand applies to speeds as well as volume of monthly downloading. Both Cisco and Ookla have shown that broadband demand has been growing at a rate if about 21% annually for many years.

At a bare minimum, the definition of broadband today ought to be 50 Mbps download – and that definition is a minimum speed, not a goal that should be used for building tomorrow’s broadband. As I said earlier, in a world where demand continues to grow, today’s definition of broadband shouldn’t matter – what matters is the likely demand for broadband in 2027 when the RDOF networks are operational.

Trending the demand curve chart for download speeds forward presents a story that the FCC doesn’t want to hear. The need for speed is going to continue to increase. If the growth trend holds (and these trends have been steady since the days of dial-up), then the definition of broadband by 2027 ought to be 250 Mbps – meaning by then nobody should build a network that can’t meet that speed.

2019 2020 2021 2022 2023 2024 2025 2026 2027
54 65 78 95 115 139 168 204 246

The big cable companies already recognize what the FCC won’t acknowledge. The minimum speed offered to new customers on urban cable networks today is at least 100 Mbps, and most users can order a gigabit. The cable companies know that if they provide fast speeds they get a lot fewer complaints from customers. In my city of Asheville, NC, Charter unilaterally increased the speed of broadband in 2018 from 60/6 Mbps to 135/20 Mbps. Anybody who has watched the history of cable company broadband knows that they will increase speeds at least once before 2027 to stay ahead of the demand curve. It wouldn’t be surprising by 2027 if cable company minimum speeds are 300 – 500 Mbps. Do we really want to be funding 25/3 rural broadband when speeds in cities will be fifteen times faster?

Will the world behave exactly like this chart – not likely. But will homes in 2027 be happy with 25/3 Mbps broadband – most definitely not. Given a choice, homes don’t even want 25/3 Mbps broadband today. We are already seeing hordes of urban customers abandoning urban DSL that delivers speeds between 25 Mbps and 50 Mbps.

If the FCC funds 25/3 Mbps broadband in the RDOF grant they will be duplicating one of the dumbest FCC decisions ever made – when CAF II funded 10/1 Mbps broadband. The FCC will be funding networks that are massively obsolete before they are even built, and they will be spending scarce federal dollars to again not solve the rural digital divide. There will continue to be cries from rural America to bring real broadband that works and by 2027 we’ll probably be talking about CAF IV grants to try this all over again.

The Definition of Broadband

When the FCC set the definition of broadband at 25/3 Mbps in January of 2015, I thought it was a reasonable definition. At the time the FCC said that 25/3 Mbps was the minimum speed that defined broadband, and anything faster than 25/3 Mbps was considered to be broadband, and anything slower wasn’t broadband.

2015 was forever ago in terms of broadband usage and there have been speed increases across the industry since then. All of the big cable companies have unilaterally increased their base broadband speeds to between 100 Mbps and 200 Mbps. Numerous small telcos have upgraded their copper networks to fiber. Even the big telcos have increased speeds in rural America through CAF II upgrades that increased speeds to 10/1 Mbps – and the telcos all say they did much better in some places.

The easiest way to look at the right definition of broadband today is to begin with the 25/3 Mbps level set at the beginning of 2015. If that was a reasonable definition at the beginning of 2015, what’s a reasonable definition today? Both Cisco and Ookla track actual speeds achieved by households and both say that actual broadband speeds have been increasing nationally about 21% annually. Apply a 21% annual growth rate to the 25 Mbps download speeds set in 2015 would predict that the definition of broadband today should be 54 Mbps:

2015 2016 2017 2018 2019
25 30 37 44 54

We also have a lot of anecdotal evidence that households want faster speeds. Households have been regularly bailing on urban DSL and moving to faster cable company broadband. A lot of urban DSL can be delivered at speeds between 25 and 50 Mbps, and many homes are finding that to be inadequate. Unfortunately, the big telcos aren’t going to provide the detail needed to understand this phenomenon, but it’s clearly been happening on a big scale.

It’s a little sketchier to apply this same logic to upload speeds. There was a lot of disagreement about using the 3 Mbps download speed standard established in 2015. It seems to have been set to mollify the cable companies that wanted to assign most of their bandwidth to download. However, since 2015 most of the big cable companies have upgraded to DOCSIS 3.1 and they can now provide significantly faster uploads. My home broadband was upgraded by Charter in 2018 from 60/6 Mbps to 135/20 Mbps. It seems ridiculous to keep upload speed goals low, and if I was magically put onto the FCC, I wouldn’t support an upload speed goal of less than 20 Mbps.

You may recall that the FCC justified the 25/3 Mbps definition of broadband by looking at the various download functions that could be done by a family of four. The FCC examined numerous scenarios that considered uses like video streaming, surfing the web, and gaming. The FCC scenario was naive because they didn’t account for the fact that the vast majority of homes use WiFi. Most people don’t realize that WiFi networks generate a lot of overhead due to collisions of data streams – particularly when a household is trying to do multiple big bandwidth applications at the same time. When I made my judgment about the 25/3 Mbps definition back in 2015, I accounted for WiFi overheads and I still thought that 25/3 Mbps was a reasonable definition for the minimum speed of broadband.

Unfortunately, this FCC is never going to unilaterally increase the definition of broadband, because by doing so they would reclassify millions of homes as not having broadband. The FCC’s broadband maps are dreadful, but even with the bad data, it’s obvious that if the definition of broadband was 50/20 Mbps today that a huge number of homes would fall below that target.

The big problem with the failure to recognize the realities of household broadband demand is that the FCC is using the already-obsolete definition of 25/3 Mbps to make policy decisions. I have a follow-up blog to this one that will argue that using that speed as the definition of the upcoming $20.4 billion RDOF grants will be as big of a disaster as the prior FCC decision to hand out billions to upgrade to 10/1 Mbps DSL in the CAF II program.

The fact that household broadband demand grows over time is not news. We have been on roughly the same demand curve growth since the advent of dial-up. It’s massively frustrating to see politics interfere with what is a straight engineering issue. As homes use more broadband, particularly when they want to do multiple broadband tasks at the same time, their demand for faster broadband grows. I can understand that no administration wants to recognize that things are worse than they want them to be – so they don’t want to set the definition of broadband at the right speed. But it’s disappointing to see when the function of the FCC is supposed to be to make sure that America gets the broadband infrastructure it needs. If the agency was operated by technologists instead of political appointees we wouldn’t even be having this debate.

Summary Conclusions for Designing an FCC Broadband Grant

The earlier series of blogs looked at a number of ideas on how the FCC could create the most effective federal grant program for the upcoming $20.4 billion of announced grants. Following is a summary of the most important conclusions of those blogs:

Have a Clearly Defined Goal. If a federal grant’s program goal is something soft, like ‘improve rural broadband’ then the program is doomed to failure and will fund solutions that only incrementally improve broadband. The grant program should have a bold goal, such as bringing a permanent broadband solution to a significant number of households. For example, done well, this grant could bring fiber to 4 – 5 million homes rather than make incremental broadband improvements everywhere.

Match the Grant Process with the Grant Goals. Past federal grants have often had grant application rules that didn’t match the goals. Since the results of grants are governed by the application rules, those are all that matter. Stated goals for a grant are just rhetoric if those goals are not realized in the grant application requirements. As an example, if a grant goal is to favor the fastest broadband possible, then all grant application rules should be weighted towards that goal.

Match Speed Requirement with the Grant Construction Period. The discussion for the proposed $20.4 billion grant contemplates a minimum speed goal of 25/3 Mbps. That’s a DSL speed and is already becoming obsolete today. A goal of 25/3 Mbps will be badly outdated by the time any grant-funded networks are built. The FCC should not repeat their worst decision ever that gave out $11 billion for CAF II funding to build 10/1 Mbps networks – a speed that was obsolete even before the grants were awarded. The FCC should be requiring future-looking speeds.

Make the Grants Available to Everybody. FCC grant and loan programs often include a statement that they are available to every kind of entity. Yet the actual award process often discriminates against some kinds of applicants. For example, grants that include a loan component make it generally impossible for most municipal entities to accept the awards. Loan rules can also eliminate non-RUS borrowers. Grant rules that require recipients to become Eligible Telecommunications Carriers – a regulatory designation – discriminate against open access networks where the network owner and the ISP are separate entities. If not written carefully, grant rules can discriminate against broadband partnerships where the network owner is a different entity than the operating ISP.

Reverse Auction is not a Good Fit. Reverse auctions are a good technique to use when taking bids for some specific asset. Reverse auctions won’t work well when the awarded area is the whole US. Since reverse auctions favor those who will take the lowest amount of funding a reverse auction will, by definition, favor lower-cost technologies. A reverse auction will also favor parts of the country with lower costs and will discriminate against the high-cost places that need broadband help the most, like Appalachia. A reverse auction also favors upgrades over new construction and would favor upgrading DSL over building faster new technologies. From a political perspective, a reverse auction won’t spread the awards geographically and could favor one region, one technology or even only a few grant applicants. Once the auction is started the FCC would have zero input over who wins the funds – something that would not sit well with Congress.

Technology Matters. The grants should not be awarded to technologies that are temporary broadband band-aids. For example, if the grants are used to upgrade rural DSL or to provide fixed cellular broadband, then the areas receiving the grants will be back at the FCC in the future asking for something better. It’s hard to justify any reason for giving grants to satellite providers.

States Need to Step Up. The magnitude of the proposed federal grant program provides a huge opportunity for states. Those states that increase state grant funding should attract more federal grants to their state. State grants can also influence the federal awards by favoring faster speeds or faster technologies.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.

Technology and FCC Grants

This is the next in the series of blogs looking at the upcoming $20.4 billion FCC grant program. I ask the question of how the FCC should consider technology in the upcoming grant program.

Should Satellite Companies be Eligible? I think a more fundamental question is if the current generation of high-orbit satellites really deliver broadband. Over the last few years I’ve talked to hundreds of rural people about their broadband situation and I have never met anybody who liked satellite broadband – not one person. Most people I’ve talked to have tried it once and abandoned it as unworkable.

This goes back to the basic definition of broadband. The FCC defines broadband by download speeds of at least 25/3 Mbps. In their original order in 2015 the FCC discussed latency, but unfortunately never made latency part of the broadband definition. As a reminder, the standard definition of latency is that it’s a measure of the time it takes for a data packet to travel from its point of origin to the point of destination.

A few years ago, the FCC did a study of the various last mile technologies and measured the following ranges of performance of last-mile latency, measured in milliseconds: fiber (10-20 ms), coaxial cable (15-40 ms), and DSL (30-65 ms). Cellular latencies vary widely depending upon the exact generation of equipment at any given cell site, but 4G latency can be as high as 100 ms. In the same FCC test, satellite broadband was almost off the chart with latencies measured as high as 650 ms.

Latency makes a big difference in the perceived customer experience. Customers will rate a 25 Mbps connection on fiber as being much faster than a 25 Mbps connection on DSL due to the difference in latency. The question that should be asked for federal grants is if satellite broadband should be disqualified due to poor latency.

I was unhappy to see so much money given to the satellite providers in the recent CAF II reverse auction. Even ignoring the latency issue, I ask if the satellite companies deserve broadband subsidies. There is no place in rural America where folks don’t already know that satellite broadband is an option – most people have rejected the technology as an acceptable broadband connection. It was particularly troubling seeing satellite providers getting money in a reverse auction. Once a satellite is in orbit it’s costs are fixed and that means that the satellite providers will be happy to take any amount of federal subsidy – they can bid lower than any other grant applicant in a reverse auction. I have to question the wisdom of providing federal subsidies to companies that are already failing at marketing.

I don’t have enough information to know how to feel about the upcoming low-orbit satellites that are just now being tested and launched. Because of lower orbits they will have lower latency. However, the satellite companies still have a huge advantage in a reverse auction since they can bid lower than anybody else – a satellite company would be happy with only a few dollars per potential customer and has no bottom limit on the amount of grant they are willing to accept. If the new satellite companies can bid in the same manner as everybody else we could end up with the situation where these companies claim 100% of the new grant funds.

What About DSL? My nightmare scenario is that the FCC hands most or all of the $20.4 billion to the big telcos to upgrade rural DSL from 10/1 Mbps to 25/3 Mbps. This is certainly within the realm of possibility. Remember that the first CAF II program was originally going to be open to everybody but at the last minute was all given to the big telcos.

I find it troublesome that the big telcos have been quiet about the announced plans for this grant. The money will be spent in the big telco service areas and you’d think they be screaming about plans for federal money to overbuild them. Recall that the big telcos recently were able to derail the Re-Connect grants by inserting the rule that only 10% of the grant money could be used for customers who receive at least 10/1 Mbps broadband. This FCC clearly favors the big telcos over other ISPs and could easily hand all of this money to the big telcos and call it CAF III.

Even if they don’t do that, the question is if any federal grant money should be used to upgrade rural DSL. Rural copper is in dreadful condition due to the willful neglect of the big telcos who stopped doing maintenance on their networks decades ago. It’s frankly a wonder that the rural copper networks even function. It would be a travesty to reward the telcos by giving them billions of dollars to make upgrades that they should have routinely made by reinvesting customer revenues.

I think when the dust clears on CAF II we’re going to find out that the big telcos largely cheated with that money. We’re going to find that they only upgraded the low-hanging fruit and that many households in the coverage areas got no upgrades or minor upgrades that won’t achieve the 10/1 Mbps goals. I think we’ll also find that in many cases the telcos didn’t spend very much of the CAF II funds but just pocketed it as free revenue. I beg the FCC to not repeat the CAF II travesty – when the truth comes out about how the telcos used the funding, the CAF II program is going to grab headlines as a scandal. Please don’t provide any money to upgrade DSL.

This blog is part of a series on Designing the Ideal Federal Broadband Grant.

 

Broadband Have-nots

In one of my recent blogs I talked about a few cities that had broadband penetration north of 90%, meaning that most households in those cities have broadband. I’ve run across three such cities this year. But there are also cities with a very different story. I saw a recent article about Syracuse, New York that claimed that 66% of the homes in the city have a landline broadband connection and only a little more than half of households have a connection that meets the FCC definition of broadband at 25/3 Mbps.

It’s easy to look at the national average broadband penetration rate of 84% and think that most people in cities across the country have broadband. This is particularly true when you adjust that national average to remove the millions of rural households that still have no landline broadband option, which adjusts the national average to over 90%.

We’ve always known that there is a correlation between income and broadband subscription rates – in fact, the basic definition of the urban digital divide is households that can’t afford broadband. We also know that in every larger city that the broadband penetration rates are not uniform but are lower in poorer neighborhoods.

I am concerned that the urban digital divide is going to get worse. Most industry analysts believe that we’ll see significant increases in broadband prices over the next decade. The big cable companies have little choice but to raise broadband rates if they want to maintain the steady bottom line revenue growth expected by Wall Street. This means that’s it’s likely over time that broadband penetration rates in cities are going to drop even lower.

Cities badly want to find a solution to the digital divide that is so heavily impacting low-income neighborhoods. They know there are huge negative impacts on households without broadband. There have been several recent studies showing that school students without home broadband lag behind students with broadband, and they never close the gap. Having whole neighborhoods that can’t afford broadband will be condemning whole generations of underperforming students, helping to perpetuate the cycle of poverty.

Syracuse is considering a solution that would bring some broadband to the neighborhoods that most need it. The city has a plan to buy 18,000 streetlights that would include outdoor WiFi hotspots. These WiFi units can produce decent broadband outdoors, but the strength of WiFi signals decrease significantly when passing through the exterior walls of buildings. While any broadband is better than nothing, outdoor WiFi units are not going to provide the same quality of broadband as a landline connection. Such efforts will likely be welcomed by residents without broadband, but this is still second-rate broadband compared to that given to households that can afford to buy broadband from the incumbent ISPs.

The dilemma for cities is that there is no easy solution to the digital divide. For Syracuse, the problem is mostly affordability and not access. Most of the homes without broadband probably have the option to buy from the incumbent providers. I say most because there are still poor neighborhoods present in almost every city that don’t have the same broadband infrastructure as the rest of the city. I’ve seen estimates that there are nearly as many residences in cities with no broadband option as are rural homes without broadband. It’s hard to know for sure because the areas without broadband are comprised of an apartment building here and a dead-end street there rather than big neighborhoods without broadband.

Cities often consider building their own broadband network as a solution to the digital divide. I undertake numerous broadband feasibility studies every year, and almost every city I’ve ever worked for has universal access to fiber as one of their primary goals. However, building fiber or any broadband infrastructure is expensive, and it’s usually hard to justify the cost of providing free or low-cost broadband to low-income homes. It’s challenging in a competitive environment to make enough profit from normal broadband customers to subsidize low-income homes.

We’ve been talking about the digital divide since the late 1990s when we saw the introduction of DSL and cable modems. In my mind, the problem is far worse today than it was then since broadband has grown to become a necessity of the same magnitude as having electric or water in a home. Unfortunately, I think the urban digital divide will be growing as broadband prices climb year after year.

Broadband Usage Continues to Grow

The firm OpenVault, a provider of software that measures data consumption for ISPs reported that the average monthly data use by households grew from 201.6 gigabytes in 2017 to 268.7 gigabytes in 2018 – a growth rate of 33%. The company also reported that the medium use per household grew from 103.6 gigabytes in 2017 to 145.2 gigabytes in 2018 – a growth rate of 40%. The medium represents the midpoint of users, with half of all households above and half below the medium.

To some degree, these statistics are not news because we’ve known for a long time that broadband usage at homes, both in total download and in desired speeds has been doubling every three years since the early 1980s. The growth in 2018 is actually a little faster than that historical average and if the 2018 growth rate was sustained, in three years usage would grow by 235%. What I find most impressive about these new statistics is the magnitude of the annual change – the average home used 67 more gigabytes of data per month in 2018 than the year before – a number that would have seemed unbelievable only a decade ago when the average household used a total of only 25 gigabytes per month.

There are still many in the industry who are surprised by these numbers. I’ve heard people claim that now that homes are watching all the video they want that the rate of growth is bound to slow down – but if anything, the rate of growth seems to be accelerating. We also know that cellular data consumption is also now doubling every two years.

This kind of growth has huge implications for the industry. From a network perspective, this kind of bandwidth usage puts a big strain on networks. Typically the most strained part of a network is the backbones that connect to neighborhood nodes. That’s the primary stress point in many networks, including FTTH networks, and when there isn’t enough bandwidth to a neighborhood then everybody’s bandwidth suffers. Somebody that designed a network ten years ago would never have believed the numbers that OpenVault is reporting and would likely not have designed a network that would still be sufficient today.

One consequence of the bandwidth growth is that it’s got to be driving homes to change to faster service providers when they have the option. A household that might have been happy with a 5 Mbps or 10 Mbps connection a few years ago is likely no longer happy with it. This has to be one of the reasons we are seeing millions of homes each year upgrade from DSL to cable modem each year in metropolitan areas. The kind of usage growth we are seeing today has to be accelerating the death of DSL.

This growth also should be affecting policy. The FCC set the definition of broadband at 25/3 Mbps in January of 2015. If that was a good definition in 2015 then the definition of broadband should have been increased to 63 Mbps in 2019. At the time the FCC set that threshold I thought they were a little generous. In 2014, as the FCC was having this debate, the average home downloaded around 100 gigabytes per month. In 2014 the right definition of broadband was probably more realistically 15 – 20 Mbps and the FCC was obviously a little forward-looking in setting the definition. Even so, the definition of broadband should be increased – if the right definition of broadband in 2014 was 20 Mbps, then today the definition of broadband ought to have been increased to 50 Mbps today.

The current FCC is ignoring these statistics for policy purposes – if they raise the definition of broadband then huge numbers of homes will be classified as not having broadband. The FCC does not want to do that since they are required by Congressional edict to make sure that all homes have broadband. When the FCC set a realistic definition of broadband in 2015 they created a dilemma for themselves. That 2015 definition is already obsolete and if they don’t change it, in a few years it is going to be absurdly ridiculous. One only has to look forward three years from now, when the definition of broadband ought to be 100 Mbps.

These statistics also remind us of the stupidity of handing out federal subsidies to build technologies that deliver less than 100 Mbps. We still have two more years of CAF II construction to upgrade speeds to an anemic 10 Mbps. We are still handing out new subsidies to build networks that can deliver 25/3 Mbps – networks that are obsolete before they are completed.

Network designers will tell you that they try to design networks to satisfy demands at least seven years into the future (which is the average life of many kinds of fiber electronics). If broadband usage keeps doubling every three years, then looking forward seven years to 2026, the average home is going to download 1.7 terabytes per month and will expect download speeds of 318 Mbps. I wonder how many network planners are using that target?

The final implications of this growth are for data caps. Two years ago when Comcast set a terabyte monthly data cap they said that it affected only a few homes – and I’m sure they were right at the time. However, the OpenVault statistics show that 4.12% of homes used a terabyte per month in 2018, almost double from 2.11% in 2017. We’ve now reached that point when the terabyte data cap is going to have teeth, and over the next few years a lot of homes are going to pass that threshold and have to pay a lot more for their broadband. While much of the industry has a hard time believing the growth statistics, I think Comcast knew exactly what they were doing when they established the terabyte cap that seemed so high just a few years ago.

How Bad is the Digital Divide?

The FCC says that approximately 25 million Americans living in rural areas don’t have access to an ISP product that would be considered as broadband – currently defined as 25/3 Mbps. That number comes out of the FCC’s mapping efforts using data supplied by ISPs.

Microsoft tells a different story. They say that as many as 163 million Americans do not use the Internet at speeds that the FCC considers as broadband. Microsoft might be in the best position of anybody in the industry to understand actual broadband performance because the company can see data speeds for every customer that updates Windows or Microsoft Office – that’s a huge percentage of all computer users in the country and covers every inch of the country.

Downloading a big software update is probably one of the best ways possible to measure actual broadband performance. Software updates tend to be large files, and the Microsoft servers will transmit the files at the fastest speed a customer can accept. Since the software updates are large files, Microsoft gets to see the real ISP performance – not just the performance for the first minute of a download. Many ISPs use a burst technology that downloads relatively fast for the first minute or so, but then slows for the rest of a download – a customer’s true broadband speed is the one that kicks in after the burst is finished. The burst technology has a side benefit to ISPs in that it inflates performance on standard speed tests – but Microsoft gets to see the real story.

I’ve ranted about the FCC’s broadband statistics many times. There are numerous reasons why the FCC data is bad in rural America. Foremost, the data is self-reported by the big ISPs who have no incentive to tell the FCC or the public how poorly they are doing. It’s also virtually impossible to accurately report DSL speeds that vary from customer to customer according to the condition of specific copper wires and according to distance from the DSL core router. We also know that much of the reporting to the FCC represents marketing speeds or ‘up-to’ speeds that don’t reflect what customers really receive. Even the manner of reporting to the FCC, by Census block, distorts the results because when a few customers in a block get fast speeds the FCC assumes that everyone does.

To be fair, the Microsoft statistics measure the speeds customers are actually achieving, while the FCC is trying to measure broadband availability. The Microsoft data includes any households that elect to buy slower broadband products to save money. However, there are not 140 million households that purposefully buy slow broadband (the difference between 163 million and 24 million). The Microsoft numbers tell us that the actual speeds in the country are far worse than described by the FCC – and for half of us slower than 25/3 Mbps. That is a sobering statistic and doesn’t just reflect that rural America is getting poor broadband, but also that many urban and suburban households also aren’t achieving 25/3 Mbps.

I’ve seen many real-life examples of what Microsoft is telling us. At CCG Consulting we do community surveys for broadband and we sometimes see whole communities where the achieved speeds for customers is lower than the speeds advertised by the ISPs. We often see a lot more households claim to have no broadband or poor broadband than would be expected using the FCC mapping data. We constantly see residents in urban areas complain that broadband with a relatively fast speed seems slow and sluggish.

Microsoft reported their findings to the FCC, but I expect the FCC to ignore their story. This is a drastic departure from the narrative that the FCC is telling Congress and the public. I wrote a blog just a few weeks ago describing how the FCC is claiming that big ISPs are delivering the speeds that they market. Deep inside the recent reports the FCC admitted that DSL often wasn’t up to snuff – but the Microsoft statistics mean that a lot of cable companies and other ISPs are also under-delivering.

In my mind the Microsoft numbers invalidate almost everything that we think we know about broadband in the country. We are setting national broadband policy and goals based upon false numbers – and not numbers that are a little off, but rather than are largely a fabrication. We have an FCC that is walking away from broadband regulation because they have painted a false narrative that most households in the country have good broadband. It would be a lot harder for politicians to allow broadband deregulation if the FCC admitted that over half of the homes in the country aren’t achieving the FCC definition of broadband.

The FCC has been tasked by Congress to find ways to improve broadband in areas that are unserved or underserved – with those categories being defined by the FCC maps. The Microsoft statistics tell us that there are huge numbers of underserved households, far higher than the FCC is recognizing. If the FCC was to acknowledge the Microsoft numbers, they’d have to declare a state of emergency for broadband. Sadly, the FCC has instead doomed millions of homes from getting better broadband by declaring these homes as already served with adequate broadband – something the Microsoft numbers say is not true.

The current FCC seems hellbent on washing their hands of broadband regulation, and the statistics they use to describe the industry provide the needed cover for them to do so. To be fair, this current FCC didn’t invent the false narrative – it’s been in place since the creation of the national broadband maps in 2009. I, and many others predicted back then that allowing the ISPs to self-report performance would put us right where we seem to be today – with statistics that aren’t telling the true story. Microsoft has now pierced the veil to see behind the curtain – but is there anybody in a position of authority willing to listen to the facts?

What’s Next for Rural Broadband?

Now that most of the CAF II money and A-CAM money has been awarded, what’s next for rural broadband? If you ask the FCC that question they are likely to answer that there might yet be one more CAF II auction to fund the 261,000 homes that went unclaimed in the last auction. However, I think this is a much bigger question.

There are still tens of millions of homes that don’t have a broadband option that meets the FCC’s current definition of 25/3 Mbps. That includes all of the places that were funded by the CAF II funds provided to the big telcos and that were only required to provide broadband with speeds of 10/1 Mbps. It also includes numerous other homes that don’t have fast broadband and that are mis-categorized by the inadequate FCC broadband maps that are populated falsely by the big ISPs.

One of CCG’s products is performing surveys and related market research in rural areas. We’ve done a lot of surveys and also asked people to take speed tests in rural communities where the actual speeds at homes are significantly lower than the advertised speeds and the speeds shown on the FCC maps. I’m not just talking about rural farms, but also in sizable towns like county seats where the broadband is still pretty crappy.

It’s obvious that this FCC is working hard to be able to claim that they have taken care of the rural broadband problem. They want to say that they’ve funded broadband everywhere and that their job is done. What they are never going to admit is that the job will never be done until rural areas have the same kind of broadband infrastructure as cities.

This particular FCC is pretending that the need for broadband is sitting still, when in fact the demand for household broadband, both for speeds and for total download volumes keep doubling every three or four years. By the time the current FCC chairman has been in his seat for four years, the comparative quality of rural broadband will have halved due to this increase in demand.

Don’t interpret what I just said to mean that I have disdain for the current FCC. The last FCC under Chairman Tom Wheeler was a huge contributor to the problem when they awarded billions of dollars to the big telcos to make broadband upgrades over seven years to 10/1 Mbps – at a time when 10/1 Mbps already didn’t meet the definition of broadband. That was obviously a political decision since the original plan was to award all of the CAF II funds by reverse auction – which would have helped to fund a lot of rural fiber.

Even if the FCC was highly motivated to solve the rural broadband gap they don’t have the tools to do so. The FCC’s only tool for funding more broadband is the Universal Service. I wrote a blog last week noting how this fund is already overcommitted. Since I wrote that blog I looked at my own cellphone bills and my family alone is contributing several hundred dollars per year towards the USF fund. We are not going to get the many billions we need to expand broadband by taxing landline and cellphone users.

The fix needs to come from Congress. That doesn’t seem likely from the current Congress that already approved a $600 million fund for rural broadband grants and then added on a provision that made the grants nearly impossible to implement. Clearly influenced by lobbyists, Congress added a provision that the grants couldn’t be used in areas where more than 10% of homes already have 10/1 Mbps broadband – and there are very few such areas.

I honestly have a hard time understanding Congress’s reluctance to address rural broadband. When I go to rural counties these days I’m told that getting better broadband has become the number one local issue. I know that rural folks and rural politicians are pleading with their state and national representatives to find broadband funding.

I also know that most politicians say they are in favor of rural broadband. I’ve only seen a handful of politicians in the last decade who told their constituents that they don’t support rural broadband funding. I’ve also found that rural broadband is a nonpartisan issue and at the local level politicians of both parties understand that communities need better broadband.

I wish I could end this blog by suggesting a solution for the problem, but there isn’t any unless the states and the federal government decide at some point to help. State broadband programs providing matching grants have seen some success. I’m sure that federal matching grants would also help as long as they weren’t structured to be giveaways to the big ISPs.