Summary Conclusions for Designing an FCC Broadband Grant

The earlier series of blogs looked at a number of ideas on how the FCC could create the most effective federal grant program for the upcoming $20.4 billion of announced grants. Following is a summary of the most important conclusions of those blogs:

Have a Clearly Defined Goal. If a federal grant’s program goal is something soft, like ‘improve rural broadband’ then the program is doomed to failure and will fund solutions that only incrementally improve broadband. The grant program should have a bold goal, such as bringing a permanent broadband solution to a significant number of households. For example, done well, this grant could bring fiber to 4 – 5 million homes rather than make incremental broadband improvements everywhere.

Match the Grant Process with the Grant Goals. Past federal grants have often had grant application rules that didn’t match the goals. Since the results of grants are governed by the application rules, those are all that matter. Stated goals for a grant are just rhetoric if those goals are not realized in the grant application requirements. As an example, if a grant goal is to favor the fastest broadband possible, then all grant application rules should be weighted towards that goal.

Match Speed Requirement with the Grant Construction Period. The discussion for the proposed $20.4 billion grant contemplates a minimum speed goal of 25/3 Mbps. That’s a DSL speed and is already becoming obsolete today. A goal of 25/3 Mbps will be badly outdated by the time any grant-funded networks are built. The FCC should not repeat their worst decision ever that gave out $11 billion for CAF II funding to build 10/1 Mbps networks – a speed that was obsolete even before the grants were awarded. The FCC should be requiring future-looking speeds.

Make the Grants Available to Everybody. FCC grant and loan programs often include a statement that they are available to every kind of entity. Yet the actual award process often discriminates against some kinds of applicants. For example, grants that include a loan component make it generally impossible for most municipal entities to accept the awards. Loan rules can also eliminate non-RUS borrowers. Grant rules that require recipients to become Eligible Telecommunications Carriers – a regulatory designation – discriminate against open access networks where the network owner and the ISP are separate entities. If not written carefully, grant rules can discriminate against broadband partnerships where the network owner is a different entity than the operating ISP.

Reverse Auction is not a Good Fit. Reverse auctions are a good technique to use when taking bids for some specific asset. Reverse auctions won’t work well when the awarded area is the whole US. Since reverse auctions favor those who will take the lowest amount of funding a reverse auction will, by definition, favor lower-cost technologies. A reverse auction will also favor parts of the country with lower costs and will discriminate against the high-cost places that need broadband help the most, like Appalachia. A reverse auction also favors upgrades over new construction and would favor upgrading DSL over building faster new technologies. From a political perspective, a reverse auction won’t spread the awards geographically and could favor one region, one technology or even only a few grant applicants. Once the auction is started the FCC would have zero input over who wins the funds – something that would not sit well with Congress.

Technology Matters. The grants should not be awarded to technologies that are temporary broadband band-aids. For example, if the grants are used to upgrade rural DSL or to provide fixed cellular broadband, then the areas receiving the grants will be back at the FCC in the future asking for something better. It’s hard to justify any reason for giving grants to satellite providers.

States Need to Step Up. The magnitude of the proposed federal grant program provides a huge opportunity for states. Those states that increase state grant funding should attract more federal grants to their state. State grants can also influence the federal awards by favoring faster speeds or faster technologies.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.

Technology and FCC Grants

This is the next in the series of blogs looking at the upcoming $20.4 billion FCC grant program. I ask the question of how the FCC should consider technology in the upcoming grant program.

Should Satellite Companies be Eligible? I think a more fundamental question is if the current generation of high-orbit satellites really deliver broadband. Over the last few years I’ve talked to hundreds of rural people about their broadband situation and I have never met anybody who liked satellite broadband – not one person. Most people I’ve talked to have tried it once and abandoned it as unworkable.

This goes back to the basic definition of broadband. The FCC defines broadband by download speeds of at least 25/3 Mbps. In their original order in 2015 the FCC discussed latency, but unfortunately never made latency part of the broadband definition. As a reminder, the standard definition of latency is that it’s a measure of the time it takes for a data packet to travel from its point of origin to the point of destination.

A few years ago, the FCC did a study of the various last mile technologies and measured the following ranges of performance of last-mile latency, measured in milliseconds: fiber (10-20 ms), coaxial cable (15-40 ms), and DSL (30-65 ms). Cellular latencies vary widely depending upon the exact generation of equipment at any given cell site, but 4G latency can be as high as 100 ms. In the same FCC test, satellite broadband was almost off the chart with latencies measured as high as 650 ms.

Latency makes a big difference in the perceived customer experience. Customers will rate a 25 Mbps connection on fiber as being much faster than a 25 Mbps connection on DSL due to the difference in latency. The question that should be asked for federal grants is if satellite broadband should be disqualified due to poor latency.

I was unhappy to see so much money given to the satellite providers in the recent CAF II reverse auction. Even ignoring the latency issue, I ask if the satellite companies deserve broadband subsidies. There is no place in rural America where folks don’t already know that satellite broadband is an option – most people have rejected the technology as an acceptable broadband connection. It was particularly troubling seeing satellite providers getting money in a reverse auction. Once a satellite is in orbit it’s costs are fixed and that means that the satellite providers will be happy to take any amount of federal subsidy – they can bid lower than any other grant applicant in a reverse auction. I have to question the wisdom of providing federal subsidies to companies that are already failing at marketing.

I don’t have enough information to know how to feel about the upcoming low-orbit satellites that are just now being tested and launched. Because of lower orbits they will have lower latency. However, the satellite companies still have a huge advantage in a reverse auction since they can bid lower than anybody else – a satellite company would be happy with only a few dollars per potential customer and has no bottom limit on the amount of grant they are willing to accept. If the new satellite companies can bid in the same manner as everybody else we could end up with the situation where these companies claim 100% of the new grant funds.

What About DSL? My nightmare scenario is that the FCC hands most or all of the $20.4 billion to the big telcos to upgrade rural DSL from 10/1 Mbps to 25/3 Mbps. This is certainly within the realm of possibility. Remember that the first CAF II program was originally going to be open to everybody but at the last minute was all given to the big telcos.

I find it troublesome that the big telcos have been quiet about the announced plans for this grant. The money will be spent in the big telco service areas and you’d think they be screaming about plans for federal money to overbuild them. Recall that the big telcos recently were able to derail the Re-Connect grants by inserting the rule that only 10% of the grant money could be used for customers who receive at least 10/1 Mbps broadband. This FCC clearly favors the big telcos over other ISPs and could easily hand all of this money to the big telcos and call it CAF III.

Even if they don’t do that, the question is if any federal grant money should be used to upgrade rural DSL. Rural copper is in dreadful condition due to the willful neglect of the big telcos who stopped doing maintenance on their networks decades ago. It’s frankly a wonder that the rural copper networks even function. It would be a travesty to reward the telcos by giving them billions of dollars to make upgrades that they should have routinely made by reinvesting customer revenues.

I think when the dust clears on CAF II we’re going to find out that the big telcos largely cheated with that money. We’re going to find that they only upgraded the low-hanging fruit and that many households in the coverage areas got no upgrades or minor upgrades that won’t achieve the 10/1 Mbps goals. I think we’ll also find that in many cases the telcos didn’t spend very much of the CAF II funds but just pocketed it as free revenue. I beg the FCC to not repeat the CAF II travesty – when the truth comes out about how the telcos used the funding, the CAF II program is going to grab headlines as a scandal. Please don’t provide any money to upgrade DSL.

This blog is part of a series on Designing the Ideal Federal Broadband Grant.

 

Broadband Have-nots

In one of my recent blogs I talked about a few cities that had broadband penetration north of 90%, meaning that most households in those cities have broadband. I’ve run across three such cities this year. But there are also cities with a very different story. I saw a recent article about Syracuse, New York that claimed that 66% of the homes in the city have a landline broadband connection and only a little more than half of households have a connection that meets the FCC definition of broadband at 25/3 Mbps.

It’s easy to look at the national average broadband penetration rate of 84% and think that most people in cities across the country have broadband. This is particularly true when you adjust that national average to remove the millions of rural households that still have no landline broadband option, which adjusts the national average to over 90%.

We’ve always known that there is a correlation between income and broadband subscription rates – in fact, the basic definition of the urban digital divide is households that can’t afford broadband. We also know that in every larger city that the broadband penetration rates are not uniform but are lower in poorer neighborhoods.

I am concerned that the urban digital divide is going to get worse. Most industry analysts believe that we’ll see significant increases in broadband prices over the next decade. The big cable companies have little choice but to raise broadband rates if they want to maintain the steady bottom line revenue growth expected by Wall Street. This means that’s it’s likely over time that broadband penetration rates in cities are going to drop even lower.

Cities badly want to find a solution to the digital divide that is so heavily impacting low-income neighborhoods. They know there are huge negative impacts on households without broadband. There have been several recent studies showing that school students without home broadband lag behind students with broadband, and they never close the gap. Having whole neighborhoods that can’t afford broadband will be condemning whole generations of underperforming students, helping to perpetuate the cycle of poverty.

Syracuse is considering a solution that would bring some broadband to the neighborhoods that most need it. The city has a plan to buy 18,000 streetlights that would include outdoor WiFi hotspots. These WiFi units can produce decent broadband outdoors, but the strength of WiFi signals decrease significantly when passing through the exterior walls of buildings. While any broadband is better than nothing, outdoor WiFi units are not going to provide the same quality of broadband as a landline connection. Such efforts will likely be welcomed by residents without broadband, but this is still second-rate broadband compared to that given to households that can afford to buy broadband from the incumbent ISPs.

The dilemma for cities is that there is no easy solution to the digital divide. For Syracuse, the problem is mostly affordability and not access. Most of the homes without broadband probably have the option to buy from the incumbent providers. I say most because there are still poor neighborhoods present in almost every city that don’t have the same broadband infrastructure as the rest of the city. I’ve seen estimates that there are nearly as many residences in cities with no broadband option as are rural homes without broadband. It’s hard to know for sure because the areas without broadband are comprised of an apartment building here and a dead-end street there rather than big neighborhoods without broadband.

Cities often consider building their own broadband network as a solution to the digital divide. I undertake numerous broadband feasibility studies every year, and almost every city I’ve ever worked for has universal access to fiber as one of their primary goals. However, building fiber or any broadband infrastructure is expensive, and it’s usually hard to justify the cost of providing free or low-cost broadband to low-income homes. It’s challenging in a competitive environment to make enough profit from normal broadband customers to subsidize low-income homes.

We’ve been talking about the digital divide since the late 1990s when we saw the introduction of DSL and cable modems. In my mind, the problem is far worse today than it was then since broadband has grown to become a necessity of the same magnitude as having electric or water in a home. Unfortunately, I think the urban digital divide will be growing as broadband prices climb year after year.

Broadband Usage Continues to Grow

The firm OpenVault, a provider of software that measures data consumption for ISPs reported that the average monthly data use by households grew from 201.6 gigabytes in 2017 to 268.7 gigabytes in 2018 – a growth rate of 33%. The company also reported that the medium use per household grew from 103.6 gigabytes in 2017 to 145.2 gigabytes in 2018 – a growth rate of 40%. The medium represents the midpoint of users, with half of all households above and half below the medium.

To some degree, these statistics are not news because we’ve known for a long time that broadband usage at homes, both in total download and in desired speeds has been doubling every three years since the early 1980s. The growth in 2018 is actually a little faster than that historical average and if the 2018 growth rate was sustained, in three years usage would grow by 235%. What I find most impressive about these new statistics is the magnitude of the annual change – the average home used 67 more gigabytes of data per month in 2018 than the year before – a number that would have seemed unbelievable only a decade ago when the average household used a total of only 25 gigabytes per month.

There are still many in the industry who are surprised by these numbers. I’ve heard people claim that now that homes are watching all the video they want that the rate of growth is bound to slow down – but if anything, the rate of growth seems to be accelerating. We also know that cellular data consumption is also now doubling every two years.

This kind of growth has huge implications for the industry. From a network perspective, this kind of bandwidth usage puts a big strain on networks. Typically the most strained part of a network is the backbones that connect to neighborhood nodes. That’s the primary stress point in many networks, including FTTH networks, and when there isn’t enough bandwidth to a neighborhood then everybody’s bandwidth suffers. Somebody that designed a network ten years ago would never have believed the numbers that OpenVault is reporting and would likely not have designed a network that would still be sufficient today.

One consequence of the bandwidth growth is that it’s got to be driving homes to change to faster service providers when they have the option. A household that might have been happy with a 5 Mbps or 10 Mbps connection a few years ago is likely no longer happy with it. This has to be one of the reasons we are seeing millions of homes each year upgrade from DSL to cable modem each year in metropolitan areas. The kind of usage growth we are seeing today has to be accelerating the death of DSL.

This growth also should be affecting policy. The FCC set the definition of broadband at 25/3 Mbps in January of 2015. If that was a good definition in 2015 then the definition of broadband should have been increased to 63 Mbps in 2019. At the time the FCC set that threshold I thought they were a little generous. In 2014, as the FCC was having this debate, the average home downloaded around 100 gigabytes per month. In 2014 the right definition of broadband was probably more realistically 15 – 20 Mbps and the FCC was obviously a little forward-looking in setting the definition. Even so, the definition of broadband should be increased – if the right definition of broadband in 2014 was 20 Mbps, then today the definition of broadband ought to have been increased to 50 Mbps today.

The current FCC is ignoring these statistics for policy purposes – if they raise the definition of broadband then huge numbers of homes will be classified as not having broadband. The FCC does not want to do that since they are required by Congressional edict to make sure that all homes have broadband. When the FCC set a realistic definition of broadband in 2015 they created a dilemma for themselves. That 2015 definition is already obsolete and if they don’t change it, in a few years it is going to be absurdly ridiculous. One only has to look forward three years from now, when the definition of broadband ought to be 100 Mbps.

These statistics also remind us of the stupidity of handing out federal subsidies to build technologies that deliver less than 100 Mbps. We still have two more years of CAF II construction to upgrade speeds to an anemic 10 Mbps. We are still handing out new subsidies to build networks that can deliver 25/3 Mbps – networks that are obsolete before they are completed.

Network designers will tell you that they try to design networks to satisfy demands at least seven years into the future (which is the average life of many kinds of fiber electronics). If broadband usage keeps doubling every three years, then looking forward seven years to 2026, the average home is going to download 1.7 terabytes per month and will expect download speeds of 318 Mbps. I wonder how many network planners are using that target?

The final implications of this growth are for data caps. Two years ago when Comcast set a terabyte monthly data cap they said that it affected only a few homes – and I’m sure they were right at the time. However, the OpenVault statistics show that 4.12% of homes used a terabyte per month in 2018, almost double from 2.11% in 2017. We’ve now reached that point when the terabyte data cap is going to have teeth, and over the next few years a lot of homes are going to pass that threshold and have to pay a lot more for their broadband. While much of the industry has a hard time believing the growth statistics, I think Comcast knew exactly what they were doing when they established the terabyte cap that seemed so high just a few years ago.

How Bad is the Digital Divide?

The FCC says that approximately 25 million Americans living in rural areas don’t have access to an ISP product that would be considered as broadband – currently defined as 25/3 Mbps. That number comes out of the FCC’s mapping efforts using data supplied by ISPs.

Microsoft tells a different story. They say that as many as 163 million Americans do not use the Internet at speeds that the FCC considers as broadband. Microsoft might be in the best position of anybody in the industry to understand actual broadband performance because the company can see data speeds for every customer that updates Windows or Microsoft Office – that’s a huge percentage of all computer users in the country and covers every inch of the country.

Downloading a big software update is probably one of the best ways possible to measure actual broadband performance. Software updates tend to be large files, and the Microsoft servers will transmit the files at the fastest speed a customer can accept. Since the software updates are large files, Microsoft gets to see the real ISP performance – not just the performance for the first minute of a download. Many ISPs use a burst technology that downloads relatively fast for the first minute or so, but then slows for the rest of a download – a customer’s true broadband speed is the one that kicks in after the burst is finished. The burst technology has a side benefit to ISPs in that it inflates performance on standard speed tests – but Microsoft gets to see the real story.

I’ve ranted about the FCC’s broadband statistics many times. There are numerous reasons why the FCC data is bad in rural America. Foremost, the data is self-reported by the big ISPs who have no incentive to tell the FCC or the public how poorly they are doing. It’s also virtually impossible to accurately report DSL speeds that vary from customer to customer according to the condition of specific copper wires and according to distance from the DSL core router. We also know that much of the reporting to the FCC represents marketing speeds or ‘up-to’ speeds that don’t reflect what customers really receive. Even the manner of reporting to the FCC, by Census block, distorts the results because when a few customers in a block get fast speeds the FCC assumes that everyone does.

To be fair, the Microsoft statistics measure the speeds customers are actually achieving, while the FCC is trying to measure broadband availability. The Microsoft data includes any households that elect to buy slower broadband products to save money. However, there are not 140 million households that purposefully buy slow broadband (the difference between 163 million and 24 million). The Microsoft numbers tell us that the actual speeds in the country are far worse than described by the FCC – and for half of us slower than 25/3 Mbps. That is a sobering statistic and doesn’t just reflect that rural America is getting poor broadband, but also that many urban and suburban households also aren’t achieving 25/3 Mbps.

I’ve seen many real-life examples of what Microsoft is telling us. At CCG Consulting we do community surveys for broadband and we sometimes see whole communities where the achieved speeds for customers is lower than the speeds advertised by the ISPs. We often see a lot more households claim to have no broadband or poor broadband than would be expected using the FCC mapping data. We constantly see residents in urban areas complain that broadband with a relatively fast speed seems slow and sluggish.

Microsoft reported their findings to the FCC, but I expect the FCC to ignore their story. This is a drastic departure from the narrative that the FCC is telling Congress and the public. I wrote a blog just a few weeks ago describing how the FCC is claiming that big ISPs are delivering the speeds that they market. Deep inside the recent reports the FCC admitted that DSL often wasn’t up to snuff – but the Microsoft statistics mean that a lot of cable companies and other ISPs are also under-delivering.

In my mind the Microsoft numbers invalidate almost everything that we think we know about broadband in the country. We are setting national broadband policy and goals based upon false numbers – and not numbers that are a little off, but rather than are largely a fabrication. We have an FCC that is walking away from broadband regulation because they have painted a false narrative that most households in the country have good broadband. It would be a lot harder for politicians to allow broadband deregulation if the FCC admitted that over half of the homes in the country aren’t achieving the FCC definition of broadband.

The FCC has been tasked by Congress to find ways to improve broadband in areas that are unserved or underserved – with those categories being defined by the FCC maps. The Microsoft statistics tell us that there are huge numbers of underserved households, far higher than the FCC is recognizing. If the FCC was to acknowledge the Microsoft numbers, they’d have to declare a state of emergency for broadband. Sadly, the FCC has instead doomed millions of homes from getting better broadband by declaring these homes as already served with adequate broadband – something the Microsoft numbers say is not true.

The current FCC seems hellbent on washing their hands of broadband regulation, and the statistics they use to describe the industry provide the needed cover for them to do so. To be fair, this current FCC didn’t invent the false narrative – it’s been in place since the creation of the national broadband maps in 2009. I, and many others predicted back then that allowing the ISPs to self-report performance would put us right where we seem to be today – with statistics that aren’t telling the true story. Microsoft has now pierced the veil to see behind the curtain – but is there anybody in a position of authority willing to listen to the facts?

What’s Next for Rural Broadband?

Now that most of the CAF II money and A-CAM money has been awarded, what’s next for rural broadband? If you ask the FCC that question they are likely to answer that there might yet be one more CAF II auction to fund the 261,000 homes that went unclaimed in the last auction. However, I think this is a much bigger question.

There are still tens of millions of homes that don’t have a broadband option that meets the FCC’s current definition of 25/3 Mbps. That includes all of the places that were funded by the CAF II funds provided to the big telcos and that were only required to provide broadband with speeds of 10/1 Mbps. It also includes numerous other homes that don’t have fast broadband and that are mis-categorized by the inadequate FCC broadband maps that are populated falsely by the big ISPs.

One of CCG’s products is performing surveys and related market research in rural areas. We’ve done a lot of surveys and also asked people to take speed tests in rural communities where the actual speeds at homes are significantly lower than the advertised speeds and the speeds shown on the FCC maps. I’m not just talking about rural farms, but also in sizable towns like county seats where the broadband is still pretty crappy.

It’s obvious that this FCC is working hard to be able to claim that they have taken care of the rural broadband problem. They want to say that they’ve funded broadband everywhere and that their job is done. What they are never going to admit is that the job will never be done until rural areas have the same kind of broadband infrastructure as cities.

This particular FCC is pretending that the need for broadband is sitting still, when in fact the demand for household broadband, both for speeds and for total download volumes keep doubling every three or four years. By the time the current FCC chairman has been in his seat for four years, the comparative quality of rural broadband will have halved due to this increase in demand.

Don’t interpret what I just said to mean that I have disdain for the current FCC. The last FCC under Chairman Tom Wheeler was a huge contributor to the problem when they awarded billions of dollars to the big telcos to make broadband upgrades over seven years to 10/1 Mbps – at a time when 10/1 Mbps already didn’t meet the definition of broadband. That was obviously a political decision since the original plan was to award all of the CAF II funds by reverse auction – which would have helped to fund a lot of rural fiber.

Even if the FCC was highly motivated to solve the rural broadband gap they don’t have the tools to do so. The FCC’s only tool for funding more broadband is the Universal Service. I wrote a blog last week noting how this fund is already overcommitted. Since I wrote that blog I looked at my own cellphone bills and my family alone is contributing several hundred dollars per year towards the USF fund. We are not going to get the many billions we need to expand broadband by taxing landline and cellphone users.

The fix needs to come from Congress. That doesn’t seem likely from the current Congress that already approved a $600 million fund for rural broadband grants and then added on a provision that made the grants nearly impossible to implement. Clearly influenced by lobbyists, Congress added a provision that the grants couldn’t be used in areas where more than 10% of homes already have 10/1 Mbps broadband – and there are very few such areas.

I honestly have a hard time understanding Congress’s reluctance to address rural broadband. When I go to rural counties these days I’m told that getting better broadband has become the number one local issue. I know that rural folks and rural politicians are pleading with their state and national representatives to find broadband funding.

I also know that most politicians say they are in favor of rural broadband. I’ve only seen a handful of politicians in the last decade who told their constituents that they don’t support rural broadband funding. I’ve also found that rural broadband is a nonpartisan issue and at the local level politicians of both parties understand that communities need better broadband.

I wish I could end this blog by suggesting a solution for the problem, but there isn’t any unless the states and the federal government decide at some point to help. State broadband programs providing matching grants have seen some success. I’m sure that federal matching grants would also help as long as they weren’t structured to be giveaways to the big ISPs.

The Definition of Broadband

The FCC recently issued the Notice of Inquiry (NOI) seeking input on next years broadband progress report. As usual, and perhaps every year into the future, this annual exercise stirs up the industry as we fight to define the regulatory speed of broadband. That definition matters because Congress has tasked the FCC to undertake efforts to make sure that everybody in the country has access to broadband. Today broadband is defined as 25 Mbps downstream and 3 Mbps upstream, and households that can’t buy that speed are considered underserved if they can get some broadband and unserved if they have no broadband options.

The NOI proposes keeping the 25/3 Mbps definition of broadband for another year. They know if they raise it that millions of homes will suddenly be considered to be underserved. However, the FCC is bowing to pressure and this year will gather data to see how many households have access to 50/5 Mbps broadband.

It was only a year ago when this FCC set off a firestorm by suggesting a reversion to the old definition of 10/1 Mbps. That change would have instantly classified millions of rural homes as having adequate broadband. The public outcry was immediate, and the FCC dropped the idea. For last year’s report the FCC also considered counting mobile broadband as a substitute for landline broadband – another move that would have reclassified millions into the served category. The FCC is not making that same recommendations this year – but they are gathering data on the number of people who access to cellular data speeds of 5/1 Mbps and 10/3 Mbps.

The FCC has also been tasked by Congress for getting faster broadband to schools. This year’s NOI recommends keeping the current FCC goal for all schools to immediately have access of 100 Mbps per 1,000 students, with a longer-term goal of 1 Gbps per 1,000 students.

Commissioner Jessica Rosenworcel has suggested in the current NOI that the official definition of broadband be increased to 100 Mbps download. She argues that our low target for defining broadband is why “the United States is not even close to leading the world” in broadband.

I think Commissioner Rosenworcel is on to something. The gap between the fastest and slowest broadband speeds is widening. This year both Comcast and Charter are unilaterally raising broadband speeds to customers. Charter kicked up the speed at my house from 60 Mbps to 130 Mbps a few weeks ago. AT&T is building fiber to millions of customers. Other fiber overbuilders continue to invest in new fiber construction.

The cable companies decided a decade ago that their best strategy was to stay ahead of the speed curve. This is at least the third round of unilateral speed increases that I can remember. A customer who purchased and kept a 20 Mbps connection a decade ago is probably now receiving over 100 Mbps for that same connection. One way to interpret Commissioner Rosenworcel’s suggestion is that the definition of broadband should grow over time to meet the market reality. If Charter and Comcast both think that their 50 million urban customers need speeds of at least 100 Mbps, then that ought to become the definition of broadband.

However, a definition of broadband at 100 Mbps creates a major dilemma for the FCC. The only two widely deployed technologies that can achieve that kind of speed today are fiber and cable company hybrid fiber/coaxial networks. As I wrote just a few days ago, there are new DSL upgrades available that can deliver up to 300 Mbps for 3,000 – 4,000 feet from a DSL hub – but none of the US telcos are pursuing the technology. Fixed wireless technology can deliver 100 Mbps – but only to customers living close to a wireless tower.

If the FCC was to adopt a definition of broadband at 100 Mbps, they would be finally recognizing that the fixes for rural broadband they have been funding are totally inadequate. They spent billions in the CAF II program to bring rural broadband up to 10/1 Mbps broadband. They are getting ready to give out a few more billion in the CAF II reverse auction which will do the same, except for a few grant recipients that use the money to help fund fiber.

By law, the FCC would have to undertake programs to bring rural broadband up to a newly adopted 100 Mbps standard. That would mean finding many billions of dollars somewhere. I don’t see this FCC being bold enough to do that – they seem determined to ignore the issue hoping it will go away.

This issue can only be delayed for a few more years. The country is still on the curve where the need for broadband at households doubles every three or so years. As the broadband usage in urban homes grows to fill the faster pipes being supplied by the cable companies it will become more apparent each year that the definition of broadband is a lot faster than the FCC wants to acknowledge.

The Lack of Broadband Competition

There is one statistic from the FCC annual report on the state of broadband that I’ve been meaning to write about. There is still a massive lack of broadband competition at speeds that most households are coming to think of as broadband.

Here are the key statistics from that report:

  • 13% of all households can’t get broadband that meets the FCC’s definition of 25/3 Mbps
  • 31% of homes have access to 25/3 Mbps, but not speeds of 100 Mbps
  • 15% have access to 100 Mbps from more than one provider
  • 41% have access to 100 Mbps from only one provider

It’s the last statistic that I find astounding. The current FCC declared with this report that the state of broadband in the country is healthy and that the market is taking care of the country’s broadband needs. I’ve written number blogs about the households in the bottom 13% that have little or no broadband, but I want to look closer at the top two categories.

Households in the 15% category are in markets where there is a fiber provider in addition to the incumbent cable company. The biggest fiber provider is still Verizon FiOS, but there are numerous others building fiber like AT&T, CenturyLink, Google Fiber, smaller telcos, small fiber overbuilders and municipalities.

This means that 41% of households (51 million homes) only have one option for fast broadband – the cable company. I see numerous problems related to this huge monopoly that has been won by the big cable companies. Consider the following:

  • The US already has some of the most expensive broadband in the developed world. The high prices are directly the result of the lack of competition.
  • This lack of competition is likely the driving factor for why most of the big ISPs in the US are rated at the bottom of all US corporations in terms of customer service. We know that customer service improves in markets where is broadband competition, but the big ISPs don’t make the same effort elsewhere.
  • We also know that competition between a cable company and a smaller fiber overbuilder lowers broadband prices. For example, there are markets where competitors like Google have set the price of a gigabit connection at $70, and the cable companies generally come close to matching the lower price. But preliminary pricing from Comcast and Charter for their new gigabit products where there are no competitors will be significantly north of $100 per month.
  • Even where there are competing networks, if both networks are owned by large ISPs we see duopoly competition where the big ISPs don’t push each other on price. For example, Comcast largely is able to offer the same prices when competing against Verizon FiOS as it does in markets where there is no fiber provider.
  • Industry analysts expect the big ISPs to start raising broadband rates for various reasons. The ISPs continue to lose telephone and cable customers and the national penetration rate for broadband is nearing a market saturation point. In order to satisfy Wall Street the big ISPs will have little choice other than raising broadband prices to maintain earnings growth.

I’m sure that the households in the bottom 13% of the market that can’t get good broadband are not sympathetic to those who can only buy fast broadband from one provider. But these statistics say that 41% of the whole market are dealing with a monopoly situation for fast broadband. Telecom is supposed to be a competitive business – but for the majority of the country the competitors have never showed up. For the FCC to declare that we have a healthy broadband market astounds me when so many households are hostage to a broadband monopoly.

There is always the chance that over the next decade that fixed 5G will bring more broadband competition. My guess, however, is that at least for a few years that this is going to be a lot more competition by press release than real competition. Deploying gigabit 5G like the big ISPs are all touting is going to require a lot more fiber than we have in place today. Deploying 5G without fiber backhaul might still result in decent broadband, but it’s not going to be the robust gigabit product that the ISPs are touting. But even poorly deployed 5G networks might bring 100+ Mbps broadband to a lot more homes after the technology gets a little more mature.

Unfortunately there is also the risk that 5G might just result in a lot more duopoly competition instead of real competition. If 5G is mostly deployed by big ISPs like Verizon and AT&T there is no reason to think that they will compete on price. Our only hope for real market competition is to see multiple non-traditional ISPs who will compete on price. However, it’s so tempting for ISPs to ride the coattails of the big ISPs in terms of pricing that 5G might bring more of the same high prices rather than real competition.

Setting the FCC Definition of Broadband

In the recently released 2018 Broadband Progress Report the FCC reluctantly kept the official definition of broadband at 25/3 Mbps. I say reluctantly because three of the Commissioners were on record for either eliminating the standard altogether or else reverting back to the older definition of 10/1 Mbps.

I’m guessing the Commissioners gave in to a lot of public pressure to keep the 25/3 standard. Several Commissioners had also taken a public stance that they wanted to allow cellular data to count the same for a household as landline broadband – and that desire was a big factor in lowering the definition since cellphones rarely meet the 25/3 speed standard.

The deliberation on the topic this year raises the question if there is some way to create a rule that would better define the speed of needed broadband. It’s worth looking back to see how the Tom Wheeler FCC came up with the 25/3 definition. They created sample profiles of the way that households of various sizes are likely to want to use broadband. In doing so, they added together the bandwidth needed for various tasks such as watching a movie or supporting a cellphone.

But the FCC’s method was too simple and used the assumption that various simultaneous uses of broadband are additive. They added together the uses for a typical family of four which resulted in bandwidth needs greater than 20 Mbps download, and used that as the basis for setting the 25/3 standard. But that’s now home broadband works. There are several factors that affect the actual amount of bandwidth being used:

For example, doing simultaneous tasks on a broadband network increases the overhead on the home network. If you are watching a single Netflix stream, the amount of needed bandwidth is predictable and steady. But if three people in a home are each watching a different Netflix the amount of needed bandwidth is greater than adding together the three theoretical streams. When your ISP and your home router try to receive and untangle multiple simultaneous streams there are collisions of packets that get lost and which have to be retransmitted. This is described as adding ‘overhead’ to the transmission process. Depending on the nature of the data streams the amount of collision overhead can be significant.

Almost nobody directly wires the signal from their ISP directly too all of their devices. Instead we use WiFi to move data around to various devices in the home. A WiFi router has an overhead of its own that adds to the overall bandwidth requirement. As I’ve covered in other blogs, a WiFi network is not impacted only by the things you are trying to do in your home, but a WiFi network is slowed when it pauses to recognizes demands for connection from your neighbor’s WiFi network.

Any definition of home broadband needs should reflect these overheads. If a household actually tries to download 25 Mbps of usage from half a dozen sources at the same time on a 25 Mbps, the various overheads and collisions will nearly crash the system.

The FCC’s definition of broadband also needs to reflect the real world. For example, most of the unique programming created by Netflix and Amazon Prime are now available in 4K. I bought a large TV last year and we now watch 4K when it’s available. That means a stream of 15-20 Mbps download. That stream forced me to upgrade my home WiFi network to bring a router into the room with the TV.

The FCC’s speed definition finally needs to consider the busy hour of the day – the time when a household uses the most broadband. That’s the broadband speed that the home needs.

We know household bandwidth needs keep increasing. Ten years ago I was happy with a 5 Mbps broadband product. Today I have a 60 Mbps product that seems adequate, but I know from tests I did last year that I would be unhappy with a 25 Mbps connection.

The FCC needs a methodology that would somehow measure actual download speeds at a number of homes over time to understand what homes area really using for bandwidth. There are ways that this could be done. For example, the FCC could do something similar for broadband like what Nielsen does for cable TV. The FCC could engage one of the industry firms that monitor broadband usage such as Akamai to sample a large number of US homes. There could be sample voluntary homes that meet specific demographics that would allow monitoring of their bandwidth usage. The accumulated data from these sample homes would provide real-life bandwidth usage as a guide to setting the FCC’s definition of broadband. Rather than changing the official speed periodically, the FCC could change the definition as needed as dictated by the real-world data.

The FCC does some spot checking today of the broadband speeds as reported by the ISPs that feed the national broadband map. But that sampling is random and periodic and doesn’t provide the same kind of feedback that a formal ongoing measuring program would show. We have tools that could give the FCC the kind of feedback it needs. Of course, there are also political and other factors used in setting the official definition of broadband, and so perhaps the FCC doesn’t want real facts to get into the way.

Regulating From Broadband Maps

One of the more bizarre things we do in the US is regulate broadband based upon broadband maps. There are numerous federal grant and subsidy programs that rely upon these maps (and the underlying databases that support them) as well as various state programs. The FCC also uses this same data when reporting broadband penetration in the country to Congress each year, as just occurred on February 9.

The maps are intended to show how many households can purchase broadband of various speeds. Currently the arbitrary speed thresholds tested are download speeds of 10 Mbps, 25 Mbps and 100 Mbps. These speeds are measured due to past decisions by the FCC. For example, the FCC chose a 10/1 Mbps speed goal for any company that accepted CAF II money to upgrade rural broadband. The FCC’s current definition of broadband is still set at 25/3 Mbps.

Anybody that understands broadband networks knows that much of the data included in the databases and the mapping is incorrect, and sometimes pure fantasy. That makes sense when you understand that the speeds in this mapping process are all self-reported by ISPs.

There are numerous reasons why the speeds in these databases are not an accurate reflection of the real world:

  • There are still ISPs that report advertised speeds rather than actual speeds received by customers.
  • Any speeds represented for a whole DSL network are inaccurate by definition. DSL speeds vary according to the size of the copper wires, the condition of the copper cable and the distance from the source of the DSL broadband signal. That means that in a DSL network the speeds available to customers vary street by street, and even house by house. We’ve always known that DSL reported in the mapping databases is overstated and that most telcos that report DSL speeds report theoretical speeds. I’m not sure I blame them, but the idea of any one speed being used to represent the performance of a DSL network is ludicrous.
  • The speeds in the database don’t recognize network congestion. There are still many broadband networks around that bog down under heavy usage, which means evenings in a residential neighborhood. Nobody wants to be told that their network is performing at 10 Mbps if the best speed they can ever get when they want to use it is a fraction of that.
  • The speeds don’t reflect that ISPs give some customers faster speeds. In networks where bandwidth is shared among all users on a neighborhood node, if a few customers are sold a faster-than-normal speed, then everybody else will suffer corresponding slower speeds. Network owners are able to force extra speed to customers that pay a premium for the service, but to the detriment of everybody else.
  • The maps don’t reflect the way networks were built. In most towns you will find homes and businesses that were somehow left out of the initial network construction. For example, when cable companies were first built they largely ignored business districts that didn’t want to buy cable TV. There are lots of cases of apartment and subdivision owners that didn’t allow in the incumbent telco or cable company. And there are a lot of homes that just got missed by the network. I was just talking to somebody in downtown Asheville where I live who is not connected to the cable network for some reason.
  • Not all ISPs care about updating the databases. There are many wireless and other small ISPs that don’t update the databases every time they make some network change that affects speeds. In fact, there are still some small ISPs that just ignore the FCC mapping requirement. At the other extreme there are small ISPs that overstate the speeds in the databases, hoping that it might drive customer requests to buy service.
  • One of the most insidious speed issues in networks are the data bursts that many ISPs frontload into their broadband products. They will send a fast burst of speed for the first minute or two for any demand for bandwidth. This improves the customer experience since a large percentage of requests to use bandwidth are for web searches or other short-term uses of bandwidth. Any customer using this feature will obtain much faster results from a speed test than their actual long-use data speeds since they are actually testing only the burst speed. A rural customer using burst might see 4 Mbps on a speed test and still find themselves unable to maintain a connection to Netflix.
  • Sometimes there are equipment issues. The best-known case of this is a widespread area of upstate New York where Charter has kept old DOCSIS 1.0 cable modems in homes that are not capable of receiving the faster data speeds the company is selling. It’s likely that the faster network speed is what is included in the database, not the speed that is choked by the old modems.
  • And finally, speed isn’t everything. Poor latency can ruin the utility of any broadband connection, to the point where the speed is not that important.

Unfortunately, most of the errors in the broadband databases and maps overstate broadband speeds rather than under-report them. I’ve worked with numerous communities and talk to numerous people who are not able to get the broadband speeds suggested by the FCC databases for their neighborhoods. Many times the specific issue can be pinned down to one of the above causes. But that’s no consolation for somebody who is told by the FCC that they have broadband when they don’t.