NTIA Proposes BEAD Performance Measurement Rules

One of the many requirements for BEAD winners will be to regularly report customer speeds after networks are built. NTIA recently issued a draft of the measurement requirements, and the final rules should be similar.

Following are a few highlights of the measuring requirements:

  • Tests have to be done twice per year through the federal interest period. That means ten years for fiber, with the ten years starting when a network is completed.
  • At least 95% of speed tests must meet 100/20 Mbps for normal residential and business passings and 1/1 Gbps for anchor institution passings.
  • Latency must be under 100 milliseconds.
  • ISPs must provide outage reports that show that the network were out of service no more than 48 hours during a year.
  • Speed tests can be done using the MBA testing program (which the FCC has discontinued) or can rely on network management tools and software that allows for testing.
  • Tests are to be administered to 10% of the customers of a given ISP across a whole state for each technology deployed. If an ISP deploys both fiber and fixed wireless across multiple projects in a state, it would have to test 10% of the fiber customers and 10% of the wireless customers. ISPs can elect to test more than 10%.
  • The customers for the speed tests must be chosen at random, using a publicly available random sampling program.
  • ISPs must upgrade customers to the target speed during the speed test period. If an ISP offers a 50/10 Mbps package, the customer must be updated during the testing period to 100/20 Mbps. This same rule applies to anchor institutions that might be buying something less than a gigabit product.
  • One requirement that will drive folks crazy is that test locations must match the FCC broadband map – and every ISP understands there is a difference between the FCC maps and real life.
  • Tests are to be done between the customer gateway and an Internet exchange point in the closest of New York City, Washington DC, Atlanta, Miami, Chicago, Dallas-Fort Worth, Los Angeles, San Francisco, Seattle, Denver, Salt Lake City, St. Paul, Helena, Kansas City, Phoenix, or Boston, MA.
  • The testing period is one week, with tests required between 6:00 PM and 12:00 AM (local prime time). Testing must be done every hour. Tests should last at least 10 to 15 seconds.
  • To comply with the speed standard, certified test results for each state or territory and speed tier, must show that 80% of the speed tests are at or above 80% of the required speed. For example, for projects that have committed to 100/20 Mbps, 80% of measurements must meet or exceed 80/16 Mbps.
  • ISPs that don’t pass the test requirements must report to the State within 15 days of completing the tests and must begin testing quarterly.

There are some interesting aspects of these rules. Reporting goes to State broadband offices, and this assumes these offices will remain staffed for the next decade. A lot of states only created a broadband office due to the Capital Project Funds and BEAD funding, and this requirement implies they would have to maintain some staffing for a long time. This also implies that States must have somebody on board who can verify that ISPs are testing properly. It’s not hard to envision that some States will lose interest in broadband once most rural areas are served.

The tests should be no challenge for a fiber network since tests are to the router and don’t include the impact of indoor WiFi. It almost seems like a waste to make a fiber network do these tests for ten years. But there will likely be wireless networks that will not meet the test requirements everywhere, and it will be interesting to watch satellite performance over time.

The biggest question not addressed in the rules is what States will do if ISPs fail the test. Experience from past federal speed tests is that there will likely be little repercussions unless somebody fails dramatically.

One of the parts of the test program that ISPs are going to find troubling is that NTIA seemingly wants them to provide a detailed list of every customer and they speed they are buying. ISPs have never been required to submit data to that level of detail for the basic reason that a customer list is probably the most important trade secret for every ISP. I have to think this will change before implementation.

A New Complaint About BEAD Maps

Earlier this month the National Rural Electric Cooperative Association (NRECA) made an ex parte filing with the FCC that warned that the current FCC maps do not reflect the reality on the ground of rural broadband. They warn that there are a lot of places that need broadband that will not be covered by BEAD. They warn that areas that don’t get funding now by BEAD will be left behind.

Anybody who reads this blog knows that I’ve been making this same argument for the last several years. As NRECA points out, the new FCC broadband maps are a big step up from the previous FCC mapping. The old maps reported broadband by Census block, and in doing so often showed an area as having good broadband when only a few places in the block had a faster technology.

NRECA points out that the fatal flaw in the new maps is that ISPs self-report broadband speeds and are free to report marketing speeds instead of something closer to what is actually delivered to customers. It was an interesting policy choice for the FCC to make since this doesn’t match what the FCC is doing elsewhere. For example, the FCC requires ISPs to report broadband speeds for each product on the new broadband labels. The rules for the labels suggest that ISPs should report speeds that have some basis derived from internal ISP speed testing. In my early examination of broadband labels, most ISPs are ignoring this requirement and claim the identical speed on the broadband labels that is reported on the FCC maps.

ISPs know the speeds they are delivering to customers. Most broadband networks have the ability to measure speeds from their core network to the customer location – a speed that doesn’t get influenced by the performance of WiFi inside of a customer premise.

I’ve seen numerous examples of speeds reported for the FCC maps that are far faster than speeds measured by speed tests. For example, I recently found a big telco reporting 100/20 Mbps for rural DSL, while Ookla speed tests show an average speed around 25 Mbps – and no speed test faster than 40 Mbps. I’ve seen the same thing for WISPs and FWA cellular broadband, where Ookla speed tests are far slower than what is being reported to the FCC. There are cable companies with no speed tests of upload speeds faster than 20 Mbps, which means their areas should be eligible for BEAD grant funding.

In its filing, NRECA suggested that the public should be allowed to take speed tests to report to the FCC. The FCC certainly has the ability to crowdsource speed tests since it does so for cellular broadband. Customers can take a cellular speed test using an FCC speed test app. I can’t think of any reason why the FCC couldn’t directly collect speed tests directly from customers using the same or a similar app. I’m also mystified why the FCC couldn’t partner with one of the big speed test sites like Ookla to gather the many millions of speed tests that are already being taken every day.

ISPs do not want customer speed tests to be part of the equation, and they have some valid arguments that the results of any given speed test can’t be trusted. Every ISP will tell you that a big part of the problem that customers have with broadband is the WiFi signal inside their home. They might have old or inadequately configured routers. They might be taking speed tests from a computer located far from the in-home router.

But interestingly, if you gather enough speed tests, a good picture of broadband performance emerges. I’ve always focused on the maximum speeds measured for a given ISP. If an ISP says it can deliver 100 Mbps or gigabit speed, then there should be some speed tests close to that speed. My experience is that looking at large numbers of speed tests will quickly identify ISPs who are reporting speeds far faster than what they are delivering.

To be fair to ISPs, speed tests can also show the opposite. I’ve seen ISPs that claim a speed on the FCC maps of 25 Mbps or 50 Mbps but are delivering much faster speeds.

The NRECA is absolutely right about the BEAD grants. There are a lot of rural areas that will be excluded from BEAD because of overstated broadband speeds. Broadband offices and the NTIA will say that there is a BEAD map challenge process to address this issue. But I could write a whole series of blogs describing the ridiculous steps that NTIA is requires to mount a successful map challenge for BEAD. Even if the map challenge process was reasonable, many local governments don’t have the resources or budget to mount a serious map challenge. This means that counties that were unable to mount a successful BEAD map challenge have a good chance of having locations improperly excluded from BEAD.

When the dust settles from BEAD grants, there are going to be a whole lot of rural neighborhoods that will not get a broadband solution – and they are going to be vocal about it.  My prediction is that this is going to end up being a few million such rural locations. As much as the industry wants to pretend that BEAD is going to solve the rural broadband issue, anybody who looks close at the FCC maps and the BEAD process knows this is not the case.

Comparing State Broadband Performance

Ookla recently published a report that compares broadband connectivity and performance in each state. The report highlights the percentage of broadband customers who are receiving broadband speeds that meet the FCC’s definition of broadband of 100/20 Mbps.  This is also the speed threshold being used for the $42.5 billion BEAD grant program, which is supposed to provide grants to every part of the country that can’t achieve 100/20 Mbps. Ookla is the largest and most commonly used speed test in the country and receives millions of tests each day, so these comparisons are based on huge numbers of speed tests.

The Ookla results are interesting and give states a way to compare themselves to peer states. Connecticut, North Dakota, Maryland, Delaware, Rhode Island, and Tennessee had the highest percentage of speed tests that met the 100/20 Mbps threshold. downstream and 20 Mbps upstream. Each state had over 62% of speed tests faster than 100/20 Mbps – with Connecticut at 65.8% and Tennessee at 62.2%.

Ookla also got more granular in its analysis. For example, the analysis compared average speed tests result in each state for urban and rural broadband customers. There is a map in the report that industry folks are going to want to explore. This comparison produced some interesting results:

  • Connecticut, which has the overall highest percentage of homes receiving 100/20 Mbps had 72.4% of urban households and 62.3% of rural households receiving that speed. Number two overall fastest was North Dakota which had 69.7% urban and 64.6% rural.
  • The state with the biggest urban/rural digital divide was Washington, with 61.1% of urban households and only 28.7% or rural households receiving 100/20 Mbps.
  • South Carolina has a higher percentage of rural homes receiving fast speeds (56.4%) than urban homes (55.1%). The other states where urban and rural broadband performance is similar are North Dakota and Nevada.
  • Some of the most populous states had low rural broadband speeds including Illinois (38.7%), New York (39.4%), and California (40.1%).
  • The states with the lowest percentage of rural homes meeting 100/20 Mbps are also the least densely populated – Alaska (17.3%), Montana (20.8%), and Wyoming (25.3%).
  • The other states with percentage of rural broadband coverage under 40% include New Mexico (29.4%), Wisconsin (31.4%), Oregon (32.2%), Idaho (34.1%), Michigan (37.5%), and Maine (37.6%). These are the states that will require a heavy life from BEAD grants.
  • Some states are probably surprising to those outside of the industry. The best example is Mississippi, which historically had poor broadband coverage. However, the analysis shows urban coverage at 62.3% and rural at 56.6%. There is a lot of industry derision aimed at the RDOF program, but that program enabled rural electric coops in the state to build fiber.
  • Finally, a few states showed big improvement between the first two quarters and 2023 and the first two quarters in 2024. The states with the biggest improvements are New Mexico (50%), Arizona, (45%), Minnesota (38%), and Nevada (37%).

Anybody who looks closely at speed test results will quickly understand that any given speed test might not be accurate because of issues inside a home. A home might receive adequate broadband, but an old or underperforming WiFi router might lower the speed delivered to devices. WiFi is also subject to distance and interference issues, and computers located at the far end of a house might receive significantly slower speeds.

However, when taken in mass, speed tests provide an accurate comparison – if you assume that WiFi is a problem everywhere. This means is that every state actually has a higher percentage of homes that receive 100/20 Mbps than shown by the Ookla numbers. However, the relative differences between states, or between urban and rural parts of states are believable.

Grants and Upload Speeds

The NTIA set a new definition of broadband at 100/20 Mbps for purposes of the BEAD grants – if a customer fails that test they are considered either unserved or underserved. Everybody nationwide has been so focused on download speeds that we are largely ignoring the fact that a huge number of nationwide broadband customers are not getting upload speeds of 20 Mbps. All of the speed test efforts I’ve seen have focused on whether homes and businesses are receiving 100 Mbps download and have largely ignored any implications of customers not achieving the NTIA’s 20 Mbps upload stream to qualify for a broadband grant.

My consulting firm helps clients conduct a lot of speed tests, and I also have been poring through the large number of speed tests gathered by Ookla and MLabs. I mostly work in rural counties, county seats, and suburban cities. I would venture to say that the vast majority of speed tests we see from cable customers do not meet the upload speed. The same is true for a large percentage of WISPs.

Sometimes the evidence is overwhelming. I recently worked with a county seat of about 20,000 people and the only customer in the community seeing upload speeds of 20 Mbps or faster were those who subscribed to the cable company’s gigabit product. Not one other cable customer had a 20 Mbps, and most weren’t even close, with an average of 11 Mbps. This was true for customers buying both a 100 Mbps, 200 Mbps, and 400 Mbps download product.

This raises an interesting question, which I’m sure is going to be the core of the cable company’s response to this question. In that particular city, the gigabit customers were getting upload speeds between 30 Mbps and 40 Mbps. I’m sure the cable company will argue that since a few customers are getting speeds over 20 Mbps that the network is capable of faster speeds.

I’ve talked to several knowledgeable engineers on the topic, and they tell me that the cable company in this case could not give faster speeds to everybody – or they would. The cable company is somehow giving a preference for gigabit customers at the expense of everybody else. If the cable operator opened the gates for everybody to get the fastest upload speed possible, the likely outcome would be that the gigabit customer speeds would drop to match everybody else’s speeds – the other customers would not get any faster.

This is an interesting question for state broadband grant offices to consider because it’s inevitable that people are going to seek grants where there is a cable company operating, using the argument that the cable company doesn’t meet the NTIA’s definition of broadband.

It makes sense to me that an ISP must meet both components of the speed definition to be considered as served. It shouldn’t matter if an ISP misses on the download or upload speed – if it fails one of the two benchmarks, it is not meeting the NTIA’s definition of served. If you don’t believe that logic, consider an ISP that is delivering 50/20 Mbps on licensed fixed wireless. I think there would be a consensus that this customer is not served since it is achieving only half of the definition of download speeds. But isn’t the same true for an ISP that is delivering 120/10 Mbps broadband?

To be fair to cable companies, they deliver speeds greater than 20 Mbps in many markets. I buy 400 Mbps download from Charter and routinely see upload speeds of 30 Mbps. But we all know that the performance of cable companies varies widely from town to town, and often inside of a town.

I had to laugh last year when the big cable companies fought so hard to reduce the definition of served from 100/100 Mbps to 100/20 Mbps. I knew then that this battle would be coming since the majority of cable customers, at least in the markets I have studied, are not seeing upload speeds of 20 Mbps.

One thing I think we can all count on is that if any grant office awards funding to overbuild a cable company because of this issue, we’re going to see the cable industry go ape. They’ve been quiet about the poor upload speeds, but they won’t stay that way if they see grant money coming to overbuild them.

What We’ve Learned About Upload Bandwidth

Until the pandemic hit, I rarely thought about upload bandwidth. I mostly used upload bandwidth to send files to people, and I rarely cared if they received the files immediately – I was happy as long as files got sent. But the pandemic changed everything for millions of people. All of a sudden, homes were unable to function well due to problems with uploading.

The big change from the pandemic came when many millions of people were sent home to work while students were sent home to attend school remotely. It turns out that connecting to schools and offices requires steady and reliable upload bandwidth, and many homes found they didn’t have that. My consulting firm has done several surveys per month during the pandemic, and we routinely have seen that at least 30% of those working or schooling from home, including those using cable company broadband, say that their bandwidth was not adequate for the needs created by the pandemic. Homes that tried to accommodate multiple people working online at the same time had the worst experiences.

We also changed a lot of other behavior during the pandemic that uses more upload bandwidth. Many who work from home started using software that automatically saves all work in the cloud. We started using collaborative software to connect to others working from home. And we began making Zoom calls to such an extent that this is now the largest use of upload broadband nationwide and has grown from practically nothing to consume over 15% of all upload broadband usage. Spending more time at home led millions to take up gaming – an activity that just started transitioning to the cloud before the pandemic.

We also got a stark reminder that broadband technologies are shared services. We saw that even homes with only one person working at home could suffer if the bandwidth for the whole neighborhood got bogged down from overuse.

It seems that everybody started collecting speed tests to figure out what was going wrong. Local governments, States, and the NTIA started gathering and looking at speed test results. We know that an individual speed test result is not reliable, but we’ve seen that masses of speed tests tell a great story about a given ISP in a given community.

We also learned that broadband networks vary by neighborhood – something that I don’t recall ever being discussed before the pandemic. Speed tests often showed that the performance of a cable company in a city could be drastically different by neighborhood. There have always been those who complained about cable company broadband, but they weren’t taken seriously by those in the same town that had adequate broadband. But we now often see some parts of cities with speeds drastically lower than the rest of the city – something cable companies have known about but never fixed.

We learned how awful rural broadband technologies can be when most rural folks had problems working and schooling from home. We figured this out when speed tests showed that rural upload speeds are often less than 1 Mbps.

Lately, I’ve been learning more about jitter, which measures the variance in broadband signal strength. Many people learned about jitter the hard way when they often got booted from school connections or Zoom calls when broadband signal strength fluctuated and hit a low point.

We also learned how the cable companies use the worst spectrum on a cable system to transmit upload speeds. They use spectrum inside the coaxial cables to transmit data, and the portion of the network used for upload is where natural interference from microwave ovens, small engines, and natural background radiation causes the most interference.

We’ve also learned that the pandemic has been good for the ISPs, although they aren’t talking about it. Millions of homes upgraded to faster broadband to try to get enough bandwidth during the pandemic. Unfortunately for many of them, their problem was not the download speeds, but the upload speeds, and the upgrade may not have brought much of a solution.

Challenging the FCC Broadband Maps

I’ve written many times about the absurdity of using the FCC mapping data for identifying areas with or without broadband. I’ve lately been looking at the FCC mapping data in West Virginia and New Mexico – two of the states with the worst broadband coverage in the country – and the FCC maps are atrocious. I see counties where the claimed broadband coverage in the FCC maps is wrong for more than half of the geographic area.

Unfortunately, the FCC is about to award $20.4 billion in RDOF grants later this year based solely on these dreadful maps. Luckily, there are other grant programs that allow grant applicants to challenge the FCC data. This includes the USDA ReConnect grants and many of the state grant programs.

One of the few ways to challenge the FCC maps is with speed tests. Anybody undertaking such a challenge needs to be aware that the incumbent telcos might challenge your speed test results, and unfortunately, some of their criticisms will be right. This means that anybody challenging the FCC maps has to take some steps to maximize the effectiveness of speed tests. Here are a few aspects of administering speed tests that should be considered.

  • A speed test needs to distinguish between cellular and landline connections. Rural folks with no broadband connection or those using cellular for home broadband are going to take the test with their cellphone. While such results are interesting, cellular speed tests can’t be mixed into a challenge of landline broadband coverage.
  • Everybody needs to use the identical speed test because each speed test measures speed using a different algorithm. Never use a speed test from the incumbents – it might be baked to show too good results.
  • A challenge can be most effective if it can get feedback from folks with no broadband available at their home. You need to somehow solicit and include results from folks that can’t take the speed tests.
  • You also should be aware a speed test sometimes doesn’t work for somebody with really slow broadband or high latency. We recently sat on the phone with somebody using satellite broadband and they couldn’t get the speed test to complete, even after many attempts.
  • The biggest challenge is in mapping the results. If you map the results so precisely that the results can be overlaid on individual homes on Google Earth, then you have provided an incumbent ISP the chance to challenge the test results. They can likely identify homes where they aren’t the ISP, or homes that have broadband that meets the FCC speed thresholds, meaning that slow speed test results might be due to poor WiFi or some other reason. Ultra-precise mapping might also violate the privacy of the people taking the speed test, This is an issue that many state speed test programs have wrestled with – some of them take such care to mask the precise location of the data that their final product can’t be used to challenge the FCC maps. For example, if speed test results are summarized by Census blocks then the results incorporate the same kinds of problems that are included in the FCC maps. Probably the best approach is to embed the final results in a pdf that is of low enough resolution to not identify individual homes.

There is one other way to map broadband coverage. An experienced field technician or engineer can drive around an area and can identify every broadband asset in the field. They can precisely identify where the cable TV networks end, down to the house. They can identify field DSLAMs that generate DSL signals out of rural cabinets – and they can often precisely identify the flavor of DSL and know the maximum speed capability of a given unit. They can identify the location and height of wireless transmitters and can map out the likely coverage areas. This kind of effort is most effective at identifying where there is no broadband, A good technician can make a decent map of the likely maximum broadband speeds available in a given area – something that is rarely achieved on most rural networks. This kind of challenge could be expensive and time-consuming, and I’ve never seen a challenge done this way. But I know engineers and technicians capable of making highly accurate maps.

Communities can tackle speed tests – they can get households to take the same speed test, such as the speed test from Ookla, and then match and map the results using GIS data. This can be a lot of work. Mapping can also be provided by many telecom engineering companies. One of the lowest-costs solutions is a speed test by Neo Partners that administers the speed test and overlays the speed test results automatically on Google maps.

Even if you aren’t challenging a grant, communities ought to consider speed tests to better understand the broadband in their community. As an example, I worked for a city where the speed tests showed that one neighborhood had far slower speeds than the rest of the city – something the city hadn’t known before the speed test. We’ve done speed tests that showed that the incumbent was delivering more than the advertised speed – again, something worth knowing.

FCC Further Defines Speed Tests

The FCC recently voted to tweak the rules for speed testing for ISPs who accept federal funding from the Universal Service Fund or from other federal funding sources. This would include all rate-of-return carriers including those taking ACAM funding, carriers that won the CAF II reverse auctions, recipients of the Rural Broadband Experiment (RBE) grants, Alaska Plan carriers, and likely carriers that took funding in the New York version of the CAF II award process. These new testing rules will also apply to carriers accepting the upcoming RDOF grants.

The FCC had originally released testing rules in July 2018 in Docket DA 18-710. Those rules applied to the carriers listed above as well as to all price cap carriers and recipients of the CAF II program. The big telcos will start testing in January of 2020 and the FCC should soon release a testing schedule for everybody else – the dates for testing were delayed until this revised order was issued.

The FCC made the following changes to the testing program:

  • Modifies the schedule for commencing testing by basing it on the deployment obligations specific to each Connect America Fund support mechanism;
  • Implements a new pre-testing period that will allow carriers to become familiar with testing procedures without facing a loss of support for failure to meet the requirements;
  • Allows greater flexibility to carriers for identifying which customer locations should be tested and selecting the endpoints for testing broadband connections. This last requirement sounds to me like the FCC is letting the CAF II recipients off the hook by allowing them to only test customers they know meet the 10/1 Mbps speeds.

The final order should be released soon and will hopefully answer carrier questions. One of the areas of concern is that the FCC seems to want to test the maximum speeds that a carrier is obligated to deliver. That might mean having to give customers the fastest connection during the time of the tests even if they have subscribed to slower speeds.

Here are some of the key provisions of the testing program that were not changed by the recent order:

  • ISPs can choose between three methods for testing. First, they may elect what the FCC calls the MBA program, which uses an external vendor, approved by the FCC, to perform the testing. This firm has been testing speeds for the network built by large telcos for many years. ISPs can also use existing network tools if they are built into the customer CPE that allows test pinging and other testing methodologies. Finally, an ISP can install ‘white boxes’ that provide the ability to perform the tests.
  • Testing, at least for now is perpetual, and carriers need to recognize that this is a new cost they have to bear due to taking federal funding.
  • The number of tests to be conducted will vary by the number of customers for which a recipient is getting support; With 50 or fewer households the test is for 5 customers; for 51-500 households the test is 10% of households. For 500 or more households the test is 50 households. ISPs declaring a high latency must test more locations with the maximum being 370.
  • Tests for a given customer are for one solid week, including weekends in each quarter. Tests must be conducted in the evenings between 6:00 PM and 12:00 PM. Latency tests must be done every minute during the six-hour testing window. Speed tests – run separately for upload speeds and download speeds – must be done once per hour during the 6-hour testing window.
  • ISPs are expected to meet latency standards 95% of the time. Speed tests must achieve 80% of the expected upland and download speed 80% of the time. An example of this requirement is that a carrier guaranteeing a gigabit of speed must achieve 800 Mbps 80% of the time. ISPs that meet the speeds and latencies for 100% of customers are excused from quarterly testing and only have to test once per year.
  • There are financial penalties for ISPs that don’t meet these tests.
  • ISPs that have between 85% and 100% of households that meet the test standards lose 5% of their FCC support.
  • ISPs that have between 70% and 85% of households that meet the test standards lose 10% of their FCC support.
  • ISPs that have between 55% and 75% of households that meet the test standards lose 15% of their FCC support.
  • ISPs with less than 55% of compliant households lose 25% of their support.
  • The penalties only apply to funds that haven’t yet been collected by an ISP.

FCC Says Big ISPs Delivering the Speeds they Market

The FCC recently released the reports from its speed test program for both 2017 and 2018. The reports summarize the results of the FCC’s Measuring Broadband America (MBA) that samples the actual performance of broadband customers by installing measuring devices at their homes. This program began in 2011 and these are the 7thand 8threport from the program. These used to be issued as separate reports, but these reports along with a number of other FCC reports are now being released together in one large annual filing. The link to the reports can be found here. The 2017 report begins at page 349 of the document and the 2018 report on page 463.

These tests are given to volunteer households of large ISPs only – those that cover 80% of all broadband customers in the country. This list of ISPs contains the big cable companies, big telcos, satellite broadband providers.

The primary conclusion of both reports is that “For most of the major broadband providers that were tested, measured speeds were 100% of advertised speeds or better between the peak hours (1 p.m. to 11 p.m. local time).

Frankly, that conclusion is impossible for me to believe and indicates that there is something in this testing program that is different than the reported experience by many customers in the real world. Consider all of the following:

  • It’s possible that the FCC is somehow doctoring the speed data, or at least not reporting all of the data they gather. Ars technica reports that SamKnows, the firm doing the measuring for these tests said they have been collecting data from between 6,000 and 10,000 homes during the time of these tests. But the reports are basing data on about 4,500 locations. This is an FCC that seems adverse to reporting things it doesn’t like, so there is certainly a chance that there is selective editing of the data used to create the report.
  • It’s clear that the reported users in these test results are not from rural America. My experience over the last decade is that virtually nobody in rural America is receiving the advertised broadband speeds. It’s virtually impossible for a rural DSL customer to getthe advertised speeds since they live far away from the core DSLAM modems that provide broadband. It’s worth noting that both reports admit that satellite broadband underperforms.

My experience comes from working extensively in rural America across the country. When we do broadband studies we elicit households to take speed tests so we can see actual performance. Admittedly speed tests have issues and are not as accurate as the measuring being done by SamKnows. They are likely connecting directly to the incoming broadband signal at the modem while most households today use WiFi, which affects self-administered speed test results. But in the many rural speed tests we’ve seen households perform, it’s rare to see a rural customer getting the speed they are paying for, and often they get just a tiny fraction of that speed, with results sometimes barely better than dial-up.

In general I test speed tests because we also do broadband studies for larger towns, and sometimes the speed tests show good performance by the ISP. For example, we studied a City recently with about 40,000 homes and for the most part Comcast was delivering speeds that often exceeded the advertised speeds. This makes me believe that the major speed test sites, while not perfect, are not terrible and can be believed to represent a whole community.

However, I’ve also studied larger communities where a major ISP underperforms across the board. I’ve rarely seen DSL meet advertised speeds for the majority of customers in a community. And I’ve studied communities where the cable company was slower than advertised for everybody.

The FCC results are also hard to believe because we know from the press that there are whole communities where a major ISP underperforms. As an example is a long-running battle in upstate New York where Charter has been delivering speeds at a fraction of the advertised speeds – the performance was so poor that the State is trying to kick Charter out of the state.

I have similar anecdotal evidence at my own house. My ISP is also Charter. They currently tell me that my speed ought to be 200 Mbps but I’m getting about 135 Mbps. Before the recent upgrade I was also getting less than what they said. I’m not unhappy with the 135 Mbps, but if my house was part of the FCC test it would show a connection getting only 2/3 of the advertised speeds.

The ars technica article I cited above is worth reading because they dig deeper into the data. I must admit that I got stopped at the first page of each report where they said that the large ISPs are mostly delivering the speeds they are advertising, because I know for much of the country that is not true. That makes me suspect that there is doctoring of data somehow. Perhaps the results mostly come from larger communities where the speeds are okay. Maybe the FCC is doctoring the data and excluding poor test results. Perhaps the ISPs know which homes are being measured and give them special attention. I don’t know the details of how the report was generated, but I have too much experience in the real world to accept the conclusions that big ISPs deliver the speeds they advertised.

FCC Speed Tests for ISPs

ISPs awarded CAF II funding in the recent auction need to be aware that they will be subject to compliance testing for both latency and speeds on their new broadband networks. There are financial penalties for those failing to successfully meet these tests. The FCC revised the testing standards in July in Docket DA 18-710. These new testing standards become effective with testing starting in the third quarter of 2019. There new standards will replace the standards already in place for ISPs that receive funding from earlier rounds of the CAF program as well as ISPs getting A-CAM or other rate-of-return USF funding.

ISPs can choose between three methods for testing. First, they may elect what the FCC calls the MBA program, which uses an external vendor, approved by the FCC, to perform the testing. This firm has been testing speeds for the network built by large telcos for many years. ISPs can also use existing network tools if they are built into the customer CPE that allow test pinging and other testing methodologies. Finally, an ISP can install ‘white boxes’ that provide the ability to perform the tests.

The households to be tested are chosen at random by the ISP every two years. The FCC doesn’t describe a specific method for ensuring that the selections are truly random, but the ISP must describe to the FCC how this is done. It wouldn’t be hard for an ISP to fudge the results of the testing if they make sure that customers from slow parts of their network are not in the testing sample.

The number of tests to be conducted varies by the number of customers for which a recipient is getting CAF support; if the number is CAF households is 50 or fewer they must test 5 customers; if there are 51-500 CAF households they must test 10% of households. For 500 or greater CAF households they must test 50. ISPs that declare a high latency must test more locations with the maximum being 370.

ISPs must conduct the tests for a solid week, including weekends in every quarter to eliminate seasonality. Tests must be conducted in the evenings between 6:00 PM and 12:00 PM. Latency tests must be done every minute during the six-hour testing window. Speed tests – run separately for upload speeds and download speeds – must be done once per hour during the 6-hour testing window.

The FCC has set expected standards for the speed tests. These standards are based upon the required speeds of a specific program – such as the first CAF II program that required speeds of at least 10/1 Mbps. In the latest CAF program the testing will be based upon the speeds that the ISP declared they could meet when entering the action – speeds that can be as fast as 1 Gbps.

ISPs are expected to meet latency standards 95% of the time. Speed tests must achieve 80% of the expected upland and download speed 80% of the time. This might surprise people living in the original CAF II areas, because the big telcos only need to achieve download speeds of 8 Mbps for 80% of customers to meet the CAF standard. The 10/1 Mbps standard was low enough, but this lets the ISPs off the hook for underperforming even for that incredibly slow speed. This requirement means that an ISP guaranteeing gigabit download speeds needs to achieve 800 Mbps 80% of the time. ISPs that meet the speeds and latencies for 100% of customers are excused from quarterly testing and only have to test once per year.

There are financial penalties for ISPs that don’t meet these tests.

  • ISPs that have between 85% and 100% of households that meet the test standards lose 5% of their FCC support.
  • ISPs that have between 70% and 85% of households that meet the test standards lose 10% of their FCC support.
  • ISPs that have between 55% and 75% of households that meet the test standards lose 15% of their FCC support.
  • ISPs with less than 55% of compliant households lose 25% of their support.

For CAF II auction winners these reductions in funding would only be applied to the remaining time periods after they fail the tests. This particular auction covers a 10-year period of time and the testing would start once the new networks are operational, which is required to be completed between years 3 and 6 after funding.

This will have the biggest impact on ISPs that overstated their network capability. For instance, there were numerous ISPs that claimed the ability in the CAF auction to deliver 100 Mbps and they are going to lose 25% of the funding if they deliver speeds slower than 80 Mbps.

The Fastest ISPs

PC Magazine has been rating ISPs in terms of speed for a number of years. They develop their rankings based upon speed tests taken at their own speed test site. They had about 124,000 speed tests taken that led to this year’s rankings. The scoring for each ISP is a composite number based 80% on the download speed and 20% of upload speeds. To be included in the rankings an ISP needed to have 100 customers or more take the speed test.

You always have to take these kinds of rankings with a grain of salt for several reasons. For example speeds don’t only measure the ISP but also the customer. The time of day can affect the speed test, but probably the type of connection affects it the greatest. We know these days that a lot of people are using out-of-date or poorly located WiFi routers that affect the speeds at their computer.

Measured speeds vary between the different speed tests. In writing this blog I took four different speed tests just to see how they compare. I took the one at the PC Magazine site and it showed my speeds at 27.5 Mbps down / 5.8 Mbps up. I then used Ookla which showed 47.9 Mbps down / 5.8 Mbps up. The Speakeasy speed test showed 17.6 Mbps down and 5.8 Mbps up. Finally, I took the test from Charter Spectrum, my ISP, which showed 31.8 Mbps down / 5.9 Mbps up. That’s a pretty startling set of different speeds measured just minutes apart – and which demonstrates why speed test results are not a great measure of actual speeds. I look at these results and I have no idea what speed I actually am receiving. However, with that said, one would hope that any given speed test would probably be somewhat consistent in measuring the difference between ISPs.

The results of the speed test ‘contest’ are done for different categories of ISPs. For years the winner of the annual speed test for the large incumbents has been Verizon FiOS. However, in this year’s test they fell to third in their group. Leading that category now is Hotwire Communications which largely provides broadband to multi-tenant buildings, with a score of 91.3. Second was Suddenlink at 49.1 with Verizon, Comcast and Cox and closely behind. The lowest in the top 10 was Wow! at a score of 26.7.

Another interesting category is the competitive overbuilders and ISPs. This group is led by Google Fiber with a score of 324.5. EPB Communications, the municipal network in Chattanooga, is second at 136.1. Also in the top 10 are companies like Grande Communications, Sonic.net, RCN, and Comporium.

PC Magazine also ranks ISPs by region and it’s interesting to see how the speeds for a company like Comcast varies in different parts of the country.

Results are also ranked by state. I find some of the numbers on this list startling. For instance, Texas tops the list with a score of 100.3. Next is South Dakota at 80.3 and Vermont at 70.6. If anything this goes to show that the rankings are not any kind of actual random sample – it’s impossible to think that this represents the true composite speeds of all of the people living in those states. The results of this contest also differs from results shown by others like Ookla that looks at millions of actual connection speeds at Internet POPs. Consider Texas. Certainly there are fast broadband speeds in Austin due to Google Fiber where all of the competitors have picked up their game. There are rural parts of the state with fiber networks built by telcos and cooperatives. But a lot of the state looks much like anywhere else and there are a lot of people on DSL or using something less than the top speeds from the cable companies.

But there is one thing this type of study shows very well. It shows that over the years that the cable companies are getting significantly faster. Verizon FiOS used to be far faster than the cable companies and now lies in the middle of a pack with many of them.

This test is clearly not a statistically valid sample. And as I showed above with my results from various speed tests the results are not likely even very accurate. But ISPs care about these kinds of tests because it can give them bragging rights if they are near the top of one of the charts. And, regardless of the flaws, one would think the same shortcomings of this particular test are similar across the board, which means it does provide a decent comparison between ISPs. That is further validated by the fact the results of this exercise are pretty consistent from year to year.