USTelecom Blasts Municipal Broadband

In a typical knee-jerk reaction, USTelecom released an Issues Brief that claims that government-owned broadband networks aren’t built for the long haul. This was clearly prompted by federal grants that are giving money directly to towns, cities, counties, and states that can be used to build infrastructure, including broadband.

The Issues Brief trots out the same old lame arguments that the big ISPs have made for years against municipal broadband – that there have been some municipal broadband efforts that have failed. This is a true, but my math shows that the municipal failures are something less than 5% of municipal ventures. We never hear the comparative figures about how many new commercial ISP startups fail. That’s because there are no newspaper headlines when a small fiber overbuilder over-borrows and quietly folds shop. And somehow the big ISP industry totally overlooks their own big dramatic failures. Frontier just went through bankruptcy and walked away from $10 billion in debt – any industry expert can tell you that Frontier’s problems were all self-made and not due to unforeseeable market forces. AT&T recently walked away from a disastrous foray into cable TV and content and lost over $50 billion in just five years. This is not an industry that should talk too loudly about failures.

The real truth behind this Issues Brief is that the big companies don’t want any competition. The Issues Brief trots out the ridiculous FCC statistic that says that there is “fierce competition among providers—over 92% of American homes have at least two fixed broadband providers competing for their business”. That statistic is due to an FCC that hasn’t wanted to admit that urban areas are largely a monopoly for cable companies. Urban homes can either buy decent broadband from a cable company or slow broadband from ancient and out-of-date DSL from a telco. And that’s where you can still buy DSL – because that FCC statistic hasn’t been updated to reflect that AT&T completely walked away from selling new DSL. There is no fierce broadband competition in most markets. My consulting firms conducts surveys, and we generally see over half the public everywhere claiming they don’t have any choice of broadband providers.

Another great quote from the Issues Brief is that “The private sector has the best track-record of success for broadband deployment”. Really? I work with dozens of communities every year that are only looking for a better broadband solution because the big incumbent ISPs have not invested in their communities. The big telcos walked away from maintaining or upgrading copper networks several decades ago. When cable companies decided to upgrade to DOCSIS 3.1 a few years ago, most of them decided to leave the upload bandwidth at the older DOCSIS 3.0 technology – which hurt millions of homes that struggled with working and schooling at home during the pandemic. The big companies only spend capital when they feel like they don’t have an alternative. This is another topic the big ISPs ought not to be touting too loudly because it’s demonstrably not true.

The Issues Brief finishes by touting public/private partnerships as the answer – the big ISPs want cities to hand them all of the current grant funding to get them to invest in the networks they should have been investing in all along. The truth here is that the vast majority of communities do not want to be an ISP and are looking for partnerships. But in most cases, communities first look for smaller ISP partners that have a track record of providing good customer service before considering the incumbents. There is a clear reason why the big ISPs are always at the bottom of every measure of customer satisfaction – they treat customers like dirt.

I can summarize this Issues Brief a lot more succinctly than did USTelecom: “We know cities are getting a lot of grant money right now. Don’t try to use this money to become a municipal ISP because cities are too stupid to make it work. Give the grant money to us instead and we’ll take care of your future broadband needs, just like we have over the last twenty years. Wink wink.”

Grant Funding is Not Free

It’s easy to think of grants as free money, but nothing is ever really free. I know ISPs that have won grants and then decided to return the grant funding once they fully understood all of the costs and requirements that come along with the grant funds. The following are a few of the issues that ISPs might not like about grants.

  • Filing for grants can be expensive, particularly if the grant requires undertaking engineering studies or market studies that the ISP might have otherwise not taken on. There is, of course, no guarantee of winning a grant after going through the cost of grant preparation.
  • Environmental studies can add a lot of cost to a grant. Environmental studies seem unnecessary for networks that will be built entirely in existing utility rights-of-ways that already contain other utilities – particularly when putting fiber on poles. Environmental studies can be expensive in some circumstances. I helped a client that wanted to replace abandoned cell sites on national parkland where the environmental study for a $2 million project cost over $250,000.
  • Archaeological studies can also be expensive and can again seem unneeded when a project will be using existing rights-of-ways and previously disturbed soil.
  • Some grants require historic preservation studies which seem unneeded when the entire network is being built outside and not in or near to historic buildings.
  • Prevailing wages can add a lot of cost to fiber construction – particularly if engineering estimates didn’t anticipate paying higher wages.
  • I’ve seen all of these extra costs add as much as 20% to the cost of a project. It’s easy to think that’s okay as long as the grants cover the extra costs – but these extra costs also add to the needed matching funds which might change the attractiveness of the grant.
  • Grants can require significant paperwork to show compliance with all of the grant rules. Sometimes paperwork is needed annually for many years after the end of grant construction.
  • Grants can require the grantee to provide free broadband to anchor institutions or others for long periods.
  • There have been grants that put restrictions on the grantee selling the grant property far into the future.

ISPs have sometimes found lenders unwilling to lend to a grant-funded project because of the restrictions. For example, a prohibition against selling a grant-funded asset for a decade might make an asset worthless as collateral for a bank loan. You can’t assume that a commercial bank will accept all of the requirements of a grant.

There are obviously many ISPs willing to accept these various provisions when grants provide enough dollars to make it worthwhile. But these various restrictions can be a lot more problematic when taking grant money to fund only a small portion of a larger project – because the restrictions come whether the grant is funding 90% or only 10% of a project.

It can be even more problematic when accepting multiple grants for a single project since each grant likely comes with different rules. It’s sometimes a Gordian knot to reconcile multiple grants with external financing.

The bottom line is that anybody accepting a grant needs to read the fine print and be aware of the restrictions that might come with the grant. Many rural projects are not viable without substantial grants – but as the above list shows, grants are not free.

NTIA’s New Broadband Map

The National Telecommunications and Information Administration surprised the broadband industry by issuing a new broadband map for the whole U.S. The map differs in dramatic ways from the FCC’s broadband map, which is derived from broadband speeds that are reported by the ISPs in the country. It’s commonly understood that the FCC broadband map overstates broadband coverage significantly. The NTIA map draws upon varied sources in an attempt to create a more accurate picture of the availability of broadband.

The NTIA map was created by overlaying layers from various data sources over Google Maps. This includes speed test data from both Ookla and M-Lab. The map shows the results from Microsoft measurements of speeds experienced during software updates. There are two layers of data from the American Community Survey showing homes that report having no Internet access at home and also homes that have no computer, smartphone, or tablet. The NTIA also includes the FCC Form 477 data that is the only basis for the FCC broadband map. The NTIA map then offers an additional layer that shows high poverty areas have 20% or more households below the national poverty level. The data in the map can be viewed at both the Census tract and Census block level.

The data on the map is eye-opening. I live in Asheville, North Carolina. The map shows that broadband adoption varies widely across the city. In my Census tract, there are 6% of homes without broadband access. The neighboring tracts have broadband access in between 6% and 10% of households. But there is also a part of the city where 43% of homes don’t have broadband. This is clearly an issue of poverty and not availability because, in the rural areas surrounding the city where there is little option except slow DSL, the percentage of homes without broadband access is around 20%.

The NTIA map sticks it in the eye of the FCC for being so slow to change its broadband maps. The exaggerated coverage in the FCC maps first became obvious to me in 2009 when clients were seeking ARRA stimulus grants that defined homes as either served, unserved, or underserved for the first time. It was clear then that some of the ISP reporting to the FCC was pure fantasy. There has been a loud call since then to fix the FCC maps that has largely gone unheeded until a recent effort to begin the process of modifying the maps. That effort is expected to take a year or two.

The NTIA is making a point that there are many other sources of broadband data than just the FCC data. For example, we all know that speed test data isn’t perfect – but taken in mass, speed tests create a pretty accurate picture of broadband speeds. One of the most interesting data points is from Microsoft – one can argue that the speeds encountered when downloading a big data file like a software update is the best measurement of actual speed. The Microsoft data shows that actual download speeds are far below what ISPs claim in most of the country.

There are even more data points that could be layered onto the NTIA maps. For example, I wish the NTIA has also layered on maps created by State broadband offices because the States have taken a stab at undoing the worse fictions in the FCC mapping.

As might be expected, the industry reacted almost immediately to the new maps. The NCTA – The Internet and Television Association quickly blasted the NTIA maps. The big trade associations probably have a good reason to hate the maps after the Department of Treasury said just last week that cities and counties could rely on mapping data from any federal agency when deciding where ARPA grant funding can be spent. Localities are going to find these maps to be interesting and useful as they consider spending money on broadband infrastructure.

Hopefully, the NTIA will continue to update these maps as new data becomes available. You probably know by reading this blog that I am not a fan of using speed definitions when allocating broadband grants. I think it would far easier, as an example, to say that grants can always be used to overbuild rural DSL. But if the government continues to base grants upon something like the 25/3 Mbps definition of broadband, then maps like this new one are extremely helpful by showing more realistic speed numbers while also reminding is that there are a lot of other factors like poverty and broadband adoption rates to considere when deciding which parts of the country need grant assistance.

Treasury Makes it Easier to Fund Broadband

On June 17, the US Treasury Department clarified the rules for using federal ARPA broadband money that is being given to states, counties, cities, and smaller political subdivisions. The new FAQs make it a lot clearer that local government can use the funds to serve businesses and households that are considered as served – meaning they receive broadband speeds over 25/3 Mbps. My first reading of the rules came to the same conclusion, but these clarifications hopefully make this clear for everybody. There was language in the original Treasury Interim rules that might have scared off city and county attorneys from using the funding for broadband. Following is some of the clarifying language from the revised FAQs:

FAQ 6.8 adds the clarifying language that unserved or underserved households or businesses do not need to be the only ones in the service area funded by the project. This is a massively helpful clarification that discloses Treasury’s intent for the funds. The response to this FAQ could have previously been interpreted to mean that the money could only be used to bring broadband to places that have less than 25/3 Mbps. FAQ 6.9 further makes this same point, that while the goal of a broadband project must be to provide service to unserved or underserved areas, a sensible solution might require serving a larger area to be economical – and again, unserved and underserved locations need not be the only places funded by the ARPA funding.

FAQ 6.11 looks at the original use of the term ‘reliably’ when defining the broadband provided to homes and businesses. The Treasury response makes it clear that advertised speeds don’t define broadband speeds, but rather the actual broadband performance experienced by customers.

The use of “reliably” in the IFR provides recipients with significant discretion to assess whether the households and business in the area to be served have access to wireline broadband service that can actually and consistently meet the specified threshold of at least 25/3 Mbps – i.e., to consider the actual experience of current broadband customers that subscribe to served at or above the 25/3 Mbps threshold. Whether there is a provider serving the area that advertises or otherwise claims to offer speeds that meet the 25 Mbps download and 3 Mbps upload speed threshold is not dispositive.  

FAQ 6.11 goes on to clarify that governments can consider a wide range of information to use as proof that broadband is not reliably meeting the 25/3 threshold including federal or state broadband data (meaning State broadband maps or the newly released NTIA broadband map), speed tests, interviews with residents, or any other relevant information. Local governments can consider issues such as whether speeds are adequate at all times of the day – do speeds bog down at the busiest times? Issues like latency and jitter can be considered.

Maybe most significantly, the FAQ gives an automatic pass to overbuild DSL or cable companies still using DOCSIS 2.0. While there are very few homes still served by DOCSIS 2.0, Treasury is allowing localities to basically declare DSL to be obsolete, regardless of any speed claims made by the telcos. This negates the tens of thousands of Census blocks where telcos claim rural DSL speeds of 25/3 Mbps – an area served only by DSL is justification to use the funding.

In a clarification that some states and counties will find reassuring, FAQ 6.10 says that the ARPA funding can be used to fund middle-mile fiber as long as it is done with a goal of supporting last-mile fiber.

These were critically important clarifications since there has been a lot of debate at the local level about whether ARPA money could be used in various circumstances. The clarifications make it clear that ARPA money can always be used to overbuild rural DSL. It’s also clear that the ARPA money can be used in urban settings as long as the funded area included at least one location that doesn’t have a broadband option of at least 25/3 Mbps. There are numerous little pockets in all cities where the cable companies didn’t build and where DSL is the only option. Cities can clearly use this funding to provide support for low-income neighborhoods and places the big ISPs have bypassed or ignored.

Why Do We Give Grants to Huge ISPs?

The blog title is a rhetorical question because we all know why we give federal money to big ISPs – they are powerful companies that have a lot of lobbyists and that make a lot of contributions to politicians. But for some reason, the rest of us don’t talk enough about why giving money to the big ISPs is bad policy.

I could write a week’s worth of blogs detailing reasons why big ISPs don’t deserve grant funding. The public dislikes big ISPs and has rated them for two decades as having the worst customer service among all corporations and entities, disliked even more than insurance companies and the IRS. The public hates talking to big ISPs, because every call turns into a sales pitch to spend more money.

The big ISPs routinely deceive their customers. They routinely advertise special prices and then proceed to bill consumers more than what was promised. They have hidden fees and try to disguise their rates as taxes and fees. The big telcos unashamedly bill rural customers big fees for decrepit DSL that barely works. The telcos have known for over a decade that they can’t deliver what they are peddling.

Cable companies come across as better than the telcos only because their broadband technology is faster. But in every city, there are some neighborhoods where speeds are far slower than advertised speeds – neighborhoods where longstanding network problems never get fixed. I hear stories all of the time about repeated slowdowns and outages. About 30% of the folks we’ve surveyed during the pandemic have said that they couldn’t work from home due to problems with cable company upload speeds.

And then there are the big reasons. The big telcos created the rural broadband crisis. They made a decision decades ago to walk away from rural copper. They quietly cut back on all upgrades and maintenance and eliminated tens of thousands of rural technicians, meaning that customers routinely wait a week or longer to even see a technician.

What’s worse, the big telcos didn’t walk away from rural America honestly. They kept talking about how they could provide good service, to the point that the FCC awarded them $11 billion in the CAF II program to improve rural DSL – we paid them for what they should have routinely done by reinvesting the billions they have collected from rural customers. But rather than use the CAF II money to improve rural DSL, most of the money got pocketed to the benefit of stockholders.

While I think the decision to walk away from rural broadband was made in the boardroom – the worst consequences of the decision were implemented locally. That’s how giant companies work and is the primary reason we shouldn’t give money to big ISPs. Upper management puts pressure on regional vice presidents to improve the bottom line, and it’s the regional managers who quietly cut back on technicians and equipment. Rural broadband didn’t die from one big sweeping decision – it was murdered by thousands of small cutbacks by regional bureaucrats trying to earn better bonuses. I’ve talked to many rural technicians who tell me that their companies have taken away every tool they have for helping customers.

What does this all boil down to? If we give money to the big ISPs to build rural networks, they are going to pocket some of the money like they did with CAF II. But even if they use grant money to build decent rural networks, it’s hard to imagine them being good stewards of those networks. The networks will not get the needed future upgrades. There will never be enough technicians. And every year the problems will get a little worse until we look up in twenty years and see rural fiber networks owned by the big ISPs that are barely limping along. Meanwhile, we’ll see networks operated by cooperatives, small telcos, and municipalities that work perfectly, that offer good customer service, and that have responsive repair and maintenance.

I have a hard time thinking that there is a single policy person or politician in the country who honestly thinks that big ISPs will take care of rural America over time. They’ll take federal money and build the least they can get away with. Then, within only a few years they’ll start to nickel and dime the rural properties as they have always done.

I have to laugh when I hear somebody comparing current rural broadband grant programs to our effort a century ago for rural electrification. That electrification money went mostly to cooperatives and not to the big commercial corporations. We’ve lost track of that important fact when we use the electrification analogy. The government made the right decision by lending money to citzens to solve the electricity gap and didn’t give money to the big commercial electric companies that had already shunned rural America.

The main reason we shouldn’t give grants to big ISPs is that solving the rural broadband gap is too important to entrust to companies that we know will do a lousy job. There is nobody who thinks that the big telcos or cable companies will do the right thing in rural America over the long run if we’re dumb enough to fund them.

A Security Warning

Today I am going to talk about something that happened outside of our industry but that should be a concern of every ISP. There is a lesson to be learned from the Colonial Pipeline hack by the DarkSide ransomware group from Russia.

I am positive that if I call my ISP clients that every one of them will tell me that their broadband networks are secure and that there is no way for malware to shut down their broadband network. I would trust that response since most broadband networks are encrypted from end-to-end between the core and customers.

But the ISPs would still be wrong. The hack of Colonial Pipeline did not attack the software that operates the pipeline. Instead, the hackers found their way into the computers used for the billing system. When that 10-year-old software got locked, Colonial had no way to take orders, pay the gas suppliers, or bill customers for delivering gas. The money side of the business was locked. Colonial made the decision that it couldn’t operate without that software.

I think if I ask the question to ISPs of whether every computer, laptop, and tablet connected to the OSS/BSS software is totally secure I would get a different answer. Hackers only need to get into one computer to shut down an ISP’s OSS/BSS. Without that software, most ISPs would not be able to take new orders, answer billing questions, send out new bills, take trouble tickets, or dispatch repair people. With the OSS/BSS software locked an ISP wouldn’t even be able to look at customer records. Most ISPs would be unable to somehow switch to a manual method of doing things. Most ISPs would have little choice but to pay the ransomware if they found themselves in the same position as Colonial.

This is the same approach that the ransomware hackers take with many large targets. They shut down the billing systems system for hospitals to bring them to a halt. They shut down the supply chain and inventory software of factories to bring them to a screeching halt. Businesses of all types now have sophisticated suites of software that are equivalent to our industry’s OSS/BSS software. Over the last decade, most larger businesses have migrated to a master software that controls most of the day-to-day backoffice functions of the business. That automation has been a huge time and dollar saver – but it is the point of attack for malware hackers.

I advise every ISP to take a look at the security of computers used by staff. That’s where the vulnerabilities are – and that’s where the ransomware folks exploit. Very few ISPs pay the same kind of attention to PCs, laptops, and cellphones as they do to the broadband network. We often don’t keep up with software updates for every device. We let employees take devices home or travel with them and use hotel WiFi.

I would bet that we’ve already had ISPs hacked – because most of the businesses that are hit with ransomware don’t talk about it. They pay the ransom and hope they get up and running again. A company like Colonial had to disclose it because the gas supply chain works on a 24/7 cycle and gas stations started running out of gas soon after the attack.

I am not a security expert, and I don’t have any answers. But I know a lot of clients do not have ironclad security for the backoffice side of the business. As soon as I heard about this hack I realized how this could happen easily in our industry as well.

AT&T’s Plan for Ditching Copper

Jeff Baumgartner of Light Reading recently reported on a wide-ranging discussion by AT&T CEO John Stankey. One of the most interesting parts of the discussion was about AT&T’s plan to use cellular wireless in rural markets to replace DSL.

I’m not going to repeat everything in the article, but the gist is that AT&T hopes to be able to start walking away from rural copper. Stankey was quoted as saying that there is already a voice alternative in rural markets – meaning cellphones. Unfortunately, that ignores the many rural homes with poor cellular coverage. The FCC was going to plow something like $4 billion into a grant program to expand rural cellular coverage, but the misreporting of existing cellular coverage areas by the big cellular carriers put that plan on hold.

Stankey believes that cellular broadband will be the alternative to rural DSL. Verizon has the same strategy but doesn’t serve as many rural markets after having unloaded most of them to Frontier over recent years.

What might a rural cellular data network look like? In most rural counties there are generally only a few existing cell towers – it’s not unusual for this to be a half dozen or less. The traditional older cell towers often don’t reach a lot of rural homes since the towers were built for the old cellular model of making sure that cars could get cell signal along numbered highways. But over time, many counties have added a few more towers for public safety purposes that reach a lot more homes for voice service.

Most people don’t realize that cellular broadband has a lot of the same characteristics as other rural wireless broadband. The signal from the cell towers quickly dies with distance. Depending upon the spectrum being used, cellular broadband can hit speeds of 50-100 Mbps for the first mile from a rural cell site, but the speeds drop off pretty rapidly from that point. Cellular broadband does not travel nearly as far as cellular voice, and rural people are used to the idea of being to make a call but not being able to grab the web. Cellular data also gets slowed and stopped by hills and other impediments. Any county without a flat topology will have lots of cellular dead spots.

What this means is that cellular broadband is not a pure replacement for landline service. For the typical rural county with a limited number of cellular towers, there are going to be plenty of homes that can’t get a cell signal. There will be a lot more homes that can’t get enough broadband speed to be meaningful.

What Stankey failed to mention in the interview is that AT&T has already walked away from the DSL market. As of last October, the company won’t sign new DSL customers anywhere in the country – in towns or rural areas. That means everybody buying or building a rural home in an AT&T area doesn’t have DSL as a broadband option. I’m sure AT&T will continue to milk existing DSL revenues for the next few years. But is the company going to care a whit if some rural households can’t get the cellular data?

The various rural grant programs are filling in some of the rural broadband gaps – but not close to all. As large as the RDOF grants were, the FCC says those grants will reach 5 million rural homes if the grants are all awarded. There are still 10 to 15 million more homes in rural America that don’t have adequate broadband – maybe more. Unfortunately, some recent federal grants went to providers like Viasat or to ISPs that might not be able or willing to fulfill the RDOF requirements.

Don’t get me wrong. I’m happy for the rural home that can finally get a decent cellular data plan. I just don’t want regulators or politicians to think that companies like AT&T are taking care of rural America with this new strategy. I would characterize AT&T’s strategy as providing cover for the company to pull down rural copper. The copper is old and at end of life and has to come down – but it’s disingenuous to not tell the public that cellular broadband means the end of copper.

An Attack on WiFi Spectrum

A little over a year ago the FCC approved the use of 1,200 MHz of spectrum in the 6 GHz band for public use – for new WiFi. WiFi is already the most successful deployment of spectrum ever. A year ago, Cisco predicted that by 2022 that WiFi will be carrying more than 50% of global IP traffic.

These are amazing statistics when you consider that WiFi has been limited to using 70 MHz of spectrum in the 2.4 GHz spectrum band and 500 MHz in the 5 GHz spectrum band. The additional 1,200 MHz of spectrum will vastly expand the capabilities of WiFi. WiFi performance was already slated to improve due to the introduction of WiFi 6 technology. Adding the new spectrum will drive WiFi performance to a new level. The FCC order adds seven 160 MHz channels to the WiFi environment (or alternately adds fifty-nine 20 MHz channels. For the typical WiFi environment, such as a home in an urban setting, this is enough new channels that big bandwidth devices ought to be able to grab a full 160 MHz channel. This is going to increase the performance of WiFi routers significantly by allowing homes or businesses to separate devices by channel to avoid interference.

One minor worry about the 6 GHz band is that it isn’t being treated the same everywhere. China has decided to allocate the entire 6 GHz spectrum band to 5G. Europe has allocated only 500 MHz for WiFi with the rest going to 5G. Other places like Latin America have matched the US allocation and are opting for a greatly expanded WiFi. This means that future WiFi devices won’t be compatible everywhere and will vary by the way the devices handle the 6 GHz spectrum. That’s not the ideal situation for a device maker, but this likely can be handled through software in most cases.

The GSMA, which is the worldwide association for large cellular carriers is lobbying for the US to allow 6 GHz to be used for 5G. They argue that since the 6 GHz spectrum is available to the public that cellular carriers ought to be able to use it like anybody else. They’d like to use it for License Assisted Access (LAA), which would allow the cellular carriers to use the spectrum for cellular broadband. If allowed, cellular traffic could flood the spectrum in urban areas and kill the benefits of 6 GHz for WiFi.

This is not the first time this issue was raised. The cellular industry lobbied hard to be able to use LAA when the FCC approved 5 GHz spectrum for WiFi. Luckily, the FCC understood the huge benefits of improved WiFi and chose to exclude cellular carriers from using the spectrum.

It would be a huge coup for cellular carriers to get to use the 6 GHz spectrum because they’d get it for free at a time where they’ve paid huge dollars for 5G spectrum. The FCC already heard these same arguments when they made the 6 GHz decision, so hopefully, the idea goes nowhere.

I talk to a lot of ISPs that tell me that poor WiFi performance is to blame for many of the perceived problems households have with broadband. Inefficient and out-of-date routers along with situation where too many devices are trying to use only a few channels is causing many of the problems with broadband. The 6 GHz WiFi spectrum will bring decades of vastly improved WiFi performance. It’s something that every homeowner will recognize immediately when they connect a properly configured WiFi router using the 6 GHz spectrum.

For now, there are not many devices that are ready to handle the new WiFi spectrum and WiFi 6 together. Some cellphones are now coming with the capability, and as this starts getting built into chips it will start working for laptops, tablets, PCs, and smart televisions. But homes will only see the real advantage over time as they upgrade WiFi routers and the various devices.

Interestingly, improved WiFi is a direct competitor for the cellular carriers in the home. The carriers have always dreamed of being able to sell subscriptions for homes to connect our many devices. WiFi allows for the same thing with just the cost of buying a new router. It would be an obvious boon to cellular carriers to both kill off the WiFi competitor while getting their hands on free spectrum.

Hopefully, the FCC will reject this argument as something that has already been decided. The GSMA argues that 5G will bring trillions of dollars in benefits to the world – but it can still do that without this spectrum. The benefit of improved WiFi has a huge value as well.

What’s the Right Definition of Upload Speed?

I read a blog on the WISPA website written by Mark Radabaugh that suggests that the best policy for broadband speeds would be met by asymmetrical architecture (meaning that upload speeds don’t need to be as fast as download speeds). I can buy that argument to some extent because there is no doubt that most homes download far more data than we upload.

But then the blog loses me when Mr. Radabaugh suggests that an adequate definition of speed might be 50/5 Mbps or 100/10 Mbps. I have seen enough evidence during the pandemic to know that 5 Mbps or 10 Mbps are not adequate upload speeds today. My consulting firm conducts speed tests and surveys for communities and during the pandemic, we’ve learned a lot about upload demand.

We’ve seen consistently in surveys that between 30% and 40% of the families that worked or schooled from home during the pandemic said that the upload connections were not adequate. Many of the respondents making this claim have lived in cities using cable company broadband with upload speeds between 10 Mbps and 20 Mbps. While those speeds may be adequate for one person working from home, they were clearly not adequate for multiple people trying to use the upload connection at the same time.

But it’s not quite that simple and we need to stop fixating on speed as the way to measure if a broadband connection is adequate. Many of the people who find home upload speeds to be inadequate complain about inconsistent speeds. They’ll connect easily enough to a work or school server, or a Zoom call but eventually get dumped out of the connection. Speeds on most technologies are not constant and bounce up and down as demand changes in the neighborhood. A 10 Mbps upload connection is not adequate if there are times when the speed drops below that speed and connections are dropped. Inconsistent broadband connections can also be related to poor latency and heavy jitter.

The latest grant requirements from the NTIA for using ARPA funding describe the issue succinctly. The NTIA says that the ARPA grants can be used in places where an ISP is not “reliably” delivering speeds of 25/3 Mbps. Reliability is a concept that is long overdue in our discussion of broadband because, unfortunately, many of our broadband technologies deliver different speeds from minute to minute and hour to hour. We cannot keep pretending that a WISP, DSL, or cable modem service that delivers 15 Mbps upload sometimes but a 5 Mbps connection at other times is reliable. Such a connection ought to more properly be labeled as a 5 Mbps connection that sometimes bursts to faster speeds.

The WISPA blog also fails to mention the context of discussing a speed requirement. There is a big difference in setting a definition of speed for today’s broadband market versus setting an expected speed for a project that’s being funded by a federal grant. We should never use federal grant funding to build broadband that just barely meets today’s definition of broadband. It’s an undisputed fact that households, on average, use a lot more broadband every year. We know that the amount of bandwidth used by households is likely to double every three years. If we build to meet today’s broadband demand, a network will be obsolete within a decade. Grant funding should be used to build networks that meet expected future needs.

I often tell stories about network engineers who undersize new transport electronics. While they know that broadband demand and traffic have been doubling on their networks every three years, they just can’t bring themselves to recommend new electronics that have ten times today’s capacity. Even experienced engineers have a mental block against truly believing the impact of growth. But anybody designing a network needs to be looking to make sure the new network is still going to be robust a decade from now.

It’s just as essential for policymakers to understand the incessant growth in broadband demand. When talking about awarding grants we shouldn’t be discussing the current definition of broadband, but instead the likely definition of broadband in a decade. That’s the big point that the WISPA blog misses. I think it’s easy to demonstrate that 5 Mbps or 10 Mbps is inadequate broadband speeds today – we have too much evidence that upload speeds need to probably be at least 25 Mbps. But we can’t accept today’s upload needs when funding new networks. Grant-funded networks should be forward-looking – and I think the NTIA’s suggestion of 100 Mbps upload for future networks is reasonable.

Cord Cutting Accelerates in 1Q 2021

The largest traditional cable providers collectively lost over 1.6 million customers in the first quarter of 2021 – an overall loss of 2.2% of customers. To put the quarter’s loss into perspective, the big cable providers lost almost 18 thousand cable customers per day throughout the quarter.

The numbers below come from Leichtman Research Group which compiles these numbers from reports made to investors, except for Cox which is estimated. The numbers reported are for the largest cable providers, and Leichtman estimates that these companies represent 95% of all cable customers in the country.

Following is a comparison of the first quarter subscriber numbers compared to the end of the 2020:

1Q 2021 4Q 2020 Change % Change
Comcast 19,355,000 19,846,000 (491,000) -2.5%
Charter 16,062,000 16,200,000 (138,000) -0.9%
AT&T 15,885,000 16,505,000 (620,000) -3.8%
Dish TV 8,686,000 8,816,000 (130,000) -1.5%
Verizon 3,845,000 3,927,000 (82,000) -2.1%
Cox 3,590,000 3,650,000 (60,000) -1.6%
Altice 2,906,600 2,961,000 (54,400) -1.8%
Mediacom 626,000 643,000 (17,000) -2.6%
Frontier 453,000 485,000 (32,000) -6.6%
Atlantic Broadband 313,591 318,387 (4,796) -1.5%
Cable One 252,000 261,000 (9,000) -3.4%
     
Total 71,974,191 73,612,387 (1,638,196) -2.2%
Total Cable 43,105,191 43,879,387 (774,196) -1.8%
Total Other 28,869,000 29,733,000 (864,000) -2.9%

 

Some observations about the numbers:

  • The big loser continued to be AT&T, which lost a net of 620,000 traditional video customers between DirecTV and AT&T TV. In the second quarter of this year AT&T spun these customers all off to a corporation.
  • The big percentage loser continues to be Frontier which lost 6.6% of its cable customers in the quarter.
  • Big customer losses finally hit to Comcast, which lost 491,000 traditional cable customers in a quarter where it added 460,000 broadband customers.
  • Charter continues to lose cable customers at a slower pace than the rest of the industry. I have to wonder if this means bundles that hard to break or some similar issue.
  • This is the ninth consecutive quarter that the industry lost over one million cable subscribers.

To put these losses into perspective, these same companies had over 85.4 million cable customers at the end of 2018 and 79.5 million by the end of 2019. That’s a loss of 13 million customers (16% of customers) since the end of 2018.

The big losses in cable subscribers happened at the same time that the biggest ISPs in the country are adding a lot of broadband customers. The biggest ISPs added over 1 million new broadband subscribers in the first quarter of 2021.

In 2020, we saw that a lot of customers dropping traditional video were switching to online versions of the full cable line-up. That didn’t carry into the first quarter of 2021 where the combination of Hulu plus Live TV, Sling TV, AT&T TV, and FuboTV collectively lost over 257,000 customers. I have to suspect that has to do with affordability.