The FCC is Redlining Rural America

The recent statistics of broadband usage in the US provide evidence that, unwittingly, the FCC is redlining rural America. OpenVault recently released its Broadband Industry Report for 4Q 2019 that tracks the way that the US consumes data. OpenVault has been collecting broadband usage for more than ten years, and the last two reports have been eye-opening.

The most important finding is that the average data consumed by households grow by 27% from 2018 to 2019 – in the fourth quarter of 2019 the average US home used 344 gigabytes of data, up from 275 gigabytes a year earlier.

The report also looks at power users – homes that consume a lot of broadband. They report that nearly 1% of homes now use 2 terabytes per month and 7.7% use over 1 terabyte per month. A terabyte is 1,000 gigabytes. The percentage of homes using over 1 terabyte almost doubled from 4% a year earlier. This statistic is important because it shows the number of homes that are hitting the 1 terabyte data caps of companies like Comcast, AT&T, Cox, and Mediacom is quickly growing.

Homes are starting to buy gigabit broadband when it’s available and affordable. 2.8% of homes in the country now subscribe to gigabit speeds, up 86% from the 1.5% of homes that bought gigabit in 2018.

54% of homes now purchase broadband plans with speeds of 100 Mbps or faster. Another 23.6% of homes are subscribing to broadband between 50-75 Mbps. This means that nearly 78% of homes are subscribing to data plans of greater than 50 Mbps. The average subscribed speed grew significantly in 2019, up from 103 Mbps to 128 Mbps.

What’s the point of all of these statistics? They show that broadband usage and speeds in urban America is growing by leaps and bounds while broadband in rural America sits still. Urban broadband speeds have increased so rapidly that the average home in the US in 2019 got speeds that were 25 Mbps faster than what they had in 2018. The average speed of broadband in 2019 was more than 100 Mbps faster than the FCC definition of broadband. I contend that FCC actions and inaction have now culminated in the redlining of rural broadband households. It may sound drastic to call the FCC inaction redlining, but I think the word fits the situation.

Redlining historically has been used to describe how big corporations discriminate against poor neighborhoods. Redlining is more often due to neglect than to conscious decisions – grocery stores don’t consider poor neighborhoods as places to build; cable companies and telcos make upgrades in neighborhoods where they have the most customers or the highest revenue per customer. The consequence of redlining is that some neighborhoods get left behind.

The FCC has taken a series of actions that is dooming large parts of rural America to poor broadband for decades to come. One of the most egregious actions by the FCC is refusing to consider a faster definition of broadband, although every statistic shows that urban America is leaping far ahead of rural America and the broadband gap is now growing rapidly each year.

The decision to stick with the outdated 25/3 definition of broadband then boxes the FCC into having to allow federal grant dollars go to build technologies that meet the 25/3 definition of broadband. Considering how fast broadband speeds and consumption are growing, this is an amazingly shortsighted decision when considering that that grant recipients for programs like RDOF have six years to construct the new networks. There will be ISPs still constructing 25/3 broadband networks using federal money in 2026.

Next, the FCC has made it clear that any rural area that gets any federal or state subsidy – even if it’s to support 25/3 Mbps, or to support satellite broadband is not going to be eligible for future federal assistance. Once the FCC sticks you with poor broadband, they’re done with you.

Finally, the FCC continues to hide behind ludicrously dreadful maps that show good broadband available for millions of homes that have no broadband option. The rules for the 477 data collection are lousy, but that’s only half the problem, and I can’t recall ever hearing any discussion at the FCC about penalizing ISPs that file fraudulent speeds. There should be huge financial penalties for a telco that claims 25/3 speeds when nobody gets speeds even close to that or for WISPs that claim 100 Mbps speeds and deliver 15 Mbps. These ISPs are stopping whole counties from being eligible for broadband grants.

All of these FCC actions and inaction have doomed huge swaths of rural America from even participating in federal grant programs to get better broadband. If that’s not redlining, I don’t know what else to call it.

Low-orbit Satellite Security

I’ve been watching the progress of the low-orbit satellite providers which are promising to bring broadband solutions across the planet. There has been some serious movement since the last time I discussed their status.

On January 29, Starlink launched its latest round of low-orbit satellites, bringing the number in space to 242. Not all of these will be delivering broadband. The first half dozen satellites were test units to try out various concepts. Starlink will use 10 of the most recent batch to test the ability to ‘de-orbit’ and bring satellites back to earth.

The latest Starlink satellites weigh 260 kilograms, up from 227 kilograms for the first satellites launched in May 2019. The latest satellites are designed to be 100% demisable, meaning they will completely burn up in the atmosphere upon reentry.

Starlink still has a long way to go to meet its business plan. If they meet all of the planned launches this year, they’ll have 1,500 satellites in orbit. They’ve told the FCC that they plan to have 6,000 satellites in orbit by the end of 2024 and 12,000 by the end of 2027. As they add new satellites the company must also replace the short-lived satellites that only have a planned life of about five years. That means by 2026 they’ll have to launch 1.200 satellites a year forever just to maintain the first fleet of 6,000 satellites.

We also saw some progress being made by OneWeb, the satellite company founded by Greg Wyler with backing from Virgin, Airbus, SoftBank, and Qualcomm. The company launched 6 satellites last year. They recently launched 34 more satellites and the company’s goal is to put 200 satellites in orbit this year.

These launches show that the industry is for real and that over the next few years we’ll see big numbers of low-orbit satellites in the sky. We finally heard just last week from Elon Musk that he does not intend to compete with rural ISPs and will only sell satellite broadband in the most remote places. He still hasn’t disclosed prices – but if he doesn’t compete with existing ISPs he’s not going to have to be competitively priced. Starlink hints that it might add some customers by the end of this year, but the serious launch of broadband service will start next year.

It’s starting to feel odd that these companies won’t talk about broadband speeds. Like with any broadband technology, the degree of oversubscription will affect broadband performance. The first customers to use the satellites might see blazingly fast speeds – but speeds will lower quickly as customers are added. One of the biggest temptations facing these companies will  be to oversubscribe the technology.

Like with any new technology, satellite broadband brings a new set of worries. There is a recent article on Fastcompany by William Akoto asking how we’re going to protect satellite fleets from hacking. If the proposed satellite constellations grow as promised, there will be tens of thousands of satellites circling the earth delivering broadband. Akoto points out that the satellite supply chain is far from secure and open to tampering. The satellites are being constructed by a number of different vendors using off-the-shelf components. The satellites are not much more than a router connected to a solar array.

It’s clear that there are virtually no hardware or software system that can’t be hacked by a determined effort. The satellites will fly over every country on earth, giving ample opportunity for hackers to hack into satellites directly overhead. The satellites will be controlled by earth station hubs, which also might be hacked in the same manner that happens to big corporate server farms.

The consequences of hacking for satellites are direr than with land-based technology. Hackers could turn satellites off making them dead weights in space. They could rearrange the solar collectors to make them run out of power. Hackers could direct all satellites to come back to earth and burn up in the atmosphere.

In the worse scenario, hackers could crash satellites together creating a lot of space debris. NASA scientist Donald Kessler described the dangers of space debris in 1978 in what’s now described as the Kessler syndrome. Every space collision creates more debris and eventually creates a cloud of circling debris that makes it impossible to maintain satellites in space. Many scientists think such a cloud is almost inevitable, but malicious hacking could create such a cloud quickly.

Hacking won’t only affect rural broadband. The ability of satellites to connect remote locations into a unified network is going to be attractive to a wide range of industries. It’s not hard to imagine the satellite constellations being used to connect to critical infrastructure like rural electric grids, rural dams, and industries of all sorts that connect to rural or third-world locations.

Industry experts are already calling for regulation of satellite security. They believe that governments need to step in to mandate that satellite constellations be as safe as possible. While this could be done voluntarily by the industry there doesn’t seem to be any such effort afoot. The consequences of not getting this right could be a disaster for the planet.

A New Paradigm for Conventions?

In the last few days, I’ve seen numerous notices of telecom conventions and meetings that are being canceled or postponed. Many big corporations that attend conventions have already decided that their employees can’t undertake non-essential travel. It’s likely soon that local governments are going to cancel conventions even if meeting organizers won’t. It seems, at least for this year, that big public telecom events will be a rare event, if they happen at all.

I’ve been thinking about this for a few days and it seems like a good time for us to reexamine how we hold telecom conventions. When you consider how much technology has changed, the way we hold telecom conventions hasn’t changed in the last forty years since I’ve been going to them. There is usually a string of speakers during the day using PowerPoint presentations (used to be overhead slides), mixed in with some panels of folks discussing various topics. There are vendors that pay for coffee breaks and meals hoping that people will stop by their booth to chat.

You probably wouldn’t be able to tell much difference if you were plopped down into a convention from twenty years ago – other than the laptops were larger and the speakers wwere talking about the big breakthroughs in DSL. There is one big difference I’ve noted that should be of concern to convention planners – there are not nearly as many young people attending the conventions today as there was twenty years ago. They find them boring and unproductive.

I got a few glimpses of a different way to meet. FierceWireless just announced a completely online ‘convention’ for 5G. I call it a convention because it stretches over multiple days and includes an array of speakers you’d expect to see at a live 5G convention. I also got a notice that WISPAmerica 2020 is going virtual – no details yet of how they’ll do it.

Having virtual portions of conventions is an idea that’s long overdue. It’s got to be a lot easier to assemble good speakers for virtual presentations. Virtual speakers can devote a few hours rather than a few days to talk at a convention. People like FCC Commissioners or presidents of major telecom firms might speak at a lot more events if they are able to speak from their office for an hour instead of making a trip. Online sessions might also make it easier to ask questions of presenters – sessions are freed from the constraints of clearing out meeting halls for the next presentation, and question sessions could be extended as needed.

If we really want to duplicate the convention experience, then having virtual speakers is not enough. The main reason that a lot of people, including me, go to conventions is the networking and the chance to make new connections in the industry. As a consultant, I invariably meet a few potential new clients and I get to catch up with existing clients. I also go to check in with the various vendors to see what’s new.

I don’t think it would be hard to duplicate the networking in a virtual convention. Speakers, vendors, and attendees could post calendars and make appointments to speak virtually with each other for 15 or 30-minute slots. This would be a lot more productive than a live convention because I always come home feeling like I’ve not met with everybody that I should have.

The coronavirus isn’t going to last forever, and it will die out or we’ll eventually find an effective vaccine. Virtual meetings like the one I describe above could keep communications in the industry flowing this year and not put the industry on hold. If anything, the giant increase in the demand to work-from-home and the demand for telemedicine means that the broadband industry will likely be busier than ever.

My hope would be that after this crisis is over that we don’t return to the existing convention format. Future live conventions would benefit by these same ideas. Bringing in virtual speakers can improve the quality of the message being conveyed. Most conventions have a few good speakers but a host of the same folks that speak year after year. Having a mix of live and virtual speakers would be an upgrade. Scheduling meetings between attendees is an idea that’s 10-years overdue.

This would also be a boon to vendors. The current system of having valuable employees man booths for several days to then meet with folks during hurried time is incredibly nonproductive. Having a reservation system to easily schedule virtual meetings with vendors would be incredibly attractive to me. It ought to also be attractive to vendors who get quality time with interested attendees instead of trying to juggle several folks standing around their booth at the same time. I can’t tell you how many vendor booths I’ve walked away from because they were busy with somebody else.

Of course, this raises the question in the future of also having virtual attendees. Paying a fee could give virtual attendees access to the speaker sessions. It would also allow for one-on-one meetings with speakers and vendors. I know there are many conventions that I’ve considered attending but that didn’t fit into my schedule. I would participate in more events virtually if I could buy a half-day, full-day, or several-day pass, priced appropriately.

The above scenario is a big break from the way we’ve traditionally held conventions. I know that I would find the virtual format I’ve described to be a lot more efficient and productive than what actually happens at conventions. We already have the technology that could make this work – although somebody has to bundle this into a convention product. There are folks who attend conventions to get out of the office and have a beer with colleagues – and that’s one reason conventional conventions won’t totally lose their appeal in the future. But if we want to make conventions relevant to the next generation of telecom employees and make them more efficient for everybody today, then mixing a virtual component into conventions ought to become the new norm.

The Dirty Secret of Coaxial Broadband

The US has clearly pinned our hopes for providing modern broadband on the big cable companies. At the end of 2019, the big cable companies had almost 68 million customers compared to 33 million for the big telcos. Any discussion of broadband in urban markets is mostly a discussion of big cable company broadband. Cable companies will continue to grow market dominance as urban DSL customers continue to migrate to cable modem. In 2019 the big cable companies added 3.1 million customers while the telcos lost over 600,000 customers.

The big cable companies have all advertised to their customers that they had upgraded to the latest technology in DOCSIS 3.1 and can now provide gigabit broadband – for an expensive price in most markets set well over $100 per month.

It’s easy to think of urban cable systems as up-to-date and high tech and ready and able to deliver fast broadband speeds. While this is true in some cities and in some neighborhoods, the dirty secret of the cable industry is that their networks are not all up to snuff. Everybody is aware of the aging problems that have plagued the telephone copper network – but it’s rare to hear somebody talking about the aging of the cable company copper networks.

Most of the cable networks were built in the 1970s, with some even a little older. Just like with telephone copper networks the coaxial networks are getting old and a network built around 1970 is now fifty years old.

Cable coaxial networks suffer more from deterioration than do telephone copper networks. The copper wires in a coaxial system are much larger and the wires hanging on poles act like a giant antenna that can receive a range of different frequencies. Any physical opening into the wire through a splice point or from aging creates a new ingress point for external frequencies – and that equates to noise on the coaxial network. Increased noise translates directly to decreased performance of the network. The capacity of the older coaxial networks is significantly lower than when the networks were first constructed.

Another issue with coaxial networks is that the type of coaxial cable used has changed over time and some of the coax used in the early networks can’t handle the capacity needed today. Some older coax has been replaced in urban networks, but not all. Coaxial networks in smaller towns still can contain a lot of older-generation coaxial cables.

These issues mean that coaxial networks don’t always perform as well as is touted by the cable companies. I can use the network in my city of Asheville NC as an example. Charter announced nationally that when it upgraded to DOCSIS 3.1 that it had a goal of raising broadband speeds everywhere to 200 Mbps. My speed at the modem is 135 Mbps. I’m not complaining about my speed and I’m glad they increased my speed, but there must be issues in the local network that stopped Charter from achieving its 200 Mbps goal.

We undertake surveys and citywide speed tests across the country and we often see that the performance of coaxial networks varies by neighborhood. We’ve seen neighborhoods where there are more outages, more variance in download speeds, and overall slower speeds than the rest of the city. These problems are almost certainly due to differences within a city of the quality of the coaxial network.

Cable companies could bring older neighborhoods up to snuff, but such upgrades are expensive. It might mean replacing a lot of drops and any runs of older coaxial cable. It might mean replacing or re-spacing amplifiers. It often means replacing all of the power taps (the devices that connect homes to the distribution cables). The upgrading effort is labor-intensive, and that means costly.

I think this means that many cities will never see another unilateral increase in broadband speeds unless the cable companies first make big investments. The cable companies have increased speeds every few years since 2000 to keep ahead of the telcos and to make customers happier with their service. I fear that since cable companies are becoming de facto monopolies in most cities that they have lost the incentive to get faster if that means spending money. The coaxial networks and speeds that we have in place today might be what we still have a decade from now, only with coaxial networks that are another ten years older.

Apple Satellites?

Word has leaked out that Apple is working on a satellite project. The company is at the beginning of the research project, so there is no way to know exactly what they have in mind. For example, is the company considering launching satellites or would they lease capacity from one of the other planned satellite networks?

The fact that Apple is working on the concept is a good segue to discuss the many ways that satellite connectivity could be useful to Apple or other companies. It’s hard to find any press that doesn’t assume that the satellite constellations will be used mostly for rural broadband, but there are numerous other ways that Apple or others could use low-orbit satellites.

One of the more obvious ways that Apple could use satellites is by offering its own branded broadband to go with their devices. It’s not hard to imagine iMacs or IPads having the option to be bundled with Apple satellite broadband, particularly for customers that don’t have adequate home broadband today. With the current vision of satellite technology, any customer connected this way would need the same sort of dish at their home as envisioned by Starlink – a flat dinner-plate-sized antenna that doesn’t have to be ‘aimed’ at the satellites.

Apple might instead be thinking of using satellites to communicate with cellphones, which would allow the company to un-tether from cellular companies. It’s unlikely that the fleets of low-orbit broadband satellites could communicate with something as small as a cellphone. However, a new company – AST & Science – recently announced that they have found a way that cellphones can communicate through satellites. This involves putting up large satellites that would act as a cellular repeater in the sky. For a space nerd like me this brings back memories of Echo 1, pictured above, which was a giant balloon that acted as a passive reflector of microwave signals. AST & Science says that this kind of satellite would act as a cellular repeater rather than as a cell site – it would connect cellphones to a cell site elsewhere.

Apple might also be considering an automobile antenna that can work with satellites. A satellite-to-car antenna would open up a host of products for Apple including smart car connectivity products. This would not be the data-intensive connections imagined by the self-driving car folks, but even a relatively slow satellite connection of even 25 Mbps would open up a whole range of broadband products for use in vehicles.

Apple’s early research might go nowhere and they might just be brainstorming on what is practically possible. The fact that companies like Apple are looking at satellites points out that there are likely many applications for satellite broadband that nobody is talking about. It makes sense that the press, for now, is concentrating on whether any of the proposed satellite constellations ever get launched, because until they are in the sky all of this discussion is purely speculative.

However, the possibilities are endless. How many uses can be developed for a worldwide broadband network that’s available everywhere? Some applications seem obvious, like tying together communications for all of the locations of a worldwide corporation into a big private network. It’s not hard to imagine school systems using the satellites as the way to get broadband for homework to every student. I’m betting there are hundreds of other ideas that have market potential. It will be interesting to see which ones are of the most interest to Apple.

The Explosive Growth of M2M Traffic

The Cisco Annual Internet Report for 2018 – 2023 is full of interesting predictions this year. One of the more intriguing predictions is that Machine-to-Machine (M2M) traffic (which they also refer to as Internet of Things (IoT) traffic) will become a little more than half all of the traffic on the web by 2023. That’s an amazing prediction until you stop and think about all of the devices that communicate with the Internet without needing a human interface.

Cisco forecasts several reasons why M2M traffic will grow so much in the next few years. The primary way is through the proliferation of M2M devices. They predict over the 5-year period there will be a 2.4 times growth in connected devices from 6.1 billion in 2018 to 14.7 billion in 2023. That’s a 19% compounded growth rate and by 2023 equals 1.8 connected devices for every person on earth.

The second reason for the growth is that we are using M2M devices for a lot more functions than just a few years ago. Cisco is predicting fast growth in the following categories of M2M

  • They predict the number of worldwide connected home devices will grow by 20% per year. This is the largest category of devices and will represent just under 50% of connected devices by 2023. This category includes things like home automation, home security and video surveillance, connected white goods (the new term for connected appliances), and our communications and entertainment devices like smart TVs, laptops, desktops, and smartphones.
  • They predict that connected car applications will be the fastest-growing sector, growing at 30% per year. This includes connections made for things like fleet management, in-vehicle entertainment, emergency calling systems, vehicle diagnostics, and vehicle navigation.
  • Cisco predicted that connected city applications will be the second fastest-growing M2M category with a 26% compounded growth. This includes things like smart traffic systems, surveillance cameras, weather and environmental monitors, smart parking systems, gunshot monitors, etc.
  • They predict that connected health will grow 19% annually. This category mostly consists of telemedicine monitors used for outpatient monitoring.
  • Connected energy applications are predicted to grow by 24%. This includes smart grid monitors that track utility usage and loads, and pinpoint network outages quickly. It includes energy monitors, which can turn off air conditioners during times of heavy peak usage. In includes sensors in water systems that track pressure and usage and that predict underground leak locations.
  • Cisco predicts connected work will grow by 15%. This is used for things like inventory tracking, surveillance and security monitoring, and tracking and connecting to employees working in the field.
  • They predict that connected retail will grow by 11% annually. M2M traffic is being used to track inventory. Big chain stores are starting to track the shopping pattern of individual shoppers to see how they traverse the various departments.
  • Connected Manufacturing and Supply Chain will grow by 8% annually. Supply chain monitoring tracks the status of delivery for components needed in the manufacturing process. This also includes smart warehousing that automates packing and shipping or orders. Smart manufacturing supports monitors that track the performance of machinery and manufacturing processes.
  • They predict all other M2M traffic will grow by 19%. This would include things like smart agriculture where monitors are tracking individual herd animals and are just starting to be deployed to monitor crop conditions. This would include other things like sports monitors.

The volume of traffic generated by M2M traffic surprises people. So much of what we do happens in the background and we either forget about it or don’t even know it’s happening. For example, there was an article in the Washington Post last year by a reporter that left the country for a month and left his cellphone in his home. During his absence, the phone used a significant portion of his monthly data plan by updating apps and communicating regularly with remote web sites. My wife’s car connects to the web through or WiFi every time she pulls into the driveway and uploads diagnostics of the various monitors and checks for and downloads needed software updates. Whether for good or bad, our machines and electronics are connecting to the web and using broadband.

New Emphasis on Working from Home

One of the hottest topics in the news related to coronavirus is working from home. Companies of all sizes are telling employees to work from home as a way to help curb the spread of the virus. Companies without work-at-home policies are scrambling to define how to make this work to minimize disruption to their business.

Allowing employees to work at home is not a new phenomenon. Most large corporations have some portion of the workforce working at home at least part-time. Studies have shown that home-based employees are often more productive than those working in the office. Those working at home enjoy big savings, both in dollars and time, from not commuting to an office.

There are a few communities around the country that have offered incentives to attract employees who work from home. The first such program I heard of was in 2018 where Vermont offered a cash incentive of between $5,000 and $10,000 for families with a home-worker to relocated to the state. The state has an aging population and wanted to attract families with good incomes to help energize the local economy. The state recognized that the long-term local benefits to the state from attracting high-paying jobs is worth a lot more than the cash incentive they are offering.

Since then other communities have tried the same thing. I recently read about a similar effort in Tulsa, Oklahoma, which has been watching its population drop since 2016. In Tulsa, a foundation is fronting the $10,000 payments used to attract home workers to the community. There is a similar program in Topeka, Kansas and in northwest Alabama.

I’ve been working from home for twenty years, and during that time I’ve seen a big shift in the work-from-home movement. When I first worked from home, I didn’t know anybody else who was doing so. Over time that has changed and in my current neighborhood over a third of the homes on my block include at least one adult working from home. According to Bloomberg, about 4% of the full-time workforce, not counting self-employed people, now work from home. Adding in self-employed people means that work-from-home is a major segment of the economy.

Wall Street seems to have recognized the value of working at home. As I write this article the Dow Jones average has dropped over 11% since February 14th. During that same time, the stock price of Zoom, a company that facilitates remote meetings has climbed over 27%.

I’m sure that most of the people being sent home to work are going to eventually return to the office. However, this current crisis is likely to make many companies reexamine their work-from-home philosophy and policies. Companies that allow people to work from home, at least part-time, are going to be the least disrupted by future economic upheavals.

If you read my blog regulatory you knew what’s coming next. The one group of people who can’t work from home are those who can’t get a decent home broadband connection. Huge numbers of rural homes in the country still have no broadband option or can only buy broadband that is not sufficient for working from home. Most corporations test the home broadband connection before letting employees work from home, and homes can be disqualified due to poor download speed, poor upload speed, or poor latency. A home broadband connection that meets the FCC definition of broadband at 25/3 Mbps might still be deemed by a corporation to be inadequate for working from home.

My consulting firm CCG talked to a homeowner this week who moved to a rural area looking for an improved lifestyle. The wife works from home, and before they bought the new home they were assured that the broadband there was fast enough to support work at home. It turns out the home is served by a WISP that is delivering less than the advertised speed, and that working from home is impossible in the new home. This family is now facing a crisis caused by lack of good broadband – and there may be no solution for their problem.

Sadly, a whole lot of America is losing economically by not being able to attract and support good-paying jobs from those working at home. If a city like Tulsa is willing to pay $10,000 to attract one work-from-home employee, imagine the negative impact on rural counties where nobody can work from home.

Introducing 6 GHz into WiFi

WiFi is already the most successful deployment of spectrum ever. In the recent Annual Internet Report, Cisco predicted that by 2022 that WiFi will cross the threshold and will carry more than 50% of global IP traffic. Cisco predicts by 2023 that there will be 628 million WiFi hotspots – most used for home broadband.

These are amazing statistics when you consider that WiFi has been limited to using 70 MHz of spectrum in the 2.4 GHz spectrum band and 500 MHz in the 5 GHz spectrum band. That’s all about to change as two major upgrades are being made to WiFi – the upgrade to WiFi 6 and the integration 6 GHz spectrum into WiFi.

The Impact of WiFi 6. WiFi 6 is the new consumer-friendly name given to the next generation of WiFi technology (replaces the term 802.11ax). Even without the introduction of new spectrum WiFi 6 will significantly improve performance over WiFi 5 (802.11ac).

The problem with current WiFi is congestion. Congestion comes in two ways – from multiple devices trying to use the same router, and from multiple routers trying to use the same channels. My house is probably typical, and we have a few dozen devices that can use the WiFi router. My wife’s Subaru even connects to our network to check for updates every time she pulls into the driveway. With only two of us in the house, we don’t overtax our router – but we can when my daughter is home from college.

Channel congestion is the real culprit in our neighborhood. We live in a moderately dense neighborhood of single-family homes and we can all see multiple WiFi networks. I just looked at my computer and I see 24 other WiFi networks, including the delightfully named ‘More Cowbell’ and ‘Very Secret CIA Network’. All of these networks are using the same small number of channels, and WiFi pauses whenever it sees a demand for bandwidth from any of these networks.

Both kinds of congestion slow down throughput due to the nature of the WiFi specification. The demands for routers and for channels are queued and each device has to wait its turn to transmit or receive data. Theoretically, a WiFi network can transmit data quickly by grabbing a full channel – but that rarely happens. The existing 5 GHz band has six 80-MHz and two 160-MHz channels available. A download of a big file could go quickly if a full channel could be used for the purpose. However, if there are overlapping demands for even a portion of a channel then the whole channel is not assigned for a specific task.

Wi-Fi 6 introduces a few major upgrades in the way that WiFi works to decrease congestion. The first is the introduction of orthogonal frequency-division multiple access (OFDMA). This technology allows devices to transmit simultaneously rather than wait for a turn in the queue. OFDMA divides channels into smaller sub-channels called resource units. The analogy used in the industry is that this will open WiFi from a single-lane technology to a multi-lane freeway. WiFi 6 also uses other techniques like improved beamforming to make a focused connection to a specific device, which lowers the chances of interference from other devices.

The Impact of 6 GHz. WiFi performance was already getting a lot better due to WiFi 6 technology. Adding the 6 GHz spectrum will drive performance to yet another level. The 6GHz spectrum adds seven 160 MHz channels to the WiFi environment (or alternately adds fifty-nine 20 MHz channels. For the typical WiFi environment, such as a home in an urban setting, this is enough new channels that a big bandwidth demand ought to be able to grab a full 160 MHz channel. This is going to increase the perceived speeds of WiFi routers significantly.

When the extra bandwidth is paired with OFDMA technology, interference ought to be a thing of the past, except perhaps in super-busy environments like a business hotel or a stadium. Undoubtedly, we’ll find ways over the next decade to fill up WiFi 6 routers and we’ll eventually be begging the FCC for even more WiFi spectrum. But for now, this should solve WiFi interference in all but the toughest WiFi environments.

It’s worth a word of caution that this improvement isn’t going to happen overnight. You need both a WiFi 6 router and WiFi-capable devices to take advantage of the new WiFi 6 technology. You’ll also need devices capable of using the 6 GHz spectrum. Unless you’re willing to throw away every WiFi device in your home and start over, it’s going to take most homes years to migrate into the combined benefits of WiFi 6 and 6 GHz spectrum.

Old Regulation Rears its Head

The way that we regulate telecom services is interesting. The FCC has effectively eliminated federal regulation of broadband, the service that over 90% of households now use. Meanwhile, landline telephone service, the telecom product that is used by an ever-decreasing number of homes is still heavily regulated.

The target of much of the remaining regulation are the big telephone companies that still operate large copper networks. It’s easy to bash the big telephone companies because of the poor quality of services offered on those copper networks, and I’ve done so many times in this blog.

When you stop and think about it, those companies are still using copper networks built in the 50s, 60s, and 70s. If the telcos had been good stewards of those networks and maintained them meticulously those networks would still be 50 to 70 years old, and older than the 35-40 year expected life for the networks. The big telcos largely ignored maintenance of copper for the last 30 years or more, and frankly, it’s a miracle that the old copper networks are still working.

Perhaps the oddest aspect of telephone regulation is that a regulatory body will occasionally punish a big telco for still being in the copper business. A good example is a proceeding in New Mexico last fall where CenturyLink asked to be deregulated for landline telephone services. This doesn’t mean that they would stop offering the services, but rather that many of the old regulations put in place at the heyday of the telephone monopolies would be excused. Most states have deregulated the big telcos from a lot of the old telephone rules.

The New Mexico Public Regulations Commission (NMPRC) rejected the request and said that CenturyLink had not demonstrated that there was ‘effective competition’ for residential telephone service. It’s hard to find any way to defend that decision. First, in many states, there are now more residential telephone customers using cable company telephone services than the old telephone company copper. Interestingly, cable companies face almost no regulation in providing telephone service and any cable company in New Mexico does not live under the same rules that CenturyLink must follow.

Further, the latest surveys I’ve seen show that 96% of US adults now have a cell phone. It’s hard to say with a straight face that cellular service is not a direct competitor to landline telephone service. Considering the big recent stir at the FCC where cellular 4G coverage maps were shown to be largely fictional, perhaps a lot of rural New Mexico doesn’t have cellular coverage – and perhaps that’s what drove the Commission’s decision. It’s worth noting  that cellular companies are also not as heavily regulated as landline telephone providers.

The regulation that is most relevant in this case is the obligation to be the carrier of last resort. The telcos like CenturyLink are still expected, within some regulatory exceptions, to provide service to anybody who asks for service. That obligation doesn’t extend to the cable companies, to the cellular companies, or even to rural broadband – just to telephone service.

I have no doubt that there are rural homes in the state for which CenturyLink is the only communications link to the world. In areas where there is no cellular service and where the cable companies refuse to build networks, there are rural homes that rely on CenturyLink and other telcos to keep them connected. The regulatory question that must be asked is if such homes are sufficient reason to still strongly regulate telephone service in a state. Hopefully, the number of homes without cellular service will decrease significantly when the FCC awards the $9 billion in the 5G Fund program to extend cellular service to more remote communities.

It’s not an easy question to answer. We know CenturyLink could have done a better job of taking care of their copper. In this country, the smaller independent telephone companies did the needed maintenance to keep copper in the best shape possible. We saw the same thing in Germany where the copper networks were built at the same time as US copper, but which have been maintained better.

But in this country, most of the smaller telcos have already replaced, or have plans to replace the old copper with fiber. In Germany, there are vigorous public debates on the topic, with engineers saying that the copper networks are not likely to last more than another decade. Where copper remains the Germans have invested in the fastest DSL possible – something the big telcos here inexplicably have not done.

To some degree the decision in New Mexico is meaningless. No regulatory decision can make the old copper perform better or last longer, so there are not many practical ramifications of the Commission’s decision. CenturyLink didn’t even own these networks for most of the years when the maintenance wasn’t done – although they have likely cut back further on maintenance in recent years, as have the other big telcos.

I’m not highlighting New Mexico for this issue because many other states have made similar regulatory decisions. Regulators are rightfully mad at the big telcos for neglecting copper, and even madder that there are no plans to upgrade the copper to something better. But the time for regulators to do something about this was twenty and thirty years ago. The copper wires in New Mexico are going to die, and at some future date the networks will go dark. The regulators can choose to regulate copper down to the last day of the last customer – but to a large degree, the remaining regulations don’t mean a whole lot.

Broadband is Now a Mature Market

One of the most interesting things revealed by Cisco’s latest Annual Internet Report is the extent to which North America is now largely a mature broadband market. In this case, North America is the combination of the US and Canada and Cisco does not provide the data for each country.

Consider the following statistics for North America between 2018 and 2023:

  • The percentage of people using the Internet in 2023 will grow to 92% (345 million users) up from 90% (328 million users) in 2018.
  • The percentage of people using cellphones in 2023 will grow to 88% (329 million users) up from 86% (313 million users) in 2018.

This paints a picture of North America as a mature broadband market. While there are still new customers to land, ISPs collectively will not be winning many new customers. The growth of landline users from 328 million to 345 million over 5 years represents an annual growth rate of only 1%. In economic terms that’s a mature market.

There will likely continue to be movement within the market. In the year ending in the third quarter of 2019, the big cable companies took at least 2 million customers from telcos – a trend that is likely to continue. However, some telcos are fighting back by building fiber, such as the 12 million fiber passings built by AT&T over the last few years.

Cisco is painting the same bleak future for cellular customers and is predicting the same slow 1% annual growth for North America. The cellular companies have been waging a marketing war and stealing customers from each other – a largely zero-sum game. The cellular market is getting tougher as Comcast and Charter continue to win cellular customers and Dish Networks is poised to enter the market in a few years.

The mature nature of the broadband and cellular industries exposes the FCC’s fiction that carriers will be spending a lot more capital due to relaxed regulations. It’s hard for any ISP to justify spending a lot of capital in a stagnant and slow-growing market. Any capital spending is being done to upgrade to newer technologies, but there is not a lot of capital needed due to customer growth.

Equipment manufacturers aren’t focusing on North America. While the US will add a net of 17 million people to the Internet over 5 years, Asia Pacific will be adding a billion new people, the Middle East and Africa will be adding 230 million people, and Latin America will be adding 83 million people. Europe is also a mature market and will only be adding 25 million people to the Internet over 5 years.

These numbers show why the administrations attempt to somehow squelch Huawei is likely doomed to failure. Huawei doesn’t need North America or Europe to succeed and can far outstrip European and American vendors by concentrating on Asia and Africa.

This slow growth also highlights the dilemma of the publicly traded ISPs. With a 1% annual growth rate the big ISPs start to look like electric utilities in terms of growth potential. Comcast and Charter  are still meeting Wall Street expectations due to taking customers from DSL, but even that growth has to slow and shrink away at some point. All of the other big cable companies are faced with trying to please Wall Street with stagnant customer counts – something that only can be done by raising rates, cutting costs through mergers, or introducing new revenue streams.

The industry doesn’t have far to grow after 2023. Numerous surveys have shown that most of the people that don’t buy Internet access either can’t afford it or live in a rural market where it’s not available. Since the big ISPs aren’t chasing either of those customer segments they are already collectively at their peak. Growth now comes only from general population growth – and even that news is not great as the US birth rate keeps dropping and immigration has been curtailed.

There is nothing wrong with a mature market from an economics perspective. Unfortunately, the high growth of broadband customers over the past twenty years has created an expectation on Wall Street that telecom companies have fast growth potential. I look at the basic numbers and wonder how long it will be until Wall Street resets that expectation. We’d all be a lot better off if the big ISPs didn’t feel huge pressure to grow the bottom line.