Trade Associations

Anybody new to the broadband industry quickly finds out that the industry is full of trade associations. There are associations of service providers and other associations representing various segments of the industry, such as electronics manufacturers and construction contractors.

Trade associations play several roles for their members. Most trade associations lobby on behalf of their members, and some associations are primarily lobbyists. Trade associations routinely file position papers with federal and state regulatory agencies on behalf of members. Other trade associations concentrate on providing services to members. Perhaps one of the most useful functions of trade associations is holding conventions and meetings so that like-minded members can talk to their peers. Most of the trade associations that you’ll find quoted in the press are national associations, but there are also numerous regional and state-level trade associations.

Many trade associations are focused on a specific industry segment. CTIA largely represents cellular companies. NTCA is largely comprised of rural ISPs. But a few trade associations are interesting because they include members that seemingly compete against each other. For example, ACA Connects has a wide-ranging membership that includes cable, telco, and fiber companies under the same umbrella. One thing is for sure – an industry newcomer needs a lineup card to keep them all straight.

Viewed from inside the industry, it’s easy to think of trade associations as powerful entities since they can speak for an entire industry segment. But it’s interesting to look at the big picture. There are over 26,000 nationwide trade associations in the country – and every industry imaginable has them.

It’s been interesting to watch the evolution of industry trade associations during my career. When I first started in the 1970s, there were not many associations. But as the industry evolved, new trade associations came to life. The newest association, APPB, was formed in 2022 to represent the interests of municipal broadband providers. Existing trade associations sometimes splinter if the association starts taking positions that not all members endorse. One of the most common disagreements within trade associations is between large members and small members – it’s common for a trade association to adopt positions that favor its largest members.

One of the best examples of this is USTelecom – The Broadband Association. This is the oldest industry trade association that was first created as the Independent Telephone Association in 1897 by a group of non-Bell telephone companies. The first members were looking for a unified voice to lobby against the power of the Bell telephone companies that were running roughshod over regulators at the time.

Over time, the association rebranded as USTA (United States Telecom Association). After the divestiture of AT&T into regional Bell companies, USTA underwent a big change when it allowed former Bell companies to join. Over a few years, a lot of smaller members of the association went elsewhere as the association started to be dominated by the large companies. When I see a USTelecom position now, I always assume it is primarily speaking for AT&T, Verizon, Lumen, and the other large telcos.

USTelecom is also a good example of how trade associations change to reflect the evolution of the industry. The association rebranded to add on the “the Broadband Association” to recognize that telephone companies have morphed primarily into broadband companies. We see the same thing happening all over the industry. When I recently looked at the website for NCTA, the association representing the largest cable companies, one of the predominant articles talks about needing more balanced policies on spectrum – because cable companies are increasingly also becoming cellular companies. Evolution in the industry is inevitable and we’re seeing unprecedented convergence between the companies in the industry and the technologies and businesses they pursue. I have to wonder how convergence will affect trade associations. For example, what will eventually happen to an association like NCTA if cable companies all migrate to fiber – similar questions loom all around the industry.

Next-generation PON is Here

At some point during the last year, practically every ISP I know that uses PON technology has quietly upgraded to next-generation PON. For now, that mostly means XGS-PON, which can deliver 10 gigabits of bandwidth to a neighborhood. We’re on the verge of seeing even faster PON cards that will be able to deliver 40 gigabits and probably beyond to 100 gigabits.

This is a big upgrade over GPON which delivers 2.5 Gbps download speed to a neighborhood node. In recent years ISPs have been able to use GPON technology to sell reliable gigabit speeds to homes or businesses that share the network in a neighborhood.

We saw a similar upgrade a dozen years ago when the industry upgraded from BPON, which delivered 622 Mbps to a neighborhood – the upgrade to GPON was a 4-fold increase in available bandwidth. Upgrading to XGS-PON is another 4-fold increase. 40-gigabit PON will be another 4-fold increase.

The best thing about the current upgrade to faster PON is that the vendors got smarter this time. I still have clients who were angry that the upgrade from BPON to GPON meant a total replacement of all electronics – even though the vendors had declared that there would be an easy upgrade path from BPON. Many ISPs decided to change vendors for the upgrade to GPON, and I think vendors got the message.

The PON architecture for most vendors allows upgrading some customers to XGS-PON by adding a faster card to an existing GPON platform. This smart kind of upgrade means that ISPs don’t need to make a flash-cut to faster PON but can move customers one at a time or neighborhood by neighborhood. Upgrades to even faster generations of PON are supposed to work in the same way.

The impact of going to GPON was the widespread introduction of gigabit-speed broadband. A decade ago, gigabit broadband was declared by cable companies to be a gimmick – likely because they couldn’t touch gigabit speeds that fast at the time. But now, all large cable companies are successfully selling gigabit products. According to the latest report from OpenVault, a quarter of homes now subscribe to gigabit or faster broadband products and almost 20% of homes regularly use more than a terabyte of data in a month.

We’ve already seen changes in the market due to next-generation PON. I know a number of ISPs that are now selling 2 Gbps and 5 Gbps broadband products using the new technology. A few are now offering 10 Gbps connections.

One of the biggest decisions faced by an ISP is how many customers to load onto a single PON card at the chassis. GPON allowed for putting up to 128 customers on a PON card, but most ISPs I know only loaded 32 customers. While this was a conservative decision, it meant there was a lot of safety so that customers almost always get the bandwidth they subscribe to.

It’s possible to load a lot more customers onto an XGS-PON card. Most of my clients are still configuring with 32 customers per card, although I’m now seeing a few ISP load 48 or 64 customers per card. There is enough bandwidth on a 10-gigabit card to give everybody a gigabit product, even with a higher customer counts, except perhaps in business districts where there might be some customers using a lot of bandwidth all of the time. The main consideration for loading extra customers on a card is the consequence of a bad card knocking out a greater number of customers.

While you never hear them talking about it, the widespread introduction of XGS-PON is one of the driving factors behind cable companies scrambling to upgrade to faster bandwidth. While the cable companies initially scoffed at gigabit speeds on GPON, I think they’ve learned that claims of faster speeds by fiber ISPs have convinced the public that fiber is superior, even when a cable company can match fiber speeds.

The race for faster technologies is clearly on. Many industry skeptics still scoff that people don’t need faster speeds – but ISPs have learned that people will buy it. That’s a fact that is hard to argue with.

A Last Gasp at Regulating Copper

The Minnesota Public Utilities Commission recently ordered a series of public hearings to investigate the quality of service on the CenturyLink copper networks. The hearings were prompted by a complaint filed by the Communications Workers of America (CWA). The complaint listed the failures of CenturyLink to meet state service standards due to the deterioration of the copper network. CWA also noted that CenturyLink is planning to eliminate half of the remaining technicians who work on copper.

Similar inquiries by other state regulators have been instituted in the last few years against CenturyLink and Frontier. I feel sorry for any customers left on deteriorating copper networks, but proceedings like this one feel like the last gasp of regulators trying to score points by beating up on the telcos that still operate copper networks.

Not that CenturyLink doesn’t deserve a lot of criticism. Its copper networks are in dreadful condition and are in the process of dying. The poor condition of the networks is due in large part to the decades-long lack of maintenance and repairs. We know this is the case because copper networks of a similar age are still operating much better in Europe. The big telcos like CenturyLink, Frontier, Verizon, and AT&T stopped caring about copper networks back in the 1990s, and the networks have been in a steady decline since then.

But U.S. copper networks are truly near the end of life. It’s impossible to neglect maintenance for over twenty years and somehow suddenly make the networks perform better. It’s hard to fathom the intentions of having regional hearings on the topic for any purpose other than letting people vent their frustration with CenturyLink. It’s hard to imagine anything changing as a result of these hearings that will improve service. There might be new fines levied on CenturyLink, but that’s less costly for the company than trying to make the copper work.

Some big telcos are working to convert copper networks to fiber. Frontier and Windstream are building a lot of fiber – and I assume they are overlashing the new fiber wires on the old copper. AT&T and Verizon are selectively expanding fiber in neighborhoods where the cost of construction meets some internally set cost test – but these two companies are quietly moving most copper customers onto cellular connections.

CenturyLink has been up and down on the decision to overbuild residential fiber. It currently looks like the company is only building ‘strategic’ fiber, which I interpret to mean business districts and large apartment complexes. It seems unlikely that CenturyLink will overbuild much more of its residential copper in Minnesota or elsewhere with fiber.

I would bet that if CenturyLink could wave a magic wand and be rid of copper, it would do so. It’s harder each year to maintain copper networks, and a move to eliminate half of the remaining copper technicians shows that the company is finally throwing in the towel. But giving up on copper still means walking away from a lot of revenue.

There are still plenty of customers who want to keep using the copper networks. Say what you want about the inadequacies of DSL, but in most urban markets where my firm does surveys, we still find 10% to 20% of households are still using DSL. These are households for whom the price is more important than broadband speed.

CenturyLink and the other big telcos have recaptured the cost of the copper networks many times over and decided many years ago not to reinvest profits back into new and upgraded networks. We’re now reduced to watching the last death throes of copper networks, and it’s not pretty.

Ten Years

Today is the tenth anniversary of writing this blog every day. That equates to 2,527 blogs, and that got me thinking about why I write this blog. It also got me thinking about the things I have gotten right and wrong over the years in my daily musings about the broadband industry.

I give full credit for this blog to my wife Julie. Ten years ago, I told her that I was having trouble keeping up with the rapid changes in the industry. Julie suggested that I start writing a daily blog as a way to force myself to read and think about the industry. Writing a blog every day was incredibly difficult at first. I struggled to find topics, and I struggled to condense my thoughts into 700-word essays. But I stuck with it until writing became a habit. I now can’t imagine not writing a blog, and I usually have a longer list of potential topics than there are days to write about them.

Before writing this blog, I went back and read some of my blogs over the years to see what I got right and wrong. One thing about having a public blog is that you can’t escape what you’ve said in the past – it’s all still out there to read.

One of the first things I got wrong happened in the first year of writing the blog. I was highly skeptical of Tom Wheeler being named Chairman of the FCC. Mr. Wheeler had an interesting career as CEO of several high-tech companies but had also served as the President of the National Cable Television Association (NCTA) and the Cellular Telecommunications & Internet Association (CTIA). I assumed that his experience in lobbying for the biggest companies in the industry meant that he was going to bring a bias to the FCC strongly in favor of big companies over everybody else. I couldn’t have been more wrong. Tom Wheeler ended up being one of the most even-handed heads of the FCC during my career. He sometimes sided with large corporations, but he also was a champion of consumers and municipal broadband – something that I think surprised everybody in the industry. He was what you want to see in an FCC Chairman – somebody who independently supported what he thought was right instead of what was wanted by corporate lobbyists.

Another thing I got wrong was something I wrote near the end of 2019. By that time, I had heard for years from rural communities that despaired that they had no broadband and were being left behind. I wrote that I sadly didn’t see any real hope on the horizon and that rural communities were on their own to get creative and find a way to fund broadband – even though I knew that the financial lift was beyond most communities. There was no way to know that we were only a few months away from a pandemic that would change everything. We sent students and workers home to somehow cope with school and work without broadband, and the cry for better broadband could no longer be ignored. We’re now awash in broadband grant funding. It’s going to take a few years to see if the grant funding is enough to serve everybody, but broadband solutions are on the way for most rural communities that were unimaginable in 2019.

I also got some things right. From the first time that I heard about the supposed wonders of 5G, I was extremely skeptical because I couldn’t find a business case for the technology. Almost everybody in the country already had a cellphone, and it was hard to imagine that people would be willing to spend more to get the rather obscure benefits promised by 5G. If anything, the trend seemed to be in the opposite direction, with competition driving cellular prices lower. I watched in amazement as the power of large corporate lobbying invented a fervor for 5G out of thin air. The public and politicians were sold on the idea that 5G meant a broadband revolution, and the 5G message was suddenly everywhere. There is still no great business case for 5G and there has been very little actual 5G technology introduced into networks. Yet even today, I keep reading about how 5G will soon change everything.

I also got it right in predicting that broadband demand would continue to grow. Akamai reported in 2013, when this blog started, that the average broadband download speed in the U.S. was 8.6 Mbps. Pew said that 2013 was the year when home broadband connections hit a 70% market penetration. The digital divide was already evident in 2013 when 90% of homes that included a college graduate had broadband compared to only 37% for homes where the adults didn’t have a high school degree. From the beginning of writing my blog, I predicted that home broadband consumption would double every three years – and it has grown even faster. Amazingly, politicians and policymakers still act like broadband demand is static. In 2015, the FCC amazingly handed out $1.5 billion annually for six years of CAF II funding to support the rural DSL provided by the largest telcos. Even today, policymakers are ignoring the broadband growth trends by allowing BEAD grants to be given to technologies as slow as 100/20 Mbps. We embarrassingly still have a national definition of broadband of only 25/3 Mbps at a time when a large majority of folks are able to buy gigabit speeds.

People often ask me how long I’ll keep writing this blog, and my answer is easy. I’ll keep writing for as long as there are interesting topics to talk about – and for as long as it’s fun.

It’s the ISP, Not Just the Technology

Davis Strauss recently wrote an article for Broadband Breakfast that reminded me that good technology does not always mean a good ISP. There are great and not so great ISPs using every technology on the market. Mr. Strauss lists a few of the ways that an ISP can cut costs when building and operating a fiber network that are ultimately detrimental to customers. Following are a few examples.

Redundant Backhaul. Many of the BEAD grants will be built in areas where the existing broadband networks fail regularly due to cuts in the single fiber backhaul feeding the area. I hear stories all of the time of folks who lose broadband regularly for a few days at a time and sometimes much longer. Building fiber last-mile will not solve the backhaul issue if the new network relies on the same unreliable backhaul.

Oversubscription. It’s possible to overload a local fiber network just like any other network if an ISP loads more customers into a network node that can be supported by the bandwidth supplying the node. There are multiple places where a fiber network can get overstressed, including the path between the core and neighborhoods and the path from the ISP to the Internet.

Lack of Spares. Fiber circuit cards and other key components in the network fail just like any other electronics. A good ISP will have spare cards within easy reach to be able quickly restore a network in the case of a component failure. An ISP that cuts corners by not stocking spares can have multi-day outages while a replacement is located.

Poor Network Records. This may not sound like an important issue  but it’s vital for good customer service and network maintenance. Fiber wires are tiny and are not easy for a field technician to identify if there are not great records that match a given fiber to a specific customer. There is an upfront effort and cost required to organize records, and an ISP that skimps on record keeping will be forever disorganized and take longer to perform routine repairs and maintenance.

Not Enough Technicians. Possibly the most important issue in maintaining a good network is to have enough technicians to support the network. The big telcos have historically understaffed field technicians which has resulted in customers waiting days or weeks just to have a technician respond to a problem. ISPs can save a lot of money by running a too-lean staff to the detriment of customers.

Inadequate Monitoring. ISPs that invest in good network monitoring can head off a huge percentage of customer problems by reacting to network issues before customers even realize there is a problem. A huge percentage of network problems can be remedied remotely by a skilled technician if the ISP is monitoring the performance of all segments of a network.

These are just a few examples of ways that ISPs can cut corners. It is these behind-the-scenes decisions on how to operate that differentiate good and poor ISPs. Mr. Strauss doesn’t come right out and say it, but his article implies that there will be ISPs chasing the giant BEAD funding that will be in the business of maximizing profits early to be able to flip the business. An ISP with this mentality is not going to spend money on redundant backhaul, record-keeping, spares, or network monitoring. An ISP with this mentality will hope that a new fiber network can eke by without the extra spending. They might even be right about this for a few years, but eventually, taking short cuts always comes back to cost more than doing things the right way.

We already know that some ISPs cut corners, because we’ve seen them for the last several decades. The big telcos will declare loudly that DSL network perform badly because of the aging of the networks. There is some truth in that, but there are other ISPs still operating DSL networks that perform far better. The rural copper networks of big telcos perform so poorly because the big telcos cut every cost possible. They eliminated technicians, didn’t maintain spare inventories, and invested nothing in additional backhaul.

I honestly don’t know how a state broadband office is going to distinguish between an ISP that will do things right and one that will cut corners – that’s not the sort of thing that can be captured in a grant application since every ISP will say it plans to do a great job and will offer superlative customer service.

A Repeat Performance for Cable 4Q 2022

The traditional cable companies lost over 6.25 million cable subscribers in 2022, up from 5.6 million in 2021. That means that almost one in every twenty homes in the country dropped traditional cable TV during the last year.

These numbers come from Leichtman Research Group, which compiles most of these numbers from the statistics provided to stockholders, except for Cox, which is privately held and estimated. Leichtman says this group of companies represents 96% of all traditional U.S. cable customers.

% 4Q Annual
4Q 2022 4Q Change Change Change
Comcast 16,142,000 (440,000) -2.7% (2,034,000)
Charter 15,147,000 (144,000) -0.9% (686,000)
DirecTV 13,100,000 (400,000) -3.0% (1,500,000)
Dish TV 7,416,000 (191,000) -2.5% (805,000)
Verizon 3,301,000 (82,000) -2.4% (343,000)
Cox 3,050,000 (90,000) -2.9% (340,000)
Altice 2,439,000 (52,800) -2.1% (293,300)
Mediacom 510,000 (15,000) -2.9% (62,000)
Breezeline 309,627 (13,411) -4.2% (37,102)
Frontier 306,000 (16,000) -5.0% (74,000)
Cable ONE 181,500 (20,500) -10.1% (79,500)
Total 61,902,127 (1,464,711) -2.3% (6,253,902)
Hulu Live 4,500,000 100,000 2.3% 200,000
Sling TV 2,334,000 (77,000) -3.2% (152,000)
FuboTV 1,445,000 214,000 17.4% 323,000
Total Cable 37,779,127 (775,711) -2.0% (3,531,902)
Total Other 24,123,000 (689,000) -2.8% (2,722,000)
Total vMvPD 8,279,000 237,000 2.9% 371,000

The losses are fairly even across the industry, with most cable providers seeing around a 10% drop in cable customers during the year. The exceptions were Charter, which lost only 4.3%, Frontier that lost almost 20%, and Cable One (Sparklight) that lost over 30% of customers. If these trends continue for another year, Charter will pass Comcast and become the largest traditional cable provider.

The magnitude of the losses are staggering, with Comcast losing over 2 million cable customers during the year and DirecTV losing 1.5 million.

To put the loss of cable customers into context, these same large companies had over 85 million cable customers at the end of 2018 and are now down to under 62 million customers.

In the fourth quarter, the three online cable alternatives that LRG tracks gained 371,000 new customers for the year, A few major online alternatives, like YouTube TV aren’t on the list since they don’t announce customer counts.

Good Enough Broadband

I’ve lately been asked by several local politicians why they should pursue getting grant funding for their county since Starlink satellite and FWA cellular broadband seem like good broadband alternatives that are already here today. It’s a reasonable question to ask since they have likely heard from rural households that are happy with both technologies. The question usually includes some degree of wishful thinking because the officials want to be able to tell constituents that good broadband is already available and that the broadband gap has been solved.

I hate to tell them that these technologies are not a good permanent solution. At the same time, I stress that they should be promoting these technologies to make sure that folks know there are some better alternatives available today than other extremely slow broadband options. But I don’t think either of these technologies is a long-term broadband solution.

FWA cellular broadband is home broadband that is delivered by cellular companies from cellular towers. It uses the same technology as the broadband delivered to cellphones, with the only real difference being that there is an in-home receiver that can be used for home broadband.

The primary problem with thinking of FWA cellular as a permanent solution is the reach of the technology. Somebody living right under a tower might be able to get 200 Mbps broadband today, and for somebody who has been suffering with rural DSL or cellular hotspots, this is an amazing upgrade. But the strong cellular service doesn’t carry far from a given tower. Speeds drop rapidly with the distance between a customer and the cell tower. A customer living a mile away from a tower might see maximum speeds of 100 Mbps, but after that, speeds drop precipitously until the product looks like other current slow broadband technologies.

The distance issue wouldn’t be a big problem if rural counties were peppered with cell towers – but most rural counties don’t have nearly enough towers to support this technology. In fact, in most rural counties I’ve worked in, a lot of the county doesn’t have good enough cellular coverage for voice calls. There doesn’t seem to be any mad rush to build new towers to support FWA – and I wouldn’t expect a cellular carrier to want to be on a tower that might only see a few dozen potential customers.

A final issue with FWA is that cellular carriers give priority to cell phones over home broadband. If cellphone traffic gets heavy, then the carriers will throttle the FWA speeds. This is probably less of an issue in a rural area than in a city, but it means that the broadband is not fully reliable.

Satellite broadband is also not a great long-term solution for several reasons. Starlink has already said that it will only serve some fixed number of customers in a given geographic area – a number it won’t disclose. That makes sense to any network engineer because the bandwidth from a single satellite overhead is shared by all homes using the service. This means that if too many households try to use a satellite at the same time that broadband speeds will bog down. Starlink is never going to be willing to serve all of the rural customers in a county – when it reaches it’s target customers it won’t sell more connections.

The other issue with Satellite broadband is that customers need a great view of the sky. Homes located amidst trees or near hills or mountains may not be able to get the service at all or get a slowed connection.

The final issue with both technologies is the speed being delivered. FWA is most typically today delivering only 50-100 Mbps to most households that are within range of a tower. The speed tests for Starlink show a similar range between 50-150 Mbps. These are amazing speeds for a home with no broadband alternatives. But these speeds are already at the low end of acceptable broadband today – particularly since these technologies have a much higher latency than fiber.

In twenty years, we’ve grown from DSL and cable modems that delivered 1 Mbps to fiber technology today that can deliver multiple gigabit speeds. There are those that claim that the fast speeds are just marketing gimmicks, but I’m hearing from more households over time that need the faster speeds. The reality of the marketplaces is that technologies will spring up to take advantage of faster broadband. We’re already seeing 8K TVs today, and telepresence should be here in the near future. A rural customer receiving 50-100 Mbps will be locked out of future faster applications.

Any county that decides not to pursue the grants to get faster broadband will regret the decision in a decade when neighboring counties have blazingly fast broadband and are the places where folks will want to live. We’ve learned that fast home broadband now equates to economic development due to the work-at-home phenomenon. I worked with a county recently where 30% of the homes include at least one person working full time from home. That means higher incomes which translates into local prosperity.

I really like both of these technologies, and I recommend them to rural folks all of the time. But these are not the broadband solution that a county needs for long-term prosperity.

Only Twenty Years

I’ve written several blogs that make the argument that we should only award broadband grants based on future-looking broadband demand. I think it is bad policy to provide federal grant funding for any technology that delivers speeds that are already slower than the speeds already available to most broadband customers in the country.

The current BEAD grants currently use a definition of 100/20 Mbps to define who households that aren’t considered to have broadband today. But inexplicably, the BEAD grants then allow grant winners to build technologies that deliver that same 100/20 Mbps speeds. The policymakers who designed the grants would allow federal funding to go to a new network that, by definition, sits at the nexus between served and unserved today. That is a bad policy for so many reasons that I don’t even know where to begin lambasting it.

One way to demonstrate the shortsightedness of that decision is a history lesson. Almost everybody in the industry tosses out a statistic that a fiber network built today should be good for at least thirty years. I think that numbers is incredibly low and that modern fiber ought to easily last for twice that time. But for the sake of argument, let’s accept a thirty-year life of fiber.

Just over twenty years ago, I lived inside the D.C. Beltway, and I was able to buy 1 Mbps DSL from Verizon or from a Comcast cable modem. I remember a lot of discussion at the time that there wouldn’t be a need for upgrades in broadband speeds for a while. The 1 Mbps speed from the telco and cable company was an 18-times increase in speed over dial-up, and that seemed to provide a future-proof cushion against homes needing more broadband. That conclusion was quickly shattered when AOL and other online content providers took advantage of the faster broadband speeds to flood the Internet with picture files that used all of the speed. It took only a few years for 1 Mbps per second to feel slow.

By 2004, I changed to a 6 Mbps download offering from Comcast – they never mentioned the upload speed. This was a great upgrade over the 1 Mbps DSL. Verizon made a huge leap forward in 2004 and introduced Verizon FiOS on fiber. That product didn’t make it to my neighborhood until 2006, at which time I bought a 30 Mbps symmetrical connection on fiber. In 2006 I was buying broadband that was thirty times faster than my DSL from 2000. Over time, the two ISPs got into a speed battle. Comcast had numerous upgrades that increased speeds to 12 Mbps, then 30 Mbps, 60 Mbps, 100 Mbps, 200 Mbps, and most recently 1.2 Gbps. Verizon always stayed a little ahead of cable download speeds and continued to offer much faster upload speeds.

The explosion of broadband demand after the introduction of new technology should be a lesson for us. An 18-time speed increase from dial-up to DSL seemed like a huge technology leap, but public demand for faster broadband quickly swamped that technology upgrade, and 1 Mbps DSL felt obsolete almost as soon as it was deployed. It seems that every time there has been a technology upgrade that the public found a way to use the greater capacity.

In 2010, Google rocked the Internet world by announcing gigabit speeds. That was a 33-time increase over the 30 Mbps download speeds offered at the time by the cable companies. The cable companies and telcos said at the time that nobody needed speeds that fast and that it was a marketing gimmick (but they all went furiously to work to match the faster fiber speeds).

I know homes and businesses today that are using most of the gigabit capacity. That is still a relatively small percentage of homes, but the number is growing. Over twenty years, the broadband use by the average home has skyrocketed, and the average U.S. home now uses almost 600 gigabytes of broadband per month – a number that would have been unthinkable in the early 2000s.

I look at this history, and I marvel that anybody would think that it’s wise to use federal funds to build a 100/20 Mbps network today. Already today, something like 80% of homes in the country can buy a gigabit broadband product. The latest OpenVault report says that over a quarter of homes are already subscribing to gigabit speeds. Why would we contemplate using federal grants to build a network with a tenth of the download capacity that is already available to most American homes today?

The answer is obvious. Choosing the technologies that are eligible for grant funding is a political decision, not a technical or economic one. There are vocal constituencies that want some of the federal grant money, and they have obviously convinced the folks who wrote the grant rules that they should have that chance. The biggest constituency lobbying for 100/20 Mbps was the cable companies, which feared that grants could be used to compete against their slow upload speeds. But just as cable companies responded to Verizon FiOS and Google Fiber, the cable companies are now planning for a huge leap upward in upload speeds. WISPs and Starlink also lobbied for the 100/20 Mbps grant threshold, although most WISPs seeking grant funding are now also claiming much faster speed capabilities.

If we learn anything from looking back twenty years, it’s that broadband demand will continue to grow, and that homes in twenty years will use an immensely greater amount of broadband than today. I can only groan and moan that the federal rules allow grants to be awarded to technologies that can deliver only 100/20 Mbps. But I hope that state Broadband Grant offices will ignore that measly, obsolete, politically-absurd option and only award grant funding to networks that might still be serving folks in twenty years.

Picking a Good Steward

I’ve been working with a lot of counties and cities that are providing funding to ISPs to expand last-mile broadband. Some of them think of the arrangement as a public-private partnership, while others think of it as making local broadband grants. Almost universally, the hardest question I get asked is how to know if they can trust an ISP to fulfill its promises. They want to know who is going to be a good steward of their money. No local politician wants to provide money to an ISP, and a year later hear the public complaining about that choice.

I rarely have a specific response to the question, but I provide a way for them to think about it. I’ve been suggesting a series of question that makes them dig deeper into the real nature of a given ISP and why they want the local funding.

Does the ISP do a good job today in other markets? Understanding this requires real due diligence, but it’s a question that can be answered. I have yet to see any ISP claim anything other than that they have stellar customer service – and we know that is not true for a lot of ISPs. One way to check on an ISP is to contact local officials in some of the communities where the ISP serves today.

Will the ISP you partner with today still be around in a decade? I never asked this question in the past, but it feels relevant today. A lot of experts foresee a huge roll-up of fiber networks, and an ISP you partner with today might not be the same ISP that will be serving your community a decade from now. There is a lot of venture capital money in the market today, and at least some VCs likely have the philosophy of building a network and dumping it in ten years. Unfortunately, this question doesn’t only apply to VC-backed firms, and there will kikely also be a roll-up of family-owned and sole proprietor ISPs.

Is the ISP trying to grow too quickly? I’ve lately seen ISPs seeking local grant funding that brag about how they already have deals in queue to build hundreds of thousands of other passings. The hardest thing for any ISP to do is to grow quickly – failure to master the complexities of rapid growth has been the fatal flaw for a lot of ISP start-ups over the years. It’s a fair question to ask if the ISP you are talking to is overreaching.

Where does the money come from? I’m seeing ISPs that have been in the business for many years suddenly talking about being able to fund huge expansion. When you partner with somebody like this, are you really partnering with the ISP with the known name or with a venture capitalist hidden in the background?

Are you being offered a too-good deal? I’ve seen several ISPs offering partnerships to cities where there will be a guaranteed profit share paid to the City. Is the ISP dangling money to a community to cover for other weaknesses? This takes me back to the advice I’ve heard for my whole life – if something sounds too good to be true, it probably has an underlying flaw.

Can I trust an ISP who has done a lousy job for many years but now swears they are different? I’m obviously talking about the big telcos. These companies abandoned rural DSL networks and customers. Suddenly, they want localities to believe that they will be a different company with a fiber network – because of the new technology.  But new technology doesn’t change an ISPs underlying philosophy. Will these big telcos keep enough staff to make timely repairs, and will they do the maintenance to keep the fiber network operating in the future?

One of the hardest questions I’ve been asked is how to evaluate a new ISP taking over a terrible network. In recent years the three most prominent ISPs in this category are Consolidated, Ziply, and now Brightspeed. I don’t have a clue how to judge the intentions of a new company. My best advice is to be at least a little wary – companies that have purchased telephone copper networks in the past struggled badly for many years when the copper networks act like an anchor holding them down.

Sometimes the offers of partnerships have an easy and obvious choice. For example, I know counties that have partnered with electric or telephone cooperatives. They trust that these businesses will still be local and operating for many decades to come. After that, picking a partner is a lot harder.

Is the Broadband Market Mature?

Craig Moffett, of MoffettNathanson, was recently quoted in FierceTelecom asking if the broadband industry is reaching maturity. Other than in rural areas, where a lot of homes are still hungry for better broadband, the broadband penetration rate in cities is approaching 90%. It’s a fair question to ask if there is room for much more growth in the industry.

This is a question that has bounced around for the last five years. But there was still significant growth in broadband over the last few years. In 2019, national broadband subscribers grew by 2.6%. That leaped to 4.5% in the 2020 pandemic year. In 2021, broadband growth slowed to 2.8% but rebounded to 3.3% in 2022.

The 2022 growth rate is likely inflated by rural broadband growth, as practically all the overall industry growth for the year came from cellular FWA broadband provided by T-Mobile and Verizon. We can’t know for sure since those companies haven’t reported on the mix of rural and urban FWA customers.

What would a mature broadband market look like? It would first mean that annual subscriber growth would likely not be greater than the growth of total households. In recent years that has been in the 1% annual range and would mean perhaps 1.2 million new broadband subscribers each year nationally. This is a drastic change for the broadband industry. Consider Comcast and Charter, the two largest ISPs. These two companies represent almost 55% of all broadband subscribers. In 2019 the two companies grew by over 5%. In 2020 that leaped to over 7%. Growth for the two fell to 4% in 2021, but in 2022 was only around 1%. The stock price for these companies for the last decade has been based upon an ever-growing customer base – and annual rate increases.

We already have an idea of what a mature telecom market looks like by looking at the big cellular companies. Practically everybody has a cellphone, and the industry now expends huge marketing dollars in trying to pry market share from competitors.

There is one way that broadband differs from cellular in that cell service in much of the country is a commodity, meaning there is not much real difference between products or performance of the cellular carriers. This isn’t true everywhere, and in some places, one of the cellular companies has a superior network. But in most urban markets where most folks live, there isn’t a lot of difference between cell companies.

The broadband market is different because, in many markets, there is only one fast ISP – usually the cable company. Such markets are effectively broadband monopolies, and the monopoly provider doesn’t have to worry about a competitor taking market penetration. That means that if overall growth permanently slows that all of the wrestling for market share is going to happen in the markets that have both a cable company and a fiber competitor.

But there is another possibility. In markets where Verizon FiOS has competed against a cable company for many years, the two sides have reached a duopoly equilibrium – meaning that neither Verizon nor the cable company won the competition battle. We saw Verizon and the cable companies dukeing it out heavily in the early years of FiOS, but the marketing in these markets today has none of the desperation or vehemence of cellular competition. In a duopoly market, the two big players are happy to maintain a relatively steady market share – and the equilibrium is fine with both competitors as long as it doesn’t get too badly skewed.

If overall broadband growth slows, we’ll see different responses depending on the market. Markets without a major fiber provider will continue to be cable monopolies. This is where prices will go up every year. Markets that settle into a steady duopoly will compete with low-key advertised specials to lure folks back and forth between the two ISPs. The biggest marketing battles and the real competition will come from markets where a cable company is competing against an independent fiber provider other than the big telcos. When broadband growth inevitably slows, the industry will naturally change. But I don’t expect to see a clear-cut national response. A mature broadband market will differ according to the local mix of competitors.