Federal Broadband Coordination

The White House is now requiring that the three agencies that are involved with broadband funding – the National Telecommunications and Information Administration (NTIA), the Federal Communications Commission (FCC) and the U.S. Department of Agriculture (USDA) – to share information about broadband funding.

The agencies have agreed to share the following information about any location that is receiving or is under consideration for federal broadband funding:

  • Every ISP serving in the area
  • Description of broadband technology and speeds being delivered for each ISP
  • Maps showing the geographic coverage of ISPs
  • Identity and details of any ISP that has received or will receive broadband funds from any of the three agencies.

This kind of coordination seems vital in the current environment and where all three agencies are awarding sizable grants. It’s not hard to imagine having different ISPs seeking grants from different federal grant programs to serve the same geographic areas.

But then what happens? Will two agencies collaborate to decide which grant program will make the award? That would add another layer of complexity to grants if a grant application filed with one agency is suddenly conflicting with a grant request at another agency. Will ISPs be informed if discussions are happening behind the scenes between agencies concerning a grant request?

This also raises the issue of different agencies having significantly different grant requirements. We’re already seeing differences among grants in terms of identifying areas that are eligible for grant awards, different definitions of qualifying broadband speeds, different lists of technologies that will or won’t be funded, etc. How can the agencies collaborate if grants trying to serve the same area are following different grant rules? For example, what does collaboration mean when grants at one agency allow for wireless technologies when grants at another agency don’t?

One of the most troublesome aspects of this arrangement is that the agencies are going to share information on existing broadband speeds and coverage. The whole industry understands that the FCC’s database for this data is often badly flawed. Some grant programs today are open to examining facts that prove the errors in the FCC mapping data – but will the FCC be open to having its data challenged by a grant request filed with a different agency?  For collaboration to be effective, all three agencies have to be working with the same set of facts.

One of the oddest aspects of this collaborative effort is that it’s only required to last two years and any of the three agencies is free after that to end the collaboration. That makes it sound like somebody doesn’t think this is going to work.

The collaboration sounds like a worthwhile effort if the agencies can work together effectively. But it’s not hard to imagine the collaborative effort adding complexity and possibly even paralysis when considering multiple grants for the same location. How will the three agencies resolve differences between grant programs? My biggest fear is that this effort will add paperwork and time to the grant process without improving the process.

Starry Coming to Columbus, Ohio

I’ve been predicting for several years that wireless broadband is coming to metropolitan areas. There are several factors aligning to support this trend. Most importantly, there have been big advances in millimeter-wave radios that can bring decent broadband. Second, DSL is clearly on the way out – in fact, AT&T will no longer install DSL or even copper voice customers. Finally, the big cable companies feel that they’ve won the broadband battle and are flexing their monopoly power by aggressively raising rates – both Comcast and Charter are on a path to have $100 broadband in just a few years. These factors leave a void for an ISP with low prices and decent broadband.

Perhaps the splashiest wireless company to tackle the market niche is Starry. The company is owned by Chet Kanojia, who you may remember as the founder of Aereo, which was trying to offer a cheap alternative to local programming.

Starry has been operating for several years by beaming gigabit broadband to high-rise apartment buildings in Boston, New York City, Washington D.C., Denver, and Los Angeles. In those markets, Starry has been offering broadband using the 37 GHz spectrum band through a market test license from the FCC.

Starry offers an interesting broadband product. The company always posts its speeds on its website, and as I wrote this article, the average speed for Starry customers is 204 Mbps download and 201 Mbps upload – an attractive product in today’s market. The company offers low prices – in current markets, the price is $50 per month with no contracts, no connect fee, and no gimmicks. The company also provides 24/7 live customer service.

Starry is ready to roll out the next generation of millimeter-wave technology and has chosen Columbus, Ohio as the first market. Rather than offer broadband only to high-rise apartments, the wireless technology will be available to everybody from high-rises to single-family homes and will cover downtown and stretch into nearby suburbs.

Starry is taking advantage of radios that can bounce signals from one customer to the next. This creates a neighborhood mesh network around each base transmitter. Starry also is deploying Time Division Duplex (TDD), which handles upload and download bandwidth differently than other ISPs. With TDD, there are both download and upload timeslots built automatically into the transmission path. This allows a single frequency and channel to handle both upload and download functions simultaneously. One user in a household can be downloading while somebody else uploads at the same time using a single frequency channel. The Starry technology will further vary the number of upload or download time slots depending upon demand.

Because of the TDD technology, the Starry home receiver is a sophisticated piece of electronics and not a simple receiver. When Starry first launched, the cost of the receiver was nearly $500 but is now approaching $200.

Starry will be launching in Columbus with a $25 introductory price for early adopters but will likely get back soon to a standard $50 rate. Starry has big plans to eventually pass up to 40 million urban households. Starry also won $268.9 million in the RDOF grants to bring broadband to over 100,000 rural homes, with a network that will be a hybrid of both fiber and fixed wireless.

Starry will  be the first splashy wireless company to hit urban markets, but it won’t be the last. There is a solid market in every city for decently priced broadband with decent speeds. There are folks in every market who want an alternative to the big cable companies, and wireless technology is poised to fill the niche being vacated by DSL. Expect to hear a lot more from Starry and from others that start popping up in urban markets.

Buying Copper Networks

There was a surprise announcement recently that Apollo Global Management is in serious talks to acquire $5 billion of copper assets from Lumen (CenturyLink). This is not a done deal and could fizzle, but it raises the question of why companies would spend anything to buy dying copper networks.

There are some clear downsides to buying an existing copper-based telco. The current DSL technology is obsolete. One has to assume that a buyer plans to walk away from the copper networks as soon as reasonably possible after buying. And that is the major dilemma to overcome. Buying and then upgrading a copper property effectively means paying for the property twice.

Anybody willing to spend $5 billion to buy copper networks and then billions more to upgrade the properties has other options. Why not just go out and overbuild fiber in many dozens of county seats where a fiber competitor could thrive?

I’ve had clients that were faced with the same opportunity over the last decade, and it’s not an easy decision. One of the big upsides of buying is getting the existing revenue stream. But in the case of Lumen, that revenue stream has been dying as households find alternatives to DSL. But the immediate revenue stream, even if small, can help to fund the new property.

Perhaps the big plus to such a transaction is the many county seats and other towns where Lumen operates as an ISP. In most cases, Lumen has already lost the battle against the cable company competitor – but in rural towns it’s not unusual for the cable company network to also be outdated and underperforming. Competing against Charter and Comcast in rural markets is not the same as competing against these companies in upgraded urban markets.

Another interesting upside to buying these properties is that Lumen has some vitally needed fiber. Even where there is no fiber to customers, a fiber network connects all of the small towns in a region together. This backhaul makes it possible to support all of the towns in a region. Another network upside is that new fiber can be overlashed onto old copper wires for a lot lower cost than adding new standalone fiber to poles. Lumen already owns the rights-of-ways, and there would be little or no make-ready costs needed to overlash fiber. From this perspective, buying Lumen almost equates to buying a huge messenger-wire network ready to accept fiber.

Of course, there are big downsides to buying an existing telco. If the buyer is smart, they will want to walk away from much of the rural areas, much like AT&T has recently walked away from DSL. The dilemma is that a buyer is probably going to have to make promises to regulators that it won’t abandon rural areas quickly.

There is also the huge operational challenge of taking over a big-company network. I’ve helped clients do this several times, and it’s a bear. The purchase generally transfers the assets but not the operational systems supporting the assets. Even if those systems somehow come along, the big telcos are using software that is massively out of date and obsolete. Big telcos also are highly decentralized, and many of the functions that support the properties are a challenge to work with during a transition and hard to replace quickly.

We’ve seen big companies in the past that failed at the process of consolidating obsolete copper networks. One only has to look at Frontier and Windstream to see how hard it is to make this work. A buyer of this many properties will likely have grandiose plans to overbuild all of the lucrative parts of markets quickly – before somebody else does so. But there doesn’t seem to be an easy solution about what to do with the rural copper networks. No big company is ever going to be satisfied with the returns from rural markets even should it do everything right.

Time to Revisit the Small Cell Order

One of the most controversial orders from the last FCC chaired by Ajit Pai was the order in 2018 that small cell sites be given priority. That order made two specific rulings. First, the FCC issued a declaratory ruling which said that the FCC has the authority to override local and state regulations concerning small cell deployments. Second, the order created a ‘shot clock’ that requires localities to process small cell applications quickly while also setting a cap on local fees.

Earlier this year, NATOA (the National Association of Telecommunications Officers and Advisors) and the CWA (Communications Workers of America) released a letter and a report that argues that it’s time to revisit that FCC order. They argued that the timeline set by the order is ridiculously, short considering the complexity of some of the installations. They also point out that cellular carriers are not using the FCC order to install ‘pizza boxes’ on poles as cellular carriers originally promised but are placing devices as large as refrigerators on poles, creating dangerous situations for technicians of other utilities that have to navigate around the large devices. Finally, the letter argues that there is no justifiable reason for setting small cell application fees below cost – cities are being required to subsidize the big cellular companies.

It’s important to put the original FCC order into context before taking a position on the issues raised in this letter. Starting around 2015, the cellular industry declared an emergency and said that the US was falling badly behind China in the race towards 5G. Both the White House and the Congress jumped aboard on the issue and said that quickly deploying 5G must be a top priority for the US economy. You might recall that the US government went so overboard on the 5G race that there was even talk about the US government buying Nokia or Ericsson so that the US wouldn’t be left behind.

In this environment, where pushing 5G forward was considered a national emergency, it was easy for the FCC to push through this order that gave cellular carriers everything on their wish list concerning small cell deployments. Just six years later, we can see that 5G deployment was not an emergency. None of the big promises made about 5G have materialized, and in fact, the cellular carriers are still struggling to define a business plan that will monetize 5G.

The real reason for the push for 5G was that the 4G cellular networks were getting overloaded – and small cell sites were needed to bolster the existing cellular networks. Everybody relies on our cellular networks, and that was a legitimate reason for the FCC to take action – but the cellular companies never publicly made this argument. The carriers didn’t want the public to know that their 4G networks were in trouble since that would hurt their stock prices. Instead, the cellular companies pulled off one of the biggest public relations scams in history and invented the 5G race to push through regulations that benefitted them.

I agree with the CWA and NATOA that it’s time to put the genie back in the bottle and revisit the small cell order. Like with all regulatory policy disputes, both sides of the issue have some legitimate concerns. The cellular carriers had a legitimate beef when they said that some cities took far too long to process permits for small cell sites. The cities also had legitimate concerns – they wanted some say so in the placement and aesthetics of the small cell deployments – and they want to be able to say no to putting a refrigerator-sized device in the middle of a busy pole line.

It’s time for the FCC to reopen this docket and try again. We now know the kinds of devices that the cellular carriers want to place, and there can be separate rules for placing pizza boxes versus refrigerators on poles. We also now have thousands of examples of the effort required by cities to review and implement small cell requests. A new docket could examine the facts instead of being pushed forward by an imaginary 5G national emergency.

The cellular carriers got everything they wanted, and any regulatory ruling that is this one-sided is almost always a bad one. We now understand that there is no 5G race with China – but we also recognize that cellular carriers have a legitimate need to keep placing small cell sites. It’s time for the FCC to weigh the facts and reissue rules that put a balance between cellular carrier and city interests – because that’s what good regulations are supposed to do.

The Reemergence of Holding Times

There is an interesting phenomenon happening with ISP networks that I don’t see anybody discussing. During the last year, we saw a big change in the nature of our broadband usage in that many of us are connecting to remote work or school servers, or we are connecting to long Zoom calls.

We already can see that these changes have accelerated the average home usage of broadband. OpenSignal reports that the average broadband usage per home grew from 274 gigabytes per month just before the pandemic up to 462 gigabytes per month measured at the end of the first quarter of this year. Since much of the new usage came during the daytime, most ISPs reported that they were able to handle the extra usage. This makes sense because ISP networks in residential neighborhoods were relatively empty during the daytime before the pandemic – adding the additional usage at these non-busy times did not stress networks. Instead, the daytime hours started to become as busy as the evening hours, which have historically been the busiest time for residential networks.

But there is one impact of the way networks are now being used that is impacting ISPs. Before the pandemic, most of the use of the Internet in residential neighborhoods was bursty. People shopped or surfed the web, and each of these events resulted in short bursts to the Internet. Even video streaming is bursty – when you watch Netflix, you’re not downloading a video continuously. Instead, Netflix feeds you short, fast bursts of content that cache on your computer and keeps you ahead of what you are watching.

But our new network habits are very different. People are connecting to a school or work server with a VPN and keeping the connection for hours. Most Zoom video calls last 30 minutes to an hour. Suddenly, we’re using bandwidth resources for a long time.

In telephone networks, we used to refer to this phenomenon as holding times. Holding times were important because they helped to determine how many trunks, or external connections were needed to handle all of the demand. A longer holding time for a given kind of traffic meant that more external trunks were needed for that kind of calling. This is pure math – you can fit twice as many calls into an hour if the average holding time is five minutes instead of ten minutes. A telephone company would have multiple kinds of trunks leaving a central office – some trunks for local traffic between nearby exchanges and other trunks for different types of long-distance traffic. Traffic engineers measured average holding times to calculate the right number of trunks for each kind of traffic.

The fact that residents are maintaining Internet connections for hours is having the same kind of impact on broadband networks. The easiest place to understand this is in the neighborhood network. Consider a neighborhood served by DSL that has a DS3 backhaul provided by the telephone company – that’s 45 megabytes of capacity. Such a connection can support a lot of bursty traffic because requests to use the Internet come and go quickly. But the new, long-duration  broadband holding times can quickly kill a DSL neighborhood connection, as we saw during the pandemic. If only 20 homes in the neighborhood (which might consist of 100 homes) connect to a school or work server using a 2 Mbps connection, then 40 of the 45 megabytes is fully occupied for that use and can’t be used for anything else. It’s possible for this local network to become totally locked with heavy VPN usage.

This kind of network stress doesn’t just affect DSL networks, but every broadband technology. The connections inside the networks between homes and the hub have gotten far busier as people lock up Internet links for long periods of time. For technologies like DSL with small backhaul pipes, this phenomenon has been killing usage for whole neighborhoods. This is the phenomenon that killed the upload backhaul for cable companies. For technologies with larger backhaul bandwidth, this phenomenon means the backhaul paths are much fuller and will have to be upgraded a lot sooner than anticipated.

This phenomenon will ease somewhat if schools everywhere go live again. However, it appears that we’re going to continue to have people working from home. And video calling has moved into the mainstream. That means that backhaul connections inside ISP networks are a lot busier than any network engineer would have predicted just two years ago. While some of the extra traffic comes from increased broadband volumes, much of it is related to the much longer customer holding times – a term we’ve never used before with broadband networks.

Hollow Core Fiber

BT, formerly known as British Telecom has been working with Lumenisity to greatly improve the performance of hollow core fiber. This is fiber that takes advantage of the fact that light travels faster through air than it does through glass. In a hollow core fiber, air fills center tubes surrounded by glass. As can be seen by the picture accompanying this blog, multiple tubes of glass and air are created inside a single fiber creating a honeycomb effect.

There was news about hollow core fiber a decade ago when a lab at DARPA worked with Honeywell to improve the performance of the fiber. They found then that they could create a single straight path of light in the tubes that was perfect for military applications. The light could carry more bandwidth for greater distances without having to be regenerated. By not bouncing through glass, the signal maintained intensity for longer distances. DARPA found the fixed orientation of light inside the tubes to be of great value for communication with military-grade gyroscopes.

Until the recent breakthrough, the hollow tube fiber was plagued by periodic high signal loss when the light signal lost it’s straight-path coherence. Lumenisity has been able to lower signal loss to 1 dB per kilometer, which is still higher than the 0.2 dB loss expected for traditional fiber. However, the lab trials indicate that better manufacturing process should be able to significantly lower signal loss.

The Lumenisity breakthrough comes from the ability to combine multiple wavelengths of light while avoiding the phenomenon known as interwave mixing where different light frequencies interfere with each other. By minimizing signal dispersion, Lumenisity has eliminated the need for digital signal processors that are used in other fiber to compensate for chromatic dispersion. This means repeater sites that can be placed further apart and that require simpler and cheaper electronics.

Lumenisity doesn’t see hollow core fiber being used as a replacement on most fiber routes. The real benefits come in situations that require low latency along with high bandwidth. For example, the hollow core fiber might be used to feed the trading desks on Wall Street. The fiber might improve performance for fiber leaving big data centers.

Lumenisity is building a factory in the U.K. to manufacture hollow core fiber and expects to have it in mass production by 2023.

Beavers Kill Fiber Route

An article from CBC earlier this year reported that beavers had chewed through an underground fiber and had knocked 900 customers in Tumbler Ridge, British Columbia off broadband for 36 hours. The beavers had chewed through a 4.5-inch conduit that was buried three feet underground. This was an unusual fiber cut because it was due to beavers – but animals damaging fiber is a common occurrence.

Squirrels are the number one source of animal damage to fiber. It’s believed that rodents love to chew on fiber as a way to sharpen their teeth, which grow continuously throughout their life. For example, squirrel teeth grow as much as eight inches per year, and the animals are forced to gnaw to keep teeth under control and sharp. For some reason, squirrels seem to prefer fiber cables over other kinds of wires hanging on poles.

I remember reading a few years ago that Level 3 reported that 17% of their aerial fiber outages were caused by squirrels. A Google search turns up numerous network outages caused by squirrels. I have a client with a new fiber network, and the only outage over the last year came from a squirrel chewing through the middle-mile fiber route that carried broadband to and from the community.

ISPs use a wide variety of techniques to try to prevent squirrel damage – but anybody that has ever put out a bird feeder knows how persistent squirrels can be. One deterrent is to use hardened cables that are a challenge for squirrels to chew through. However, there have been cases reported where squirrels still partially chew through such cables and cause enough damage to allow in water and cause future damage.

A more common solution is to use sort of physical barriers to keep squirrels away from the cable. There are barrier devices that can be mounted on the pole to block squirrels from moving higher – but these can also be an impediment for technicians. Another kind of barrier is mounted where the fiber connects to a pole to keep the squirrels away from the fiber. There are more exotic solutions like deploying ultrasonic blasters to drive squirrels away from fiber. In other countries, ISPs sometimes deploy poison or obnoxious chemicals to keep squirrels away from the fiber, but such techniques are frowned upon or illegal in the US.

What was most interesting about the beaver fiber cut was that the cut was far underground – supposedly out of any danger. In parts of the country there are similar threats to buried fiber from pocket gophers. There are thirteen species of pocket gophers in the US that range from 5 to 13 inches in length. The two regions of the country with pocket gophers are the Midwest plains and the Southwest. Gophers live on plants and either eat roots or pull plants down through the soil.

Pocket gophers can cause considerable damage to buried fiber. These rodents will chew through almost anything and there have been reported outages from gophers chewing through gas, water, and buried electric lines. Gophers typically live between 6 and 12 inches below the surface and are a particular threat to buried drops.

There are several ways to protect against gophers. The best protection is to bury fiber deep enough to be out of gopher range, but that can add a lot of cost to buried drops. I have a few clients that bore drops to keep them away from gophers. Another protection is to enclose the fiber in a sheath that is over 3 inches in diameter. Anything that large and tubular is generally too big for a gopher to bite. Another solution is to surround the buried fiber with 6 – 8 inches of gravel of at least 1-inch size – anything smaller gets pushed to the side by the gophers. Unfortunately, all of these solutions add a lot of cost to fiber drops.

An unexpected risk for aerial fiber is from birds. Large birds with sharp talons can create small cuts in the sheath and introduce water. Flocks of birds sitting on a fiber can cause stretch and cause sag. That may sound like a small risk, but when I lived in Florida is was common to see hundreds of birds sitting shoulder to shoulder on the wires between two poles. While most people would find that many birds to be an interesting sight, being a broadband guy, my first reaction was always to see which wire they were sitting on.

Satellites and Cellular Backhaul

Elon Musk recently announced that he was going to be providing cellular backhaul from the Starlink constellation of satellites. This makes a lot of sense from a financial perspective in that it avoids the costly wired fiber networks needed to reach rural cell sites.

This is clearly a shot across the bow for companies that currently bring fiber connectivity to rural cell sites. There are numerous rural middle-mile networks that mostly survive by providing backhaul to cell sites. While there has been downward pressure from the cellular carriers on transport rates – it’s likely that Starlink or other satellite providers could drop the bottom out of the market pricing for transport.

Since we hear so much about how the US is losing the 5G war, people may not realize how far the cellular networks around the world are behind those in the US and other developed countries. According to statistics from GSMA, in 2020 there were 7.9 billion cellular users in the work, 48% who were still using 2G or 3G cellular technology. The percentage of users on older technologies is expected to drop to about 23% by 2025, with a big transition to 4G.

But even then, cellular data speeds are likely to remain slow in many countries due to the lack of fiber backhaul and to the fact that in many countries the vast majority of people get almost all of their broadband from cellphones.

It’s been predicted for many years that satellites would play a big role in supporting cell sites. The worldwide consulting firm NSR predicted last year that there would be 800,000 cell sites worldwide connected via satellite by 2029. Over that same time period, NSR predicts the US market for satellite backhaul at $39 billion.

But it’s still a bit of a surprise to hear Starlink talking about providing cellular backhaul. A rural cell site is a large data user and requires far more bandwidth than the average residential or business customer. It would be a big challenge to Starlink or any satellite network to carry both cellular backhaul and residential broadband – because the cellular backhaul would suck away a lot of the capacity of the network out of any one satellite. One would think that cell sites would get priority routing, which means other broadband users would suffer.

It’s been less than five years since the new generation of satellite companies said they would be launching big constellations in the overhead sky. My first thought when I first heard of the new satellite technology is that they would be far better off financially by supporting a handful of cellular companies rather than million of residential customers. What I never expected is that somebody would try to handle both on the same network.

And perhaps that’s not Starlink’s plan. The company has been talking about launching 30,000 satellites over time (currently at 1,500). It would be possible to have different satellites for different customers with a constellation that large. But Elon Musk’s disclosure made it sound like discussions about cellular backhaul are already in the works.

I think we’re many years away from fully understanding how satellite companies will operate. It’s possible that cellular companies and big corporate users will make it worthwhile for the satellite companies to give them priority over residential broadband customers. It’s not hard envisioning satellites providing connectivity to large fleets like FedEx, UPS, or the US Postal Service. Satellite broadband could become the connectivity of choice for the large trucking companies. It’s going to be hard for a constellation owner to say no to those kinds of opportunities – but saying yes to big commercial opportunities will means diluting the broadband available for residential customers.

The Natural Evolution of Technology

I’ve been thinking lately about the future of current broadband technologies. What might the broadband world look like in twenty years?

The future of broadband technology will be driven by the continued growth in broadband demand, both in the amount of bandwidth we use and in the broadband speeds the public will demand. Technologies that can’t evolve to keep up with future demand will fade away – some slowly and some virtually overnight.

I don’t think it’s a big stretch to say that within twenty years that fiber will be king. There is a huge national push to build fiber now, with huge funding from federal and state grants, but also unprecedented amounts of commercial investment in fiber. Fiber will be built in a lot of rural America through subsidies and in a lot of small and medium towns because it makes financial sense. The big challenge will continue to be urban neighborhoods where fiber construction costs are high. Twenty years from now we’ll look back on today as the time when we finally embraced fiber, much like we look back twenty years ago when DSL and cable modems quickly killed dial-up.

It goes without saying that telephone copper will be dead in twenty years. To the extent copper is still on poles it will be used to support overlashed fiber. DSL will serve as the textbook posterchild about how technologies come and go. DSL is already considered as obsolete, a mere twenty years after introduction to the market. In twenty more years, it will be a distant memory.

I don’t see a big future for rural WISPs. These companies will not fare well in the fierce upcoming competition with fiber, low-orbit satellite, and even fixed cellular. Some stubborn WISPs will hang on with small market penetrations, but research into new and better radios will cease as demand for WISP services fade. The smart WISPs are going to move into towns and cities. WISPs willing to adapt to using millimeter-wave radios can grab a decent market share in towns by offering low prices to consumers who value price over big bandwidth. I predict that WISPs will replace DSL as the low-price competitor against the large ISPs in towns and cities.

Low orbit satellites will still serve the most remote customers in twenty years – but this won’t be the technology of choice due to what will be considered in the future as very slow bandwidth. Two decades from now, a 150 Mbps download connection is going to feel like today’s DSL. The satellite companies will thrive in the third world where they will be the ISP of choice for most rural customers. Interestingly, when I look out forty years, I think it’s likely that residential satellite broadband will fade into history. It’s hard to envision this technology can have a forty-year shelf life in a world where broadband demand continues to grow.

The technology that is hard to predict is cable broadband. From a technology perspective, it’s hard to see cable companies still wanting to maintain coaxial copper networks. In twenty years, these networks will be 70 years old. We don’t talk about it much, but age affects coaxial networks even more than telephone copper networks. Over the next decade, cable companies face a hard choice – convert to fiber or take one more swing at upgrading to DOCSIS 4.0 and its successors. It’s hard to imagine the giant cable companies like Comcast or Charter making the decision to go all fiber – they will worry too much about how the huge capital outlay will hurt their stock prices.

I expect there will still be plenty of coaxial networks around in twenty years. Unfortunately, I foresee that coaxial copper will stay in the poorest urban neighborhoods and smaller rural towns while suburbs and more affluent urban neighborhoods will see a conversion to fiber. For anybody who doesn’t think that can happen, I pointto AT&T history of DSL redlining. Cable companies might even decide to largely abandon poorer neighborhoods to WISPs and municipal fiber overbuilders, similar to the way that AT&T recently walked away from DSL.

It’s easy to think of technologies as being permanent and that any broadband technology used today will be around for a long time. One only has to look at the history of DSL to see that broadband technologies can reach great success only to be obsolete within just a few decades. We’re going to see the evolution of technology for as long as the demand for broadband continues to grow. Much of the technology being touted today as broadband solutions will quietly fade into obscurity over the next twenty years.

This is the biggest reason why I think that only technologies that can be relevant a decade or two from now should be eligible for federal grant funding. It’s shortsighted to give tax dollars to technologies that are not likely to be relevant in the somewhat near future. We saw a great example of that with the CAF II program that funded already-obsolete DSL. More recently saw federal grant money going to Viasat and to rural WISPs in the CAF II reverse auction. There are smarter ways to spend valuable tax dollars.

 

Standalone Broadband

Parks Associates recently announced the release of its Home Services Dashboard, a for-pay service that tracks consumer adoption of telecom services like Internet, pay-TV, and cellphones. As part of the announcement, the company released a blog that shows that at the end of the first quarter of 2021 that 41% of US homes are buying standalone broadband – meaning broadband that’s not bundled with cable TV or a home telephone.

This is further evidence of cord-cutting since in 2018 only 33% of homes had standalone broadband. This shows that in eight years that 8% of all US homes have ditched all services from their ISP other than broadband. One doesn’t have to trend this too many years into the future until over half of all homes will be buying only broadband.

ISPs are working hard to keep some kind of bundle because they understand the power of the bundle to control churn. It’s much easier for a customer who is only buying broadband to switch to another ISP. As the bundle with cable TV is losing appeal, the biggest cable companies, Comcast and Charter, have been busy bundling customers with cellular service. Comcast has also been successful in bundling millions of homes with smart home packages. For a short time, Comcast even tested the idea of bundling with solar power.

My consulting firm does broadband surveys and we’ve seen a wide range of success of bundling in different markets across the country. Just in the last two years, we’ve seen communities where the level of bundling is as low as 45% or as high as 80%. We’ve noticed that cities with older populations seem to have the highest percentage of homes still buying a bundle. Bundling is also still fairly common in rural America, although the rural bundle is most typically a DSL connection and a telephone landline. A few telcos like CenturyLink still bundle with satellite TV.

The one statistic from Park Associates that I have to challenge is the average price for unbundled broadband at $64. I have to wonder what is included in that figure. Consider Comcast as an example. The Comcast price for its most popular broadband product, Performance, is $76. There is currently a web special where a new customer can buy the product for as little as $34.99. But that price is only good for 12-months and reverts to the higher price. But more importantly, Comcast charges $14 for the modem. That means the price of standalone broadband without a promotional price is $90. Even the introductory product is at $49 when including the modem. The same is true for Charter and other major cable companies – the standalone price without special discount with every big cable company is more than $64 – sometimes a lot more.

Over the last year, I’ve done rate studies in over a dozen communities, and I’ve never seen the average price for standalone broadband below the $75 range. I can accept the Park Associates price if it doesn’t include the extra modem fee. Park Associates are obviously looking at a larger database of numbers than I am, but I’ve seen the $70+ average price in urban, suburban, and rural markets.

I agree with Parks Associates that broadband prices have steadily been climbing, and they observed that standalone broadband prices have increased in price by 64% since 2011. I have no way to judge the percentage increase, but I agree there has been substantial rate increases.

What might surprise a lot of people is how many households still cling to their bundles. After accounting for homes that don’t buy broadband, there are still slightly more homes that buy broadband through a bundle than households that buy standalone broadband. However, at the rate that homes are cutting the cord, it seems likely that within a year or so that there will be more homes without a bundle. In 2020, the traditional cable industry lost almost 6 million customers or almost 7.4% of the base of cable customers – and most of those customers broke a bundle when they ditched cable.