What’s the Future for CenturyLink?

I don’t know how many of you watch industry stock prices. I’m certainly not a stock analyst, but I’ve always tracked the stock prices of the big ISPs as another way to try to understand the industry. The stock prices for big ISPs are hard to compare because every big ISP operates multiple lines of business these days. AT&T and Verizon are judged more as cellular companies than as ISPs. AT&T and Comcast stock prices reflect that both are major media companies.

With that said, the stock price for CenturyLink has performed far worse than other big ISPs over the last year. A year ago a share of CenturyLink stock was at $19.24. By the end of the year the stock price was down to $15.44. As I wrote this blog the price was down to $10.89. That’s a 43% drop in share price over the last year and a 30% drop since the first of the year. For comparison, following are the stock prices of the other big ISPs and also trends in broadband customers:

Stock Price 1 Year Ago Stock Price Now % Change 2018 Change in Broadband Customers
CenturyLink $19.24 $10.89 -43.4% -262,000
Comcast $32.14 $43.15 34.3% 1,353,000
Charter $272.84 $377.89 38.5% 1,271,000
AT&T $32.19 $30.62 -4.9% -18,000
Verizon $48.49 $56.91 17.4% 2,000

As a point of comparison to the overall market, the Dow Jones Industrial average was up 4% over this same 1-year period. The above chart is not trying to make a correlation between stock prices and broadband customers since that is just one of dozens of factors that affect the performance of these companies.

Again, I’ve never fully understood how Wall Street values any given company. In reading analyst reports on CenturyLink it seems that the primary reason for the drop in stock price is that all of the company’s business units are trending downward. In the recently released 1Q 2019 results the company showed a year-over-year drop in results for the international, enterprise, small and medium business, wholesale, and consumer business units. It seems that analysts had hoped that the merger with Level 3 would reverse some of the downward trends. Stock prices also dropped when the company surprised the market by cutting its dividend payment in half in February.

CenturyLink faces the same trends as all big ISPs – traditional business lines like landline telephone and cable TV are in decline. Perhaps the most important trend affecting the company is the continued migration of broadband customers from copper-based DSL to cable company broadband. CenturyLink is not replacing the DSL broadband customers it’s losing. In 2018 CenturyLink lost a lot of broadband customers with speeds under 20 Mbps, but had a net gain of customers using more than 20 Mbps. CenturyLink undertook a big fiber-to-the-home expansion in 2017 and built fiber to pass 900,000 homes and businesses – but currently almost all expansion of last-mile networks is on hold.

It’s interesting to compare CenturyLink as an ISP with the big cable companies. The obvious big difference is the trend in broadband customers and revenues. Where CenturyLink lost 262,000 broadband customers in 2018, the two biggest cable companies each added more than a million new broadband customers for the year. CenturyLink and other telcos are losing the battle of DSL versus cable modems with customers migrating to cable companies as they seek faster speeds.

It’s also interesting to compare CenturyLink to the other big telcos. From the perspective of being an ISP, AT&T and Verizon are hanging on to total broadband customers. Both companies are also losing the DSL battle with the cable companies, but each is adding fiber customers to compensate for those losses. Both big telcos are building a lot of new fiber, mostly to provide direct connectivity to their own cell sites, but secondarily to then take advantage of other fiber opportunities around each fiber node.

Verizon has converted over a hundred telephone exchanges in the northeast to fiber-only and is getting out of the copper business in urban areas. Verizon has been quietly filling in its FiOS fiber network to cover the copper it’s abandoning. While nobody knows yet if it’s real, Verizon also has been declaring big plans to to expand into new broadband markets markets using 5G wireless loops.

AT&T was late to the fiber game but has been quietly yet steadily adding residential and business fiber customers over the last few years. They have adopted a strategy of chasing pockets of customers anywhere they own fiber.

CenturyLink had started down the path to replace DSL customers when they built a lot of fiber-to-the-home in 2017. Continuing with fiber construction would have positioned the company to take back a lot of the broadband market in the many large cities it serves. It’s clear that the new CenturyLink CEO doesn’t like the slow returns from investing in last-mile infrastructure and it appears that any hopes to grow the telco part of the business are off the table.

Everything I read says that CenturyLink is facing a corporate crisis. Diving stock prices always put strain on a company. CenturyLink faces more pressure since the activist investors group Southeastern Asset Management holds more than a 6% stake in CenturyLink and made an SEC filing that that the company’s fiber assets are undervalued.

The company has underperformed compared to its peers ever since it was spun off from AT&T as US West. The company then had what turned out to be a disastrous merger with Qwest. There was hope a few years back that the merger with CenturyLink would help to right the company. Most recently has been the merger with Level 3, and at least for now that’s not made a big difference. It’s been reported that CenturyLink has hired advisors to consider if they should sell or spin off the telco business unit. That analysis has just begun, but it won’t be surprising to hear about a major restructuring of the company.

AT&T and Augmented Reality

Lately it seems like I find a news article almost every week talking about new ways that people are using broadband. The latest news is an announcement that AT&T is selling Magic Leap augmented reality headsets in six cities plus online.

The AT&T launch is being coordinated with the release of an augmented reality immersive experience that will bring The Game of Thrones into people’s homes with a themed gaming experience called The Dead Must Die, with a teaser in this trailer.

Augmented reality differs from virtual reality in that augmented reality overlays images into the local environment. A user will see characters in their living room as opposed to being immersed in a total imaginary environment with virtual reality.

Magic Leap is one of the most interesting tech start-ups. They started in 2014 with a $542 million investment, and since then have raised over $2.3 billion dollars. The company’s investors and advisors include people like Alibaba executive vice chair Joe Tsai and director Steven Spielberg. There have been rumors over the years of an impending product, but until now they’ve never brought a product to market. AT&T will be selling Magic Leap’s first headset, called the Magic Leap One Creator Edition for a price of $2,295. The mass-market headset will surely cost a lot less.

AT&T’s interest in the technology extends past selling the headsets. Magic Leap recently signed a deal with the NBA and its broadcast partner Turner which is now owned by AT&T and will obviously be looking at augmented reality broadcasts of basketball games.

AT&T’s interest goes even far beyond that and they are looking at the Magic Leap technology as the entry into the spatial Internet – moving today’s web experience to three dimensions. AT&T sees the Magic Leap headset as the entry into bringing virtual reality to industries like healthcare, retail and manufacturing. They envision people shopping in 3D, doctors getting 3D computer assistance for visualizing a patient during an operating, and manufacturer workers aided by overlaid 3D blueprints on the manufacturing floor.

While the Magic Leap headset will work on WiFi today, AT&T is promoting Magic Leap as part of their 5G Innovation Program. AT&T is touting this as a technology that will benefit greatly from 5G, which will allow users to go mobile and use the augmented reality technology anywhere.

I couldn’t find any references to the amount of bandwidth used by this first-generation headset, but it has to be significant. Looking at the Game of Thrones application, a user is immersed in a 3D environment and can move and interact with elements in the augmented reality. That means a constant transmission of the elements in the 3D environment. I have to think that is at least equivalent to several simultaneous video transmissions. Regardless of the bandwidth used today, you can bet that as augmented reality becomes mainstream that content makers will find ways to use greater bandwidth.

We are already facing a big increase in bandwidth that is needed to support gaming from the cloud – as is now being pushed by the major game vendors. Layering augmented reality on top of that big data stream will increase bandwidth needs by another major increment.

AT&T Withdraws from Lifeline Program

In March the Public Utility Commission of Ohio allowed AT&T to withdraw from the federal Lifeline program. This is a program that let’s qualified low-income homes get a monthly discount of $9.25 from either their landline telephone or their broadband connection – only one discount per home. AT&T successfully withdrew from Lifeline in Illinois in 2018 and in twelve other states in 2017.

AT&T apparently hasn’t been advertising or pushing the potential discount since they only had 7,300 homes in the state on the Lifeline program. The Communications Workers of America say there are almost 1.6 million households in Ohio that qualify for the discount – although not all of them are served by AT&T.

You might think that AT&T supports Lifeline by looking at their web site. However, clicking through to Ohio notifies customers that the discount will end in June and provides customers a list of other companies that might offer them the discount.

The Lifeline program started in 1985, and at the time the amount of discount was a significant savings for customers. Because of inflation the $9.25 discount represents a far smaller portion of a today’s monthly telecommunications bill.

Participation in the Lifeline program has dropped significantly in the past few years, as has the way the fund is being used. The following revenue numbers come from the 2018 annual report from USAC – the entity that operates the Lifeline Fund. I extraopolated out the number of participants at $9.25 per month.

2016 2018
Telephone $1,477,548,000 $312,300,000
Bundle $25,554,000 $293,707,000
Broadband $18,610,000 $536,424,000
Total $1,521,712,000 $1,142,431,000
Participants        13,700,000        10,250,000

Since 2016 there are 2.5 million fewer participants in the plan – many certainly due to carriers like AT&T withdrawing from the plan. The USAC numbers show a big shift since 2016 of participants applying the discount to their broadband bill rather than to landline telephone or cellphone bill.

The Lifeline Program was in the news recently when the FCC Inspector General issued a fraud advisory that says there are a lot of duplicate names requesting Lifeline and a number of deceased people still getting the discount. Chairman Ajit Pai immediately issued a statement saying that the program needs to be cleaned up.

Fraud has always been a concern in the program. However, it’s a little odd for the FCC to be complaining about fraud today since they are in the process of taking over validation of Lifeline subscribers. Eligibility to participate in Lifeline was previously the responsibility of the states, but in June, 2018 USAC launched the National Verifier, a database that lists everybody eligible to receive a Lifeline credit. As of the end of last year, the federal verifier was active in 18 states, with the remaining states and territories joining the program this year. It seems odd to be yelling about problems of the older state programs when the FCC has already implemented a solution that they believe will solve most of the fraud issues.

I published a blog several days ago saying how regulators are letting the public down. It’s mystifying to me why the Ohio PUC and so many other states are letting AT&T out of the Lifeline program. The Lifeline Fund reimburses AT&T for every discount given to customers, so there is zero net cost to AT&T to participate in the plan. With the new National Verifier, AT&T takes no role in enrolling customers, who must enter through the national Verifier portal. I don’t know why regulators don’t insist that AT&T and every other company that sells residential telephone and broadband be required to participate in the program.

Another Rural Wireless Provider?

T-Mobile announced the start of a trial for a fixed wireless broadband product using LTE. The product is being marketed as “T-Mobile Home Internet”. The company will offer the product by invitation only to some existing T-Mobile cellular customers in “rural and underserved areas”. The company says they might connect as many as 50,000 customers this year. The company is marketing the product as 50 Mbps broadband, with a monthly price of $50 and no data cap. The company warns that speeds may be curtailed during times of network congestion.

The company further says that their ultimate goal is to offer speeds of up to 100 Mbps, but only if they are allowed to merge with Sprint and gain access to Sprint’s huge inventory of mid-range spectrum. They said the combination of the two companies would enable them to cover as many as 9.5 million homes with 100 Mbps broadband in about half of US zip codes.

There are positive aspects the planned deployment, but also a number of issues that make me skeptical. One positive aspect is that some of the spectrum used for LTE can better pass through trees compared to the spectrum used for the fixed wireless technology that is being widely deployed in the open plains and prairies of the Midwest and West. This opens up the possibility of bringing some wireless broadband to places like Appalachia – with the caveat that heavy woods are still going to slow down data speeds. It’s worth noting that this is still a line-of-sight technology and fixed LTE will be blocked by hills or other physical impediments.

The other positive aspect of the announced product is the price and lack of a data cap. Contrast this to the AT&T fixed LTE product that has a price as high as $70 along with a stingy 160 GB monthly cap, and with overage charges that can bring the AT&T price up to $200 per month.

I am skeptical of a number of the claims made or implied by the announcement. The primary concern is download speeds. Fixed LTE will be the same as any other fixed wireless product and speeds will decrease with the distance of a customer from the serving tower. In rural America distances can mount up quickly. LTE broadband is similar to rural cellular voice and works best where customers can get 4 or 5 bars. Anybody living in rural America understands that there are a lot more places with 1 or 2 bars of signal strength than of 4 or 5 bars.

The 50 Mbps advertised speed is clearly an ‘up-to’ speed and in rural America it’s doubtful that anybody other than those who live under a tower could actually get that much speed. This is one of the few times when I’ve seen AT&T advertise truthfully and they market their LTE product as delivering at least 10 Mbps speed. I’ve read numerous online reviews of the AT&T product and the typical speeds reported by customers range between 10 Mbps and 25 Mbps, with only a few lucky customers claiming speeds faster than that.

The online reviews of the AT&T LTE product also indicate that signal strength is heavily influenced by rain and can completely disappear during a downpour. Perhaps even more concerning are reports that in some cases speeds remain slow after a rain due to wet leaves on trees that must be scattering the signal.

Another concern is that T-Mobile is touting this as a solution for underserved rural America.  T-Mobile has far less presence in rural America than AT&T and Verizon and is on fewer rural cellular towers. This is evidenced by their claim that even after a merger with Sprint they’d only be seeing 9.5 million passings – that’s really small coverage for a nationwide cellular network. I’m a bit skeptical that T-Mobile will invest in connecting to more rural towers just to offer this product – the cost of backhaul to rural towers often makes for a lousy business case.

The claim also says that the product will have some aspects of both 4G and 5G. I’ve talked to several wireless engineers who have told me that they can’t see any particular advantage for 5G over 4G when deploying as fixed wireless. A carrier already opens up the available data path fully with 4G to reach a customer and 5G can’t make the spectrum perform any better. I’d love to hear from anybody who can tell me how 5G would enhance this particular application. This might be a case where the 5G term is tossed in for the benefit of politicians and marketing.

Finally, this is clearly a ploy to keep pushing for the merger with Sprint. The claim of the combined companies being able to offer 100 Mbps rural broadband has even more holes than the arguments for achieving 50 Mbps. However, Sprint does have a larger rural presence on rural towers today than T-Mobile, although I think the Sprint towers are already counted in the 9.5 million passings claim.

But putting aside all my skepticism, it would be great if T-Mobile can bring broadband to any rural customers that otherwise wouldn’t have it. Even should they not achieve the full 50 Mbps claim, many rural homes would be thrilled to get speeds at half that level. A wireless product with no data caps would also be a welcomed product. The timing of the announcement is clearly aimed at promoting the merger process with Sprint and I hope the company’s deployment plans don’t evaporate if the merger doesn’t happen.

Reality Pricing Coming for Online Video

I’ve been a cord cutter for many years and over the last few years, I’ve tried the various vMVPDs that offer channel line-ups that somewhat mimic traditional cable TV. I’ve tried Sling TV, DirecTV Now and Playstation Vue. In every case I’ve always scratched my head wondering how these products could offer prices that are lower than the wholesale price of the content from programmers. There are only two possibilities – either these companies have been setting low prices to gain market share or they had been able to negotiate far better deals for content than the rest of the industry.

Of course, the answer is that they’ve been subsidizing these products. And Wall Street is now pressuring these companies to end the subsidies and become profitable. There is probably no better example of this than AT&T’s DirecTV Now service. When DirecTV Now launched it carried a price tag of $35 per month for about a hundred channels of programming. The low price was clearly set as a reaction to a similarly low price from Sling TV which was the first big successful vMVPD.

Both companies offered line-ups including the channels that most households watch. This included the high-price programming from ESPN and numerous other quality networks. The initial pricing was crazy – a similar package on traditional cable was priced at $60 – $70.

The low pricing has worked for DirectTV Now. They are getting close to surpassing the Sling TV in subscribers. AT&T has featured DirecTV Now in its advertising and has been shuttling customers from the satellite-based DirecTV to the online product.

But AT&T company just got realistic with the product. They have collapsed from four options down to two options now priced at $50 and $70 per month. The company got ready for this shift by eliminating special promotional prices in the fourth quarter of last year. They had roughly half a million customers who were paying even less than their published low prices. When AT&T raised the rates they immediately lost over half of those promotional customers.

Not only are prices rising, but the company has significantly trimmed the channel counts. The new $50 package will have only about 40 channels while the $70 package will have 50 channels. It’s worth noting that both packages now include HBO, which is the flagship AT&T product. HBO is by far the most expensive programming in the industry and AT&T has now reconfigured DirecTV Now to be HBO plus other premium channels.

The new prices are realistic and also include a profit margin. It will be interesting to see how the DirecTV Now customer base reacts to such a drastic change. I’m sure many of them will flee to cheaper alternatives. But the company may also attract customers that subscribe directly to HBO to upgrade.

The big question is if there will be cheaper alternatives? The online industry has been around long enough that it is now out of its infancy and investors are starting to expect profits from any company in this space. The new realistic pricing by AT&T is likely to drive the other online programmers to also get more realistic.

These price increases have ramifications for cord-cutting. It’s been easy to justify cutting the cord when you could ditch a $70 per month traditional cable product for a $35 online one that has the channels you most watch. But there is less allure from going online when the alternative choice is just as expensive as the traditional one. There is always going to be some savings from jumping online – if nothing else customers can escape the exorbitant fees for renting a settop box.

It’s clear that AT&T is counting on HBO as the allure for its online offering. That product is available in a number of places on the web for a monthly rate of $15, so including that in the $50 and $70 product still distinguishes DirecTV Now from the other vMVPD providers.

What is clear by this move is that we are approaching the time when companies are willing to eat huge losses to gain online market share. That market share is worthless if customers leave in droves when there is a rate increase. These big companies don’t seem to have fully grasped that there is zero customer loyalty online. Viewers don’t really care who the underlying company is that is carrying their favorite programming – it’s the content they care about. The big cable companies have to break their long history of making decisions like near-monopolies.

Where’s the CAF II Success?

If you’ve read this blog you know I’ve been a big critic of the FCC’s CAF II program that gave over $10 billion in federal subsidies to the biggest telcos to improve rural broadband. My complaint is that the program set the embarrassingly low goal of improving rural broadband to speeds of at least 10/1 Mbps. My complaint is that this money could have done a huge amount of good had it been put up to reverse auction as was done with the leftover customers from this program last year – many ISPs would have used this funding to help to build rural fiber. Instead, the telcos are using the money mostly to upgrade DSL.

While I think the program was ill-conceived and was a giveaway to the big telco lobbyists, I am at least glad that it is improving rural broadband. For a household with no broadband, a 10 Mbps product might provide basic access to broadband services for the first time. We are now into the fifth year of the six-year program, so we ought to be seeing the results of these upgrades. USTelecom just published a blog saying that deployments are ahead of schedule and that CAF II is a quiet success.

The telcos have told the FCC they are largely on track – by the end of 2018 they should have upgraded broadband for at least 60% of the required households. AT&T and Windstream report that they have made at least 60% of the needed upgrades everywhere. Frontier says they are on track in 27 of the 29 states needing upgrades. CenturyLink says they are on track in only 23 of 33 states that are getting CAF II upgrades. According to USTelecom, over 2.1 million households should now be seeing faster speeds.

It’s also worth noting that the CAF II program should improve broadband for many more households that are not covered directly by the program. For example, when upgrading DSL for a CAF II area that surrounds a town, those living in the town should also see better broadband. The secondary benefit of the CAF program is that rural towns should be seeing speeds increasing from 6 Mbps or slower to as fast as 25 Mbps. By now many more millions of households should be seeing faster broadband due to CAF II.

What I find puzzling is that I would expect to see an upward burst of broadband customers for the big telcos because of CAF II. But the numbers aren’t showing that. There were four telcos that accepted more than $1 billion from the program, as follows, and three of them lost broadband customers in 2018:

Funding Households Per Household 2018 Broadband Customers
CenturyLink $3.09 B 1,190,016 $2,593 (262,000)
AT&T $2.96 B 1,265,036 $2,342 (18,000)
Frontier $1.7 B 659,587 $2,578 (203,000)
Windstream $1.07 B 413,345 $2,595 8,400
Total CAF II $10.05 B 4,075,840 $2,467

Windstream is the only telco of the four that gained customers last year. Windstream’s footprint is probably the most rural of the four telcos. We know that every telco is losing the battle for customers in towns where cable companies are increasing speeds on coaxial networks. Windstream seems to be offsetting those losses, and I can conjecture it’s because they have been selling more rural broadband.

AT&T is in a category all by itself. It’s impossible to know how AT&T is faring with CAF II. They are largely implementing CAF II using their cellular network (with the goal of tearing down rural copper). The company has also been deploying fiber past millions of homes and business in urban areas. They are clearly losing the residential broadband battle in urban markets to companies like Comcast and Charter. However, I can tell you anecdotally that AT&T hasn’t given up on urban copper. They have knocked on my door in Asheville, NC at least three times in the last year trying to sell DSL. I have to assume that they are also marketing broadband improvements in rural areas.

CenturyLink and Frontier are clearly bleeding broadband customers and each lost over 200,000 customers just in the last year. I have to wonder how hard these companies are marketing improved rural broadband. Both companies work in urban and suburban markets but also in numerous county seats situated in rural counties. Like every telco they are losing DSL customers in these markets to the cable company competitors.

Just like I have anecdotal evidence that AT&T is still pushing copper I hear stories that say the opposite for CenturyLink and Frontier. I worked in a few rural counties last year where the CAF II upgrades were reported as complete. And yet the communities seemed unaware of the improvements. Local politicians who bear the brunt of complaints from households that want better broadband weren’t aware of any upgrades – which tells me their rural constituents weren’t aware of upgrades.

I honestly don’t know what this all means. I really expected to find more positive evidence of the impact of CAF II. From what I know of rural America, households ought to leap at the opportunity to buy 10/1 Mbps DSL if they’ve had no broadband in the past. Are the upgrades being done but not being followed up with a marketing and public awareness campaign? Are actual upgraded speed not meeting the 10/1 Mbps goal? Are the upgrades really being made as reported to the FCC? We’re perhaps a year and a half away from the completion of CAF II, so I guess we’ll find out soon enough.

The Impending Cellular Data Crisis

There is one industry statistic that isn’t getting a lot of press – the fact that cellular data usage is more than doubling every two years. You don’t have to plot that growth rate very many years into the future to realize that existing cellular networks will be inadequate to handle the increased demand in just a few years. What’s even worse for the cellular industry is that the growth is the nationwide average. I have many clients who tell me there isn’t nearly that much growth at rural cellular towers – meaning there is likely even faster growth at some urban and suburban towers.

Much of this growth is a self-inflicted wound by the cellular industry. They’ve raised monthly data allowances and are often bunding in free video with cellular service, thus driving up usage. The public is responding to these changes by using the extra bandwidth made available to them.

There are a few obvious choke points that will be exposed with this kind of growth. Current cellphone technology limits the number of simultaneous connections that can be made from any given tower. As customers watch more video they eat up slots on the cell tower that otherwise could have been used to process numerous short calls and text messages. The other big chokepoint is going to be the broadband backhaul feeding each cell cite. When usage grows this fast it’s going to get increasingly expensive to buy leased backbone bandwidth – which explains why Verizon and AT&T are furiously building fiber to cell sites to avoid huge increases in backhaul costs.

5G will fix some, but not all of these issues. The growth is so explosive that cellular companies need to use every technique possible to make cell towers more efficient. Probably the best fix is to use more spectrum. Adding an additional spectrum to a cell site immediately adds capacity. However, this can’t happen overnight. Any new spectrum is only useful if customers can use it and it takes a number of years to modify cell sites and cellphones to work on a new spectrum. The need to meet growing demand is the primary reason that the CTIA recently told the FCC they need an eye-popping 400 MHz of new mid-range spectrum for cellular use. The industry painted that as being needed for 5G, but it’s needed now for 4G LTE.

Another fix for cell sites is to use existing frequency more efficiently. The most promising way to do this is with the use of MIMO antenna arrays – a technology to deploy multiple antennas in cellphones to combine multiple spectrum together to create a larger data pipe. MIMO technology can make it easier to respond to a request from a large bandwidth user – but it doesn’t relieve the overall pressure on a cell tower. If anything, it might do the exact opposite and let cell towers prioritize those that want to watch video over smaller users who might then be blocked from making voice calls or sending text messages. MIMO is also not an immediate fix and also needs to work through the cycle of getting the technology into cellphones.

The last strategy is what the industry calls densification, which is adding more cell sites. This is the driving force behind placing small cell sites on poles in areas with big cellular demand. However, densification might create as many problems as it solves. Most of the current frequencies used for cellular service travel a decent distance and placing cell sites too close together will create a lot of interference and noise between neighboring towers. While adding new cell sites adds additional local capacity, it also decreases the efficiency of all nearby cell sites using traditional spectrum – the overall improvement from densification is going to be a lot less than might be expected. The worse thing about this is that interference is hard to predict and is very much a local issue. This is the primary reason that the cellular companies are interested in millimeter wave spectrum for cellular – the spectrum travels a short distance and won’t interfere as much between cell sites placed closely together.

5G will fix some of these issues. The ability of 5G to do frequency slicing means that a cell site can provide just enough bandwidth for every user – a tiny slice of spectrum for a text message or IoT signal and a big pipe for a video stream. 5G will vastly expand the number of simultaneous users that can share a single cell site.

However, 5G doesn’t provide any additional advantages over 4G in terms of the total amount of backhaul bandwidth needed to feed a cell site. And that means that a 5G cell site will get equally overwhelmed if people demand more bandwidth than a cell site has to offer.

The cellular industry has a lot of problems to solve over a relatively short period of time. I expect that in the middle of the much-touted 5G roll-out we are going to start seeing some spectacular failures in the cellular networks at peak times. I feel sympathy for cellular engineers because it’s nearly impossible to have a network ready to handle data usage that doubles every two years. Even should engineers figure out strategies to handle five or ten times more usage, in only a few years the usage will catch up to those fixes.

I’ve never believed that cellular broadband can be a substitute for landline broadband. Every time somebody at the FCC or a politician declares that the future is wireless I’ve always rolled my eyes, because anybody that understands networks and the physics of spectrum can easily demonstrate that there are major limitations on the total bandwidth capacity at a given cell site, along with a limit on how densely cell sites can be packed in an area. The cellular networks are only carrying 5% of the total broadband in the country and it’s ludicrous to think that they could be expanded to carry most of it.

The Slow Deployment of 5G

Somebody asked me a few days ago why I write so much about 5G. My response is that I am intrigued by the 5G hype. The major players in the industry have been devoting big dollars to promote a technology that is still mostly vaporware. The most interesting thing about 5G is how politicians, regulators and the public have bought into the hype. I’ve never seen anything like it. I can remember other times when the world was abuzz over a new technology, but this was usually a reaction to an actual technology you could buy like the first laptop computers, the first iPhone and the first iPod.

Anybody that understands our industry knew that it will take a number of years to roll out any major new technology, particularly a wireless technology since wireless behaves differently in the field compared to the lab. We’re only a year past the release of 5G standards, and it’s unrealistic to think those standards could be translated into operation hardware and software systems in such a short time. You only have to look back at the history of 4G, which started as slowly as 5G and which finally had the first fully-compliant 4G cell site late last year.  It’s going to take just as long until we see a fully functional 5G cell site. What we will see, over time, is the incremental introduction of some of the aspects of 5G as they get translated from lab to the field. That rollout is further complicated for cellular use by the timeline needed to get 5G-ready handsets into peoples’ hands.

This blog was prompted by a Verizon announcement that 5G mobile services will be coming to 30 cities later this year. Of course, the announcement was short on details, because those details would probably be embarrassing for Verizon. I would expect that the company will introduce a tiny few aspects of 5G into the cell sites in business districts of major cities and claim that as a 5G roll-out.

What does that a roll-out this year mean for cellular customers? There are not yet any 5G capable cellphones. Both AT&T and Verizon have been working with Samsung to introduce a 5G version of their S10 phone later this year. Verizon has also been reported to be working with Lenovo for a 5G modular upgrade later this year. I’m guessing these phones are going to come with a premium price tag for the early adaptors willing to pay for 5G bragging rights. These phones will only work as 5G from the handful of cell sites with 5G gear – and that will only be for a tiny subset of the 5G specifications. I remember when one of my friends bought one of the first 4G phones and crowed about how it worked in downtown DC. At the time I told him his great performance was because he was probably the only guy using 4G – and sure enough, his performance dropped as others joined the new technology.

On the same day that I saw this Verizon announcement I also saw a prediction by Cisco that only 3% of cellular connections will occur over a 5G network by the end of 2022. This might be the best thing I’ve seen that pops the 5G hype. Even for folks buying the early 5G phones, there will be a dearth of cell sites around the country that will work with 5G for a number of years. Anybody who understands the lifecycle of cellular upgrades agrees with the Cisco timeline. It takes years to work through the cycle of upgrading cell sites, upgrading handsets and then getting those handsets to the public.

The same is true for the other technologies that are also being called 5G. Verizon made a huge splash just a few months ago about introducing 5G broadband using millimeter wave spectrum in four cities. Even at the time of that announcement, it was clear that those radios were not using the 5G standard, and Verizon quietly announced recently that they were ceasing those deployments while they wait for actual 5G technology. Those deployments were actually a beta test of millimeter wave radios, not the start of a rapid nationwide deployment of 5G broadband from poles.

AT&T had an even more ludicrous announcement at the end of 2018 where they announced 5G broadband that involved deployment of WiFi hotspots that were supposedly fed by 5G. However, this was a true phantom product for which they had no pricing and that nobody could order. And since no AT&T cell sites have been upgraded to 5G, one had to wonder how this involved any 5G technology. It’s clear this was technology roll-out by press release only so that they could have the bragging rights of saying they were the first ones to have 5G.

The final announcement I saw on that same day was one by T-Mobile saying they would begin deploying early 5G in cell sites in 2020. But the real news is that they aren’t planning on charging any more for any extra 5G speeds or features.

I come back to my original question about why I write about 5G so often. A lot of my clients ask me if they should be worried about 5G and I don’t have an answer for them. I can see that actual 5G technology is going to take a lot longer to come to market than the big carriers would have you believe. But I look at T-Mobile’s announcement on price and I also have to wonder what the cellular companies will really do once 5G works. Will AT&T and Verizon both spend billions to put 5G small cells in residential neighborhoods if it doesn’t drive any new cellular revenues? I have to admit that I’m skeptical – we’re going to have to wait to see what the carriers do rather than listen to what they say.

Google Fiber Leaving Louisville

Most readers have probably heard by now that Google Fiber is leaving Louisville because of failures with their fiber network. They are giving customers two months of free service and sending them back to the incumbent ISPs in the city. The company used a construction technique called micro-trenching where they cut a tiny slit in the road, one inch wide and few inches deep to carry the fiber. Only a year after construction the fiber is popping out of the micro-trenches all over the city.

Everybody I’ve talked to is guessing that it’s a simple case of ice heaving. While a micro-trench is sealed, it’s likely that small amounts of moisture seep into the sealed micro-trench and freezes when it gets cold. The first freeze would create tiny cracks, and with each subsequent freeze the cracks would get a little larger until the trench finally fills up with water, fully freezes and ejects the fill material. The only way to stop this would be to find a permanent seal that never lets in moisture. That sounds like a tall task in a city like Louisville that might freeze and thaw practically every night during the winter.

Nobody other than AT&T or Charter can be happy about this. The reason that Google Fiber elected to use micro-trenching is that both big ISPs fought tooth and nail to block Google Fiber from putting fiber on the utility poles in the city. The AT&T suit was resolved in Google’s favor, with the Charter one is still in court. Perhaps Google Fiber should have just waited out the lawsuits – but the business pressure was there to get something done. Unfortunately, the big ISPs are being rewarded for their intransigence.

One obvious lesson learned is not to launch a new network using an untried and untested construction technique. In this case, the micro-trenches didn’t just fail, they failed spectacularly, in the worst way imaginable. Google Fiber says the only fix for the problem would be to build the network again from scratch, which makes no financial sense.

Certainly, the whole industry is going to now be extremely leery about micro-trenching, but there is a larger lesson to be learned from this. For example, I’ve heard from several small ISPs who are ready to leap into the 5G game and build networks using millimeter wave radios installed on poles. This is every bit a new and untested technology like micro-trenching. I’m not predicting that anybody pursuing that business plan will fail – but I can assuredly promise that they will run into unanticipated problems.

Over my career, I can’t think of a single example where an ISP that took a chance on a cutting-edge technology didn’t have big problems – and some of those problems were just as catastrophic as what Google Fiber just ran into. For example, I can remember half a dozen companies that tried to deploy broadband networks using the LMDS spectrum. I remember one case where the radios literally never worked and the venture lost their $2 million investment. I remember several others where the radios had glitches that caused major customer outages and were largely a market disaster.

One thing that I’ve seen over and over is that telecom vendors take shortcuts. When they introduce a new technology they are under extreme pressure to get it to market and drive new revenues. Ideally, a vendor would hold small field trials of new technology for a few years to work out the bugs. But if a vendor finds an ISP willing to take a chance on a beta technology, they are happy to let the customers of that ISP be the real guinea pigs for the technology, and for the ISP to take the hit for the ensuing problems.

I can cite similar stories for the first generation of other technologies including the first generation of DSL, WiFi mesh networks, PON fiber-to-the-home and IPTV. The companies that were the first pioneers deploying these technologies had costly and sometimes deadly problems. So perhaps the lesson learned is that pioneers pay a price. I’m sure that this failure of micro-trenching will result in changing or abandoning the technique. Perhaps we’ll learn to not use micro-trenches in certain climates. Or perhaps they’ll find a way to seal the micro-trenches against humidity. But none of those future solutions will make up for Google Fiber’s spectacular failure.

The real victims of this situation are the households in Louisville who had changed to Google Fiber – and everybody else in the City. Because of Google Fiber’s lower prices, both Charter and AT&T lowered prices everywhere in the city. You can bet it’s not going to take long to get the market back to full prices. Any customers crawling back to the incumbents from Google Fiber can probably expect to pay full price immediately – there is no real incentive to give them a low-price deal. As a whole, every household in the City is going to be spending $10 or $20 more per month for broadband – which is a significant penalty on the local economy.

AT&T’s 5G Strategy

AT&T recently described their long-term 5G strategy using what they call the 3 pillars of 5G – the three areas where the company is putting their 5G focus. The first pillar is a concentration on 5G cellular, and the company’s goal is to launch a 5G-based cellular service, with some cities coming on board in the second half of 2020. This launch will use frequencies in the sub-6 GHz range. This admission that there won’t be any AT&T 5G until at least 2020 contradicts the AT&T marketing folks who are currently trying to paint the company’s 4G LTE as pre-5G.

The biggest problem for the public will be getting a 5G cellphone. AT&T is working with Samsung to hopefully launch two phones later this year that have some 5G capability. As always with a new generation of wireless technology, the bottleneck will be in handsets. The cell phone makers can’t just make generic 5G phones – they have to work with the carriers to be ready to support the spectific subset of 5G features that are released. You might recall that the 5G cellular specification contains 13 improvements, and only the first generation of a few of those will be included in the first generation 5G cell sites. Cellphone manufacturers will also have to wrestle with the fact that each big cellular carrier will introduce a different set of 5G features.

This is a real gamble for cellphone makers because a 5G phone will become quickly obsolete. A 5G phone sold in late 2019 probably won’t include all of the 5G features that will be on the market by late 2020 – and this is likely to be true for the next 3 or 4 years as the carriers roll out incremental 5G improvements. It’s also a gamble for customers because anybody that buys an early 5G cellphone will have early bragging rights, but those cool benefits can be out of date in six months. I think most people will be like me and will wait a few years until the 5G dust settles.

AT&T’s second pillar is fixed wireless. This one is a head-scratcher because they are talking about the fixed cellular product they’ve already been using for several years – and that product is not 5G. This is the product that delivers broadband to homes using existing low-band cellular frequencies. This is not the same as Verizon’s product that delivers hundreds of megabits per second but is instead a product that delivers speeds up to 50 Mbps depending upon how far a customer lives from a cell tower – with reports that most households are getting 15 Mbps at best. This is the product that AT&T is mostly using to satisfy its CAF II requirements in rural America. All of the engineers I’ve talked to don’t think that 5G is going to materially improve this product.

The final pillar of AT&T’s strategy is edge computing. What AT&T means by this is to put fast processors at customer sites when there is the need to process low-latency, high-bandwidth data. Like other carriers, AT&T has found that not everything is suited for the cloud and that trying to send big data to and from the cloud can create a bandwidth bottleneck and add latency. This strategy doesn’t require 5G and AT&T has already been deploying edge routers. However, 5G will enhance this ability at customer sites that need to connect a huge number of devices simultaneously. 5G can make it easier to connect to a huge number of IoT devices in a hospital or to 50,000 cell phones in a stadium. The bottom line is that the migration to more edge computing is not a 5G issue and applies equally to AT&T’s fiber customers.

There is really nothing new in the three-pillar announcement and AT&T has been talking about all three applications from some time – but the announcement does highlight the company’s focus for stockholders.

In what was mostly a dig at Verizon, AT&T’s CEO Randall Stephenson did hold out the possibility of AT&T following Verizon into the 5G fixed wireless local loop using millimeter wave spectrum – however, he said such a product offering is probably three to five years into the future. He envisions the product as an enhancement to AT&T’s fiber products, not necessarily a replacement. He emphasized that AT&T is happy with the current fiber deployments. He provided some new statistics on a recent earnings call and said the company is seeing customer penetration rates between 33% and 40% within 18 months of new fiber deployment and penetration around 50% after three years. Those are impressive statistics because AT&T’s fiber deployments have been largely in urban areas competing with the big cable companies.

A year ago, Stephenson said that getting sufficient backhaul was his number one concern with deploying high-bandwidth wireless. While he hasn’t repeated that recently, it fits in with his narrative of seeing millimeter wave radio deployments in the 3-5 year time frame. The company recently released a new policy paper on its AirGig product that says that the product is still under development and might play well with 5G. AirGig is the mysterious wireless product that shoots wireless signals along power lines and somehow uses the power lines to maintain focus of the signal. Perhaps the company is seeing a future path for using AirGig as the backhaul to 5G fixed wireless deployments.