The Future of AT&T and Verizon

The cellphone companies have done such a great job of getting everybody to purchase a smartphone that cellular service in the country is quickly turning into a commodity. And, as is typical with most commodity products, that means less brand loyalty from customers and lower market prices for the products.

We’ve recently seen the cellular market demonstrate the turn toward becoming a commodity. In the first quarter of this year the cellular companies had their worse performance since back when they began. Both AT&T and Verizon posted losses for post-paid customers for the quarter. T-Mobile added fewer customers than expected and Sprint continued to lose money.

This is a huge turnaround for an industry where the big two cellular companies were each making over $1 billion per month in profits. The change in the industry comes from two things. First, people are now shopping for lower prices and are ready to change carriers to get lower monthly bills. The trend for lower prices was started by T-Mobile to gain market share, but low prices are also being pushed by cellular resellers – being fed by the big carriers. The cellular industry is only going to get more competitive when the cable companies soon enter the market. That will provide enough big players to make cellular minutes a true commodity. The cable companies have said they will be offering low prices as part of packages aimed at making customers stickier and will put real price pressure on the other cellular providers.

But the downturn in the first quarter was almost entirely due to the rush by all of the carriers to sell ‘unlimited’ data plans – which, as I’ve noted in some earlier blogs, are really not unlimited. But these plans offer lower prices for data and are freeing consumers to be able to use their smartphones without the fear of big overage fees. Again, this move was started by T-Mobile, but it was also driven heavily by public demand. AT&T and Verizon recognized that if they didn’t offer this product set that they were going to start bleeding customers to T-Mobile.

It will be really interesting to watch what happens to AT&T and Verizon, who are now predominantly cellular companies that also happen to own networks. The vast majority of revenues for these companies comes from the cellular parts of their companies. When I looked at both of their annual reports last year I had a hard time finding evidence that these companies were even in the landline network business. Discussions of those business lines are buried deeply within the annual reports.

These companies obviously need to find new forms of revenues to stay strong. AT&T is tackling this for now by going in a big way after the Mexican market. But one only has to look down the road a few years to see that Mexico and any other cellular market will also trend towards commoditization.

Both companies have their eyes on the same potential growth plays:

  • Both are making the moves necessary to tackle the advertising business. They look at the huge revenues being made by Facebook and Google and realize that as ISPs they are sitting on customer data that could make them major players in the targeted marketing space. Ad revenues are the predominant revenue source at Google and if these companies can grab even a small slice of that business they will make a lot of money.
  • Both are also chasing content. AT&T’s bid for the purchase of Time Warner is still waiting for government approval. Verizon has made big moves with the purchases of AOL and Yahoo and is rumored to be looking at other opportunities.
  • Both companies have been telling stockholders that there are huge amounts of money to be made from the IoT. These companies want their cellular networks to be the default networks for collecting data from IoT devices. They certainly ought to win the business for things like smart cars, but there will be a real battle between cellular and WiFi/landline connections for most other IoT usage.
  • Both companies are making a lot of noise about 5G. They are mostly concentrating on high-speed wireless connections using millimeter wave spectrum that they hope will make them competitive with the cable companies in urban areas. But even that runs a risk because if we see true competition in urban areas then prices for urban broadband might also tumble. And that might start the process of making broadband into a commodity. On the cellular side it’s hard to think that 5G cellular won’t quickly become a commodity as well. Whoever introduces faster cellphone data speeds might get a bump upward for a few years, but the rest of the industry will certainly catch up to any technological innovations.

It’s hard to foresee any business line where AT&T and Verizon are going to get the same monopoly power that they held in the cellular space for the past few decades. Everything they might undertake is also going to be available to competitors, meaning they are unlikely to make the same kind of huge margins they have historically made with cellular. No doubt they are both going to be huge companies for many decades to come since they own the cellular networks and spectrum. But I don’t think we can expect them to be the cash cows they have been in the past.

White Space Spectrum for Rural Broadband – Part II

Word travels fast in this industry, and in the last few days I’ve already heard from a few local initiatives that have been working to get rural broadband. They’re telling me that the naysayers in their communities are now pushing them to stop working on a broadband solution since Microsoft is going to bring broadband to rural America using white space spectrum. Microsoft is not going to be doing that, but some of the headlines could make you think they are.

Yesterday I talked about some of the issues that must be overcome in order to make white space spectrum viable. It certainly is no slam dunk that the spectrum is going to be viable for unlicensed use under the FCC spectrum plan. And as we’ve seen in the past, it doesn’t take a lot of uncertainty for a spectrum launch to fall flat on its face, something I’ve seen a few times just in recent decades.

With that in mind, let me discuss what Microsoft actually said in both their blog and whitepaper:

  • Microsoft will partner with telecom companies to bring broadband by 2022 to 2 million of the 23.4 million rural people that don’t have broadband today. I have to assume that these ‘partners’ are picking up a significant portion of the cost.
  • Microsoft hopes their effort will act as a catalyst for this to happen in the rest of the country. Microsoft is not themselves planning to fund or build to the remaining rural locations. They say that it’s going to take some combination of public grants and private money to make the numbers work. I just published a blog last Friday talking about the uncertainty of having a federal broadband grant program. Such funding may or may not ever materialize. I have to wonder where the commercial partners are going to be found who are willing to invest the $8 billion to $12 billion that Microsoft estimates this will cost.
  • Microsoft only thinks this is viable if the FCC follows their recommendation to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has been favoring creating just one channel of unlicensed spectrum per market. The cellular companies that just bought this spectrum are screaming loudly to keep this at one channel per market. The skeptic in me says that Microsoft’s white paper and announcement is a clever way for Microsoft to put pressure on the FCC to free up more spectrum. I wonder if Microsoft will do anything if the FCC sticks with one channel per market.
  • Microsoft admits that for this idea to work that manufacturers must mass produce the needed components. This is the classic chicken-and-egg dilemma that has killed other deployments of new spectrum. Manufacturers won’t commit to mass producing the needed gear until they know there is a market, and carriers are going to be leery about using the technology until there are standardized mass market products available. This alone could kill this idea just as the FCC’s plans for the LMDS and MMDS spectrum died in the late 1990s.

I think it’s also important to discuss a few important points that this whitepaper doesn’t talk about:

  • Microsoft never mentions the broadband data speeds that can be delivered with this technology. The whitepaper does talk about being able to deliver broadband to about 10 miles from a given tower. One channel of white space spectrum can deliver about 30 Mbps up to 19 miles in a point-to-point radio shot. From what I know of the existing trials these radios can deliver speeds of around 40 Mbps at six miles in a point-to-multipoint network, and less speed as the distance increases. Microsoft wants multiple channels in a market, because bonding multiple channels could greatly increase speeds to perhaps 100 Mbps. But even with one channel this is great broadband for a rural home that’s never had broadband. But the laws of physics means these radios will never get faster and those will still be the speeds offered a decade and two from now when those speeds are going to feel like slow DSL does today. It seems like too many broadband technology plans fail to recognize the fact that our demand for broadband has been doubling every three years since 1980. What’s pretty good speeds today can become inadequate in a surprisingly short period of time.
  • Microsoft wants to be the company to operate the wireless databases behind this and other spectrum. That gives them a profit motive to spur the wireless spectrums to be used. There is nothing wrong with wanting to make money, but this is not a 100% altruistic offer on their part.

It’s hard to know what to conclude about this. Certainly Microsoft is not bringing broadband to all of rural America. But it sounds like they are willing to work towards making this work. But we can’t ignore the huge hurdles that must be overcome to realize the vision painted by Microsoft in the white paper.

  • First, the technology has to work and the interference issues I discussed in yesterday’s blogs need to be solved for anybody to trust using this spectrum on an unlicensed basis. Nobody will use this spectrum if unlicensed users constantly get bumped off by licensed ones. The trials done for this spectrum to date were not done in a busy spectrum environment.
  • Second, somebody has to be willing to fund the $8B to $12B Microsoft estimates this will cost. There may or may not be any federal grants ever available for this technology, and there may never be commercial investors willing to spend that much on a new technology in rural America. The fact that Microsoft thinks this needs grant funding tells me that a business plan based upon this technology might not stand on its own.
  • Third, the chicken-and-egg issue of getting over the hurdle to have mass-produced gear for the spectrum must be overcome.
  • Finally, the FCC needs to adopt Microsoft’s view that there should be 3 unlicensed channels available everywhere – something that the licensed holders are strongly resisting. And from what I see from the current FCC, there is a god chance that they are going to side with the big cellular companies.

White Space Spectrum for Rural Broadband – Part I

Microsoft has announced that they want to use white space spectrum to bring broadband to rural America. In today and tomorrow’s blog I’m going to discuss the latest thoughts on the white space spectrum. Today I’ll discuss the hurdles that must be overcome to use the spectrum and tomorrow I will discuss in more detail what I think Microsoft is really proposing.

This spectrum being called white space has historically been used for the transmission of television through the air. In the recent FCC incentive auction the FCC got a lot of TV stations to migrate their signals elsewhere to free up this spectrum for broadband uses. And in very rural America much of this spectrum has been unused for decades.

Before Microsoft or anybody can use this spectrum on a widespread basis the FCC needs to determine how much of the spectrum will be available for unlicensed use. The FCC has said for several years that they want to allocate at least one channel of the spectrum for unlicensed usage in every market. But Microsoft and others have been pushing the FCC to allocate at least three channels per market and argue that the white space spectrum, if used correctly, could become as valuable as WiFi. It’s certainly possible that the Microsoft announcement was aimed at putting pressure on the FCC to provide more than one channel of spectrum per market.

The biggest issue that the FCC is wrestling with is interference. One of the best characteristics of white space spectrum is that it can travel great distances. The spectrum passes easily through things that kill higher frequencies. I remember as a kid being able to watch UHF TV stations in our basement that were broadcast from 90 miles away from a tall tower in Baltimore. It is the ability to travel significant distances that makes the spectrum promising for rural broadband. Yet these great distances also exacerbate the interference issues.

Today the spectrum has numerous users. There are still some TV stations that did not abandon the spectrum. There are two bands used for wireless microphones. There was a huge swath of this spectrum just sold to various carriers in the incentive auction that will probably be used to provide cellular data. And the FCC wants to create the unlicensed bands. To confound things, the mix between the various users varies widely by market.

Perhaps the best way to understand white space interference issues is to compare it to WiFi. One of the best characteristics (and many would also say the worse characteristics) of WiFi is that it allows multiple users to share the bandwidth at the same time. These multiple uses cause interference and so no user gets full use of the spectrum, but this sharing philosophy is what made WiFi so popular – except for the most crowded environments anybody can create an application using WiFi and knows that in most cases the bandwidth will be adequate.

But licensed spectrum doesn’t work that way and the FCC is obligated to protect all spectrum license holders. The FCC has proposed to solve the interference issues by requiring that radios be equipped so that unlicensed users will first dynamically check to make sure there are no licensed uses of the spectrum in the area. If they sense interference they cannot broadcast, or, once broadcasting, if they sense a licensed use they must abandon the signal.

This would all be done by using a database that identifies the licensed users in any given area along with radios that can search for licensed usage before making a connection. This sort of frequency scheme has never been tried before. Rather than sharing spectrum, like WiFi, the unlicensed user will be only allowed to use the spectrum when there is no interference. As you can imagine the licensed cellular companies, which just spent billions for this spectrum are worried about interference. But there are also concerns by churches, city halls and musicians who use wireless microphones.

It seems unlikely to me that in an urban area with a lot of usage on the spectrum that unlicensed white space spectrum is going to be very attractive. If it’s hard to make or maintain an unlicensed connection then nobody is going to try to use the spectrum in a crowded-spectrum environment.

The question that has yet to be answered is if this kind of frequency plan will work in rural environments. There have been a few trials of this spectrum over the past five years, but those tests really proved the viability of the spectrum for providing broadband and did not test the databases or the interference issue in a busy spectrum environnment. We’ll have to see what happens in rural America once the cellular companies start using the spectrum they just purchased. Because of the great distances in which the spectrum is viable, I can imagine a scenario where the use of licensed white space in a county seat might make it hard to use the spectrum in adjoining rural areas.

And like any new spectrum, there is a chicken and egg situation with the wireless equipment manufacturers. They are not likely to commit to making huge amounts of equipment, which would make this affordable, until they know that this is really going to work in rural areas. And we might not know if this is going to work in rural areas until there have been mass deployments. This same dilemma largely sunk the use fifteen years ago of the LMDS and the MMDS spectrums.

The white space spectrum has huge potential. One channel can deliver 30 Mbps to the horizon on a point-to-point basis. But there is no guarantee that the unlicensed use of the spectrum is going to work well under the frequency plan the FCC is proposing.

AT&T’s Broadband Trials

John Donovan, the chief strategy officer for AT&T, spoke at the Mobile World Congress recently and said that the company was trying five different technologies for the last mile. This includes WLL (wireless local loop), G.Fast, 5G, AirGig and fiber-to-the-premise. He said the company would be examining the economics of all of different technologies. Let me look at each one, in relation to AT&T.

Wireless Local Loop (WLL). The technology uses the companies LTE bandwidth but utilizes a point-to-multipoint network configuration. By using a small dish on the house to receive the signal the company is getting better bandwidth than can be received from normal broadcast cellular. The company has been doing trials on various different versions of the technology for many years. But there are a few recent trials of the newest technology that AT&T will be using for much of its deployment in rural America as part of the CAF II plan. That plan requires the ISP to deliver at least 10/1 Mbps. AT&T says that the technology is delivering speeds of 15 to 25 Mbps. The company says that even at the edge of a cellular network that a customer can get 10 Mbps about 90% of the time.

G.Fast. This is a technology that uses high frequencies to put more bandwidth on telephone copper wire. Speeds are reported to be as high as 500 Mbps, but only for very short distances under 200 feet. AT&T recently announced a G.Fast trial in an apartment building in Minneapolis. The technology is also being tested by CenturyLink and Windstream. All of these trials are using existing telephone copper inside of existing apartment buildings to deliver broadband. So this is not really a last mile technology. AT&T brings fiber to the apartment complex and then uses G.Fast as an inside wire technology. If they find it to be reliable this would be a great alternative to rewiring apartments with fiber.

5G. AT&T recently announced a few trials of early 5G technologies in Austin. They are looking at several technology ideas such carrier aggregation (combining many frequencies). But these are just trials, and AT&T is one of the companies helping to test pre-5G ideas as part of the worldwide effort to define the 5G specifications. These are not tests of market-ready technologies, but are instead field trials for various concepts needed to make 5G work. There is no doubt that AT&T will eventually replace LTE wireless with 5G wireless, but that transition is still many years in the future. The company is claiming to be testing 5G for the press release benefits – but these are not tests of a viable last mile technology – just tests that are moving lab concepts to early field trials.

AirGig. This one remains a mystery. AT&T says it will begin trialing the technology later this year with two power companies. There has been a little bit of clarification of the technology since the initial press release. This is not a broadband over powerline technology – it’s completely wireless and is using the open lines-of-sight on top of power poles to create a clear path for millimeter wave radios. The company has also said that they don’t know yet which wireless technology will be used to go from the poles into the home – they said the whole range of licensed spectrum is under consideration including the LTE frequencies. And if that’s the case then the AirGig is a fiber-replacement, but the delivery to homes would be about the same as WLL.

FTTP. Donovan referred to fiber-to-the-home as a trial, but by now the company understands the economics of fiber. The company keeps stretching the truth a bit about their fiber deployments. The company keeps saying that they have deployed fiber to 4 million homes, with 8 million more coming in the next three years. But the fact is they have actually only passed the 4 million homes that they can market to as is disclosed on their own web site. The twelve million home target was something that was dictated by the FCC as part of the settlement allowing the company to buy DirecTV.

We don’t know how many fiber customers AT&T has. They are mostly marketing this to apartment buildings, although there are residential customers around the country saying they have it. But they have not sold big piles of fiber connections like Verizon FiOS. This can be seen by looking at the steady drop in total AT&T data customers – 16.03 million in 2014, 15.78 million in 2015 and 15.62 million at the end of the third quarter of 2016. AT&T’s fiber is not really priced to be super-competitive, except in markets where they compete with Google Fiber. Their normal prices elsewhere on fiber are $70 for 100 Mbps, $80 for 300 Mbps and $99 for a gigabit.

The Resurgence of Rabbit Ears

rabbit earsThere is perhaps no better way to understand the cord cutting phenomenon than by looking at the booming sales of home TV antennas known as ‘rabbit ears’ used to receive local television off the airwaves. A study released by Park Associates shows that 15% of households now use rabbit ears, and that is a pretty amazing statistic. That is up from 8% of households from as recently as 2013. And I recall an earlier time when this had fallen below 5%.

For the longest time the TV-watching public was counted in three groups – those who had cable TV (including satellite), those that used rabbit ears to watch local TV only, and those with no TV. We now have a fourth category – those that only watch OTT programming such as Netflix.

I was once in the category of not watching TV at all. I remember twenty years ago I went to Circuit City (now gone) to consider buying a set of rabbit ears and the clerks there weren’t even sure if the store carried them. With some asking around they found that they had a few units of one brand that had been gathering dust.

But today there is a resurgence in rabbit ears and there are easily a dozen major brands. And there are new rabbit ear options coming on the market all of the time. For example, Sling TV just launched AirTV, a $99 box that integrates Sling TV, Netflix and high-quality rabbit ears together with a voice-activated remote control that makes it easy to cut the cord. This looks to be one of the better voice-activation systems around and lets you search programming options by using the name of shows, actors names or genres of types of programming.

Since most people have had cable TV for a long time many have no idea of what they can receive off air for free. The FCC has an interesting map that shows you the expected reception in your area. In my case the map shows that I can get a strong signal from every major network including CW and PBS along with signals from MyTV, Univision and a few independent local stations.

The Parks study also looks at other industry statistics. A few of the most interesting ones include:

  • Penetration of pay-TV was down to 81% in 2016 and has fallen every year since 2014. Parks cites the normal reasons for the decline including the growth of OTT programming, the increasing cost of a cable TV subscription and growing consumer awareness that there are viable alternatives to cable TV.
  • Satisfaction with pay-TV keeps dropping and only one-third of households now say that they are very satisfied with their pay-TV service.
  • OTT viewing continues to rise and 63% of US households now subscribe to at least one OTT offering like Netflix while 31% of households subscribe to more than one.
  • In 2016 12% of households downgraded their pay-TV service (meaning dropped it or went to a less expensive option). This was double the percentage (6%) who upgraded their pay-TV service in 2016.
  • Very few cord nevers (those who have never had cable TV) are deciding to buy pay-TV, with only 2% of them doing so in 2016. This is the statistic that scares the cable companies because cord nevers include new Millenial households. This generation is apparently not interested in being saddled with a pay-TV subscription. In past generations the percentage of new homes that bought pay-TV closely matched the overall penetration of the market – buying TV was something you automatically did when you moved to a new place.

These statistics show how much choice the OTT phenomenon has brought to the marketplace. Ten years ago there wouldn’t have been industry experts predicting the resurgence of rabbit ears. In fact, rabbit ears were associated with other obsolete technologies like buggy whips and were used as the butt of jokes to make fun of those who didn’t like the modern world. But this is no longer true and new rabbit ear homes are perhaps some of the most tech savvy, who know that they can craft an entertainment platform without sending a big check to a cable company.


The FCC’s Cable Price Report

FCC_New_LogoOnce a year the FCC releases a Report on Cable Industry Prices and this year’s report came out a few weeks ago. This current report has some very odd findings that make me think that perhaps this report is no longer needed.

The report looked at the prices charged for basic cable and expanded basic cable in 485 communities in the US, some where cable has a declaration of effective competition and others with no competition.

I think the results shown in the report are off because the findings show average rate increases that are far below what is reported everywhere else in the industry. The FCC says that the price of basic cable increased by only 2.3% over the last year to reach a price of $23.79. More surprisingly, the average price of expanded basic cable increased by only 2.7% to reach $69.03 which was slightly lower than the increase in inflation. This compares to the 10-year historical average of 4.8% increases per year from this same report.

The increase in basic cable might be accurate because there are years when many companies don’t increase this rate. But the expanded basic rate increase is baffling. I wrote a blog back in the beginning of the year showing much larger increases for all of the big cable companies this year except Charter, due to their impending merger – and they caught up later in the year.

I think that perhaps the FCC is no longer asking the right questions. It’s certainly possible that the published prices for expanded basic cable increased as they have said – but that doesn’t tell us anything about what customers are really paying.

I suspect the FCC is not picking up the plethora of new ‘fees’ that are being used to disguise the price of cable. These might be called network programming fees to cover the cost of buying local programming. Or they might be called sports charges to cover the ever-rising cost of sports programming. Every big company labels these fees a little differently. But these fees are part of the cable bill that people pay each month and the primary purpose of the fees is to allow the cable companies to claim lower cable rates. These fees also confuse customers who often think they are taxes. My guess is that the FCC did not include these fees – and they must be included because they are nothing more than a small piece of the cable bill labeled differently.

Additionally, I’ve seen a number of estimates that say that around 70% of households buy cable as part of a bundle, and for these households the change in the list price of the components of the bundle doesn’t matter – customers only care about the overall increase in the price of the bundle. Customers don’t know or care which piece of the bundle increases since they are rarely shown the cost of bundle components.

And this leads to a discussion of the fact that cable companies have recently began increasing the prices of other products in order to keep cable rates lower. Rather than raise the price of cable they might instead raise the fees mentioned above, raise the price of the cable modem or the settop box, or raise the price of the broadband products. And all the cable companies care about – and all most customers see – is the increase in the total bill.

Finally, we know that there are now many different rates in every market. Cable companies sell specials or negotiate contract renewals with customers. At CCG we often gather customer bills to try to understand a market and we often see customers with an identical package with prices varying by as much as 10 or 15 dollars. None of the variation in actual rates makes it into the FCC report. I think this report only looks at the published list price and those prices are largely irrelevant since they don’t reflect what customers really pay.

So I think the usefulness of this report is over. If I recall this report was mandated by Congress, and so the FCC is probably obligated to keep producing it. But the results it now shows have almost nothing to do with the rates that customers actually pay for cable TV in the real world.

ESPN and the Cable Industry

espnI’ve been writing periodically about ESPN because they seem to be the poster child for what is happening to cable TV and to programmers in the country. It’s been obvious over the last year or two that ESPN is bleeding customers, and the many articles about them concentrate on that issue.

ESPN is a good bellwether for the industry because they are carried by practically every cable TV provider, and because their contracts require that the channel be carried in the expanded basic tier – the tier that generally has between 50 and 75 channels. Only a few tiny rural cable systems don’t carry ESPN since they carry only a small number of channels.

When ESPN loses customers it can only come from one of two reasons – people that cut the cord and drop cable altogether or from cord shavers who downsize to the smallest basic cable package. Basic cable is the small package of 10 – 15 channels that includes the local network affiliates, government channels and a few cheap throw-ins like shopping channels.

But it’s not easy to figure out the real number of cord cutters and cord shavers. The largest cable companies report total subscriber numbers each quarter but they don’t report on the packages that customers buy. Various analysts estimate the number of cord cutters each quarter, but they differ on these estimates – and I haven’t seen anybody try to estimate the number of cord shavers.

Nielsen tracks the number of customers of each cable network and that tells us how the various cable TV networks are faring. The latest article on ESPN comes from Sports TV Ratings, a website that tracks subscribers to the various sports networks. That site shows that ESPN lost 621,000 subscribers just last month (October 2016). That is an astounding number since ESPN has roughly 89 million customers – it’s a drop of 7/10’s of a percent, which annualized would be over 8% of ESPN customers.

But that number may not be a huge aberration. FierceCable reported earlier this year that ESPN had lost 2.2 million customers between February and August of this year, which is a clip of 440,000 lost customers per month. And the network has lost more than 11 million customers since its peak in 2013 when it had almost 100 million customers.

Trying to count cord shavings gets even more complicated because of OTT content. The cited drop of 610,000 ESPN customers is from the Nielsen numbers for carriage on cable systems. This doesn’t include online content which includes ESPN. For instance, the basic package on Sling TV includes ESPN and Goldman Sachs estimated that Sling TV will have almost 2 million customers by the end of this year. There are a number of new OTT offerings just hitting the market that will include the network, but for now Sling TV has most of the online ESPN subscribers.

ESPN has an advantage over many other networks in that it probably can add back customers by selling to people directly on the web. And so perhaps the network can find an equilibrium number of customers at some lower threshold than today. But this is not going to be true for a lot of other content. As an example, in October the Golf Channel lost 600,000 subscribers and The Major League Baseball Channel lost 515,000 customers – and those kinds of networks have very limited appeal on a standalone basis. That is the real story behind the losses at ESPN – the vast majority of cable networks are bleeding customers right now.

Some of the content providers are not too worried about the drop of US cable customers since they are picking up far greater numbers of new customers worldwide right now. But networks that are US-centric – sports, news, weather – are in for a rough ride over the next few years as the industry settles out to a new and lower norm. I think we can expect to see a transformation of sports programming as the numerous sports networks bleed customers. This probably means more emphasis on live programming and fewer sports networks.

Technology and Telecom Jobs

PoleIn case you haven’t noticed, the big companies in the industry are cutting a lot of jobs – maybe the biggest job cuts ever in the industry. These cuts are due to a variety of reasons, but technology change is a big contributor.

There have been a number of announced staff cuts by the big telecom vendors. Cisco recently announced it would cut back as many as 5,500 jobs, or about 7% of its global workforce. Cisco’s job cuts are mostly due to the Open Compute Project where the big data center owners like Facebook, Amazon, Google, Microsoft and others have turned to a model of developing and directly manufacturing their own routers and switches and data center gear. Cloud data services are meanwhile wiping out the need for corporate data centers as companies are moving most of their computing processes to the much more efficient cloud. Even customers that are still buying Cisco boxes are cutting back since the technology now provides a huge increase of capacity over older technology and they need fewer routers and switches.

Ericsson has laid off around 3,000 employees due to falling business. The biggest culprit for them is SDNs (Software Defined Networks). Most of the layoffs are related to cell site electronics. The big cellular companies are actively converting their cell sites to centralized control with the brains in the core. This will enable these companies to make one change and have it instantly implemented in tens of thousands of cell sites. Today that process requires upgrading the brains at each cell site and also involves a horde of technicians to travel to and update each site.

Nokia plans to layoff at least 3,000 employees and maybe more. Part of these layoffs are due to final integration with the purchase of Alcatel-Lucent, but the layoffs also have to do with the technology changes that are affecting every vendor.

Cuts at operating carriers are likely to be a lot larger. A recent article published in the New York Times reported that internal projections from inside AT&T had the company planning to eliminate as many as 30% of their jobs over the next few years, which would be 80,000 people and the biggest telco layoff ever. The company has never officially mentioned a number but top AT&T officials have been warning all year that many of the job functions at the company are going to disappear and that only nimble employees willing to retrain have any hope of retaining a long-term job.

AT&T will be shedding jobs for several reasons. One is the big reduction is technicians needed to upgrade cell sites. But an even bigger reason is the company’s plans to decommission and walk away from huge amounts of its copper network. There is no way to know if the 80,000 number is valid, but even a reduction half that size would be gigantic.

And vendor and carrier cuts are only a small piece of the cuts that are going to be seen across the industry. Consider some of the following trends:

  • Corporate IT staffs are downsizing quickly from the move of computer functions to the cloud. There have been huge number of technicians with Cisco certifications, for example, that are finding themselves out of work as their companies eliminate the data centers at their companies.
  • On the flip side of that, huge data centers are being built to take over these same IT functions with only a tiny handful of technicians. I’ve seen reports where cities and counties gave big tax breaks to data centers because they expected them to bring jobs, but instead a lot of huge data centers are operating with fewer than ten employees.
  • In addition to employees there are fleets full of contractor technicians that do things like updating cell sites and these opportunities are going to dry up over the next few years. There will always be opportunities for technicians brave enough to climb cell towers, but that is not a giant work demand.

It looks like over the next few years that there are going to be a whole lot of unemployed technicians. Technology companies have always been cyclical and it’s never been unusual for engineers and technicians to have worked for a number of different vendors or carriers during a career, yet mostly in the past when there was a downsizing in one part of the industry the slack was picked up somewhere else. But we might be looking at a permanent downsizing this time. Once SDN networks are in place the jobs for those networks are not coming back. Once most IT functions are in the cloud those jobs aren’t coming back. And once the rural copper networks are replaced with 5G cellular those jobs aren’t coming back.

The Urban Broadband Gap

apartment-buildings-mascot-frontIt’s natural to think that all city-dwellers have great broadband options. But when you look closer you find out it’s often not really so. For various reasons there are sizable pockets of urban folks with gaping broadband needs.

Sometimes the broadband gap is just partial. I was just talking to a guy yesterday from Connecticut who lives in a neighborhood that largely commutes to New York City for work. These are rich neighborhoods of investment bankers, stockbrokers and other white collar households. They have cable modem service from Comcast and can get home broadband, but he tells me that cell phone coverage is largely non-existent. He can’t even use his cellphone outside of his house. There is a lot of talk about broadband migrating to wireless, but 5G broadband isn’t going to benefit people that can’t even get low-bandwidth cellular voice service.

I also have a good friend who lives in a multi-million dollar home in Potomac, Maryland – the wealthiest town in one of the wealthiest counties in the country. He has no landline broadband – no cable company, no Verizon FiOS, and not even any usable DSL. His part of the town has winding roads and sprawling lots and was built over time. I’m sure that it never met the cable company’s franchise density requirement of at least 15 or 20 homes per street mile of fiber – so it never got built. I am sure that most of the city has broadband, but even within the richest communities there are homes without.

You often see this problem just outside of city boundaries. Cities generally have franchise agreements that require the cable company to serve everybody, or almost everybody. But since counties rarely have these agreements the cable and phone companies are free to pick and choose who to serve outside of town. You will see some neighborhoods outside of a city with a cable company network while another similar neighborhood nearby goes without. It’s easy to find these pockets by looking for satellite TV dishes. The difference between the two neighborhoods is often due to nothing more to the whim of the telco and cable companies at the time of original construction.

The fault for not having broadband can’t always be laid on the cable company. Apartment owners and real estate developers for new neighborhoods are often at fault. For example there are many apartments around where the apartment owner made a deal years ago with a satellite TV providers to provide bulk cable TV service on a revenue sharing basis. In electing satellite TV the apartment owner excluded the cable company and today has no broadband.

Real estate developers often make the same bad choices. For instance some of hoped to provide broadband themselves but it never came to fruition. I’ve even seen some developments that just waited too long to invite in the cable company or telco and the service providers declined to build after the streets were paved. The National Broadband Map is a great resource for understanding local broadband coverage. In my own area there are two neighborhoods on the map that show no broadband. When I first saw the map I assumed these were parks, but there are homes in both of these areas. I don’t know why these areas are sitting without broadband, but it’s as likely to be a developer issue as a cable company issue.

There have also been several articles written recently that accuse the large cable companies and telcos of economic redlining. These companies may use some of the above excuses for not building to the poorer parts of an urban area, but overlaying broadband coverage and incomes often paints a startling picture. Since deciding where a cable company expands is often at the discretion of local and regional staff it’s not hard to imagine bias entering the process.

I’ve seen estimates that between 6 and 8 million urban people don’t have broadband available. These have to be a mixture of the above situations – the neighborhoods are outside of a franchise area, or the developers or apartments owners didn’t allow ISPs in, or the ISPs are engaging in economic redlining. But for whatever the reasons this is a lot of people, especially when added to the 14 million rural citizens without broadband.

I spend a lot of my time working on the rural broadband gap, but I don’t see much concentrated effort looking at the urban gap. That’s probably because this gap is one where it’s one subdivision, one apartment building or one street at a time with surrounding households having broadband. It’s hard to cobble together a constituency of these folks and even harder to find an economic solution to fix the problem.

Wall Street and Programmers

wall-streetIn an intriguing development, analyst Michael Nathanson has downgraded Discovery Networks and Scripps Networks Interactive from ‘neutral’ to ‘sell’. His reason is that he sees a poor future for programmers that don’t carry live TV events like sports or news.

Discovery Networks produces the various Discovery channels along with Animal Planet, TLC, Science, Velocity, OWN and American Heroes Channel. Scripps produces HGTV, the Food Network, DIY Network, the Cooking Channel, the Great American Country, the Travel Channel and TVN.

Nathanson believes that advertising is starting to chase live content and is abandoning other content. There is a major trend in the country for people to skip traditional broadcast ads using DVRs and video on demand. He further recognizes that all cable channels are losing viewers to OTT alternatives like Netflix. This all will add up to a significant drop in advertising revenues for traditional cable networks that stream shows paid for by advertising.

These networks are also feeling pressure from cable subscriptions. We know, for example, that ESPN lost millions of customers since 2015 and one has to think that the same thing is happening to all of the other networks. The ESPN losses seem to be due in part to cord cutting, but even more to cord shaving where customers are downsizing their cable packages. I listen to a lot of radio and I constantly hear ads from DirecTV and others to buy their new skinny bundles. Each time somebody picks a skinny bundle or an alternative like Sling TV, a whole lot of channels lose a monthly subscription.

This might be the first crack in the programmers’ armor. For nearly two decades they have been able to raise rates to cable companies while also enjoying ever-increasing advertising revenues. And this ever-growing revenue made the programmers a favorite of Wall Street which rewards revenues that grow quarter after quarter. But we are starting to see advertising revenues abandoning cable and moving to online venues. This year is the first year when web advertising will eclipse TV advertising.

It seems for these networks we are seeing a perfect storm. Advertising in general is leaving cable – and within that shift, if Nathanson is right, it will leave traditional cable channels much faster than those offering live programming. We are also seeing traditional cable subscriptions shifting to skinny bindles and OTT. There is no doubt that all of this is going to add up to smaller revenues for these networks. And since contracts between programmers and cable companies are for 3 -5 years the programmers don’t have the ability to raise subscription rates quickly enough to make up for these losses. Even if they tried to maintain growth through rate increases it’s likely today that they would get a lot of pushback from cable companies.

It’s hard to feel any sympathy for the programmers because it is their greed that has made cable too expensive for many homes. Programming rates in recent years have increased nearly 10% per year – many multiples faster than general inflation. Those rate increases were clearly done to please Wall Street, but it didn’t take a crystal ball to see that the increases were not sustainable.

The way that we value large companies in the US is perverse. These networks make a lot of money. And even with all of these changes they are going to continue to make a lot of money for a long time to come. But companies that fall out favor with Wall Street generally have huge problems. These companies are going to be pressured to somehow fix the situation, but there doesn’t seem to be any way for them to do that. We are likely to see them start ditching unprofitable channels. The companies might be sold or split up into smaller companies. It’s unlikely once Wall Street abandons a company for it to just sit still.

The programmers have held almost all of the power in the industry for a long time – but maybe we are starting to see a change. That can only be a good thing for the industry.