New Technology – October 2017

I’ve run across some amazing new technologies that hopefully will make it to market someday.

Molecular Data Storage. A team of scientists at the University of Manchester recently made a breakthrough with a technology that allows high volumes of data to be stored within individual molecules. They’ve shown the ability to create high-density storage that could save 25,000 gigabits of data on something the size of a quarter.

They achieved the breakthrough using molecules that contain the element dysprosium (that’s going to send you back to the periodic table) cooled to a temperature of -213 centigrade. At that temperature the molecules retain magnetic alignment. Previously this has taken molecules cooled to a temperature of -259 C. The group’s goal is to find a way to do this at -196 C, the temperature of affordable liquid nitrogen, which would make this a viable commercial technology.

The most promising use of this kind of dense storage would be in large data centers since this storage is 100 times more dense than existing technologies. This would make data centers far more energy efficient while also speeding up computing. This kind of improvement since there are predictions that within 25 years data centers will be the largest user of electricity on the planet.

Bloodstream Electricity. Researchers at Fudan University in China have developed a way to generate electricity from a small device immersed in the bloodstream. The device uses stationary nanoscale carbon fibers that act like a tiny hydropower generator. They’ve named the device as ‘fiber-shaped fluidic nanogenerator” (FFNG).

Obviously there will need to be a lot of testing to make sure that the devices don’t cause problems like blood clots. But the devices hold great promise. A person could use these devices to charge a cellphone or wearable device. They could be used to power pacemakers and other medical devices. They could be inserted to power chips in farm animals that could be used to monitor and track them, or used to monitor wildlife.

Light Data Storage. Today’s theme seems to be small, and researchers at Caltech have developed a small computer chip that is capable of temporarily storing data using individual photons. This is the first team that has been able to reliably capture photons in a readable state on a tiny device. This is an important step in developing quantum computers. Traditional computers store data as either a 1 or a 0, but quantum computers store also can store data that is both a 1 and 0 simultaneously. This has shown to be possible with photons.

Quantum computing devices need to be small and operate at the nanoscale because they hold data only fleetingly until it can be processed, and nanochips can allow rapid processing. The Caltech device is small around the size of a red blood cell. The team was able to store a photon for 75 nanoseconds, and the ultimate goal is to store information for a full millisecond.

Photon Data Transmission. Researchers at the University of Ottowa have developed a technology to transmit a secure message using photons that are carrying more than one bit of information. This is a necessary step in developing data transmission using light, which would free the world from the many limitations of radio waves and spectrum.

Radio wave data transmission technologies send one bit of data at a time with each passing wavelength. Being able to send more than one bit of data with an individual proton creates the possibility of being able to send massive amounts of data through the open atmosphere. Scientists have achieved the ability to encode multiple bits with a proton in the lab, but is the first time it’s been done through the atmosphere in a real-world application.

The scientists are now working on a trial between two locations that are almost three miles apart and that will use a technology they call adaptive optics that can compensate for atmospheric turbulence.

There are numerous potential uses for the technology in our industry. This could be used to create ultrahigh-speed connections between a satellite and earth. It could be used to transmit data without fiber between locations with a clear line-of-sight. It could used as a secure method of communications with airplanes since small light beams can’t be intercepted or hacked.

The other use of the technology is to leverage the ability of photons to carry more than one bit of data to create a new kind of encryption that should be nearly impossible to break. The photon data transmission allows for the use of 4D quantum encryption to carry the keys needed to encrypt and decrypt packets, meaning that every data packet could use a different encryption scheme.

Generations Matter

Nielsen recently published their first quarter Total Audience Report for Q1 2017. It’s the best evidence that I’ve seen yet that there is a huge difference between generations when it comes to video viewing habits. Compared to most surveys that look at a few thousand people, these statistics are based on almost 300,000 households.

The report examined in detail the viewing habits of the different US generations – Generation Z (ages 2 – 20), Millennials (ages 21 – 37), Generation X (ages 38 – 52), Baby Boomers (ages 53 – 70) and the Greatest Generation (ages 71+). What might surprise a lot of people is that Generation Z and the Millennials together now make up 48% of the US population – and that means their viewing habits are rapidly growing in importance to the cable TV industry.

The report outlines how the various generations own or use various devices or services. But note that these responses represent the entire household. So, for example, when Nielsen sought answers from somebody in generation Z it’s likely that the answers represent what is owned by their parents who are likely a millennial or in generation X. Here are a few interesting statistics:

  • The broadband penetration rate between generations is about the same, ranging from 82% to 85% of households. It wasn’t too many years ago when the baby boomer households lagged in broadband adoption.
  • There is a significant difference in the use of OTT services like Netflix. 73% of homes representing generation Z subscribe to an OTT service, but only 51% of baby boomer only households.
  • Baby boomers also lag in smartphone adoption at 86% with the younger generations all between 95% and 97% adoption.
  • Baby boomers also lag in the adoption of an enabled smart TV (meaning it’s connected to the web). 28% of baby boomers have an enabled smart TV while younger households are at about 39%.

The biggest difference highlighted in the report is the daily time spent using various entertainment media that includes such things as TV, radio, game consoles, and surfing the Internet.

The big concern to the cable industry is the time spent watching cable content. For example, the average monthly TV viewing for those over 65 is 231 hours of live TV and 34 hours of time-sifted TV. But for people aged 12-17 that is only 60 hours live and 10 hours time-shifted. For ages 18-24 it’s 72 hours live and 12 hours time-shifted. For ages 25-34 it’s 101 hours live and 19 hours time-shifted. This is probably the best proof I’ve seen of how much less younger generations are invested in traditional TV.

This drastic difference for TV stands out because for other kinds of media there is not such a stark difference. For example, those over 65 spend about 67 hours per month using apps on smartphones while those 18-24 use 77 hours and those 25-34 use 76 hours.

There even wasn’t a drastic difference in the number of hours spent monthly watching video on a smartphone with those over 65 watching 2 hours per month compared to 7 hours for those 18-24 and 6 hours for those 25-34.

The only other media with a stark difference is video game consoles with those over 65 using 13 hours per month while those 18-24 use 49 hours per month. Other things like listening to the radio or using a multimedia device (like Roku or Apple TV) are similar across generations.

The drastic difference in TV viewing has serious repercussions for the industry. For example, TV is no longer a medium to be used to reach those aged 18-24 since they watch TV over 180 hours less per month than those over 65. We’re seeing a big shift in advertising dollars and during the last year the amount spent on web advertising surpassed TV advertising for the first time. When you trend this forward a decade it spells bad news for the broadcasting and cable industries. For many years there was a big hope that as people get older that they would revert to the usage patterns of their parents. But the evidence shows that the opposite seems to be true – that kids keep their viewing habits as they grow older.

When you compare this report to earlier ones it’s obvious that the difference between generations is widening. Just comparing to 2016 those over 65 are watching more TV each month while the youngest generations are cutting back on TV over time – Generation Z watched 15 minutes less TV per day just since 2016.

The Future of AT&T and Verizon

The cellphone companies have done such a great job of getting everybody to purchase a smartphone that cellular service in the country is quickly turning into a commodity. And, as is typical with most commodity products, that means less brand loyalty from customers and lower market prices for the products.

We’ve recently seen the cellular market demonstrate the turn toward becoming a commodity. In the first quarter of this year the cellular companies had their worse performance since back when they began. Both AT&T and Verizon posted losses for post-paid customers for the quarter. T-Mobile added fewer customers than expected and Sprint continued to lose money.

This is a huge turnaround for an industry where the big two cellular companies were each making over $1 billion per month in profits. The change in the industry comes from two things. First, people are now shopping for lower prices and are ready to change carriers to get lower monthly bills. The trend for lower prices was started by T-Mobile to gain market share, but low prices are also being pushed by cellular resellers – being fed by the big carriers. The cellular industry is only going to get more competitive when the cable companies soon enter the market. That will provide enough big players to make cellular minutes a true commodity. The cable companies have said they will be offering low prices as part of packages aimed at making customers stickier and will put real price pressure on the other cellular providers.

But the downturn in the first quarter was almost entirely due to the rush by all of the carriers to sell ‘unlimited’ data plans – which, as I’ve noted in some earlier blogs, are really not unlimited. But these plans offer lower prices for data and are freeing consumers to be able to use their smartphones without the fear of big overage fees. Again, this move was started by T-Mobile, but it was also driven heavily by public demand. AT&T and Verizon recognized that if they didn’t offer this product set that they were going to start bleeding customers to T-Mobile.

It will be really interesting to watch what happens to AT&T and Verizon, who are now predominantly cellular companies that also happen to own networks. The vast majority of revenues for these companies comes from the cellular parts of their companies. When I looked at both of their annual reports last year I had a hard time finding evidence that these companies were even in the landline network business. Discussions of those business lines are buried deeply within the annual reports.

These companies obviously need to find new forms of revenues to stay strong. AT&T is tackling this for now by going in a big way after the Mexican market. But one only has to look down the road a few years to see that Mexico and any other cellular market will also trend towards commoditization.

Both companies have their eyes on the same potential growth plays:

  • Both are making the moves necessary to tackle the advertising business. They look at the huge revenues being made by Facebook and Google and realize that as ISPs they are sitting on customer data that could make them major players in the targeted marketing space. Ad revenues are the predominant revenue source at Google and if these companies can grab even a small slice of that business they will make a lot of money.
  • Both are also chasing content. AT&T’s bid for the purchase of Time Warner is still waiting for government approval. Verizon has made big moves with the purchases of AOL and Yahoo and is rumored to be looking at other opportunities.
  • Both companies have been telling stockholders that there are huge amounts of money to be made from the IoT. These companies want their cellular networks to be the default networks for collecting data from IoT devices. They certainly ought to win the business for things like smart cars, but there will be a real battle between cellular and WiFi/landline connections for most other IoT usage.
  • Both companies are making a lot of noise about 5G. They are mostly concentrating on high-speed wireless connections using millimeter wave spectrum that they hope will make them competitive with the cable companies in urban areas. But even that runs a risk because if we see true competition in urban areas then prices for urban broadband might also tumble. And that might start the process of making broadband into a commodity. On the cellular side it’s hard to think that 5G cellular won’t quickly become a commodity as well. Whoever introduces faster cellphone data speeds might get a bump upward for a few years, but the rest of the industry will certainly catch up to any technological innovations.

It’s hard to foresee any business line where AT&T and Verizon are going to get the same monopoly power that they held in the cellular space for the past few decades. Everything they might undertake is also going to be available to competitors, meaning they are unlikely to make the same kind of huge margins they have historically made with cellular. No doubt they are both going to be huge companies for many decades to come since they own the cellular networks and spectrum. But I don’t think we can expect them to be the cash cows they have been in the past.

White Space Spectrum for Rural Broadband – Part II

Word travels fast in this industry, and in the last few days I’ve already heard from a few local initiatives that have been working to get rural broadband. They’re telling me that the naysayers in their communities are now pushing them to stop working on a broadband solution since Microsoft is going to bring broadband to rural America using white space spectrum. Microsoft is not going to be doing that, but some of the headlines could make you think they are.

Yesterday I talked about some of the issues that must be overcome in order to make white space spectrum viable. It certainly is no slam dunk that the spectrum is going to be viable for unlicensed use under the FCC spectrum plan. And as we’ve seen in the past, it doesn’t take a lot of uncertainty for a spectrum launch to fall flat on its face, something I’ve seen a few times just in recent decades.

With that in mind, let me discuss what Microsoft actually said in both their blog and whitepaper:

  • Microsoft will partner with telecom companies to bring broadband by 2022 to 2 million of the 23.4 million rural people that don’t have broadband today. I have to assume that these ‘partners’ are picking up a significant portion of the cost.
  • Microsoft hopes their effort will act as a catalyst for this to happen in the rest of the country. Microsoft is not themselves planning to fund or build to the remaining rural locations. They say that it’s going to take some combination of public grants and private money to make the numbers work. I just published a blog last Friday talking about the uncertainty of having a federal broadband grant program. Such funding may or may not ever materialize. I have to wonder where the commercial partners are going to be found who are willing to invest the $8 billion to $12 billion that Microsoft estimates this will cost.
  • Microsoft only thinks this is viable if the FCC follows their recommendation to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has been favoring creating just one channel of unlicensed spectrum per market. The cellular companies that just bought this spectrum are screaming loudly to keep this at one channel per market. The skeptic in me says that Microsoft’s white paper and announcement is a clever way for Microsoft to put pressure on the FCC to free up more spectrum. I wonder if Microsoft will do anything if the FCC sticks with one channel per market.
  • Microsoft admits that for this idea to work that manufacturers must mass produce the needed components. This is the classic chicken-and-egg dilemma that has killed other deployments of new spectrum. Manufacturers won’t commit to mass producing the needed gear until they know there is a market, and carriers are going to be leery about using the technology until there are standardized mass market products available. This alone could kill this idea just as the FCC’s plans for the LMDS and MMDS spectrum died in the late 1990s.

I think it’s also important to discuss a few important points that this whitepaper doesn’t talk about:

  • Microsoft never mentions the broadband data speeds that can be delivered with this technology. The whitepaper does talk about being able to deliver broadband to about 10 miles from a given tower. One channel of white space spectrum can deliver about 30 Mbps up to 19 miles in a point-to-point radio shot. From what I know of the existing trials these radios can deliver speeds of around 40 Mbps at six miles in a point-to-multipoint network, and less speed as the distance increases. Microsoft wants multiple channels in a market, because bonding multiple channels could greatly increase speeds to perhaps 100 Mbps. But even with one channel this is great broadband for a rural home that’s never had broadband. But the laws of physics means these radios will never get faster and those will still be the speeds offered a decade and two from now when those speeds are going to feel like slow DSL does today. It seems like too many broadband technology plans fail to recognize the fact that our demand for broadband has been doubling every three years since 1980. What’s pretty good speeds today can become inadequate in a surprisingly short period of time.
  • Microsoft wants to be the company to operate the wireless databases behind this and other spectrum. That gives them a profit motive to spur the wireless spectrums to be used. There is nothing wrong with wanting to make money, but this is not a 100% altruistic offer on their part.

It’s hard to know what to conclude about this. Certainly Microsoft is not bringing broadband to all of rural America. But it sounds like they are willing to work towards making this work. But we can’t ignore the huge hurdles that must be overcome to realize the vision painted by Microsoft in the white paper.

  • First, the technology has to work and the interference issues I discussed in yesterday’s blogs need to be solved for anybody to trust using this spectrum on an unlicensed basis. Nobody will use this spectrum if unlicensed users constantly get bumped off by licensed ones. The trials done for this spectrum to date were not done in a busy spectrum environment.
  • Second, somebody has to be willing to fund the $8B to $12B Microsoft estimates this will cost. There may or may not be any federal grants ever available for this technology, and there may never be commercial investors willing to spend that much on a new technology in rural America. The fact that Microsoft thinks this needs grant funding tells me that a business plan based upon this technology might not stand on its own.
  • Third, the chicken-and-egg issue of getting over the hurdle to have mass-produced gear for the spectrum must be overcome.
  • Finally, the FCC needs to adopt Microsoft’s view that there should be 3 unlicensed channels available everywhere – something that the licensed holders are strongly resisting. And from what I see from the current FCC, there is a god chance that they are going to side with the big cellular companies.

White Space Spectrum for Rural Broadband – Part I

Microsoft has announced that they want to use white space spectrum to bring broadband to rural America. In today and tomorrow’s blog I’m going to discuss the latest thoughts on the white space spectrum. Today I’ll discuss the hurdles that must be overcome to use the spectrum and tomorrow I will discuss in more detail what I think Microsoft is really proposing.

This spectrum being called white space has historically been used for the transmission of television through the air. In the recent FCC incentive auction the FCC got a lot of TV stations to migrate their signals elsewhere to free up this spectrum for broadband uses. And in very rural America much of this spectrum has been unused for decades.

Before Microsoft or anybody can use this spectrum on a widespread basis the FCC needs to determine how much of the spectrum will be available for unlicensed use. The FCC has said for several years that they want to allocate at least one channel of the spectrum for unlicensed usage in every market. But Microsoft and others have been pushing the FCC to allocate at least three channels per market and argue that the white space spectrum, if used correctly, could become as valuable as WiFi. It’s certainly possible that the Microsoft announcement was aimed at putting pressure on the FCC to provide more than one channel of spectrum per market.

The biggest issue that the FCC is wrestling with is interference. One of the best characteristics of white space spectrum is that it can travel great distances. The spectrum passes easily through things that kill higher frequencies. I remember as a kid being able to watch UHF TV stations in our basement that were broadcast from 90 miles away from a tall tower in Baltimore. It is the ability to travel significant distances that makes the spectrum promising for rural broadband. Yet these great distances also exacerbate the interference issues.

Today the spectrum has numerous users. There are still some TV stations that did not abandon the spectrum. There are two bands used for wireless microphones. There was a huge swath of this spectrum just sold to various carriers in the incentive auction that will probably be used to provide cellular data. And the FCC wants to create the unlicensed bands. To confound things, the mix between the various users varies widely by market.

Perhaps the best way to understand white space interference issues is to compare it to WiFi. One of the best characteristics (and many would also say the worse characteristics) of WiFi is that it allows multiple users to share the bandwidth at the same time. These multiple uses cause interference and so no user gets full use of the spectrum, but this sharing philosophy is what made WiFi so popular – except for the most crowded environments anybody can create an application using WiFi and knows that in most cases the bandwidth will be adequate.

But licensed spectrum doesn’t work that way and the FCC is obligated to protect all spectrum license holders. The FCC has proposed to solve the interference issues by requiring that radios be equipped so that unlicensed users will first dynamically check to make sure there are no licensed uses of the spectrum in the area. If they sense interference they cannot broadcast, or, once broadcasting, if they sense a licensed use they must abandon the signal.

This would all be done by using a database that identifies the licensed users in any given area along with radios that can search for licensed usage before making a connection. This sort of frequency scheme has never been tried before. Rather than sharing spectrum, like WiFi, the unlicensed user will be only allowed to use the spectrum when there is no interference. As you can imagine the licensed cellular companies, which just spent billions for this spectrum are worried about interference. But there are also concerns by churches, city halls and musicians who use wireless microphones.

It seems unlikely to me that in an urban area with a lot of usage on the spectrum that unlicensed white space spectrum is going to be very attractive. If it’s hard to make or maintain an unlicensed connection then nobody is going to try to use the spectrum in a crowded-spectrum environment.

The question that has yet to be answered is if this kind of frequency plan will work in rural environments. There have been a few trials of this spectrum over the past five years, but those tests really proved the viability of the spectrum for providing broadband and did not test the databases or the interference issue in a busy spectrum environnment. We’ll have to see what happens in rural America once the cellular companies start using the spectrum they just purchased. Because of the great distances in which the spectrum is viable, I can imagine a scenario where the use of licensed white space in a county seat might make it hard to use the spectrum in adjoining rural areas.

And like any new spectrum, there is a chicken and egg situation with the wireless equipment manufacturers. They are not likely to commit to making huge amounts of equipment, which would make this affordable, until they know that this is really going to work in rural areas. And we might not know if this is going to work in rural areas until there have been mass deployments. This same dilemma largely sunk the use fifteen years ago of the LMDS and the MMDS spectrums.

The white space spectrum has huge potential. One channel can deliver 30 Mbps to the horizon on a point-to-point basis. But there is no guarantee that the unlicensed use of the spectrum is going to work well under the frequency plan the FCC is proposing.

AT&T’s Broadband Trials

John Donovan, the chief strategy officer for AT&T, spoke at the Mobile World Congress recently and said that the company was trying five different technologies for the last mile. This includes WLL (wireless local loop), G.Fast, 5G, AirGig and fiber-to-the-premise. He said the company would be examining the economics of all of different technologies. Let me look at each one, in relation to AT&T.

Wireless Local Loop (WLL). The technology uses the companies LTE bandwidth but utilizes a point-to-multipoint network configuration. By using a small dish on the house to receive the signal the company is getting better bandwidth than can be received from normal broadcast cellular. The company has been doing trials on various different versions of the technology for many years. But there are a few recent trials of the newest technology that AT&T will be using for much of its deployment in rural America as part of the CAF II plan. That plan requires the ISP to deliver at least 10/1 Mbps. AT&T says that the technology is delivering speeds of 15 to 25 Mbps. The company says that even at the edge of a cellular network that a customer can get 10 Mbps about 90% of the time.

G.Fast. This is a technology that uses high frequencies to put more bandwidth on telephone copper wire. Speeds are reported to be as high as 500 Mbps, but only for very short distances under 200 feet. AT&T recently announced a G.Fast trial in an apartment building in Minneapolis. The technology is also being tested by CenturyLink and Windstream. All of these trials are using existing telephone copper inside of existing apartment buildings to deliver broadband. So this is not really a last mile technology. AT&T brings fiber to the apartment complex and then uses G.Fast as an inside wire technology. If they find it to be reliable this would be a great alternative to rewiring apartments with fiber.

5G. AT&T recently announced a few trials of early 5G technologies in Austin. They are looking at several technology ideas such carrier aggregation (combining many frequencies). But these are just trials, and AT&T is one of the companies helping to test pre-5G ideas as part of the worldwide effort to define the 5G specifications. These are not tests of market-ready technologies, but are instead field trials for various concepts needed to make 5G work. There is no doubt that AT&T will eventually replace LTE wireless with 5G wireless, but that transition is still many years in the future. The company is claiming to be testing 5G for the press release benefits – but these are not tests of a viable last mile technology – just tests that are moving lab concepts to early field trials.

AirGig. This one remains a mystery. AT&T says it will begin trialing the technology later this year with two power companies. There has been a little bit of clarification of the technology since the initial press release. This is not a broadband over powerline technology – it’s completely wireless and is using the open lines-of-sight on top of power poles to create a clear path for millimeter wave radios. The company has also said that they don’t know yet which wireless technology will be used to go from the poles into the home – they said the whole range of licensed spectrum is under consideration including the LTE frequencies. And if that’s the case then the AirGig is a fiber-replacement, but the delivery to homes would be about the same as WLL.

FTTP. Donovan referred to fiber-to-the-home as a trial, but by now the company understands the economics of fiber. The company keeps stretching the truth a bit about their fiber deployments. The company keeps saying that they have deployed fiber to 4 million homes, with 8 million more coming in the next three years. But the fact is they have actually only passed the 4 million homes that they can market to as is disclosed on their own web site. The twelve million home target was something that was dictated by the FCC as part of the settlement allowing the company to buy DirecTV.

We don’t know how many fiber customers AT&T has. They are mostly marketing this to apartment buildings, although there are residential customers around the country saying they have it. But they have not sold big piles of fiber connections like Verizon FiOS. This can be seen by looking at the steady drop in total AT&T data customers – 16.03 million in 2014, 15.78 million in 2015 and 15.62 million at the end of the third quarter of 2016. AT&T’s fiber is not really priced to be super-competitive, except in markets where they compete with Google Fiber. Their normal prices elsewhere on fiber are $70 for 100 Mbps, $80 for 300 Mbps and $99 for a gigabit.

The Resurgence of Rabbit Ears

rabbit earsThere is perhaps no better way to understand the cord cutting phenomenon than by looking at the booming sales of home TV antennas known as ‘rabbit ears’ used to receive local television off the airwaves. A study released by Park Associates shows that 15% of households now use rabbit ears, and that is a pretty amazing statistic. That is up from 8% of households from as recently as 2013. And I recall an earlier time when this had fallen below 5%.

For the longest time the TV-watching public was counted in three groups – those who had cable TV (including satellite), those that used rabbit ears to watch local TV only, and those with no TV. We now have a fourth category – those that only watch OTT programming such as Netflix.

I was once in the category of not watching TV at all. I remember twenty years ago I went to Circuit City (now gone) to consider buying a set of rabbit ears and the clerks there weren’t even sure if the store carried them. With some asking around they found that they had a few units of one brand that had been gathering dust.

But today there is a resurgence in rabbit ears and there are easily a dozen major brands. And there are new rabbit ear options coming on the market all of the time. For example, Sling TV just launched AirTV, a $99 box that integrates Sling TV, Netflix and high-quality rabbit ears together with a voice-activated remote control that makes it easy to cut the cord. This looks to be one of the better voice-activation systems around and lets you search programming options by using the name of shows, actors names or genres of types of programming.

Since most people have had cable TV for a long time many have no idea of what they can receive off air for free. The FCC has an interesting map that shows you the expected reception in your area. In my case the map shows that I can get a strong signal from every major network including CW and PBS along with signals from MyTV, Univision and a few independent local stations.

The Parks study also looks at other industry statistics. A few of the most interesting ones include:

  • Penetration of pay-TV was down to 81% in 2016 and has fallen every year since 2014. Parks cites the normal reasons for the decline including the growth of OTT programming, the increasing cost of a cable TV subscription and growing consumer awareness that there are viable alternatives to cable TV.
  • Satisfaction with pay-TV keeps dropping and only one-third of households now say that they are very satisfied with their pay-TV service.
  • OTT viewing continues to rise and 63% of US households now subscribe to at least one OTT offering like Netflix while 31% of households subscribe to more than one.
  • In 2016 12% of households downgraded their pay-TV service (meaning dropped it or went to a less expensive option). This was double the percentage (6%) who upgraded their pay-TV service in 2016.
  • Very few cord nevers (those who have never had cable TV) are deciding to buy pay-TV, with only 2% of them doing so in 2016. This is the statistic that scares the cable companies because cord nevers include new Millenial households. This generation is apparently not interested in being saddled with a pay-TV subscription. In past generations the percentage of new homes that bought pay-TV closely matched the overall penetration of the market – buying TV was something you automatically did when you moved to a new place.

These statistics show how much choice the OTT phenomenon has brought to the marketplace. Ten years ago there wouldn’t have been industry experts predicting the resurgence of rabbit ears. In fact, rabbit ears were associated with other obsolete technologies like buggy whips and were used as the butt of jokes to make fun of those who didn’t like the modern world. But this is no longer true and new rabbit ear homes are perhaps some of the most tech savvy, who know that they can craft an entertainment platform without sending a big check to a cable company.


The FCC’s Cable Price Report

FCC_New_LogoOnce a year the FCC releases a Report on Cable Industry Prices and this year’s report came out a few weeks ago. This current report has some very odd findings that make me think that perhaps this report is no longer needed.

The report looked at the prices charged for basic cable and expanded basic cable in 485 communities in the US, some where cable has a declaration of effective competition and others with no competition.

I think the results shown in the report are off because the findings show average rate increases that are far below what is reported everywhere else in the industry. The FCC says that the price of basic cable increased by only 2.3% over the last year to reach a price of $23.79. More surprisingly, the average price of expanded basic cable increased by only 2.7% to reach $69.03 which was slightly lower than the increase in inflation. This compares to the 10-year historical average of 4.8% increases per year from this same report.

The increase in basic cable might be accurate because there are years when many companies don’t increase this rate. But the expanded basic rate increase is baffling. I wrote a blog back in the beginning of the year showing much larger increases for all of the big cable companies this year except Charter, due to their impending merger – and they caught up later in the year.

I think that perhaps the FCC is no longer asking the right questions. It’s certainly possible that the published prices for expanded basic cable increased as they have said – but that doesn’t tell us anything about what customers are really paying.

I suspect the FCC is not picking up the plethora of new ‘fees’ that are being used to disguise the price of cable. These might be called network programming fees to cover the cost of buying local programming. Or they might be called sports charges to cover the ever-rising cost of sports programming. Every big company labels these fees a little differently. But these fees are part of the cable bill that people pay each month and the primary purpose of the fees is to allow the cable companies to claim lower cable rates. These fees also confuse customers who often think they are taxes. My guess is that the FCC did not include these fees – and they must be included because they are nothing more than a small piece of the cable bill labeled differently.

Additionally, I’ve seen a number of estimates that say that around 70% of households buy cable as part of a bundle, and for these households the change in the list price of the components of the bundle doesn’t matter – customers only care about the overall increase in the price of the bundle. Customers don’t know or care which piece of the bundle increases since they are rarely shown the cost of bundle components.

And this leads to a discussion of the fact that cable companies have recently began increasing the prices of other products in order to keep cable rates lower. Rather than raise the price of cable they might instead raise the fees mentioned above, raise the price of the cable modem or the settop box, or raise the price of the broadband products. And all the cable companies care about – and all most customers see – is the increase in the total bill.

Finally, we know that there are now many different rates in every market. Cable companies sell specials or negotiate contract renewals with customers. At CCG we often gather customer bills to try to understand a market and we often see customers with an identical package with prices varying by as much as 10 or 15 dollars. None of the variation in actual rates makes it into the FCC report. I think this report only looks at the published list price and those prices are largely irrelevant since they don’t reflect what customers really pay.

So I think the usefulness of this report is over. If I recall this report was mandated by Congress, and so the FCC is probably obligated to keep producing it. But the results it now shows have almost nothing to do with the rates that customers actually pay for cable TV in the real world.

ESPN and the Cable Industry

espnI’ve been writing periodically about ESPN because they seem to be the poster child for what is happening to cable TV and to programmers in the country. It’s been obvious over the last year or two that ESPN is bleeding customers, and the many articles about them concentrate on that issue.

ESPN is a good bellwether for the industry because they are carried by practically every cable TV provider, and because their contracts require that the channel be carried in the expanded basic tier – the tier that generally has between 50 and 75 channels. Only a few tiny rural cable systems don’t carry ESPN since they carry only a small number of channels.

When ESPN loses customers it can only come from one of two reasons – people that cut the cord and drop cable altogether or from cord shavers who downsize to the smallest basic cable package. Basic cable is the small package of 10 – 15 channels that includes the local network affiliates, government channels and a few cheap throw-ins like shopping channels.

But it’s not easy to figure out the real number of cord cutters and cord shavers. The largest cable companies report total subscriber numbers each quarter but they don’t report on the packages that customers buy. Various analysts estimate the number of cord cutters each quarter, but they differ on these estimates – and I haven’t seen anybody try to estimate the number of cord shavers.

Nielsen tracks the number of customers of each cable network and that tells us how the various cable TV networks are faring. The latest article on ESPN comes from Sports TV Ratings, a website that tracks subscribers to the various sports networks. That site shows that ESPN lost 621,000 subscribers just last month (October 2016). That is an astounding number since ESPN has roughly 89 million customers – it’s a drop of 7/10’s of a percent, which annualized would be over 8% of ESPN customers.

But that number may not be a huge aberration. FierceCable reported earlier this year that ESPN had lost 2.2 million customers between February and August of this year, which is a clip of 440,000 lost customers per month. And the network has lost more than 11 million customers since its peak in 2013 when it had almost 100 million customers.

Trying to count cord shavings gets even more complicated because of OTT content. The cited drop of 610,000 ESPN customers is from the Nielsen numbers for carriage on cable systems. This doesn’t include online content which includes ESPN. For instance, the basic package on Sling TV includes ESPN and Goldman Sachs estimated that Sling TV will have almost 2 million customers by the end of this year. There are a number of new OTT offerings just hitting the market that will include the network, but for now Sling TV has most of the online ESPN subscribers.

ESPN has an advantage over many other networks in that it probably can add back customers by selling to people directly on the web. And so perhaps the network can find an equilibrium number of customers at some lower threshold than today. But this is not going to be true for a lot of other content. As an example, in October the Golf Channel lost 600,000 subscribers and The Major League Baseball Channel lost 515,000 customers – and those kinds of networks have very limited appeal on a standalone basis. That is the real story behind the losses at ESPN – the vast majority of cable networks are bleeding customers right now.

Some of the content providers are not too worried about the drop of US cable customers since they are picking up far greater numbers of new customers worldwide right now. But networks that are US-centric – sports, news, weather – are in for a rough ride over the next few years as the industry settles out to a new and lower norm. I think we can expect to see a transformation of sports programming as the numerous sports networks bleed customers. This probably means more emphasis on live programming and fewer sports networks.

Technology and Telecom Jobs

PoleIn case you haven’t noticed, the big companies in the industry are cutting a lot of jobs – maybe the biggest job cuts ever in the industry. These cuts are due to a variety of reasons, but technology change is a big contributor.

There have been a number of announced staff cuts by the big telecom vendors. Cisco recently announced it would cut back as many as 5,500 jobs, or about 7% of its global workforce. Cisco’s job cuts are mostly due to the Open Compute Project where the big data center owners like Facebook, Amazon, Google, Microsoft and others have turned to a model of developing and directly manufacturing their own routers and switches and data center gear. Cloud data services are meanwhile wiping out the need for corporate data centers as companies are moving most of their computing processes to the much more efficient cloud. Even customers that are still buying Cisco boxes are cutting back since the technology now provides a huge increase of capacity over older technology and they need fewer routers and switches.

Ericsson has laid off around 3,000 employees due to falling business. The biggest culprit for them is SDNs (Software Defined Networks). Most of the layoffs are related to cell site electronics. The big cellular companies are actively converting their cell sites to centralized control with the brains in the core. This will enable these companies to make one change and have it instantly implemented in tens of thousands of cell sites. Today that process requires upgrading the brains at each cell site and also involves a horde of technicians to travel to and update each site.

Nokia plans to layoff at least 3,000 employees and maybe more. Part of these layoffs are due to final integration with the purchase of Alcatel-Lucent, but the layoffs also have to do with the technology changes that are affecting every vendor.

Cuts at operating carriers are likely to be a lot larger. A recent article published in the New York Times reported that internal projections from inside AT&T had the company planning to eliminate as many as 30% of their jobs over the next few years, which would be 80,000 people and the biggest telco layoff ever. The company has never officially mentioned a number but top AT&T officials have been warning all year that many of the job functions at the company are going to disappear and that only nimble employees willing to retrain have any hope of retaining a long-term job.

AT&T will be shedding jobs for several reasons. One is the big reduction is technicians needed to upgrade cell sites. But an even bigger reason is the company’s plans to decommission and walk away from huge amounts of its copper network. There is no way to know if the 80,000 number is valid, but even a reduction half that size would be gigantic.

And vendor and carrier cuts are only a small piece of the cuts that are going to be seen across the industry. Consider some of the following trends:

  • Corporate IT staffs are downsizing quickly from the move of computer functions to the cloud. There have been huge number of technicians with Cisco certifications, for example, that are finding themselves out of work as their companies eliminate the data centers at their companies.
  • On the flip side of that, huge data centers are being built to take over these same IT functions with only a tiny handful of technicians. I’ve seen reports where cities and counties gave big tax breaks to data centers because they expected them to bring jobs, but instead a lot of huge data centers are operating with fewer than ten employees.
  • In addition to employees there are fleets full of contractor technicians that do things like updating cell sites and these opportunities are going to dry up over the next few years. There will always be opportunities for technicians brave enough to climb cell towers, but that is not a giant work demand.

It looks like over the next few years that there are going to be a whole lot of unemployed technicians. Technology companies have always been cyclical and it’s never been unusual for engineers and technicians to have worked for a number of different vendors or carriers during a career, yet mostly in the past when there was a downsizing in one part of the industry the slack was picked up somewhere else. But we might be looking at a permanent downsizing this time. Once SDN networks are in place the jobs for those networks are not coming back. Once most IT functions are in the cloud those jobs aren’t coming back. And once the rural copper networks are replaced with 5G cellular those jobs aren’t coming back.