New Technology – October 2017

I’ve run across some amazing new technologies that hopefully will make it to market someday.

Molecular Data Storage. A team of scientists at the University of Manchester recently made a breakthrough with a technology that allows high volumes of data to be stored within individual molecules. They’ve shown the ability to create high-density storage that could save 25,000 gigabits of data on something the size of a quarter.

They achieved the breakthrough using molecules that contain the element dysprosium (that’s going to send you back to the periodic table) cooled to a temperature of -213 centigrade. At that temperature the molecules retain magnetic alignment. Previously this has taken molecules cooled to a temperature of -259 C. The group’s goal is to find a way to do this at -196 C, the temperature of affordable liquid nitrogen, which would make this a viable commercial technology.

The most promising use of this kind of dense storage would be in large data centers since this storage is 100 times more dense than existing technologies. This would make data centers far more energy efficient while also speeding up computing. This kind of improvement since there are predictions that within 25 years data centers will be the largest user of electricity on the planet.

Bloodstream Electricity. Researchers at Fudan University in China have developed a way to generate electricity from a small device immersed in the bloodstream. The device uses stationary nanoscale carbon fibers that act like a tiny hydropower generator. They’ve named the device as ‘fiber-shaped fluidic nanogenerator” (FFNG).

Obviously there will need to be a lot of testing to make sure that the devices don’t cause problems like blood clots. But the devices hold great promise. A person could use these devices to charge a cellphone or wearable device. They could be used to power pacemakers and other medical devices. They could be inserted to power chips in farm animals that could be used to monitor and track them, or used to monitor wildlife.

Light Data Storage. Today’s theme seems to be small, and researchers at Caltech have developed a small computer chip that is capable of temporarily storing data using individual photons. This is the first team that has been able to reliably capture photons in a readable state on a tiny device. This is an important step in developing quantum computers. Traditional computers store data as either a 1 or a 0, but quantum computers store also can store data that is both a 1 and 0 simultaneously. This has shown to be possible with photons.

Quantum computing devices need to be small and operate at the nanoscale because they hold data only fleetingly until it can be processed, and nanochips can allow rapid processing. The Caltech device is small around the size of a red blood cell. The team was able to store a photon for 75 nanoseconds, and the ultimate goal is to store information for a full millisecond.

Photon Data Transmission. Researchers at the University of Ottowa have developed a technology to transmit a secure message using photons that are carrying more than one bit of information. This is a necessary step in developing data transmission using light, which would free the world from the many limitations of radio waves and spectrum.

Radio wave data transmission technologies send one bit of data at a time with each passing wavelength. Being able to send more than one bit of data with an individual proton creates the possibility of being able to send massive amounts of data through the open atmosphere. Scientists have achieved the ability to encode multiple bits with a proton in the lab, but is the first time it’s been done through the atmosphere in a real-world application.

The scientists are now working on a trial between two locations that are almost three miles apart and that will use a technology they call adaptive optics that can compensate for atmospheric turbulence.

There are numerous potential uses for the technology in our industry. This could be used to create ultrahigh-speed connections between a satellite and earth. It could be used to transmit data without fiber between locations with a clear line-of-sight. It could used as a secure method of communications with airplanes since small light beams can’t be intercepted or hacked.

The other use of the technology is to leverage the ability of photons to carry more than one bit of data to create a new kind of encryption that should be nearly impossible to break. The photon data transmission allows for the use of 4D quantum encryption to carry the keys needed to encrypt and decrypt packets, meaning that every data packet could use a different encryption scheme.

The Competition Dilemma

One of the most perplexing issues for fiber overbuilders is what I call the competition dilemma. That is where the big cable companies like Comcast will match the prices of any major competitor in their footprint, making it impossible for a competitor to ever get a price advantage.

A lot of fiber overbuilders enter the market and hope to gain customers by offering lower prices. You saw this when Google Fiber offered a gigabit broadband connection for $70, and I see the same thing from many smaller ISPs. But any price advantage disappears if the large incumbent cable company matches the lower prices.

This is an interesting dilemma for municipal cable systems. They often enter the market with a goal of lowering prices in their market. And when the incumbent provider matchers their prices the municipality has achieved their goal since everybody in the city then benefits from lower prices.

But this comes at a cost. Lower prices mean lower margins, and any ISP that lowers prices is hurting their own bottom line. You would think that lower prices also hurt the incumbent providers, but the big ISPs have the advantage of being able to charge more in surrounding communities to offset lower margins where there is competition. They factor in competition when setting their nationwide prices, so it can be argued that competition doesn’t really hurt big companies at all – they make up for competitive losses by charging a little more everywhere else.

There doesn’t seem to be any limit on how low an incumbent provider will go to match prices. Take the example of the cable TV product on the city-owned Click! Network in Tacoma, WA. For many years the city didn’t raise cable prices, and Comcast matched their low pricing. Over time the cable prices in Tacoma were over 30% lower than prices in the Tacoma suburbs and nearby cities like Seattle. The customers in the city benefitted from low cable rates, but the city was losing money on cable TV and over time raised their rates back to the market rates.

This issue is going to be in the news a lot more in the future. In a recent blog I talked about an analyst who believes that Comcast is going to double their broadband rates over the next few years. Even if their rate increases aren’t that drastic I think it’s obvious that they plan to raise rates. This is probably the number one reason they have been lobbying hard to get rid of Title II regulation, since that is the only tool that regulators could use to examine and react to broadband rate increases.

If Comcast and the other big ISPs undertake regular broadband price increases they will create an interesting dynamic in the industry. Anybody with a competing network is going to have to decide if they are going to raise rates to match them. It’s going to be tempting to do so because increases in broadband rates flow 100% straight to the bottom line. But if a competitor doesn’t raise rates, then it’s likely that the big ISPs will raise rates everywhere except where there is significant competition. And that would result in big difference in broadband prices between markets with and without a competitor.

It’s also likely that as the big ISPs raise broadband rates that they will be inviting competitors into the market. I create a lot of financial business plans and there are many markets where it’s hard to make a business case for building fiber at today’s broadband rate. But raise those rates and a lot more business plans become attractive.

The final issue raised by the competition dilemma is customer choice. Most cities desperately want competition in their markets because they can see the large cable companies becoming near-monopolies. One of the primary reasons why cities build fiber networks or lure ISPs to do so is to provide more choice. But you have to ask what kind of choice customers really get when there is no price difference between a competitor and the incumbents?

When Disaster Strikes

As many of you know who read this blog, I lived for nearly a decade on St. Croix in the US Virgin Islands. Since all three of the US Virgin Islands as well Puerto Rico got devastated by the two recent hurricanes I thought I’d talk a bit about how our industry responds to disasters.

Disasters have always been with us in the telecom industry. There have been other hurricanes that have knocked down wires and poles in the past. I have a number of clients who have experienced crippling ice storms. I have clients in the West who have lost network from wildfires. And I’ve had clients all over the country who have suffered from massive flooding.

I witnessed the impact of a hurricane in St. Croix when category 3 hurricane Omar hit the island in 2008. The hurricane itself was bad enough. A wall of water came down the hill behind my house, burst through the french doors at the rear of my house and streamed through and out my front door. Then, at the very end of the storm I took a direct hit from a tornado – one of the impacts of hurricanes often forgotten about.

As bad as these storms are, it’s the aftermath that is the most devastating. I was without power for over six weeks, meaning that my consulting work came to a screeching halt. But it took those whole six weeks anyway to clean the mud out of my house and to cut up the hundred dead trees around the property, including a magnificent hundred-year old mahogany tree. And while there are always mosquitoes in the Caribbean, after the flooding from a hurricane they come in dense clouds, making it miserable to work outside. What I remember most about that period is that my world shrank and all of my energy was needed to deal with the effects of the storm. I also learned how much I rely on electricity, refrigeration and lights in a place that gets dark at 6:00 PM every day. It’s mind-boggling to think that there are millions of Americans that will be without power for months.

A category 3 hurricane is strong enough to send trees crashing through overhead wires, and so there were wires down all over the island. But there was a minimal number of poles broken, and so the task of restoring power and telephone wires just needed lots of crews with cherry-pickers. Our island was the only place hit by Omar and crews from St. Thomas, Tortola and Puerto Rico came to help with the recovery. The island was so grateful we threw a huge well-deserved parade and party for repair crews when they were finished.

It was the response from work crews from other islands that made all of the difference. We see the same thing here in the US all of the time. One of my clients got devastated by hurricane Katrina and work crews from all over the US rushed to help. We see this after every stateside disaster as telecom and power crews from elsewhere rush to aid a utility in trouble.

And that is the big problem right now in the Caribbean. St. Thomas and the British Virgin Islands got devastated by hurricane Irma. The storm was so strong that it snapped the majority of the utility poles in St Thomas, meaning the work effort needed to restore the island is going to be massive. Since St. Croix got only minor damage in that storm it become the staging area for the work effort to help St. Thomas and St. John. But then two weeks later St. Croix and Puerto Rico were flattened by hurricane Maria.

We now have the unprecedented situation where all of the islands in the region lost their utility infrastructure at the same time. This presents an almost unsolvable logistical challenge of somehow getting the resources in place to get the islands back up and running. As bad as the Virgin Islands are right now, it’s almost impossible for the mind to grasp the amount of damage in Puerto Rico with it’s rough terrain and 3 million people still without power.

No utility can shoulder the cost of the repair efforts from a bad natural disasters. In the US the federal government has always jumped in to fund some of the needed recovery. The crews that rush in to help don’t ask first about getting paid and they assume they will eventually reimbursed for their costs. The FCC quickly approved $76.9 million towards the recovery effort for the Virgin Island and Puerto Rico. But that’s just a start on the cost of fixing the damage – I have colleagues working on St. Thomas and their first quick estimate of the utility damage there was almost $60 million. I imagine the final number for all of the islands is going to be astronomical.

I know that if there was an easy way to get there that many of the telco and power companies in the US would be sending crews to help the islands. It’s going to be hard enough just getting the needed poles, cables and electronics to the islands. It’s frustrating to know that the logistics challenges means that the repair will take a long time. It won’t be surprising to still see parts of Puerto Rico without electricity six months from now – and that is heartbreaking.

Cable Systems Aren’t All Alike

Big cable companies all over the country are upgrading their networks to DOCSIS 3.1 and announcing that they will soon have gigabit broadband available. Some networks have already been upgraded and we are seeing gigabit products and pricing springing up in various markets around the country. But this does not mean that all cable networks are going to be capable of gigabit speeds, or even that all cable networks are going to upgrade to DOCIS 3.1. As the headline of this blog says, all cable systems aren’t alike. Today’s blog looks at what that means as it applies to available broadband bandwidth.

A DOCSIS cable network is effectively a radio network that operates only inside the coaxial cable. This is why you will hear cable network capacity described using megahertz, which is a measure of the frequency of a radio transmission. Historically cable networks came in various frequency sizes such as 350 MHz, 650 MHz or 1,000 MHz.

The size of the available frequency, in megahertz, describes the capacity of the network to carry cable TV channels or broadband. Historically one analog TV channel uses about 6 MHz of frequency – meaning that a 1,000 MHz system can transmit roughly 167 channels of traditional analog TV.

Obviously cable networks carry more channels than this, which is why you’ve seen cable companies upgrade to digital system. The most commonly used digital compression scheme can squeeze six digital channels into the same frequency that carries one analog channel. There are new compression techniques that can squeeze in even more digital channels into one slot.

In a cable network each slice of available frequency can be used to either transmit either TV channels or else be used for broadband. If a cable companies wants more broadband capacity they must create room for the broadband by reducing the number of slots used for TV.

It is the overall capacity of the cable network along with the number of ‘empty’ channel slots that determine how much broadband the network can deliver to customers. A cable system needs roughly 24 empty channel slots to offer gigabit broadband download speeds. It’s a lot harder to carve out enough empty channels on smaller capacity networks. An older cable system operating at 650 MHz has significantly less capacity for broadband than a newer urban system operating at 1,000 MHZ or greater capacity.

One of the primary benefits of DOCSIS 3.1 is the ability to combine any number of empty channels into a signal broadband stream. But the task of upgrading many older networks to DOCSIS 3.1 is not just a simple issue of upgrading the electronics. If a cable company wants the faster broadband speeds they need to also upgrade the overall capacity of the network. And the upgrade from 350 MHz or 650 MHz to 1,000 MHz is often expensive.

The higher capacity network has different operating characteristics that affect the outside cable plant. For example, the placement and spacing of cable repeaters and power taps is different in a higher frequency network. In some cases the coaxial cable used in an older cable networks can’t handle the higher frequency and must be replaced. So upgrading an older cable network to get faster speeds often means making a lot of changes in the physical cable plant. To add to the cost, this kind of upgrade also usually means having to change out most or all of the cable settop boxes and cable modems – an expensive undertaking when every customer has multiple devices.

The bottom line of all of this is that it’s not necessarily cheap or easy to upgrade older or lower-capacity cable networks to provide faster broadband. It takes a lot more than upgrading the electronics to get faster speeds and often means upgrades the physical cable plant and replacement of settop boxes and cable modems. Cable operators with older networks have to do a cost/benefit analysis to see if it’s worth the upgrade cost to get faster broadband. Since most older cable systems are in rural small towns, this is one more hurdle that must be overcome to provide faster broadband in rural America.

TiVo’s Latest Video Trend Report

TiVo just released their Q2 2017 Online Video and Pay-TV Trends Report, and as usual it’s full of interesting statistics about the cable and video industries.

They looked in detail at those without a traditional pay-TV service – the cord cutters and cord nevers. They found the following:

  • One-fourth left cable TV during the last 12 months – proof that cord cutting is a real phenomenon.
  • 7% use an antenna to get over-the-air free TV.
  • Over 85% report that they don’t have cable TV because it costs too much.
  • Almost 46% use an OTT service like Netflix, Hulu or Amazon.

For people that still have a cable TV subscription:

  • 53% are satisfied with the value of their cable and another 31% are very satisfied. Those percentages are higher than past quarters, possibly due to the dissatisfied cord-cutters leaving cable.
  • In a statistic that might surprise many, only 15% are dissatisfied with the value of their cable subscription.
  • 47% of households now pay between $50 – $75 per month for cable. Over 36% of households spend more than $100 per month for cable, with 10% of households spending more than $150 per month.

TiVo looked at those who plan to change TV service in the next year:

  • 6% plan to cut the cord
  • 8% plan to change to another TV provider
  • 31% say that they have thought about cutting the cord.
  • 56% would change to an a la carte TV offering that let them buy only the channels they want.
  • 39% said they would be more willing to cut the cord if there was some easy way to navigate between OTT providers.

Many are still buying premium movie channels:

  • 27% of households are buying HBO, up from 22% just a year ago.
  • 17% are buying Showtime
  • 17% are buying the Movie Channel
  • 13% buy a premium sports package
  • 13% buy Starz
  • 12% buy Cinemax

They also looked at TV viewing habits:

  • 89% of households watch TV on a daily basis
  • 67% of homes watch recorded content (DVR / DVD) content on a daily basis.
  • 63% watch OTT content on a daily basis

Households are largely loyal to a handful of content:

  • Over 80% of homes report that they watch 10 or fewer different channels of content.
  • 59% of households watch 5 or fewer different shows per week.
  • 83% of households watch 10 or fewer shows per week.

Households were asked what they like most about OTT services:

  • 59% like services where each family member can create their own profile.
  • 56% like the lower prices of the service
  • 46% like auto-play of episodes where the next show comes on automatically

TV Everywhere still doesn’t have universal acceptance

  • Just over 50% of households are aware that their cable service offers TV Everywhere
  • Just over 1/3 of households actually use TV Everywhere to watch content on cell phones, tablets, etc.

CAF II and Wireless

Frontier Communications just announced that they are testing the use of wireless spectrum to complete the most rural portions of their CAF II build-out requirement. The company accepted $283 million per year for six years ($1.7 billion total) to upgrade broadband to 650,000 rural homes and businesses. That’s a little over $2,600 per location passed. The CAF II program requires that fund recipients increase broadband to speeds of at least 10 Mbps down and 1 Mbps up.

Frontier will be using point-to-multipoint radios where a transmitter is mounted on a tower with the broadband signal then sent to a small antenna at each customer’s location. Frontier hasn’t said what spectrum they are using, but in today’s environment it’s probably a mix of 2.4 GHz and 5 GHz WiFi spectrum and perhaps also some 3.65 GHz licensed spectrum. Frontier, along with CenturyLink and Consolidated told the FCC a year ago that they would be interested in using the spectrum in the ‘citizens’ radio band’ between 3.7 MHz and 4.2 MHz for this purpose. The FCC opened a docket looking into this spectrum in August and comments in that docket were due to the FCC last week.

I have mixed feelings about using federal dollars to launch this technology. On the plus side, if this is done right this technology can be used to deliver bandwidth up to 100 Mbps, but in a full deployment speeds can be engineered to deliver consistent 25 Mbps download speeds. But those kinds of speeds require an open line-of-sight to customers, tall towers that are relatively close to customers (within 3 – 4 miles) and towers that are fiber fed.

But when done poorly the technology delivers much slower broadband. There are WISPs using the technology to deliver speeds that don’t come close to the FCC’s 10/1 Mbps requirement. They often can’t get fiber to their towers and they will often serve customers that are much further than the ideal distance from a tower. Luckily there are many other WISPs using the technology to deliver great rural broadband.

The line-of-sight issue is a big one and this technology is a lot harder to make work in places with lots of trees and hills, making it a difficult delivery platform in Appalachia and much of the Rockies. But the technology is being used effectively in the plains and open desert parts of the country today.

I see downsides to funding this technology with federal dollars. The primary concern is that the technology is not long-lived. The electronics are not generally expected to last more than seven years and then the radios must be replaced. Frontier is using federal dollars to get this installed, and I am sure that the $2,600 per passing is enough to completely fund the deployment. But are they going to keep pouring capital into replacing radios regularly over time? If not, these deployments would be a sick joke to play on rural homes – giving them broadband for a few years until the technology degrades. It’s hard to think of a worse use of federal funds.

Plus, in many of areas where the technology is useful there are already WISPs deploying point-to-multipoint radios. It seems unfair to use federal dollars to compete against firms who have made private investments to build the identical technology. The CAF money ought to be used to provide something better.

I understand Frontier’s dilemma. In the areas where they took CAF II money they are required to serve everybody who doesn’t have broadband today. My back-of-the envelope calculations tells me that the CAF money was not enough for them to extend DSL into the most rural parts of the CAF areas since extending DSL means building fiber to feed the DSLAMs.

As I have written many times I find the whole CAF program to be largely a huge waste of federal dollars. Using up to $10 billion to expand DSL, point-to-multipoint, and in the case of AT&T cellular wireless is a poor use of our money. That same amount of money could have seeded matching broadband that could be building a lot of fiber to these same customers. We only have to look at state initiatives like the DEED grants in Minnesota to see that government grant money induces significant private investment in fiber. And as much as the FCC doesn’t want to acknowledge it, building anything less than fiber is nothing more than a Band-aid. We can and should do better.

Local Government Funding for Fiber

There is an interesting new trend where local government acts as the banker for rural broadband projects. It’s an interesting new twist on public / private partnerships and is a model that more communities should consider.

Consider these rural broadband projects in Minnesota.

  • First is RS Fiber. This is a new broadband cooperative that serves most of Sibley County and some of Renville County in Minnesota. Bonds were approved to fund 25% of a broadband project and those bonds are backed by the counties, some small cities and also by townships that are getting the fiber. The expectation is that the project will make the bond payments.
  • Next is in Swift County Minnesota. Federated Telephone Cooperative, an existing telephone company, was awarded $4.95 million to build fiber to rural homes in the county. The county approved general obligation bonds of $7.8 million to complete the project, or 60% of the funding.

Both projects are classic examples of a public private partnership. In these particular cases the company that will own and operate the network is a cooperative, but these same agreements could have been made with a for-profit telco or some other telecom provider as well.

These kinds of projects make sense for a number of reasons:

  • The process of approving bond financing is far faster than securing traditional funding for these kinds of projects.
  • Bonds for fiber can be financed over a long period of time – 20 to 30 years, while loan terms for commercial loans are usually shorter. Just like with a home mortgage, borrowing for a longer time period means lower annual debt payments, which is essential to make these projects financially feasible.

In both cases the Counties and other local government entities have taken on the role of banker. The local governments will have no operational role in running the fiber business (a role they did not want). The Counties expect for the bond payments to be covered by the fiber project. And since these networks are being built in rural areas with few other broadband alternatives the new fiber ventures should get high customer penetration rates. But if the ventures fail then the local governments are on the hook to cover any shortfalls in the bond payments.

These are both cases of local governments deciding that the need for rural broadband was great enough to risk taxpayer money to get this done. They also decided that the risk of not getting paid is low. The business cases show that even in the worst case the revenues from the projects should cover almost all costs, meaning that the downside risk to the Counties is minimal. In the case of RS Fiber, as a start-up new cooperative, they would not have been able to get any traditional funding without the seed money from the local governments.

This is a model that the rest of rural America should consider. Small ISPs like these cooperatives stand ready to serve a lot of rural America, but they often don’t have the financial wherewithal to do so. In these cases, a public private partnership with local government as the banker seemed to be the only way to make this happen.

Everywhere I travel in rural America homeowners and farmers want good broadband. They understand that it’s costly to build fiber to farms and small rural towns. But they also seem willing to help pay to make this work. I think if more rural counties would listen to their constituents they would take a harder look at this model.

Of course, a county needs to do their homework up front and make sure they know it’s a sound project and that the estimated cost of building the broadband network is accurate. But assuming there is a solid business plan, perhaps the most valuable role a county can tackle is that of being the banker to help new broadband builds get off the ground

A Doubling of Broadband Prices?

In what is bad news for consumers but good news for ISPs, a report by analyst Jonathan Chaplin of New Street Research predicts big increases in broadband prices. He argues that broadband is underpriced. Prices haven’t increased much for a decade and he sees the value of broadband greatly increased since it is now vital in people’s lives.

The report is bullish on cable company stock prices because they will be the immediate beneficiary of higher broadband prices. The business world has not really acknowledged the fact that in most US markets the cable companies are becoming a near-monopoly. Big telcos like AT&T have cut back on promoting DSL products and are largely ceding the broadband market to the big cable companies. We see hordes of customers dropping DSL each quarter and all of the growth in the broadband industry is happening in the biggest cable companies like Comcast and Charter.

I’ve been predicting for years that the cable companies will have to start raising broadband prices. The companies have been seeing cable revenues drop and voice revenues continuing to drop and they will have to make up for these losses. But I never expected the rapid and drastic increases predicted by this report. Chaplin sets the value of basic broadband at $90, which is close to a doubling of today’s prices.

The cable industry is experiencing a significant and accelerating decline in cable customers. And they are also facing significant declines in revenues from cord-shaving as customers elect smaller cable packages. But the cable products have been squeezed on margin because of programming price increases and one has to wonder how much the declining cable revenue really hurts their bottom line.

Chaplin reports that the price of unbundled basic broadband at Comcast is now $90 including what they charge for a modem. It’s even higher than that for some customers. Before I left Comcast last year I was paying over $120 per month for broadband since the company forced me to buy a bundle that included basic cable if I wanted a broadband connection faster than 30 Mbps.

Chaplin believes that broadband prices at Comcast will be pushed up to the $90 level within a relatively short period of time. And he expects Charter to follow.

If Chaplin is right one has to wonder what price increases of this magnitude will mean for the public. Today almost 20% of households still don’t have broadband, and nearly two-thirds of those say it’s because if the cost. It’s not hard to imagine that a drastic increase in broadband rates will drive a lot of people to use broadband alternatives like cellular data, even though it’s a far inferior substitute.

I also have to wonder what price increases of this magnitude might mean for competitors. I’ve created hundreds of business plans for markets of all sizes, and not all of them look promising. But the opportunities for a competitor improve dramatically if broadband is priced a lot higher. I would expect that higher prices are going to invite in more fiber overbuilders. And higher prices might finally drive cities to get into the broadband business just to fix what will be a widening digital divide as more homes won’t be able to afford the higher prices.

Comcast today matches the prices of any significant cable competitor. For instance, they match Google Fiber’s prices where the companies compete head-to-head. It’s not hard to foresee a market where competitive markets stay close to today’s prices while the rest have big rate increases. That also would invite in municipal overbuilders in places with the highest prices.

Broadband is already a high-margin product and any price increases will go straight to the bottom line. It’s impossible for any ISP to say that a broadband price increase is attributable to higher costs – as this report describes it, any price increases can only be justified by setting prices to ‘market’.

All of this is driven, of course, by the insatiable urge of Wall Street to see companies make more money every quarter. Companies like Comcast already make huge profits and in an ideal world would be happy with those profits. Comcast does have other ways to make money since they are also pursuing cellular service, smart home products and even now bundling solar panels. And while most of the other cable companies don’t have as many options as Comcast, they will gladly follow the trend of higher broadband prices.

Why Isn’t Everybody Cutting the Cord?

Last year at least two million households cut the cord. I’ve seen headlines predicting that as many as 5 million more this year, although that seems too high to me. But both of these numbers are a lot lower than the number of people who say they are going to cut the cord in the coming year. For several years running various national surveys show that 15 million or more households say they want to cut the cord. But year after year they don’t and today’s blog looks at some of the reasons why.

I think one of the primary reasons people keep traditional cable is that they figure out that they won’t save as money with cord cutting as they had hoped. The majority of cord cutters say that saving money is their primary motivation for cutting the cord, and once they look hard at the actual savings they decide it’s not worth the change.

One issue that surprises a lot of potential cord cutters is the impact of losing their bundling discount if they are buying programming from a cable company. Big cable companies penalize customers who break the bundle. As an example, consider a customer who has a $50 broadband product and a $50 cable product, but for which the cable company charges $80. When a customer drops one of the two products the cable company will charge them $50 for the remaining one. That means there is a $20 penalty for cutting the cord and thus not much savings from cutting the cord.

Households also quickly realize that they need to subscribe to a number of OTT services if they want a wide array of programming choices. If you want to watch the most popular OTT shows that means a $10 subscription to Netflix, an $8.25 per month subscription to Amazon and a Hulu package that starts at $8. If you want to watch Game of Thrones you’ll spend $15 for HBO. And while these packages carry a lot of movies, if you really love movies you’ll find yourself buying them on an a la carte basis.

And OTT options are quickly proliferating. If you want to see the new Star Trek series that means another $5.99 per month for CBS All Access. If your household likes Disney programming that new service is rumored to cost at least another $5 per month.

And none of these options bring you all of the shows you might be used to watching on cable TV. One option to get many of these same networks is by subscribing to Sling TV or PlayStation Vue, with packages that start at $20 per month, but which can cost a lot more. If you don’t want to subscribe to these services, then buying whole season of one specific show can easily cost $100.

And then there is sports. PlayStation Vue looks to have the best basic sports package, but that means buying the service plus add-on packages. A serious sports fan is also going to consider buying Fubo. And fans of specific sports can buy subscriptions to Major League baseball, NBA basketball or NHL hockey.

Then there are the other 100 OTT options. There is a whole range of specialty programmers that carry programming like foreign films, horror movies, British comedies and a wide range of other programming. Most of these range from $3 to $7 per month.

There are also hardware costs to consider. Most people who watch a range of OTT programming get a media streaming device like Roku, Amazon Fire, or Apple TV. Customers that want to record shows shell out a few hundred dollars for an OTT VCR. A good antenna to get local programming costs between $30 and $100.

The other reason that I think people don’t cut the cord is that it’s not easy to navigate between the many OTT options. They all have different menus and log-ins and it can be a pain to navigate between platforms. And it’s not easy to find what you want to watch, particularly if you don’t have a specific show in mind. It’s hard to think that it’s going to get any easier to use the many OTT services since they are in competition with each other. It’s hard to ever see them agreeing on a common interface or easy navigation since each platform wants viewers to stay on their platform once logged in.

Finally, none of these combinations gets you everything that’s on cable TV today. For many people cutting the cord means giving up a favorite show or favorite network.

If anything, OTT watching is getting more complicated over time. And if a household isn’t careful they might spend more than their old cable subscription. I’m a cord cutter and I’m happy with the OTT services I buy. But I can see how this option is not for everybody.

 

When Customers Use Their Data

In a recent disturbing announcement ,Verizon Wireless will be disconnecting service to 8,500 rural customers this month for using too much data on their cellphones. The customers are scattered around 13 states and are a mix those with both unlimited and limited data plans.

Verizon justifies this because these customers are using data where Verizon has no direct cell towers, meaning that these customers are roaming on cellular data networks owned by somebody else. Since Verizon pays for roaming the company say that these customers are costing them more in roaming charges than what the company collects in monthly subscription fees.

Verizon may well have a good business case for discontinuing these particular data customers if they are losing money on each customer. But the act of disconnecting them opens up a lot of questions and ought to be a concern to cellular customers everywhere.

This immediately raises the question of ‘carrier of last resort’. This is a basic principle of utility regulation that says that utilities, such as traditional incumbent telephone companies, must reasonably connect to everybody within their service territory. Obviously cellular customers don’t fall under this umbrella since the industry is competitive and none of the cellular companies have assigned territories.

But the lines between cellular companies and telcos are blurring. As AT&T and Verizon take down rural copper they are offering customers a wireless alternative. But in doing so they are shifting these customers from being served by a regulated telco to a cellular company that doesn’t have any carrier of last resort obligations. And that means that once converted to cellular that Verizon or AT&T would be free to then cut these customers loose at any time and for any reason. That should scare anybody that loses their rural copper lines.

Secondly, this raises the whole issue of Title II regulation. In 2015 the FCC declared that broadband is a regulated service, and that includes cellular data. This ruling brought cable companies and wireless companies under the jurisdiction of the FCC as common carriers. And that means that customers in this situation might have grounds for fighting back against what Verizon is doing. The FCC has the jurisdiction to regulate and to intervene in these kinds of situations if they regulate the ISPs as common carriers. But the current FCC is working hard to reverse that ruling and it’s doubtful they would tackle this case even if it was brought before them.

Probably the most disturbing thing about this is that it’s scary for these folks being disconnected. Rural homes do not want to use cellular data as their only broadband connection because it’s some of the most expensive broadband in the world. But many rural homes have no choice since this is their only broadband alternative to do the things they need to do with broadband. While satellite data is available almost everywhere, the incredibly high latency on satellite data means that it can’t be used for things like maintaining a connection to a school server to do homework or to connect to a work server to work at home.

One only has to look at rural cellular networks to understand the dilemma many of these 8,500 households might face. The usable distance for a data connection from a cellular tower is only a few miles at best, much like the circles around a DSL hub. It is not hard to imagine that many of these customers actually live within range of a Verizon tower but still roam on other networks.

Cellular roaming is an interesting thing. Every time you pick up your cellphone to make a voice or data connection, your phone searches for the strongest signal available and grabs it. This means that the phones of rural customers that don’t live right next to a tower must choose between competing weaker signals. Customers in this situation might be connected to a non-Verizon tower without it being obvious to them. Most cellphones have a tiny symbol that warns when users are roaming, but since voice roaming stopped being an issue most of us ignore it. And it’s difficult or impossible on most phones to choose which tower to connect to. Many of these customers being disconnected might have always assumed they actually were using the Verizon network. But largely it’s not something that customers have much control over.

I just discussed yesterday how we are now in limbo when it comes to regulating the broadband practices of the big ISPs. This is a perfect example of that situation because it’s doubtful that the customers being disconnected have any regulatory recourse to what is happening to them. And that bodes poorly to rural broadband customers in general – just one more reason why being a rural broadband customer is scary.