Will Costly Alternatives Slow Cord Cutting?

The primary reason that households claim they cut the cord is due to price. Surveys have shown that most households regularly watch around a dozen cable channels, and cord cutters still want to see their favorite channels. Not all cord cutters are willing to go cold turkey on the traditional cable networks and so they seek out an online alternative that carries the networks they want to watch.

For the last few years, there have been online alternatives that carry the most popular cable networks for prices between $35 and $45 per month. However, during the last year, the cost of these alternatives has risen significantly. I doubt that the price increases will drive people back to the cable companies where they had to pay for hidden fees and a settop box, but the higher prices might make more households hesitate to make the switch. Following are the current prices of the major online alternatives to traditional cable TV:

Hulu Live TV. This service is owned 2/3 by Disney and 1/3 by Comcast. They recently announced a price increase effective December 18 to move the package from $44.99 to $54.99. Customers can also select an add-free version for $60.99. At the beginning of 2019, the service was priced at $39.99, so the price increased by 36% during the year.

AT&T TV Now (was called DirecTV Now) raised the price of the service earlier this year from $50 to $65. The company also raised the prices significantly for DirecTV over satellite and lost millions of customers between the two services.

YouTube TV raised prices in May from $40 to $50. This service is owned by Google. Along with the price increase, the service added the Discovery Channel.

Sling TV is owned by Dish Networks. They still have the lowest prices for somebody looking for a true skinny package. They offer two line-ups, called Blue or Orange that each cost $25 per month, or both for $40 per month. There are also add-ons packages for $5 per month for Kids (Nick channels, Disney Jr), Lifestyle (VH-1, BET, diy, Hallmark), Heartland (outdoor channels), Hollywood (TCM, Sundance, Reelz), along with News, Spanish and International packages. One of the big things missing from Sling TV is local network channels and they provide an HD antenna with a subscription. Sling TV has spread the most popular channels in such a way that customers can easily spend $50 to $60 monthly to get their favorite channels.

Fubo TV is independent and not associated with another big media company. They offer 179 channels, including local network channels for $54.99 per month. The network started with sports coverage including an emphasis on soccer.

TVision Home is owned by T-Mobile. This was formerly known as Layer3 TV. The company has never tried to make this a low-cost alternative and it’s the closest online service to mimic traditional cable TV. The service is only available today in a few major markets. Customers can get an introductory price of $90 per month (goes up to $100 after a year). They charge $10 per extra TV and also bill taxes that range from 4% to 20% depending upon the market. This is cable TV delivered over broadband.

Playstation Vue. The service is owned by Sony and has announced that it will cease service at the end of January 2020. The service is no longer taking new customers. The price of the core packages is $55 per month, which increased by $5 in July.  The service carries more sports channels than most of the other services.

The channels offered by each service differ, so customers need to shop carefully and compare lineups. For example, I’m a sports fan and Sling TV and Fubo TV don’t carry the BigTen Network. There are similar gaps throughout the lineups of all of the providers.

All of these alternatives, except perhaps TVision Home, are still less expensive than most traditional cable TV packages. However, it looks like all of these services are going to routinely increase rates to cover increased programming fees. Couple that with the fact that customers dropping cable TV probably lose their bunding discounts, and a lot of houses are probably still on the fence about cord cutting.

A Peek at AT&T’s Fixed LTE Broadband

Newspaper articles and customer reviews provide a glimpse into the AT&T wireless LTE product being used to satisfy the original CAF II obligations. This article from the Monroe County Reporter reports on AT&T wireless broadband in Monroe County, Georgia. This is a county where AT&T accepted over $2.6 million from the original CAF II program to bring broadband to 1,562 rural households in the County.

Monroe is a rural county southeast of Atlanta with Forsyth as the county seat. As you can see by the county map accompanying this blog, AT&T was required to cover a significant portion of the county (the areas shown in green) with broadband of at least 10/1 Mbps. In much of the US, AT&T elected to use the CAF II money to provide faster broadband from cellular towers using LTE technology.

The customer cited in the article is happy with the AT&T broadband product and is getting 30/20 Mbps service. AT&T is cited in the article saying that the technology works best when serving customers within 2 miles of a cell tower, but that the coverage can sometimes extend to 3 miles. Unfortunately, 2 miles or even 3 miles isn’t very far in rural America and there are going to be a lot of homes in the CAF II service area that will be too far from an AT&T cell tower to get broadband.

From the AT&T website, the pricing for the LTE broadband is as follows. The standalone data product is $70 per month. Customers can get the product for $50 per month with a 1-year contract if they subscribe to DirecTV or an AT&T cellular plan that includes at least 1 GB of cellular broadband allowance. The LTE data product has a tiny data cap of 215 GB of download per month. Customers that exceed the data cap pay $10 for each additional 50 GB of data, up to a maximum fee of $200 per month.

The average household broadband usage was recently reported by OpenVault as 275 GB per month. A household using that average broadband would pay an additional $30 monthly. OpenVault also reported recently that the average cord cutter uses over 520 GB per month. A customer using a cord cutter level of data would pay an additional $70 per month. The product is only affordably priced if a household doesn’t use much broadband.

The article raises a few questions. First, this customer had to call AT&T to get the service, which apparently was not being advertised in the area. He said it took a while to find somebody at AT&T who knew about the LTE broadband product. The customer also said that the installer for the service came from Bainbridge, Georgia – which is a 3-hour drive south from the AT&T cell site mentioned in the article.

This highlights one of the major problems of rural broadband that doesn’t get talked about enough. The big telcos all have had massive layoffs over the last decade, particularly in the workforces supporting copper and rural networks. Even should one of these big telcos offer a rural broadband product, how good is that product without technician support? As I travel the county, I hear routine stories of rural folks who wait weeks to get broadband problems fixed.

When I heard that AT&T was going to use LTE to satisfy it’s CAF II requirements, my first thought was that their primary benefit was to use the federal funding to beef up their rural cellular networks rather than to start caring about rural broadband customers. In Monroe County, AT&T received almost $1,700 per CAF household, and I wonder if they will all see the benefits of this upgrade.

I’ve always suspected that AT&T wouldn’t aggressively market the LTE broadband product. If they were heavily marketing this by now, at the end of the fifth year of the CAF II buildout, there would be rural customers all over the country buying upgraded broadband. However, news about upgraded broadband is sparse for AT&T, and also for CenturyLink, and Frontier. I work with numerous rural counties where the local government never heard of CAF II since the telcos have done little marketing of improved rural broadband.

The article highlights a major aspect of the plight of rural broadband. We not only need to build new rural broadband infrastructure, but we need to replenish the rural workforce of technicians needed to take care of the broadband networks. The FCC needs to stop giving broadband money to the big telcos and instead distribute it to companies willing to staff up to support rural customers.

Counting Gigabit Households

I ran across a website called the Gigabit Monitor that is tracking the population worldwide that has access to gigabit broadband. The website is sponsored by VIAVI Solutions, a manufacturer of network test equipment.

The website claims that in the US over 68.5 million people have access to gigabit broadband, or 21% of the population. That number gets sketchy when you look at the details. The claimed 68.5 million people includes 40.3 million served by fiber, 27.2 million served by cable company HFC networks, 822,000 served by cellular and 233,000 served by WiFi.

Each of those numbers is highly suspect. For example, the fiber numbers don’t include Verizon FiOS or the FiOS properties sold to Frontier. Technically that’s correct since most FiOS customers can buy maximum broadband speeds in the range of 800-900 Mbps. But there can’t be 40 million people other people outside of FiOS who can buy gigabit broadband from other fiber providers. I’m also puzzled by the cellular and WiFi categories and can’t imagine there is anybody that can buy gigabit products of either type.

VIAVI makes similar odd claims for the rest of the world. For example, they say that China has 61.5 million people that can get gigabit service. But that number includes 12.3 million on cellular and 6.2 million on WiFi.

Finally, the website lists the carriers that they believe offer gigabit speeds. I have numerous clients that own FTTH networks that are not listed, and I stopped counting when I counted 15 of my clients that are not on the list.

It’s clear this web site is flawed and doesn’t accurately count gigabit-capable people. However, it raises the question of how to count the number of people who have access to gigabit service. Unfortunately, the only way to do that today is by accepting claims by ISPs. We’ve already seen with the FCC broadband maps how unreliable the ISPs are when reporting broadband capabilities.

As I think about each broadband technology there are challenges in defining gigabit-capable customers. The Verizon situation is a great example. It’s not a gigabit product if an ISP caps broadband speeds at something lower than a gigabit – even if the technology can support a gigabit.

There are challenges in counting gigabit-capable customers on cable company networks as well. The cable companies are smart to market all of their products as ‘up to’ speeds because of the shared nature of their networks. The customers in a given neighborhood node share bandwidth and the speeds can drop when the network gets busy. Can you count a household as gigabit-capable if they can only get gigabit speeds at 4:00 AM but get something slower during the evening hours?

It’s going to get even harder to count gigabit capability when there are reliable cellular networks using millimeter wave spectrum. That spectrum is only going to able to achieve gigabit speeds outdoors when in direct line-of-site from a nearby cell site. Can you count a technology as gigabit-capable when the service only works outdoors and drops when walking into a building or walking a few hundred feet away from a cell site?

It’s also hard to know how to count apartment buildings. There are a few technologies being used today in the US that bring gigabit speeds to the front of an apartment building. However, by the time that the broadband suffers packet losses due to inside wiring and is diluted by sharing among multiple apartments, nobody gets a true gigabit product. But ISPs routinely count them as gigabit customers.

There is also the issue of how to not double-count households that can get gigabit speeds from multiple ISPs. There are urban markets with fiber providers like Google Fiber, Sonic, US Internet, EPB Chattanooga, and others where customers can buy gigabit broadband on fiber and also from the cable company. There are even a few lucky customers in places like Austin, Texas and the research triangle in North Carolina where some homes have three choices of gigabit networks after the telco (AT&T) also built fiber.

I’m not sure we need to put much energy into accurately counting gigabit-capable customers. I think everybody would agree an 850 to 950 Mbps connection on Verizon FiOS is blazingly fast. Certainly, a customer getting over 800 Mbps from a cable company has tremendous broadband capability. Technically such connections are not gigabit connections, but the difference between a gigabit connection and a near-gigabit connection for a household is so negligible as to not practically matter.

Starlink Making a Space Grab

SpaceNews recently reported that Elon Musk and his low-orbit space venture Starlink have filed with the International Telecommunications Union (ITU) to launch an additional 30,000 broadband satellites in addition to the 11,927 now in the planning stages. This looks like a land grab and Musk is hoping to grab valuable orbital satellite paths to keep them away from competitors.

The new requests consist of 20 filings requesting to deploy 1,500 satellites each in 20 different orbital bands around the earth. These filings are laying down the gauntlet for other planned satellite providers like OneWeb that has plans for 1,910 satellites, Kuiper (Jeff Bezos) with plans for 3,326 satellites and Samsung with plans for 4,600 satellites.

The Starlink announcements are likely aimed at stirring up regulators at the ITU, which is meeting at the end of this month to discuss spectrum regulations. The FCC has taken the lead in developing satellite regulations. Earlier this year the FCC established a rule where an operator must deploy satellites on a timely basis to keep the exclusive right of the spectrum needed to communicate with the satellites. Under the current FCC rules, a given deployment must be 50% deployed within six years and completely deployed within nine years. In September, Spacelink revised its launch plans with the FCC in a way that meets the new FCC guidelines, as follows:

Satellites Altitude (Km) 50% Completion 100% Completion
Phase 1 1,584 550 March 2024 March 2027
1,600 1,110
400 1,130
375 1,275
450 1,325
Phase 2 2,493 336 Nov 2024 Nov 2027
2,478 341
2,547 346
11,927

This is an incredibly aggressive schedule and would require the company to launch 5,902 satellites by November 24, 2024, or 120 satellites per month beginning in November 2019. To date, the company has launched 62 satellites. The company would then need to step launches up to 166 per month to complete the second half on time.

I’m guessing that Starlink is already starting to play the regulatory game. For example, if they can’t meet the launch dates over the US in that time frame, then some of the constellations might not work in the US. If the company eventually launches all of the satellites it has announced, then every satellite would not need to serve customers everywhere. If the ITU adopts a timeline similar to the US, then it’s likely that other countries won’t award spectrum to every one of the Starlink constellations. Starlink will be happy if each country gives it enough spectrum to be effective there. Starlink’s strategy might be to flood the sky with so many satellites that they can provide service anywhere as long as at least a few of their constellations are awarded spectrum in each country. There are likely to be countries like North Korea, and perhaps China that won’t allow any connections with satellite constellations that bypass their web firewalls.

Starlink faces an additional challenge with many of the planned launches. Any satellite with an orbit at less than 340 kilometers (211 miles) is considered as very low earth orbit (VLEO) since there is still enough earth atmosphere at that altitude to cause drag that eventually degrades a satellite orbit. Anything deployed at VLEO heights will have a shorter than normal life. The company has not explained how it plans to maintain satellites at the VLEO altitudes.

At this early stage of satellite deployment, there is no way to know if Starlink is at all serious about wanting to launch 42,000 satellites. This may just be a strategy to get more favorable regulatory rules. If Starlink is serious about this, you can expect other providers to speed up plans to avoid being locked out of orbital paths. We’re about to see an interesting space race.

The Myth of 5G and Driverless Cars

A colleague sent me an article that had been published earlier this year in MachineDesign magazine that predicts that driverless cars can’t be realized until we have a ubiquitous 5G network. When looking for the original article on the web I noticed numerous similar articles like this one in Forbes that have the same opinion.

These articles and other similar articles predict that high-bandwidth, low-latency 5G networks are only a few years away. I’m not quite sure who these folks think will invest the $100 billion or more that would likely be required to build such a wireless network along all of the roads in the country. None of the cellular carriers have such grandiose plans, and if they did their stockholders would likely replace a management team that suggested such an investment.

It’s easy to understand how this myth got started. When 5G was first discussed, the cellular companies listed self-driving cars as one of the reasons the government should support 5G. However, over time they’ve all dropped this application from their 5G message and it’s no longer a cellular company talking point.

The idea that 5G is needed for self-driving cars is bolstered by the belief that the computing power of a data center is needed to process the massive amounts of data generated by a self-driving car. That very well may be true, and the current versions of self-driving cars are essentially data centers on wheels that contain several fast computers.

The belief that 5G will enable self-driving cars also comes from the promise of low latency, near to that of a direct fiber connection. The folks that wrote these articles envision a massive 2-way data transfer happening constantly with 5G for every self-driving car. I can’t imagine they have ever talked to a network engineer about the challenge of creating 2-way wireless gigabit connections with hundreds of moving cars simultaneously on a freeway at rush hour. It’s hard to envision the small cell site and fiber infrastructure needed to handle that without hiccups. I also don’t know if the authors have recently driven down many rural reads recently to remind themselves of the huge challenge of implementing rural gigabit 5G.

The talk of using wireless for vehicles also ignores some fundamental issues. Wireless technologies are wonky in the real world. Radio waves do odd things in the wild and every wireless network has dead zones and places where the system mysteriously won’t work the way it’s supposed to. Worse, the dead spots and odd spots move around with changes in temperature, humidity, and precipitation.

Network engineers also would advise that for a critical task like driving at high speeds that every vehicle should have a redundant back-up connection, meaning a second wireless connection in case the first one has a problem. Anybody that puts critical tasks on a fiber network invests in such redundancy. Hospitals that use broadband as part of a surgical procedure or a factory that does precision manufacturing will have a second fiber connection to be safe. It’s hard to imagine a redundant connection for a moving car since the only place it can come from is the nearest cell sites that provide the primary connection.

I don’t know how other feel about this, but I’m not about to trust my life to a self-driving car that needs a connection to an external data center to be safe. I know too much about how broadband networks function to believe that 5G networks will somehow always make perfect connections when other fiber networks don’t.

One of the first things that came to my mind when I read these articles was to wonder what happens when there is a fiber outage on the network supporting the 5G cell sites. Do all of the self-driving cars just stop and wait for a broadband signal? I picture a city during an event like the 19-hour CenturyLink fiber outage a year ago and wonder if we are so stupid as to make our transportation systems reliant on external computing and external networks. I sure hope that we are not that dumb.

Looking Back at Looking Forward

I find it interesting to sometimes look backward a few years to see what predictions were made about the future of the telecom industry. Five years ago I went to an NTCA conference where several speakers made predictions about the industry, particularly as it would impact rural America. It’s interesting to look at what was predicted about today just a few years ago. Some predictions were dead on and others fizzled. Following are some of the more interesting misses.

Broadband Technologies. There were predictions that by 2020 that we’d see upgrades to G.Fast in rural copper networks and to next-generation PON equipment for fiber deployments. Neither of these happened for various reasons. US telcos have never accepted G.Fast, although there is widespread adoption in Europe where copper networks are delivering 300 Mbps speeds to customers. The big telcos in the US are making no investments in rural copper unless the FCC fully funds it. Many smaller telcos have taken advantage of changes in the Universal Service Fund to upgrade from copper to fiber rather than upgrade DSL. Next-generation PON electronics are still waiting for one big ISP to buy enough gear to lower prices.

Widespread 5G. It’s not hard to understand why this would have been believed in 2014 since the big carriers were already in hype mode even then. One prediction was that as many as 60% of cellphones would be 5G by 2020. There were several predictions that 5G was going to enable autonomous vehicles and that building fiber along highways would be routine by 2020. There was a prediction that we’d see small cells everywhere, with deployments every 3,000 feet.

The timing of 5G is far behind those predictions. I see where Cisco recently estimated that only 3% of cellphones worldwide would be 5G enabled by 2022. Most experts today believe that the cellular networks will still predominantly rely on 4G LTE even a decade from today. The idea of building a cellular network for autonomous vehicles died – it was always hard to imagine the revenue stream that would have supported that network. We may still get to a dense small cell network someday, but calling for a small cell every 3,000 feet still sounds incredibly aggressive even decades from now.

IoT and LPWAN. There was a prediction that by 2020 that we’d have deployed low bandwidth networks using 900 MHz spectrum that would connect to huge numbers of outdoor IoT sensors. The prediction was that there is a huge revenue opportunity to charge $1 monthly for each sensor. There are still those calling for these networks today, but it’s still not getting any widespread traction.

Widespread Adoption of Augmented and Virtual Reality. Those technologies were on everybody’s future list in 2014. Oculus Rift was the leader in developing virtual reality and Magic Leap had raised several rounds of funding to develop augmented reality. There is now a sizable gaming deployment of virtual reality, but virtual reality has not yet touched the average person or moved beyond gaming. Magic Leap finally started selling a developer headset at the end of last year.

We Should Be Overrun by Now with Robots and Drones. In 2014 there was a prediction of robots everywhere by 2020. New factories are manned today by robots, but robots are still news when they are used in a public-facing function. A few hotels are trying out a robot concierge. There are a few automated fast food restaurants. There are a few hospitals with robots that transport meals and medicines. Robots deliver take-out food in a few city centers and university towns.

Drones are quietly being used for functions like mapping and inspecting storm damage. Flying small drones is now a popular hobby. Amazon keeps experimenting with drone delivery of packages but it’s still in the trial stage. Commercial use of drones is still in its infancy.

Use of Data. My favorite prediction was that by 2020 we’d have software systems that can deliver data at the right place, at the right time, to the right person, on the right device. This harkens back to the old AT&T promise that someday we’d be able to watch any movie we wanted, the minute we wanted. To some degree that old promise came to pass, although it was implemented by somebody other than AT&T.

Some businesses are meeting parts of this prediction today. These are custom platforms that send trouble tickets to technicians, notify employees to connect a new customer, automate ordering of inventory, etc. However, nothing close to that promise has yet made it into our everyday lives. In fact, except for Candy Crush most of us probably still have the same apps on our smartphones we used in 2014. Many of us are still waiting for the digital assistant we were first promised a decade ago.

Got Some Things Right. It’s easy to pick on predictions that never came to pass and I’ve made plenty of those myself. There was some great prediction in 2014. One presenter said we’d continue to see the explosive growth of residential data usage, that would continue to grow at 24% per year – that’s still a dead-on prediction. There was a prediction that businesses would migrate employees to mobile devices and it is routine today to see employees in all sorts of businesses operating from a tablet. There was a prediction of explosive growth of machine-to-machine data traffic, and today this one of the areas fastest traffic growth.

Continued Lobbying for White Space Spectrum

In May, Microsoft submitted a petition to the FCC calling for some specific changes that will improve the performance of white space spectrum used to provide rural broadband. Microsoft has now taken part in eleven white space trials and makes these recommendations based up on the real-life performance of the white space spectrum. Not included in this filing is Microsoft’s long-standing request for the FCC to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has long favored creating just one channel of unlicensed white space spectrum per market – depending on what’s available.

A number of other parties have subsequently filed comments in support the Microsoft proposals including the Wireless Internet Service Providers Association (WISPA), Next Century Cities, New America’s Open Technology Institute, Tribal Digital Village and the Gigabit Libraries Network. One of the primary entities opposed to earlier Microsoft proposals is the National Association of Broadcasters (NAB), which worries about interference with TV stations from white space broadband. However, the group now says that it can support some of the new Microsoft proposals.

As a reminder, white space spectrum consists of the unused blocks of spectrum that are located between the frequencies assigned to television stations. Years ago, at the advent of broadcast television, the FCC provided wide buffers between channels to reflect the capability of the transmission technology at the time. Folks my age might remember back to the 1950s when neighboring TV stations would bleed into each other as ghost signals. As radio technology has improved the buffers are now larger than needed and are larger than buffers between other blocks of spectrum. White space spectrum is using those wide buffers.

Microsoft has proposed the following:

  • They are asking for higher power limits for transmissions in cases where the spectrum sits two or more channels away from a TV station signal. Higher power means greater transmission distances from a given transmitter.
  • They are asking for a small power increase for white space channels that sit next to an existing TV signal.
  • They are asking for white space transmitters to be placed as high as 500 meters above ground (1,640 feet). In the US there are only 71 existing towers taller than 1,000 feet.
  • Microsoft has shown that white space spectrum has a lot of promise for supporting agricultural IoT sensors. They are asking the FCC to change to white space rules to allow for narrowband transmission for this purpose.
  • Microsoft is asking that the spectrum be allowed to support portable broadband devices used for applications like school buses, agricultural equipment and IoT for tracking livestock.

The last two requests highlight the complexity of FCC spectrum rules. Most people would probably assume that spectrum licenses allow for any possible use of spectrum. Instead, the FCC specifically defines how spectrum can be used and the rural white space spectrum is currently only allowed for use as a hot spot or for fixed point-to-point data using receiving antennas at a home or business. The FCC has to modify the rules to allow use for IoT for farms sensors, tractors and cows.

The various parties are asking the FCC to issue a Notice of Proposed Rulemaking to get comments on the Microsoft proposal. That’s when we’ll learn if any other major parties disagree with the Microsoft proposals. We already know that the cellular companies oppose providing multiple white space bands for anything other than cellular data, but these particular proposals are to allow the existing white space spectrum to operate more efficiently.

Are Broadband Investments Increasing?

The largest ISPs and their lobbying arm USTelecom are still claiming that the level of industry capital spending has improved as a direct result of the end of Title II regulation. In a recent blog they argue that capital spending was up in 2018 due to the end of regulation – something they describe as a “forward-looking regulatory framework”. In reality, the new regulatory regime is now zero regulation since the FCC stripped themselves of the ability to change ISP behavior for broadband products and practices.

The big ISPs used this same argument for years leading up to deregulation. They claimed that ISPs held back on investments since they were hesitant to invest in a regulatory-heavy environment. This argument never held water for a few reasons. First, the FCC barely ever regulated broadband companies. Since the advent of DSL and cable modems in the late 1990s, each subsequent FCC has largely been hands-off with the ISP industry.

The one area where the last FCC added some regulations was with net neutrality. According to USTelecom that was crippling regulation. In reality, the CEO of every big telco and cable company has publicly stated that they could live with the basic principles of net neutrality. The one area of regulation that has always worried the big ISPs is some kind of price regulation. That’s really not been needed in the past, but all of the big companies look into the future and realize that the time will come when they will probably raise broadband rates every year. We are now seeing the beginnings of that trend, which is probably why USTelecom keeps beating this particular dead horse to death – the ISPs are petrified of rate regulation of any kind.

The argument that the big ISPs held back on investment due to heavy regulation has never had any semblance to reality. The fact is that the big ISPs make investments for the same reasons as any large corporation – to increase revenues, to reduce operating costs, or to protect markets.

As an example, AT&T has been required to build fiber past 12.5 million passings as part of the settlement reached that allowed them to buy DirecTV. AT&T grabbed that mandate with gusto and has been aggressively building fiber for the past several years and selling fiber broadband. Both AT&T and Verizon have also been building fiber to cut transport expense to cell sites – they are building where that transport is too costly, or where they know they want to install small cell sites. The large cable companies all spent capital on DOCSIS 3.1 for the last few years to boost broadband speeds to protect and nurture their growing monopoly of urban broadband. All of these investment decisions were made for strategic business reasons that didn’t consider the difference between light regulation and no regulation. Any big ISP that says they will forego a strategic investment due to regulation would probably see their stock price tumble.

As a numbers guy, I always become instantly suspicious of deceptive graphs. Consider the graph included in the latest USTelecom blog. It shows the levels of industry capital investments made between 2014 and 2018. The graph makes the swings of investment by year look big due to the graphing trick of starting the bottom of the graph at $66 billion instead of at zero. The fact is that 2018 capital investments are less than 3% higher than the investments made in 2014. This is an industry where the aggregate level of annual investment varies by only a few percent per year – the argument that the ISPs have been unleashed due to the end of Title II regulation is laughable and the numbers don’t show it.

There are always stories every year that can explain the annual fluctuation in industry spending. Here are just a few things that made an significant impact on the aggregate spending in the past few years:

  • Sprint had a cash crunch a few years ago and drastically cut capital spending. One of the primary reasons for the higher 2018 spending is that Sprint spent almost $2 billion more in 2018 than the year before as they try to catch up on neglected projects.
  • AT&T spent $2 billion in 2018 for FirstNet, the nationwide public safety network. But AT&T is not spending their own money – that project is being funded by the federal government and ought to be removed from these charts.
  • Another $3 billion of AT&T’s spending in 2018 was to beef up the 4G network in Mexico. I’m not sure how including that spending in the numbers has any relevance to US regulation.
  • AT&T has been on a tear building fiber for the past four years – but they announced last month that the big construction push is over, and they will see lower capital spending in future years. AT&T has the largest capital budget in the industry and spent 30% of the industry wide $75 billion in 2018 – how will USTelecom paint the picture next year after a sizable decrease in AT&T spending?

The fact that USTelecom keeps harping on this talking point means they must fear some return to regulation. We are seeing Congress seriously considering new consumer privacy rules that would restrict the ability of ISPs to monetize customer data. We know it’s likely that if the Democrats take back the White House and the Senate that net neutrality and the regulation of broadband will be reinstated. For now, the big ISPs have clearly and completely won the regulatory battle and broadband is as close to deregulated as any industry can be. Sticking with this false narrative can only mean that the big ISPs think their win is temporary.

Disney Jumps into the OTT Market

Disney is jumping into the OTT fray joining Netflix, Amazon Prime and others that offer a unique library of content for online viewing. The online product will be marketed as Disney+ and will launch on November 12 in the US, later worldwide. Disney seems to be taking the same low-price philosophy as other online start-ups with initial pricing at $6.99 per month, or an annual subscription for $69.99 for the first year.

Disney is counting on a unique customer base of what it calls ‘true fans’ that adore everything Disney. I can attest they exist and have a wife and daughter in that category. Disney thinks these fans will gain them a lot more subscribers than other competing services.

The company is already sitting on one of the largest libraries of valuable content. Disney has been making huge revenues over the years rolling out old Disney classics periodically and then withdrawing them from the market. I’ve seen speculation that Disney plan to offer their full catalog of Disney classics as part of the offering, but we won’t know for sure until the service is available. Disney has a lot of other popular content as well such as the Star Wars and Marvel comics franchises.

Analysts say the $6.99 price is too low and Disney seems to acknowledge it. I read where an analyst at BTIG Research said that Disney didn’t expect for the service to be profitable until 2024 until the service has over 60 million customers. It’s hard to fathom needing that many customers to break even. But Disney is not quite the same as other programmers. Perhaps they are willing to take a loss on the video library in order to drive revenues in other ways. There has to be a big upside to have over 60 million fans watching your content and advertising and buying Disney merchandise.

The Disney launch enters an already crowded market and in doing so makes it that much harder to justify cord cutting. The OTT services that mimic cable TV like DirecTV Now, Hulu, Sling TV, Playstation Vue, FuboTV, and others have increased prices to rival a subscription to an expanded basic line-up from a cable company. There are then dozens of add-on options of other programming like Netflix, CBS All-Access, HBO Now, and the upcoming Apple TV that lure viewers with unique content. It’s starting to be clear that cord cutting is not cheaper unless a viewer has the discipline to restrict content to only one or two services.

Something else is starting to become clear in the industry, which is that customers who buy traditional cable TV are also subscribing to OTT services like Netflix. At the end of last year PwC reported that the number of Netflix subscribers in the US had surpassed the number of traditional cable subscribers.

The PwC study intended to understand the viewers of OTT content. They wanted to see how viewers handle the huge array of programming options. One of their most interesting findings is that age is becoming less of a factor in understanding OTT usage. When PwC started watching the market four years ago it was easy to identify differently buying and viewing habits between younger and older viewers, but those differences seem to be converging.

For example, PwC found that 28% of older consumers had cut the cord, while in earlier years it was a much smaller percentage. They found that 61% of older viewers now watch content on the Internet, up from less than 50% just a year earlier. They found that there was a higher percentage of customers who claimed they are loyal to traditional cable TV from younger viewers ages 25-34 (22%) than with those older than 50 (16%).

One of the most interesting finding in the PwC study was the extent to which people don’t like for video services to suggest viewing. Only 21% of viewers feel that the suggested viewing on sites like Netflix is better than what they can do themselves. The biggest complaint about all OTT services is the ease of finding content and on 12% say they can find the content they want to watch easily.

The End of the Bundle?

There are a few signs in the industry that we are edging away from the traditional triple play bundle or telephone, cable TV and broadband. The bundle was instrumental in the cable company’s success. Back in the day when DSL and cable modems had essentially the same download speed the cable companies introduced bundles to entice customers to use their broadband. The lure of getting a discount for cable TV due to buying broadband was attractive and gave the cable companies an edge in the broadband marketing battle.

Over time the cable companies became secure in their market share and they created mandatory bundles, meaning they would not sell standalone broadband. Over time this spit the broadband market in cities – the cable company got customers who could afford bundles and the telco with DSL got everybody else. Many of the cable companies became so smug about their bundles that they forced customers to buy cable TV just to get their broadband. I’ve noticed over the last year that most of the mandatory bundles have died.

The bundle lost a little luster when the Julia Laulis, the CEO of CableOne, told her investors in February on the 4Q 2018 earnings call that the company no longer cares about the bundle. She said what I’m sure that many other cable companies are discussing internally, which is that the bundle doesn’t have any impact in attracting customers to buy broadband. On that call she said, “We don’t see bundling as the savior for churn. I know that we don’t put time and resources into pretty much anything having to do with video because of what it nets us and our shareholders in the long run. We pivoted to a data-centric model over five, six years ago, and we’ve seen nothing to derail us from that path.”

Her announcement raises two important issues that probably spell the eventual end of bundling. First, there is no real margin on cable TV. The fully loaded cost of the product has increased to the point where the bottom line of the company is not improved by selling cable. The only two big cable providers who might see some margin from cable TV are Comcast and AT&T since they own some of the programming but for everybody else the margins on cable TV have shrunk to nothing, or might even be negative.

I’ve had a number of clients take a stab at calculating the true cost of providing cable TV. The obvious big cost of the product is the programming fees. But my clients tell me that a huge percentage of their operational costs come from cable TV. They say most of the calls to customer service are about picture quality. They say that they do far more truck rolls due to cable issues than for any other product. By the time you account for those extra costs it’s likely that cable TV is a net loser for most small ISPs – as it obviously is for CableOne, the seventh largest cable company.

The other issue is cable rates. High programming rates keep forcing cable providers to raise the price of the TV product every year. We know that high cable prices are the number one issue cited by cord cutters. Perhaps more importantly, it’s the number one issue driving customer dissatisfaction with the cable company.

I have to wonder how many other big cable companies have come to the same conclusion but just aren’t talking about it. Interestingly, one of the metrics used by analysts to track the cable industry is average revenue per user (ARPU). If cable companies bail on the bundle and lose cable customers their ARPU will drop – yet margins might stay the same or even get a little better. If there is a new deemphasis on bundles and cable TV subscription the industry will need to drop the ARPU comparison.

It’s not going to be easy for a big cable company to back out of the cable TV business. Today there is still a penalty for customers who drop a bundle – dropping cable TV raises the price for the remaining products. We’ll know that the cable companies are serious about deemphasizing cable TV when that penalty disappears.