G.Fast over Coax

There is yet another new technology available to carriers – G.Fast over coaxial cable. Early trials of the technology show it works better than G.Fast over telephone copper.

Calix recently did a test of the new coaxial technology and was able to deliver 500+ Mbps for up to 2,000 feet. This is far better than current G.Fast technology over copper which can handle similar data speeds up to about 800 feet. But telephone G.Fast is improving and Calix just demonstrated a telephone copper G.Fast that can deliver 1 Gbps for about 750 feet.

But achieving the kinds of speeds demonstrated by Calix requires a high-quality telephone copper network. We all know that the existing telephone and coaxial networks in existing buildings are usually anything but pristine. Many existing coaxial cables in places like apartment buildings have been cut and re-spliced numerous times over the years, which will significantly degrade G.Fast performance.

This new technology is definitely going to work best in niche applications – and there may be situations where it’s the clearly best technology for the price. There are a surprising number of coaxial networks in place in homes, apartment buildings, schools, factories and older office buildings that might be good candidates for the technology.

A number of telcos like CenturyLink and AT&T are starting to use G.Fast over telephone copper to distribute broadband to apartment buildings. Since as the incumbent telephone company they can make sure that these networks are available to them. But there might be many apartment buildings where the existing coaxial network could be used instead. The ability to go up to 2,000 feet could make a big difference in larger apartment buildings.

Another potential use would be in schools. However, with the expanding demand for broadband in classrooms one has to wonder if 500 Mbps is enough bandwidth to serve and share among a typical string of classrooms – each with their own heavy broadband demand.

There are also a lot of places that have coaxial networks that you might not think about. For example, coaxial wiring was the historic wiring of choice for the early versions of video surveillance cameras in factories and other large businesses. It would not be hard to add WiFi modems to this kind of network. There are tons of older hotels with end-to-end coaxial networks. Any older office buildings is likely to have coaxial wiring throughout.

But there is one drawback for the technology in that the coaxial network can’t be carrying a cable TV signal at the same time. The coaxial G.Fast operates at the same frequencies as a significant chunk of a traditional DOCSIS cable network. To use the technology in a place like an apartment would mean that the coaxial wiring can no longer be used for cable TV delivery. Or it means converting the cable TV signal to IPTV to travel over the G.Fast. (but that wouldn’t leave much bandwidth for broadband.) But still, there are probably many unused coaxial wiring networks and the technology could use them with very little required rewiring.

It’s more likely that the coaxial G.Fast could coexist with existing applications in places like factories. Those networks typically use MoCA to feed the video cameras, at frequencies that are higher than DOCSIS cable networks.

But my guess is that the interference issue will be a big one for many potential applications. Most apartments and schools are going to still be using their networks to deliver traditional video. And many other coaxial networks will have been so chopped up and re-spliced over time to present a real challenge for the technology.

But this is one more technology to put into the toolbox, particularly for companies that bring broadband to a lot of older buildings. There are probably many cases where this could be the most cost effective solution.

More Pressure on WiFi

As if we really needed more pressure put onto our public WiFi spectrum, both Verizon and AT&T are now launching Licensed Assisted Access (LAA) broadband for smartphones. This is the technology that allows cellular carriers to mix LTE spectrum with the unlicensed 5 GHz spectrum for providing cellular broadband. The LAA technology allows for the creation of ‘fatter’ data pipes by combining multiple frequencies, and the wider the data pipe the more data that makes it to the end-user customer.

When carriers combine frequencies using LAA they can theoretically create a data pipe as large as a gigabit while only using 20 MHz of licensed frequency. The extra bandwidth for this application comes mostly from the unlicensed 5 GHz band and is similar to the fastest speeds that we can experience at home using this same frequency with 802.11AC. However, such high-speed bandwidth is only useful for a short distance of perhaps 150 feet and the most practical use of LAA is to boost cellphone data signals for customers closest to a cell tower. That’s going to make LAA technology most beneficial in dense customer environments like busy downtown areas, stadiums, etc. LAA isn’t going to provide much benefit to rural cellphone towers or those along interstate highways.

Verizon recently did a demonstration of the LAA technology that achieved a data speed of 953 Mbps. They did this using three 5 GHz channels combined with one 20 megahertz channel of AWS spectrum. Verizon used a 4X4 MIMO (multiple input / multiple output) antenna array and 256 QAM modulation to achieve this speed. The industry has coined the new term of four-carrier aggregation for the technology since it combines 4 separate bands of bandwidth into one data pipe. A customer would need a specialized MIMO antenna to receive the signal and also would need to be close to the transmitter to receive this kind of speed.

Verizon is starting to update selected cell sites with the technology this month. AT&T has announced that they are going to start introducing LAA technology along with 4-way carrier aggregation by the end of this year. It’s important to note that there is a big difference between the Verizon test with 953 Mbps speeds and what customers will really achieve in the real world. There are numerous factors that will limit the benefits of the technology. First, there aren’t yet any handsets with the right antenna arrays and it’s going to take a while to introduce them. These antennas look like they will be big power eaters, meaning that handsets that try to use this bandwidth all of the time will have short battery lives. But there are more practical limitations. First is the distance limitation and many customers will be out of range of the strongest LAA signals. A cellular company is also not going to try to make this full data connection using all 4 channels to one customer for several reasons, the primary one being the availability of the 5 GHz frequency.

And that’s where the real rub comes in with this technology. The FCC approved the use of this new technology last year. They essentially gave the carriers access to the WiFi spectrum for free. The whole point of licensed spectrum is to provide data pipes for all of the many uses not made by licensed wireless carriers. WiFi is clearly the most successful achievement of the FCC over the last few decades and providing big data pipes for public use has spawned gigantic industries and it’s hard to find a house these days without a WiFi router.

The cellular carriers have paid billions of dollars for spectrum that only they can use. The rest of the public uses a few bands of ‘free’ spectrum, and uses it very effectively. To allow the cellular carriers to dip into the WiFi spectrum runs the risk of killing that spectrum for all of the other uses. The FCC supposedly is requiring that the cellular carriers not grab the 5 GHz spectrum when it’s already busy in use. But to anybody that understands how WiFi works that seems like an inadequate protection, because any of the use of this spectrum causes interference by definition.

In practical use if a user can see three or more WiFi networks they experience interference, meaning that more than one network is trying to use the same channel at the same time. It is the nature of this interference that causes the most problems with WiFi performance. When two signals are both trying to use the same channel, the WiFi standard causes all competing devices to go quiet for a short period of time, and then both restart and try to grab an open channel. If the two signals continue to interfere with each other, the delay time between restarts increases exponentially in a phenomenon called backoff. As there are more and more collisions between competing networks, the backoff increases and the performance of all devices trying to use the spectrum decays. In a network experiencing backoff the data is transmitted in short bursts between the times that the connection starts and stops from the interference.

And this means that when the cellular companies use the 5 GHz spectrum they will be interfering with the other users of that frequency. That’s what WiFi was designed to do and so the interference is unavoidable. This means other WiFi users in the immediate area around an LAA transmitter will experience more interference and it also means a degraded WiFi signal for the cellular users of the technology – and they reason they won’t get speeds even remotely close to Verizon’s demo speeds. But the spectrum is free for the cellular companies and they are going to use it, to the detriment of all of the other uses of the 5 GHz spectrum. With this decision the FCC might well have nullified the tremendous benefits that we’ve seen from the 5 GHz WiFi band.

Trends for Programmers

It’s always a good idea for anybody that offers a cable product to keep an eye on what is going on with the programmers. Probably the number one problem for small cable operators is the never-ending increase in the prices paid to buy programming. Here are some of the current trends that are going to impact the cost of buying cable programming over the next few years:

Subscriber Losses Continue. All of the major programmers are losing customers. Probably the most widely discussed is ESPN that has lost 13 million customers since 2011. But it’s happening across the board to all of them to a slightly lesser extent.

The loss of customers puts obvious earnings pressure on the programmers. They are now facing a classic Catch-22 situation. If they try to make up for lost revenues by raising rates even faster they are likely to lose customers even faster. It’s getting to be pretty clear that cable rate increases are the driving force behind a lot of cord-cutting. But probably even more important that cord-cutting is cord-shaving where millions of customers are opting for smaller and less expensive channel line-ups. At this point cord-shaving is costing the programmers more loses than cord-cutting – but we don’t know the numbers since the big cable companies are not releasing statistics on cord-shaving.

Advertising Taking a Hit. We are also seeing a crossover point and late last year we saw more advertising being done on the web than on TV. In this latest quarter we are finally starting to see real declines in TV advertising revenue – a far cry from year-after-year growth in ad revenues for the cable networks. For years programmers were on a trajectory of expecting healthy growth of both subscriber revenues and ad revenues, and both are starting to sink at the same time.

At this point the drop in advertising revenues is tiny, but it’s going to get worse as ad spending continues to shift to online. And the ad dollars are not only dropping for the programmers, but drops in advertising are affecting local ad revenues for television stations and cable companies.

Ad revenues are sinking due the shrinking in ‘eyeballs’ watching cable programming. While cord cutting is shaving the total number of cable subscribers, the more substantial issue is that people are spending more time watching Netflix and other OTT content, at the expense of watching cable shows. This means that the ratings for most TV shows has been plummeting and taking with it the willingness of programmers to pay premium rates for ad slots. There is also a big age shift with younger viewers abandoning traditional cable programming at a much faster rate than older generations.

No Easy Shift to Streaming. A lot of programmers were counting on a shift to direct OTT content to help to reverse the shift in traditional TV viewers. For example, Disney / ESPN just announced that they will be offering online versions of their networks starting in early 2018.

But we also just saw NBC cancel their online offering Seeso. The network carried a significant amount of comedy programming and NBC tried to lure customers to pay $3.99 per month for the service. But they had very few takers and that failure is probably scaring the rest of the industry. Recent surveys by Nielsen and others have shown that viewers care as much about the platform as they do about the content. That means that they are only willing to buy a monthly subscription if they see a value in staying on a given platform. The programmers are all hoping that people will be willing to pay a small fee to watch one or two favorite shows, but that doesn’t seem to be the case. People value a platform like Netflix where they can move from show to show without the hassles of logging in to multiple platforms. This doesn’t bode well each programmer creating individual platforms consisting of the same content they show on traditional cable.

Pressure to Create More Content. There are newcomers like Netflix, Amazon and Apple that are spending billions annually to create new content. This is putting a lot of pressure on traditional cable networks to keep up, adding to their bottom line costs. There is always the reward for those handful of hits that become must-watch shows, but the most new content doesn’t generate enough revenue to cover the production costs.

What Does This Mean? All of these trends predict a poorer future for programmers. I think it means some of the following:

  • More mergers. We will probably see more mergers as a way to control costs. We are just now seeing the merger of Discovery and Scripps. But there were only seven major programmers before that merger, so there is only so much benefit that can be gained through mergers.
  • Faster rate increases. These are all publicly traded companies. They are going to try every avenue to maintain earnings, but in the face of dropping subscribers and flat ad revenues they are going to have little ultimate choice but to raise programming rates even faster. But they are also limited in some sense with this because most programming contracts with cable companies are signed on a three-year forward basis, and the prices are already locked for the next few years for most of their cable company customers.
  • Reduced expectations. Programmers have been some of the darlings of Wall Street for the last few decades. But as these new realities sink in there is going to have to be reduced stock prices for these companies as well as lowered expectations about their earnings potential. And in today’s stock-driven corporate world that is anathema. We may be seeing the first hints of an industry whose wheels are coming off.

Where’s the Top of the Broadband Market?

Last week I looked at the performance of the cable TV industry and today I’m taking a comparative look at broadband customers for all of the large ISPs in the country. Following are the comparative results comparing the end of 2Q 2017 to 2Q 2016.

2017 2016 Change
Comcast 25,306,000 23,987,000 1,319,000 5.5%
Charter 23,318,000 21,815,000 1,503,000 6.9%
AT&T 15,686,000 15,641,000 45,000 0.3%
Verizon 6,988,000 7,014,000 (26,000) -0.4%
CenturyLink 5,868,000 5,990,000 (122,000) -2.0%
Cox 4,845,000 4,745,000 100,000 2.1%
Frontier 4,063,000 4,552,000 (489,000) -10.7%
Altice 4,004,000 4,105,000 (101,000) -2.5%
Mediacom 1,185,000 1,128,000 57,000 5.1%
Windstream 1,025,800 1,075,800 (50,000) -4.6%
WOW 727,600 725,700 1,900 0.3%
Cable ONE 521,724 508,317 13,407 2.6%
Fairpoint 307,100 311,440 (4,340) -1.4%
Cincinnati Bell 304,193 296,700 7,493 2.5%
94,149,417 91,894,957 2,254,460 2.5%

All of these figures come from reports published each quarter by Leichtman Research Group. Just like with cable subscribers, these large companies control over 95% of the broadband market in the country – so looking at them provides a good picture of all broadband. Not included in these numbers are the broadband customers of the smaller ISPs, the subscribers of WISPs (wireless ISPs) and customers of the various satellite services. It’s always been fuzzy about how MDUs are included in these numbers. The MDUs served by the major ISPs above are probably counted fairly well. But today there are numerous MDU owners who are buying a large broadband pipe from a fiber provider and then giving broadband to tenants. These customers are a growing demographic and are likely not included accurately in these numbers.

One of the biggest stories here is that the overall market is still growing at a significant rate of almost 2.5% per year. A little over half of the growth is coming from sales of broadband to new housing units. In the last year, with a good economy the country added almost 1.5 million new living units. But there are obviously still other homes buying broadband for the first time.

There has been a debate for years in the country about where the broadband market will top out. Those that don’t have broadband today can be put into four basic categories: 1) those that can’t afford broadband, 2) those that don’t want it 3) those that are happy with a substitute like cellular broadband, and 4) those who have zero broadband available, such as much of rural America.

It’s obvious that cable companies are outperforming telcos and Comcast, Charter and Mediacom gained more than 5% new broadband customers over the last year. But compared to more recent years the telcos have largely held their own, except for Frontier – which had numerous problems during the year including a botched transition for customers purchased from Verizon.

There are a number of industry trends that will be affecting broadband customers over the next few years:

  • We should start seeing rural customers getting broadband for the first time due to the FCC’s CAF II program. We are now in the third year of that program. The number of customers could be significant and CenturyLink estimates it will get at least a 60% penetration where it is expanding its DSL. I have seen reports from all over the country of fixed cellular wireless customers being connected by AT&T and Verizon.
  • The introduction of ‘unlimited’ cellular plans ought to make cellular broadband more attractive, at least to some demographics. While not really unlimited, the data caps of 20 GB or more per month are a huge increase over data caps from prior years.
  • There are almost a dozen companies that have filed requests with the FCC to launch new broadband satellites. The first major such launch was done recently by ViaSat which will use the new satellite to beef up its Excede product. There’s no telling how many of the other FCC filings represent real satellites or just vaporware, but there should be more competition from satellites, particular those that launch in low orbits to reduce the latency issue. The really big unknown is if Elon Musk will be able to launch the massive satellite network he has promised.
  • Lifeline programs. Companies like Comcast and AT&T have quietly launched low-price broadband options for low-income homes. The companies don’t advertise the plans broadly, but there are communities where significant numbers of customers have been added to these programs.

Cable TV Number 2Q 2017

SANYO DIGITAL CAMERA

You can’t read an article about the cable industry without hearing about the erosion of customers due to cord cutting. So I thought I would take a look at the cable customers claimed by the largest cable companies at the end of the second quarters of 2016 and 2017.

2Q 2016 2Q 2017 Change
Comcast 22,396,000 22,516,000 120,000 0.5%
DirecTV 20,454,000 20,856,000 402,000 2.0%
Charter 17,312,000 17,071,000 (241,000) -1.4%
Dish 13,593,000 11,892,000 (1,701,000) -12.5%
AT&T 4,869,000 4,666,000 (203,000) -4.2%
Verizon 4,637,000 3,853,000 (784,000) -16.9%
Cox 4,330,000 4,245,000 (85,000) -2.0%
Altice 3,639,000 3,463,000 (176,000) -4.8%
Frontier 1,340,000 1,007,000 (333,000) -24.9%
Mediacom 842,000 829,000 (13,000) -1.5%
WOW 524,300 458,200 (66,100) -12.6%
Cable ONE 338,974 297,990 (40,984) -12.1%
94,275,274 91,154,190 (3,121,084) -3.3%

These companies represent more than 95% of the whole TV market. According to Leichtman Research these companies together lost around 655,000 cable customers in the second quarter of this year.

What’s most striking about the above table is that the companies in aggregate lost 3.3% or over 3.1 million customers in the last year. One has to only go back two years to see the first instance of the industry losing customers, so these losses are recent. This is reminiscent to me to what happened to telephone landlines. The losses started very slowly, but then the rate of the decline picked up year after year. There is no way to know if cable will take the same path or if the drop in customers will be slower. But I think everybody in the industry from programmers to Wall Street is concerned about losses of this magnitude.

Interestingly, for now the big cable companies are largely maintaining earnings due to rate increases for the remaining cable customers plus continued growth in broadband customers. I’ll have a blog next week looking at the state of broadband.

There are a few interesting things to note in these numbers:

  • The losses in the second quarter of 2017 are actually smaller than the losses from that same quarter of 2016. But the year-over-year losses are significantly more now than they were in the year ending with 2Q 2016.
  • Satellite TV is getting clobbered. While DirecTV is higher, it’s offset to some extent by the loss of customers at parent AT&T which is shifting customers to the satellite platform. Dish networks is the big loser. Much of their customer losses have been offset by Sling TV adding over a million customers during the last year. But it’s rumored in the industry that Sling TV is operating at almost no margin.
  • Comcast continues to buck the rest of the industry and saw a tiny gain of customers over the last year.
  • When looking at these numbers you always must remember that the industry lost customers while there were around 1.5 million new residential living units build last year (homes and apartments). The gains that these companies got from those new homes, probably at least 1 million new customers is masked by the other losses, meaning that the industry lost over 4 million customers during the last year.
  • We know that the cable companies are continuing to take broadband customers from the telcos and there has to be some of that going on in these numbers.

 

Merger Madness

The last year was a busy one for mergers in the industry. We saw Charter gobble up Time Warner Cable and Bright House Networks. We saw CenturyLink buy Level 3 Communications. But those mergers were nothing like we see on the horizon right now. I can barely read industry news these days without reading about some rumored gigantic acquisitions.

There have always been mergers in the industry, but I can’t remember a time when there was this level of merger talk happening. This might be due in part to an administration that says it won’t oppose megamergers. It’s also being driven by Wall Street that makes a lot of money when they find the financing for a big merger. Here are just a few of the mergers being talked discussed seriously in the financial press:

Crown Castle and Lightower. This merger is already underway with Crown Castle paying $7.1 billion for Lightower. It matches up two huge fiber networks along with tower assets to make the new company the major player in the small cell deployment space, particularly in the northeast.

Discovery and Scripps. Discovery Communications announced a deal to buy Scripps Networks for about $11.9 billion. This reduces the already-small number of major programmers and Discovery will be picking up networks like the Food Network, HGTV, Travel Channel, the Cooking Channel and Great American Country.

Comcast, Altice and Charter. Citigroup issued a report that speculates that Comcast and Altice would together buy Charter and split the assets. Comcast would gain the former Time Warner cable systems with the rest going to Altice. There is also talk of Altice trying to finance the purchase of Charter on its own. But with Charter valued at about $120 billion while also carrying around $63 billion in debt that seems like a huge number to finance. This would be an amazing merger with the ink not yet dry on Charter’s merger with Time Warner.

Amazon and Dish Network. This makes sense because Amazon could finally help Dish capitalize on its 700 E-block and AWS-4 spectrum licenses. This network could be leveraged by Amazon to track trucks and packages, monitor the IoT and to control drones.

T-Mobile and Sprint. Deutsche Telecom currently owns 63% of T-Mobile and Softbank owns 82% of Sprint. A straight cashless merger would create an instantly larger company and gain major operational advantages. The FCC and the Justice Department nixed a merger between T-Mobile and AT&T a few years back, but in an environment where the cellular companies are getting into the wireless business this might sail through a lot easier today. Sprint has also been having negotiations for either a merger or some sort of partnership with Comcast and Charter.

Comcast and Verizon. There is also Wall Street speculation about Comcast buying Verizon. The big advantage would be to merge the Comcast networks with the Verizon Wireless assets. Comcast has a history of buying companies in distress and Verizon’s stock price has dipped 17% already this year. But this would still be a gigantic merger worth as much as $215 billion. There are also some major regulatory hurdles to overcome with the big overlap in the northeast between Comcast and the Verizon FiOS networks.

FCC Takes a New Look at 900 MHz

The FCC continues its examination of the best use of spectrum and released a Notice of Inquiry on August 4 looking at the 900 MHz band of spectrum. They want to know if there is some better way to use the spectrum block. They are specifically looking at the spectrum between 896–901 MHz and 935-940 MHz.

The FCC first looked at this frequency in 1986 and the world has changed drastically since then. The frequency is currently divided into 399 narrowband channels grouped into 10-channel blocks. This licensed use of the spectrum varies by MTA (Major Trading Area), where channels have been allocated according to local demand from commercial users.

One of the more common uses of the spectrum is for SMR service (Specialized Mobile Radio), which is the frequency that’s been used in taxis and other vehicle fleets for many years. The other use is more commonly referred to as B/ILT purposes (Business/Industrial Land Transportation). This supports radios in work fleets, and is used widely to monitor and control equipment (such as monitoring water pumps in a municipal water system). The frequency was also widely used historically for public safety / police networks using push-button walkie-talkies (although cellphones have largely taken over that function).

The FCC currently identifies 2,700 sites used by 500 licensees in the country that are still using B/ILT radios and technologies. These uses include security at nuclear power plants including public alert notifications, flood warning systems, smart grid monitoring for electric networks, and for monitoring petroleum refineries and natural gas distribution systems.

But we live in a bandwidth hungry world. One of the characteristics of this spectrum is that it’s largely local in nature (good for distances of up to a few miles, at most). When mapping the current uses of the frequency it’s clear that there are large portions of the country where the spectrum is not being used. And this has prompted the FCC to ask if there is a better use of the spectrum.

Typically the FCC always finds ways to accommodate existing users and regardless of any changes made it’s unlikely that they are going to cut off use of the spectrum in nuclear plants, electric grids and water systems. But to a large degree the spectrum is being underutilized. Many of the older uses of the spectrum such as walkie-talkies and push-to-talk radios have been supplanted by newer technologies using other spectrum. With that said, there are still some places where the old radios of this type are still in use.

The FCC’s action was prompted by a joint proposal by the Enterprise Wireless Alliance (EWA) and Pacific DataVision (PDV). This petition asks for the frequency to be realigned into three 3 MHz bands that can be used for wireless broadband and two 2 MHz bands that could be used to continue to support the current narrowband uses of the spectrum. They propose that the broadband channels be auctioned to a single user in each BTA but that the narrowband uses continue to be licensed upon request in the same manner as today.

This docket is a perfect example of the complexities that the FCC always has deal with in changing the way that we use spectrum. The big question that has to always be addressed by the FCC is what to do with existing users of the spectrum. Any new allocation plan is going to cause many existing users to relocate their spectrum within the 900 MHz block or to spectrum elsewhere. And it’s generally been the practice of the FCC to make new users of spectrum pay to relocate older uses of spectrum that must be moved. And so the FCC must make a judgement call about whether it makes monetary sense to force relocation.

The FCC also has to always deal with technical issues like interference. Changing the way the spectrum will be used from numerous narrowband channels to a few wideband channels is going to change the interference patterns with other nearby spectrum. And so the FCC must make a determination of the likelihood of a spectrum change not causing more problems than it solves.

This particular band is probably one of the simpler such tasks the FCC can tackle. While the users of the spectrum perform critical tasks with the current spectrum, there is not an unmanageable number of current users and there are also large swaths of the US that have no use at all. But still, the FCC does not want to interfere with the performance at nuclear plants, petroleum refineries or electric grids.

For anybody that wants to read more about how the FCC looks at spectrum, here is the FCC Docket 17-200. The first thing you will immediately notice is that this document, like most FCC documents dealing with wireless spectrum, is probably amongst the most jargon-heavy documents produced by the FCC. But when talking about spectrum the jargon is useful because the needed discussions must be precise. But it is a good primer on the complications involved in changing the way we use spectrum. There has been a recent clamor from the Congress to free up more spectrum for cellular broadband, but this docket is a good example of how complex of an undertaking that can be.

Big ISPs Want to be Regulated

I’ve always contended that the big ISPs, regardless of their public howling, want to be regulated. It is the nature of any company that is regulated to complain about regulation. For the last decade as AT&T and Verizon made the biggest telecom profits ever they have released press release after press release decrying how regulation was breaking their backs. The big telcos and cable companies spent the last few years declaring loudly that Title II regulation was killing incentives to make investments, while spending record money on capital.

A few months ago Comcast, Charter, and Cox filed an amicus brief in a lawsuit making its way through the US. Court of Appeals for the Ninth Circuit. In that brief they asked the federal appeals court to restore the Federal Trade Commission’s jurisdiction over AT&T. The specific case being reviewed had to do with deceptive AT&T marketing practices when they originally offered unlimited cellular data plans. It turns out that AT&T throttled customer speeds once customers reached the meager threshold of 3 – 5 GB per month.

In 2014 the FTC sued AT&T for the practice and that’s the case now under appeal. It’s a bit extraordinary to see big ISPs siding with the government over another ISP, and the only reason that can be attributed to the suit is that these companies want there to be a stable regulatory environment. In the brief the cable companies expressed the desire to “reinstate a predictable, uniform, and technology-neutral regulatory framework that will best serve consumers and businesses alike.”

That one sentence sums up very well the real benefit of regulation to big companies. As much as they might hate to be regulated, they absolutely hate making huge investments in new product lines in an uncertain regulatory environment. When a big ISP knows the rules, they can plan accordingly.

One scenario that scares the big ISPs is living in an environment where regulations can easily change. That’s where we find ourselves today. It’s clear that the current FCC and Congress are planning on drastically reducing the ‘regulatory burden’ for the big ISPs. That sounds like an ideal situation for the ISPs, but it’s not. It’s clear that a lot of the regulations are being changed for political purposes and big companies well understand that the political pendulum swings back and forth. They dread having regulations that change with each new administration.

We only have to go back a few decades to see this in action. The FCC got into and then back out of the business of regulating cable TV rates several times in the late 1970s and the 1980s. This created massive havoc for the cable industry. It created uncertainty, which hurt their stock prices and made it harder for them to raise money to expand. The cable industry didn’t become stable and successful until Congress finally passed several pieces of cable legislation to stop these regulatory swings.

Big companies also are not fond of being totally deregulated. That is the basis for the amicus brief in the AT&T case. The big ISPs would rather be regulated by the FTC instead of being unregulated. The FTC might occasionally slap them with big fines, but the big companies are smart enough to know that they have more exposure without regulations. If the FTC punishes AT&T for its marketing practices that’s the end of the story. But the alternative is for AT&T to have to fend off huge class action lawsuits that will seek damages far larger than what the FTC will impose. There is an underlying safety net by being regulated and the big ISPs understand and can quantify the risk of engaging in bad business practices.

In effect, as much as they say that hate being regulated, big companies like the safety of hiding behind regulators who protect them as much as they protect the public. It’s that safety net that can allow a big ISP to invest billions of capital dollars.

I really don’t think the FCC is doing the big ISPs any favors if they eliminate Title II regulations. Almost every big ISP has said publicly that they are not particularly bothered by the general principles of net neutrality – and I largely believe them. Once those rules were put into place the big companies made plans based upon those rules. The big ISPs did fear that some future FCC might use Title II rules to impose rate regulation – much as the disaster with the cable companies in the past. But overall the regulation gives them a framework to safely invest in the future.

I have no doubt that the political pendulum will eventually swing the other way – because it always does. And when we next get a democratic administration and Congress, we are likely to see much of the regulations being killed by the current FCC put back into place by a future one. That’s the nightmare scenario for a big ISP – to find that they have invested in a business line that might be frowned upon by future regulators.

FirstNet – A Boon or Boondoggle?

The federal program FirstNet was born out of the tragedy of the 9/11 terrorist attacks. At the time there was a lot of negative press when it was realized that first responders from New Jersey were unable to communicate with those from New York. And the idea was born to create a nationwide platform so that all first responders could easily communicate with each other.

The FirstNet concept first tackled the concept of interoperability. There were a number of jurisdictions where interoperability was an issue then. But since 9/11 most metropolitan areas have solved the interoperability issue on their own. The fire and police departments in regions got together in the years after 9/11 and made sure they could communicate with each other. One of the easiest fixes was for first responders to add cellphones to complement the first responder radios that were the major method of communications in 2001. So the concept morphed into a discussion of finding cellular bandwidth for first responders. We’ve seen repeatedly that local cellular networks instantly get clogged during any kind of major emergency, and this means that first responders have trouble making cellphone connections just like everybody else.

Congress stepped into the discussion in 2012 and created FirstNet (First Responder Network Authority). As part of that action Congress set aside Band 14 of the 700 MHz spectrum for the exclusive use of first responders nationwide. After several reboots of the RFP process the new agency finally chose AT&T to provide a nationwide LTE network for first responders. The company was given $7 billion as the first payment towards creating the nationwide cellular network. The GAO had estimated that the final network could cost as much as $47 billion.

States were given the right to opt-in to FirstNet with zero cost to the states. In the last month or so thirteen states have chosen to be part of the effort. That means that AT&T will provide the network in those states using federal dollars.

But there is a huge question, mostly technical, of whether this network makes any sense. A lot of engineers say that FirstNet is overkill and that there are now other ways to solve the same problem. A hint of how easily this can be done came from a press release from Kansas, which just bought into FirstNet. In that release AT&T said that until FirstNet is built in the state that first responders would immediately get priority access to cell towers and by the end of this year would have preemptive access – meaning that a call attempt made by a first responder would shove somebody else off the cellular network. Providing preemptive access is a far less costly way of solving the problem. If first responders can be given preemptive access that easily, then there really is no longer a need for FirstNet.

To add fuel to the fire, Verizon just announced at the end of the next week that they would offer these same services to first responders everywhere – and with zero federal dollars. Verizon will immediately offer preemptive access to cell towers to all first responders.

Any topic having to do with first responders is always an emotional one and much of the first responder community has bought into the concept of having interference-free spectrum. But the Verizon announcement shows that the FirstNet solution is obsolete before the first piece of network is constructed.

And the FirstNet implementation comes with a big national cost. It’s clear that we need a huge amount of bandwidth to satisfy customer demands for cellular data. It seems wasteful to use a slice of prime spectrum in Band 14 of 700 MHz when it’s not needed. That spectrum is worth more to the country for providing cellular data than for handling calls from first responders. This would not be true if first responders really needed this spectrum to communicate – but the cellular companies can give them preemptive access using existing cellular spectrum. For the vast majority of time the FirstNet spectrum will sit virtually unused – at any given time in a city it might be handling hundreds of transmissions from first responders when it could instead be handling hundreds of thousands of transmissions for everybody.

There is also the rural issue to deal with. FirstNet is supposed to provide nationwide first responder access. But as somebody who travels widely in rural America, I can tell you that a lot of the AT&T LTE coverage map is bosh. There is a whole lot of rural America where cell coverage is either spotty or non-existent. When you get to a rural place you quickly come to understand the short distance that a cell signal travels from any given cellular tower. There are gaps everywhere in rural America between widely-spaced cell towers.

First responders in rural America are not going to rely on the FirstNet spectrum even if it’s freely available to them. They are more likely going to keep their current radio networks that work today, using spectrum that travels farther than the 700 MHz spectrum. I can’t help but picture a rural tragedy, such as a downed-plane, where first responders from outside the area will have no communication ability if the FirstNet signal to the needed area is weak or nonexistent.

I see this as another giant government handout to the huge carriers. You can be assured that a lot of the money going to AT&T will go to their bottom line. I hope, at least, that some of the money they are getting for FirstNet will at least improve normal cellular coverage in rural America – but I’m not holding my breath. To me this seems like another big federal program that is being spent to fix a problem that no longer exists. Local jurisdictions solved the interoperability problem in the first few years after 9/11. And the ability of cellular companies to give preemptive access to first responders means there is no reason to set aside a huge valuable slice of spectrum.

CHECK Comments for correction and update.

OTT News – August 2017

SANYO DIGITAL CAMERA

It’s been a busy time in the OTT market with players coming and going and the choices available to customers growing more complicated and confusing.  Here are some of the bigger recent events in the industry.

Continued Cord Cutting. The major cable providers lost 946,000 cable customers in the second quarter – the worst quarterly loss ever. This puts cord cutting at an annual loss rate of 2.7% of customer, up from only 1% a year ago. It’s obvious that cord cutting is picking up momentum, and the wide variety of OTT viewing has to be a contributor. Nielsen recently reported that 62% of homes now watch OTT content at least occasionally.

It’s getting harder for analysts to count cable customers. For example, Dish Networks is not reporting on the specific performance of its satellite service versus SlingTV. The losses for the quarter were also eased a bit by the fact that Charter began counting seasonal customers even when they go dormant, such as the snowbird in Florida who subscribe only in the winter but who keep the account active.

ESPN / Disney OTT Offering. Disney announced that it would be launching two new OTT offerings in 2019 – a standalone ESPN offering and a standalone Disney offering. Along with this announcement they announced they will be withdrawing Disney content from Netflix. The ESPN offering will not duplicate the cable version of the network and will not include things like the NFL and NBA. But it will include major league baseball, the NHL, major league soccer, grand slam tennis events and college sports. Analysts think this offering is mandatory since ESPN has lost 13 million subscribers since 2011 and advertising revenues dropped 8% last quarter.

The standalone Disney offering is also interesting in that the company has decided to take Netflix on head-to-head. Because of contractual arrangements Netflix will still have access to content produced by Disney such as the numerous shows produced by Disney’s Marvel Studios. But starting in 2019 Disney is going to make new content only available on their own platform. This prompted Netflix to purchase Millarworld, a major comics producer.

NBC Closing Seeso. NBCUniversal says that it will be ending the Seeso OTT offering later this year. This is an offering that consisted largely of NBC comedy and related entertainment such as Saturday Night Live and the Tonight with Jimmy Fallon.

This failure is a big warning to the many cable networks that have been contemplating using the strategy of shoving existing content online. Industry analysts say that simply taking linear content online is not a recipe for success. It seems that the platform is just as important as the concept and the bigger platforms like Netflix keep customers engaged and enabling them to move from show to show without leaving the platform. But it’s too easy for a customer to leave a limited-offering platform, thus diminishing the perceived value for customers to buy a subscription.

Facebook OTT Offering. Facebook has announced the launch of Watch, an OTT service that will include content from A&E, Univision, Major League Baseball and other content such as worldwide soccer. For now the new service is being launched overseas with some limited US trials, but is expected to hit the whole US market later this year.

The offering is being structured like YouTube to enable content creators to launch their own channels. Facebook is currently funding some content providers to seed content on the new service. They are hoping that within time the platform becomes self-sustaining and can be an alternative to the wildly popular YouTube. Facebook is counting on their ability to lure enough of their billion plus users to the new platform to make it a success. The company’s goal is to keep people on their platform for more than just social networking.

Apple. Apple will be entering the OTT world and announced that they will spend $1 billion to create programming content over the next year. This puts them into rarified company with Netflix that is spending $6 billion, Amazon at $4.5 billion and HBO at $2 billion. There is no news yet of the nature or timing of an Apple OTT offering.