The Trajectory of Cord Cutting

2017 was the year that cord cutting became a real phenomenon. The industry has been talking about cord cutting for around 5 years. In the beginning the phenomenon manifested by a slowing and stalling of the growth of cable subscribers. Many industry pundits a few years ago opined that cord cutting was a minor phenomenon because they believed that people couldn’t walk away from their favorite programming.

But in 2016 the industry as a whole lost a million customers. That sounds like a lot, but in an industry with roughly 90 million customers, the hope of the industry was that cord cutting would take decades to have any major bottom line impact. 2017 then saw a loss of 2.4 million customers and the whole industry now agrees that cord cutting is real and that it is accelerating.

The big question now is the future trajectory for cord cutting – how is this going to affect the industry over the next five years? We have past experience from watching another major telecom product take a nose dive. Back in the mid-1990s almost 99% of US homes had landlines. Today that number is down to just under 44% according to surveys done annually by the Center for Disease Control (CDC). The government agency has been asking about landline penetrations as part of a much broader survey for several decades.

I would venture to say that hardly anybody in the industry can easily tell you how fast we have been losing landlines. It’s something we all know about, but I know I had no idea about the rate of decline of decline of landlines since the 1990s.

Just like with cable TV, in the early years the rate of landline loss was relatively slow. I remember being asked about landline losses in 1997, the year I started CCG Consulting. At that time the industry was losing around 1 million customers per year. But a lot of prognosticators predicted that sale of landlines would collapse since everybody was going to change to cellphones.

But the CDC statistics tell a different story. Those statistics show that by 2004 the industry still had a 93% market penetration. Since then there has been a steady decline of landline that make an almost straight line graph to end at today’s penetration rate of 44%. I doubt that there were any industry experts in 2004 who would have predicted that there would still be a 44% penetration of landlines in 2017. During the 13-year period from 2004 to 2017 roughly 4.5 million households dropped landlines each year. The rate of loss is neither accelerating or declining.

There is no reason to think that the decline of cable TV will happen in the identical fashion. But for the first five years of customer losses the two industries have nearly the same story. Losses started slowly, and even after five years the rate of loss of cable customers is still half of the annual loss of landlines.

The industries are also different. In telecom the two biggest phone companies at the beginning of the decline of landlines were Verizon and AT&T and they also have been the biggest beneficiaries of the growth of the cellphones that replace landlines. Both companies are larger and far more profitable now than they were in the mid-90s. We are unlikely to see the same thing happening in cable. It appears that cord cutters are fleeing to a wide array of programming alternatives – and most of those alternatives are not owned by the same companies that have been profiting from cable TV.

The cable companies are clearly losing customers and revenues. The two satellite TV companies alone lost 1.7 million customers just in 2017. Continued losses of that magnitude are going to quickly affect some of the biggest cable providers. The programmers are also losing paying customers at a rapid clip. When households flee to online video providers they replace traditional 200-channel lineups with much smaller ones, meaning that a lot of individual cable networks are bleeding customers.

What might make the difference between cable and landline industries is the way the industry is reacting to the losses. In the landline world we saw the emergence of lower-cost alternatives to telco landlines as the cable companies got into the business. Even today cable landlines mostly cost less than telco landlines. I would have to think that the ability for customers to cut costs helped to stave off landline losses.

But the cable industry seems to be reacting by raising rates even faster than historically. It looks like the programmers want to get as much money as possible out of the industry before it disappears. That mentality is pushing up cable rates faster than ever and high prices seem to be the major motivation behind cord cutting. My guess is that if the cable industry stays on the same trajectory as today that it’s going to lose customers far faster than the historic drop in landlines. But my crystal ball is no better than anybody else’s, so like everybody else I’ll keep watching the statistics.

Is the FCC Disguising the Rural Broadband Problem?

Buried within the FCC’s February Broadband Deployment Report are some tables that imply that over 95% of American homes can now get broadband at speeds of at least 25/3 Mbps. That is drastically higher than the report just a year earlier. The big change in the report is that the FCC is now counting fixed wireless and satellite broadband when compiling the numbers. This leads me to ask if the FCC is purposefully disguising the miserable condition of rural broadband?

I want to start with some examples from this FCC map that derives from the data supporting the FCC’s annual report. I started with some counties in Minnesota that I’m familiar with. The FCC database and map claims that Chippewa, Lyon, Mille Lacs and Pope Counties in Minnesota all have 100% coverage of 25/3 broadband. They also claim that Yellow Medicine County has 99.59% coverage of 25/3 Mbps broadband and the folks there must be wondering who is in that tiny percentage without broadband.

The facts on the ground tell a different story. In real life, the areas of these counties served by the incumbent telcos CenturyLink and Frontier have little or no broadband outside of towns. Within a short distance from each town and throughout the rural areas of the county there is no good broadband to speak of – certainly not anything that approaches 25/3 Mbps. I’d love to hear from others who look at this map to see if it tells the truth about where you live.

Let me start with the FCC’s decision to include satellite broadband in the numbers. When you go to the rural areas in these counties practically nobody buys satellite broadband. Many tried it years ago and using it is a miserable experience. There are a few satellite plans that offer speeds as fast as 25/3 Mbps. But satellite broadband today has terrible latency, as high as 900 milliseconds. Anything over 100 milliseconds makes it hard or impossible to do any real-time computing. That means on satellite broadband that you can’t stream video. You can’t have a Skype call. You can’t connect to a corporate WAN and work from home or connect to online classes. You will have problems staying on many web shopping sites. You can’t even make a VoIP call.

Satellite broadband also has stingy data caps that make it impossible to use as a home broadband connection. Most of the plans come with a monthly data caps of 10 GB to 20 GB, and unlike cellular plans where you can buy additional data, the satellite plans cut you off for the rest of the month when you hit your data cap. And even with all of these problems, it’s also expensive and is priced higher than landline broadband. Rural customers have voted with their pocketbooks that satellite broadband is not broadband that many people are willing to tolerate.

Fixed wireless is a more mixed bag. There are high-quality fixed wireless providers who are delivering speeds as fast as 100 Mbps. But as I’ve written about, most rural fixed broadband delivers speeds far below this and the more typical fixed wireless connection is somewhere between 2 Mbps and 6 Mbps.

There are a number of factors needed to make a quality fixed broadband connection. First, the technology must be only a few years old because older radios older were not capable of reaching the 25/3 speeds. Customers also need a clear line-of-sight back to the transmitter and must be within some reasonable distance from a tower. This means that there are usually s significant number of homes in wireless service areas that can’t get any coverage due to trees or being behind a hill. Finally, and probably most importantly, the wireless provider needs properly designed network and a solid backhaul data pipe. Many WISPs pack too many customers on a tower and dilute the broadband. Many wireless towers are fed by multi-hop wireless backhaul, meaning the tower doesn’t have enough raw bandwidth to deliver a vigorous customer product.

In the FCC’s defense, most of the data about fixed wireless that feeds the database and map is self-reported by the WISPs. I am personally a big fan of fixed wireless when it’s done right and I was a WISP customer for nine years. But there are a lot of WISPs who exaggerate in their marketing literature and tell customers they sell broadband up to 25/3 Mbps when their actual product might only be a tiny fraction of those speeds. I have no doubt that these WISPs also report those marketing speeds to the FCC, which leads to the errors in the maps.

The FCC should know better. In those counties listed above I would venture to say that there are practically no households who can get a 25/3 fixed wireless connection, but there are undoubtedly a few. I know people in these counties gave up on satellite broadband many years ago. My conclusion from the new FCC data is that this FCC has elected to disguise the facts by claiming that households have broadband when they don’t. This is how the FCC is letting themselves off the hook for trying to fix the rural broadband shortages that exist in most of rural America. We can’t fix a problem that we won’t even officially acknowledge, and this FCC, for some reason, is masking the truth.

SDN Finally Comes to Telecom

For years we’ve heard that Software Defined Networking (SDN) is coming to telecom. There have been some movement in that area in routing on long-haul fiber routes, but mostly this network concept is not being used in telecom networks.

AT&T just announced the first major deployment of SDN. They will be introducing more than 60,000 ‘white box’ routers into their cellular networks. White box means that the routers are essentially blank generic hardware that comes with no software or operating systems. This differs from the normal routers from companies like Cisco that come with a full suite of software that defines how the box will function. In fact, from a cost perspective the software costs a lot more than the software in a traditional router.

AT&T will now be buying low-cost hardware and will load their own software onto the boxes. This is not a new concept and the big data center companies like Facebook and Google have been doing this for several years. SDN let’s a provider load only the software they need to support just the functions they need. The data center providers say that simplifying the software saves them a fortune in power costs and air conditioning since the routers are far more efficient.

AT&T is a little late to the game compared to the big web companies, and it’s probably taken them a lot longer to develop their own proprietary suite of cell site software since it’s a lot more complicated than switches in a big data center. They wouldn’t want to hand their cell sites over to new software until it’s been tested hard in a variety of environments.

This move will save AT&T a lot of money over time. There’s the obvious savings on the white box routers. But the real savings is in efficiency. AT&T has a fleet of employees and contractors whose sole function is to upgrade cell sites. If you’ve followed the company you’ve seen that it takes them a while to introduce upgrades into their networks as technicians often have to visit every cell site, each with different generics of operating hardware and software.

The company will still need to visit cell sites to make hardware changes, but the promise of SDN is that software changes can be implemented across their whole network in a short period of time. This means they can fix security flaws or introduce new features quickly. They will have a far more homogeneous network where cell sites use the same generics of hardware and software, which should reduce glitches and local problems. The company will save a lot on labor and contractor costs.

This isn’t good news for the rest of the industry. This means that Cisco and other router makers are going to sell far fewer telecom-specific routers. The smaller companies in the country have always ridden the coattails of AT&T and Verizon, whose purchase of switches and routers pulled down the cost of these boxes for everybody else. These big companies also pushed the switch manufacturers to constantly improve their equipment, and the volume of boxes sold justified the router manufacturers to do the needed R&D.

You might think that smaller carriers could also buy their own white box routers to also save money. This looks particularly attractive since AT&T is developing some of the software collaboratively with other carriers and making the generic software available to everybody. But the generic base software is not the same software that will run AT&T’s new boxes. They’ve undoubtedly sunken tens of millions into customizing the software further. Smaller carriers won’t have the resources to customize this software to make it fully functional.

This change will ripple through the industry in other ways. For years companies often hired technicians who had Cisco certification on various types of equipment, knowing that they understood the basics of how the software could be operated. But as Cisco and other routers are edged out of the industry there are going to be far fewer jobs for those who are Cisco certified. I saw an article a few years ago that predicted that SDN would decimate the technician work force by eliminating a huge percentage of jobs over time. AT&T will need surprisingly few engineers and techs at a central hub now to update their whole network.

We’ve known this change has been coming for five years, but now the first wave of it is here. SDN will be one of the biggest transformational technologies we’ve seen in years – it will make the big carriers nimble, something they have never been. And they are going to make it harder over time for all of the smaller carriers that compete with them – something AT&T doesn’t mind in the least.

The Demand for Upload Speeds

I was recently at a public meeting about broadband in Davis, California and got a good reminder of why upload speeds are as important to a community as download speeds. One of the people making public comments talked about how uploading was essential to his household and how the current broadband products on the market were not sufficient for his family.

This man needed good upload speeds for several reasons. First, he works as a photographer and takes pictures and shoots videos. He says that it takes hours to upload and send raw, uncompressed video to one of his customers and says the experience still feels like the dial-up days. His full-time job is working as a network security consultant for a company that specializes in big data. As such he needs to send and receive large files, and his home upload bandwidth is also inadequate for that – forcing him to go to an office for work that could otherwise be done from his home. Finally, his daughter creates YouTube content and has the same problem uploading content – which is particularly a problem when her content deals with time-sensitive current events and waiting four hours to get the content to YouTube kills the timeliness of her content.

This family is not unusual any more. A decade ago, a photographer led the community effort to get faster broadband in a city I was working with. But he was the only one asking for faster upload speeds and most homes didn’t care about it.

Today a lot of homes need faster upload speeds. This particular family had numerous reasons including working from home, sending large data files and posting original content to the web. But these aren’t the only uses for faster upload speeds. Gamers now need faster upload speeds. Anybody who wants to remotely check their home security cameras cares about upload speeds. And more and more people are migrating to 2-way video communications, which requires those at both ends to have decent uploading. We are just now seeing the early trials of virtual presence where communications will be by big-bandwidth virtual holograms at each end of the communications.

Davis is like many urban areas in that the broadband products available have slow upload speeds. Comcast is the cable incumbent, and while they recently introduced a gigabit download product, their upload speeds are still paltry. DSL is offered by AT&T which has even slower upload speeds.

Technologies differ in their ability to offer upload speeds. For instance, DSL is technically capable of sending the data at the same speeds for upload or download. But DSL providers have elected to stress the download speed, which is what most people value. So DSL products are set with small upload and a lot of download. It would be possible to give a customer the choice to vary the mix between upload and download speeds, but I’ve never heard of an ISP who tried to provide this as an option to customers.

Cable modems are a different story. Historically the small upload speeds were baked directly into the DOCSIS standard. When Cable Labs created DOCSIS they made upload speeds small in response to what cable companies asked from them. Until recently, cable companies have had no option to increase upload speeds beyond the DOCSIS constraints. But Cable Labs recently amended the new DOCSIS 3.1 standard to allow for much upload speeds of nearly a gigabit. The first release of the new DOCSIS 3.1 standard didn’t include this, but it’s now available.

However, a cable company has to make sacrifices in their network if they want to offer faster uploads. It takes about 24 empty channels (meaning no TV signal) on a cable system to provide gigabit download speeds. A cable company would need to vacate many more channels of programming to also offer faster uploads and I don’t think many of them will elect to do so. Programming is still king and cable owners need to balance the demand for more channels compared to demand for faster uploads.

Fiber has no real constraints on upload speeds up to the capability of the lasers. The common technologies being used for residential fiber all allow for gigabit upload speeds. Many fiber providers set speeds to symmetrical, but others have elected to limit upload speeds. The reason I’ve heard for that is to limit the attractiveness of their network for spammers and others who would steal the use of fast uploading. But even these networks offer upload speeds that are far faster than the cable company products.

As more households want to use uploading we are going to hear more demands for a faster upload option. But for now, if you want super-fast upload speeds you have to be lucky enough to live in a neighborhood with fiber-to-the-home.

The Looming Backhaul Crisis

I look forward a few years and I think we are headed towards a backhaul crisis. Demand for bandwidth is exploding and we are developing last-mile technologies to deliver the needed bandwidth, but we are largely ignoring the backhaul network needed to feed customer demand. I foresee two kinds of backhaul becoming a big issue in the next few years.

First is intercity backhaul. I’ve read several predictions that we are already using most of the available bandwidth on the fibers that connect major cities and the major internet POPs. It’s not hard to understand why. Most of the fiber between major cities was built in the late 1990s or even earlier, and much of that construction was funded by the telecom craze of the 90s where huge money was dumped into the sector.

But there has been very little new fiber construction on major routes since then, and I don’t see any carriers with business plans to build more fiber. You’d think that we could get a lot more bandwidth out of the existing fiber routes by upgrading the electronics on those fiber, but that’s not the long-haul fiber network operates. Almost all of the fiber pairs on existing routes have been leased out to various entities for their own private uses. The reality is that nobody really ‘owns’ these fiber routes since the routes are full of carriers that each have a long-term contract to use a few of the fibers. As long as any of these entities has enough bandwidth for their own network purposes they are not going to sink the big money into upgrading to terabit lasers, which are still very expensive.

Underlying that is a problem that nobody wants to talk about. Many of those fibers are aging and deteriorating. Over time fiber runs into problems and gets opaque. This can come from having too many splices in the fiber, or from accumulated microscopic damage from stress during fiber construction or due to temperature fluctuations. Fiber technology has improved tremendously since the 1990s – contractors are more aware of how to handle fiber during the construction period and the glass itself has improved significantly through improvements by the manufacturers.

But older fiber routes are slowly getting into physical trouble. Fibers go bad or lose capacity over time. This is readily apparent when looking at smaller markets. I was helping a client look at fibers going to Harrisburg, PA and the fiber routes into the city are all old and built in the early 90s and are experiencing regular outages. I’m not pointing out Harrisburg as a unique case, because the same is true for a huge number of secondary communities.

We are going to see a second backhaul shortage that is related to the intercity bandwidth shortage. All of the big carriers are talking about building fiber-to-the-home and 5G networks that are capable of delivering gigabit speeds to customers. But nobody is talking about how to get the bandwidth to these neighborhoods. You are not going to be able to feed hundreds of 5G fixed wireless transmitters using the existing bandwidth that is available in most places.

Today the cellular companies are paying a lot of money to get gigabit pipes to the big cell towers. Most recent contracts include the ability for these connections to burst to 5 or 10 gigabits. Getting these connections is already a challenge. Picture multiplying that demand by hundreds and thousands of new cell sites. To use the earlier example of Harrisburg, PA – picture somebody trying to build a 100-node 5G network there, each with gigabit connections to customers. This kind of network might initially work with a 10 gigabit backhaul connection, but as bandwidth demand keeps growing (doubling every three years), it won’t take long until this 5G networks will need multiple 10 gigabit connections, up to perhaps 100 gigabits.

Today’s backhaul network is not ready to supply this kind of bandwidth. You could build all of the fiber you want locally in Harrisburg to feed the 5G nodes, but that won’t make any difference if you can’t feed that whole network with sufficient bandwidth to get back to an Internet POP.

Perhaps a few carriers will step up and build the needed backhaul network. But I don’t see that multi-billion dollar per year investment listed in anybody’s business plans today – all I hear about are plans to rush to capture the residential market with 5G. Even if carriers step up and bolster the major intercity routes (and somebody probably will), that is only a tiny portion of the backhaul network that stretches to all of the Harrisburg markets in the country.

The whole backhaul network is already getting swamped due the continued geometric growth of broadband demand. Local networks and backhaul networks that were vigorous just a few years ago can get overwhelmed by a continuous doubling of traffic volume. If you look at any one portion of our existing backhaul network you can already see the stress today, and that stress will turn into backhaul bottlenecks in the near future.

Broadband and Rural Population

Rural counties in total in the US gained 33,000 people in 2017, the first overall increase in rural population since 2011. But the gains came entirely to rural counties adjacent to metropolitan areas and highlights that remote rural counties continue a long-term trend of losing population. Nationwide, counties not next to metropolitan areas lost 24,000 people in 2017.

Further, within that global number, there is a different by the primary economic drivers in each county. Farming counties lost the most population, rural counties with manufacturing only lost a little population, and counties that rely on recreation and tourism gained population, attributed mostly to in-migration of retirees.

The loss of population is one of the primary reasons that rural counties are looking for better broadband. Rural counties don’t have to peer to far into the future to see a bleak picture. Their counties are often rapidly aging as young people move away to find employment. They are already seeing houses abandoned and foresee a shrinking property tax base. Farm counties continue to see the consolidation of farms and the loss of the family farmer.

Rural counties have different goals from broadband than more urban areas. They foremost hope that better broadband can mean more jobs and more employment. Many rural counties are not naïve enough any more to think they are going to attract factories or other businesses to their county. Some get lucky occasionally, but the country as a whole has lost an astonishing 60,000 factories since 2001. And even when new factories arise, they are largely automated and support far fewer jobs than factories in the past.

Rural counties care more about finding ways for residents to supplement incomes, and better broadband brings the chance to work from home. I was working with a rural county in Minnesota last year where I noticed that almost every farm in the county was operating a non-farming business along with farming as a way to supplement income – and this was in a county where the farms had practically no broadband at all and many homes relies on their cellular data for extremely expensive broadband connections. These counties want broadband so that the people living there can make enough money to stay.

Rural counties also need broadband for education. I rarely visit a rural county where there are not a few places in the county where rural residents drive their kids a few night each week to use WiFi in order to do their homework. It’s disheartening to see a parking lot full of cars at a library after it’s closed sitting and doing homework on the slow WiFi leaking out through the library walls.

Earlier I mentioned counties that rely on recreation for their economy. These counties desperately want broadband because they are finding that urban tourists don’t want to visit places that don’t have good internet connectivity and cellphone coverage. And businesses in these counties who rely on tourism have a hard time making it without broadband. Something as simple as offering on-line reservations requires a decent broadband connection, and lack of broadband puts counties at a disadvantage compared to similar areas with broadband.

Finally, rural counties want broadband to help keep the elderly in their homes longer. As hard as it is to be elderly in an urban area it’s much harder in a rural area where something as simple as seeing a doctor can mean a long trip. Rural areas also have a hard time attracting doctors and as rural clinics and hospitals continue to close it will become even harder to get basic healthcare. So rural counties are relying more and more on telemedicine and providers like the Mayo Clinic are embracing the technology to help rural America – but telemedicine require decent broadband including good upload speeds.

The skeptics of any government push to improve broadband often use the excuse that people just want broadband to watch TV, and that broadband is just for entertainment. And when people get broadband they take advantage of the wonderful world of entertainment now available on the Internet. But I can’t recall ever having seen entertainment on the list of reasons why rural counties want better broadband – they want it so that their counties survive as viable communities.

Technical Support as a Product

Verizon announced a new line of products this week that provide three different tiers of technical support to help customers better handle the wide array of devices and issues they face in the ever-more-confusing digital world.

The first product is the most intriguing. For $10 per month a customer gets technical support for virtually any device connected to their home network such as printers, smart locks, smart lighting, security systems and smart thermostats. Unlike the Comcast smart home product where customers only get Comcast-approved IoT devices, a Verizon customers can buy devices anywhere and still get support from Verizon in connecting or troubleshooting connections to such devices.

The technical support also comes with other useful features. Verizon will help you isolate and remove computer viruses and malware. They’ll help you with complicated tasks such as transferring photos, files or personal settings between computers. They also will provide up to 5 PC tune-ups per year that will run diagnostic tests and maximize computer performance. Maybe most important of all, joining the service puts customers at the top of the queue for call to technical support (which I guess means longer wait times for customers who don’t buy the service).

A second tier at $15 per month adds LifeLock Select, the identity protection service as well as McAfee Security and Safe Family – a suite of virus protection and parental control software. The last tier, priced at $30 per month adds an insurance plan for smart devices in the home other than smartphones or computers. It will cover smart TVs, tablets, smart thermostats and other IoT devices, and Verizon will repair or replace broken devices. This top plan also adds LastPass, a password management tool. The product doesn’t include any in-home assistance, although that is available through Verizon partners.

This is not a new product and there are numerous companies offering online technical support such as 24/7 Techies, AskPCExperts, Bask and Best Buy’s Geek Squad. Several of these services also offer a first-tier plan for $9.99 per month.

I also know a number of smaller US telcos and ISPs that offer something similar. Many of these companies operate in geographically isolated areas where there is no Best Buy or other outlet for technical services and support. Some of my clients sell monthly subscriptions similar to Verizon’s base fee. Others operate a physical store location where customers can get computer repair services.

I think every small company I’ve talked to who offers something like this tells me that they are lucky if the operation breaks even. But they all maintain the service because it distinguishes them from competitors and builds tremendous customer loyalty. Most are happy if their technical support effort breaks even.

I’m sure Verizon has done the math and believes they can make money at this – and they probably can. A company of their size can keep call queues full so that technicians have little or no down time. I’m sure Verizon is also counting on the fact that most customers will buy this service and only use it sporadically. A customer who really uses the service will cost Verizon more than the $10 fee.

The biggest issue that anybody who offers this kind of service faces is keeping a competent technical staff to support customer calls. Many employees who take this kind of job view it as a stepping stone to a better paying technical career, so there is usually high turnover of staff.

It’s an intriguing business line for ISPs to consider, particularly those in more remote markets. Our home networks are getting increasingly confusing. Sometimes just getting a printer to work right can eat up hours of time and figuring out more complicated devices can be maddening. As long as you recognize that you won’t get rich doing this it’s an interesting way to create personal relationships with customers while providing an invaluable service to customers that will differentiate you from the competition – unless your competition is Verizon.

FCC to Tackle Rural Call Completion

The FCC announced it is taking another shot at solving the rural call completion problem. Those not living in rural America might be unaware of this issue, but for a number of years calls placed to rural locations have been dropped, making it difficult at times to contact someone in a rural area.

We know why it’s happening. There are long-distance providers who don’t want to pay the higher access charges associated with calling high-cost rural areas. Since 1984 there has been a system where long-distance companies pay to get ‘access’ to local networks in order to complete calls. Originally these access rates were high, over 5 cents per minute even for the large Bell telephone companies. But over the years the access rates have been drastically trimmed through FCC actions.

Today it costs long distance companies a fraction of a penny to call urban locations or rural communities that have telephone service from AT&T, Verizon or CenturyLink. But the access rates are still more expensive for calls placed to areas served by smaller telephone companies, with the rates as much as a penny a minute. But even those rates are being trimmed.

That may not sound like a lot. But some wholesale long distance carriers charge a flat rate per minute to other retail telephone providers, like cellular or cable companies, to complete their calls. These carriers don’t want to pay the higher access fees and many of them simply abandon calls made to places with the higher access charge rates. Not all long-distance carriers do this, but there definitely some bad actors in the industry.

The last time the FCC addressed this they tried to just ban the practice of abandoning calls, and they hoped that would stop the practice. But it didn’t. They are now trying a new approach in WC Docket 13-39. They are now proposing to make the companies that sell long distance to customers responsible for making sure that calls go through.

Here’s how this might work in practice. If you buy telephone service from a cable company there is a high likelihood that the company is not a long-distance provider, but rather buys long distance from somebody else – let’s call them Company A. It’s also likely that Company A uses a number of other long-distance carriers to complete calls. Most wholesale long distance providers like Company A use what they call least-cost routing, meaning they pick a different underlying long-distance carrier to reach each location in the country according to cost at the time the call is placed.

It’s the least-cost routing that is causing the problem. If a carrier selling to Company A offers the same price per minute everywhere in the country they are going to likely be the lowest-cost provider to rural places. Other carriers will charge more to route to a rural place because of the higher access charge there. Because of the automated nature of least-cost routing,  Company A, wittingly or not will choose some sub-carriers who are dropping calls.

The FCC wants to make Company A responsible for the quality of its calling. In the docket the FCC calls Company A the ‘entity that selects the initial long-distance route’. The FCC wants these companies to keep detailed records of how and to whom calls are routed. I assume the FCC will start then asking to trace specific calls that were dropped to start identifying the bad actors in the industry. This docket also likely will allow the FCC to levy fines against any carriers involved in the practice, including Company A, the first router of calls.

Let’s hope this new approach works. The practice of purposefully dropping calls goes against the century-long industry goal of having universal voice connectivity. It’s bad for business, bad for rural residents and bad for the country as a whole when calls can’t be made to rural communities. There are examples of people making multiple attempts to call a rural place and unable to get through, and this practice needs to end. For this to be effective the FCC will have to play detective and sift through the call records to identify carriers who drop calls, and hopefully they will do just that.

Prices are Driving Cord Cutting

It’s been general wisdom for several years that high cable TV prices are one of the predominant factors behind cord cutting. TiVo’s recently released Q4 2017 Online Video and Pay-TV Trends Report says that prices are even more important than we thought. TiVo talked to a number of cord cutters in 2017 and found that 86.7% of those who dropped TV in 2017 list high prices as the number one reason for abandoning traditional cable TV. This is up from 80.1% a year earlier.

It’s not hard to understand why price is becoming such a big factor. Over 50% of households now say that their monthly cable bill is more than $75 per month. And only 15% pay less than $50, down from 18% a year earlier. Annual rate increases that are far greater than the cost of general inflation are pushing cable prices out of the affordability zone for many households.

Interestingly we see this same trend manifesting in another way. In a recent survey Parks Associates report that a little over 20% of households now use digital antennas in their homes to receive over-the-air networks like ABC, CBS, NBC, FOX and PBS. That’s up from 15% in 2015. That’s a huge swing and means that over 6 million homes have started using antennas in just the last two years.

Nationwide more than 3 million households (2.4% of all households) dropped cable in 2017 – as witnessed by the subscribers of the largest cable TV companies. Just about every one of my clients will tell you that they lost a larger percentage than the average. The nationwide numbers are bolstered by the fact that Comcast lost only 0.7% of its cable customers and Charter lost 1.4%. But telcos did far worse with AT&T losing 14.6% and Frontier losing 16.1%.

Higher prices are almost entirely due to increased programming costs. Small companies have seen programming costs grow over 10% per year, and the rate of annual growth is increasing. Some have reported annual increases to me as high as 15% in the most recent two years.

One of the biggest drivers of high programming costs are the retransmission charges that local affiliates of the major networks charge to cable companies to cover the over-the-air networks. In large cities a lot of the local TV stations are owned directly by the major networks like NBC or ABC, but in smaller markets these are generally owned by others. The independent local stations have no recourse but to raise retransmission rates each year since the networks increase the costs to them for remaining as an affiliate. In the end, all of the extra revenues from retransmission fees flows up to the major networks, which now see this as a major source of revenue growth.

It’s not just the major networks that are increasing rates. Practically every cable network is increasing rates at a faster pace than a decade ago. It’s a really odd economic phenomenon to see big price increases occurring in an industry that is losing customers at this pace. Any economics 101 book would suggest that the laws of supply and demand would drive prices for programming in the other direction.

But the cable industry is perverse due to regulations. The cable rules require stations to carry local networks that are within their range. I know a number of cable companies who would gladly provide rabbit ears to customers rather than continue to raise rates every year – but they are required by laws passed by Congress to carry these stations. The same laws also force cable companies to carry large lineups in the basic and expanded basic tiers that we are all familiar with.

These laws mean that cable providers have few options on what networks to carry. I don’t know any cable providers who wouldn’t like to try something different and perhaps offer smaller packages of the most-watched networks that people could afford to buy. But the legal requirements for cable lineups embolden the programmers to charge exorbitantly because cable operators have no power to push back. The most a cable provider can do is to take all of the networks from a given programmer off the air – and even this is impractical since each of the few major programmers own a lot of networks.

You can’t really fault the programmers since they are all publicly-traded companies which are responding to Wall Street demands that they increase profits quarter after quarter. The whole cable ecosystem is polluted by the need to increase profits. It’s sad, because without the price increases each year all of the companies involved could continue to make high margins and big profits.

Even with all of this turmoil in the industry I don’t hear of any discussion in Congress about tackling the issue and relaxing the current rules that are breaking the industry. Instead we see people fleeing traditional cable TV and buying smaller packages of the same programming online from Sling TV, DirecTV Now and Playstation Vue.

The rate of cord cutting is clearly accelerating and it’s not going to take many more years until these issues can’t be fixed – because by then the majority of households will be getting programming online rather than from cable companies.

Convergence

Even a decade ago it was apparent that the telecom industry was headed towards convergence. By that, I mean that the various cable companies, telcos and wireless companies are expanding service lines and are starting to compete with each other in areas that were unimaginable even a few years ago.

Comcast is the best example of this. Their CEO Brian Roberts was quoted last year as saying that the company was now in all of the business lines available to it. Compare today’s Comcast with the company a decade ago. Then they were just becoming a triple play provider by adding voice to their product line-up. Since then they have added a lot more business lines.

A decade ago Comcast barely went after business customers and didn’t even own network in many business districts and industrial parks – and now they are a major provider of business services. The company recently added cellular service and it appears they are adding customers at a furious pace. They are becoming a major player in home security. The company has a thriving product line selling residential smart home services. They even started bundling home solar panels with their residential product line recently.

The company has even stepped up their traditional cable service to do better against the satellite providers. They’ve developed their own settop box that is said to be the best in the industry. And they have bought a number of the cable programmers like NBC, giving them a margin advantage over any competitor for video.

It seems to me like everybody else wants to be Comcast. Consider AT&T. A decade ago they were a traditional telco. They operated a huge copper network for residential broadband and telephone service and owned the country’s largest fiber network for providing wholesale transport and business services. They were also one of the two largest cellular companies, and with Verizon controlled the vast majority of that business.

AT&T not only added cable TV service to their product line, but they bought DirecTV and become a major video provider. They are trying hard to buy programming and content by merging with Time Warner. The company has been aggressive building fiber to large apartment complexes and has become a major player in the MDU market that used to be almost exclusively controlled by the cable incumbents. The company has also been building a lot of fiber to better compete head-to-head with Comcast and other cable companies that have faster residential broadband.

Verizon took a different path and competed head-to-head with Comcast in the northeast even a decade ago with its FiOS fiber network. The company continues to buy smaller regional fiber providers like XO to beef up its business and fiber networks. Verizon has announced that it intends to roar back into the residential market by use of small cell 5G over the next decade. And Verizon continues to thrive as a cellular carrier.

Even smaller companies like CenturyLink are looking a lot like their bigger competitors. The company had added cable to its bundle. They built fiber past almost a million passings last year to provide more robust competition for broadband speeds. And they bought Level 3 to become a major player for transport and business services.

But these big ISPs are not the only ones crossing into new product lines. Consider T-Mobile. In a move that was unthinkable even a few years ago they are making a major play to bundle video content with their cellular service – making them a direct competitor of all of the ISPs for the market segment of folks who are happy with mobile video rather than a landline connection. T-Mobile is pushing the other cellular providers to do the same.

And there are other national competitors on the horizon. For example, there are several satellite companies like SpaceX and OneWeb that are likely to compete nationally with bundles similar to the other ISPs. I also think we’ll see new competitors spring up and compete with 5G last-mile networks as that technology matures.

It’s going to be interesting to see the winners and losers over the next decade. Right now the cable companies are approaching a near monopoly in many markets for broadband. The only way these other competitors are going to survive and thrive is to chop away at Comcast and the other large cable companies. But at the same time the cable companies will be carving cellular customers. For those like me who follow the industry it’s going to be interesting to watch.