Our Digital Illiteracy

Pew Research Center recently surveyed 4,272 adults and tested their knowledge of basic computer topics. The results showed that there was a lack of general knowledge about a few of the terms that are important for how people use the Internet.

For example, the survey showed that only 30% of survey takers knew that website starting with https:// means that the information provided over that site is encrypted.

Only 28% of respondents understood the concept of two-factor authentication – something that Google and Microsoft say can eliminate nearly 100% of hacking of a connection.

Only 24% understood the purpose of private browsing.

The respondents fared better on a few topics. For example, two-thirds of respondents understood the danger of phishing, but it’s a bit scary that one out of three users didn’t. 63% understand that cookies allow websites to track user visits and other activities on web sites.

48% of respondents understood the concept of net neutrality – the technology topic that has gotten the most press over the last four years.

A few of the questions were a bit smug. Only 15% of people could identify a picture of Jack Dorsey, the founder of Twitter. I have to admit that this is a question I would also have failed because I don’t much care about the personalities of the people behind web companies – even though I follow the issues involving these companies closely.

It’s probably not surprising that younger users did better on the survey question than older users. It’s still a bit shocking, though that only 1% of survey takers got every question right.

The bottom line of this survey is that the general public probably has a much lower knowledge of the Internet that many web companies and ISPs assume. I think this survey highlights an opportunity for small ISPs to educate customers by passing on safety tips or important knowledge about the web.

ISPs communicate with users on log-in pages, when billing and on their web site. It wouldn’t be hard to add some recurring messages such as. “Did you know that web sites that start with https use an encrypted connection with users and provide for a safer connection?” Experienced web users will blow past such messages, but we know that repeating messages eventually make an impression on most people.

It’s easy for technical folks to assume that the public understands basic concepts about the web – but surveys like this one remind us that’s necessarily true.

Why I am Thankful – 2019

It’s Thanksgiving again and I pause every year to look at the positive events and trends for the small ISP industry. I found a number of things to be thankful for at the end of 2019.

FCC Finally Admits Its Maps Suck. The FCC has begrudgingly admitted that its broadband mapping sucks and is considering several proposals for improving the mapping. It looks like the proposals will fix the ‘edge’ problem, where today rural customers that live close to cities and towns are lumped in with the broadband available in those places. Sadly, I don’t believe there will ever be a good way to measure and map rural DSL and fixed wireless. But fixing the edge problem will be a great improvement.

FCC Released the CBRS Spectrum. The 3.65 GHz, (Citizens Band Radio Spectrum) should provide a boost to rural fixed broadband. There are some restrictions where there is existing government use and there will be frequency sharing rules, so the frequency is not fully unrestricted. The 80 MHz of free spectrum should prove to be powerful in many parts of the country. The FCC is considering other frequencies like white space, C Band, and 6 GHz that also will be a benefit to rural broadband.

States Are Reversing a Few Draconian Laws. Several states have removed barriers for electric cooperatives to get into the broadband business. Arkansas softened a prohibition against municipal broadband. Local politicians are now telling state legislators that broadband is the top priority in communities that don’t have access to good broadband. It’s been obvious for a long time that the best solutions to fix rural broadband are local – it makes no sense to restrict any entity that wants to invest in rural broadband.

The FCC Has Made it Easier for Indian Tribes to Provide Broadband. Various rule changes have streamlined the process of building and owning broadband infrastructure on tribal lands. Many tribes are exploring their options.

Local Broadband Activists Make a Difference. It seems like every community I visit now has a local broadband committee or group that is pushing local politicians to find a solution for poor broadband coverage. These folks make a difference and are prodding local governments to get serious about finding broadband solutions.

The FCC Announces a Monstrous Grant Program. I hope the RDOF grants that will award over $16 billion next year will make a real dent in the rural digital divide. Ideally, a lot of the grants will fund rural fiber, since any community with fiber has achieved a long-term broadband solution. However, I worry that much of the funding could go to slower technologies, or even to the satellite companies – so we’ll have to wait and see what happens in a massive reverse auction.

States Take the Lead on Net Neutrality. When the US Appeals Court ruled that the FCC had the authority to undo net neutrality, the court also rules that states have the authority to step into that regulatory void. Numerous states have enacted some version of net neutrality, but California and Washington have enacted laws as comprehensive as the old FCC rules. My guess at some point is that the big ISPs will decide that they would rather have one set of federal net neutrality rules than a host of different state ones.

The Proliferation of Online Programming. The riches of programming available online is amazing. I’m a Maryland sports fan and there are only three basketball or football games that I can’t watch this season even though I don’t live in the Maryland market. I don’t understand why there aren’t more cord cutters because there is far more entertainment available online than anybody can possibly watch. A decade ago, I didn’t even own a TV because there was nothing worth watching – today I keep a wish list of programming to watch later.

NC Broadband Matters. Finally, I’m thankful for NC Broadband Matters. This is a non-profit in North Carolina that is working to bring broadband to communities that don’t have it today. The group invited me to join their Board this year and I look forward to working with this talented group of dedicated folks to help find rural broadband solutions in the state.

T-Mobile Offering Broadband Solutions

As part of the push to get approval for the proposed merger with Sprint, T-Mobile pledged that it will offer low-cost data plans, give free 5G to first responders and provide free broadband access to underserved households with school students. These offers are all dependent upon regulators and the states approving the merger.

The low-price broadband plans might be attractive to those who don’t use a lot of cellular data. The lowest-price plan offers 2 GB of data for $15 monthly. The price is guaranteed for 5 years and the data cap grows by 500 MB per year to reach 4 GB in the fifth year. The second plan offers 5 GB for $25 and also grows by 500 Mb per year to reach 7 GB by the fifth year. I assume adding voice and texting is extra.

The offer for free phones for first responders is just that. T-Mobile will offer free voice, texting, and data to first responders for 10 years. There will be no throttling of data and data will always get priority. The company estimates that this would save $7.7 billion nationwide for first responders over the ten years if they all switch to T-Mobile. Not surprisingly the other carriers are already unhappy with this offer, particularly AT&T which is busy building the nationwide FirstNet first responder network. This may be a somewhat hollow offer. The FirstNet network has some major advantages such as automatically interconnecting responders from different jurisdictions. But at least some local governments are going to be attracted to free cellular service.

The offer for school students is intriguing. For the next five years, the company is offering 100 GB per month of downloaded data to eligible student households. The company will also provide a free WiFi hotspot that converts the cellular data into WiFi for home use. T-Mobile estimates that roughly 10 million households would be eligible. Studies have shown that cost is the reason that many homes with students don’t have home broadband. In urban areas, the T-Mobile effort could largely eliminate the homework gap, at least for five years. That would give the country five years to find a more permanent solution. While T-Mobile would also help in rural America, many rural homes are not in range of a T-Mobile tower capable of delivering enough broadband to be meaningful. However, in many cases, this offer would be bringing broadband for homework to homes with no other broadband alternatives.

If the merger goes through, T-Mobile plans to mobilize the big inventory of 2.5 GHz spectrum owned by Sprint as well as activating 600 MHz spectrum. These are interesting spectrum, particularly the 600 MHz. This spectrum is great at penetrating buildings and can reach deep into most buildings. The spectrum also carries far, up to 10 miles from a transmitter. However, compared to higher frequencies, the 600 MHz spectrum won’t carry as much data. Further, data speeds decrease with distance from a cell sites and the data speeds past a few miles are likely to be pretty slow.

This plan makes me wonder how allowing millions of students onto the cellular network for homework will affect cell sites. Will some cell sites bog down when kids are all connected to the school networks to do homework?

I further wonder if the promise to offer free broadband to students also comes with a promise to supply enough backhaul bandwidth to poor neighborhoods to support the busy networks. Without good backhaul, the free bandwidth might be unusable at peak hours. I don’t mean to denigrate an offer that might mean a broadband solution for millions of kids – but I’ve also learned over the years that free doesn’t always mean good.

I’ve seen where a few states like New York are still against the merger, so there is no guarantee it’s going to happen. It sounds like the courts will have to decide. I suspect these offers will be withdrawn if the decision is made by courts rather than by the states.

C-Band Announcement Moot on Rural Wireless

On November 18, FCC Chairman Ajit Pai told several members of Congress that he had decided there should be a public auction for the C-Band spectrum that sits between 3.7 GHz and 4.2 GHz. The spectrum has historically been used by satellite companies for communication between satellites and earth stations. This is prime spectrum for 5G cellular broadband, but also could provide a huge benefit to fixed wireless providers in rural America. Chairman Pai will be asking the rest of the FCC commissioners to approve an order sometime after the first of next year. Making an early announcement is a bit unusual since major orders like this are usually announced by releasing a written order that comes after a vote of the Commission.

The letters from Chairman Pai describe four reasons behind the decision: First, we must make available a significant amount of C-Band spectrum for 5G. Second, we must make spectrum available for 5G quickly. Third, we must generate revenue for the federal government. And, Fourth, we must protect the services that are currently delivered using the C-Band so that they can continue to be delivered to the American people. 

Missing from Chairman Pai’s letter was any mention of making the C-Band spectrum available for rural fixed wireless. WISPA and other rural proponents have been lobbying for sharing the spectrum so that the C-Band could be used for urban 5G while also benefitting faster rural broadband.

This has been an unusual docket from the start because the satellite providers, under the name of the C-Band Alliance (CBA) offered to relocate to the higher part of the spectrum if they could hold a private auction to sell the vacated spectrum to the cellular carriers. There were several problems with that offer. First, the satellite providers would make billions of dollars of windfall profits through selling spectrum that they don’t own. Federal law makes it clear that the FCC has the right to award or take-back spectrum and it would have been a major precedent for license holders to be able to sell spectrum for a huge profit. There were also obvious concerns about transparency, and it was feared that backroom deals would be struck to give spectrum to the big cellular carriers for bargain prices while still benefitting the satellite companies.

There was also a political nuance. The CBA proposed to give some of the proceeds of the private auction to the federal government, similar to what happens in an FCC auction. However, money given that way would go towards paying off the federal deficit. Proceeds of FCC auctions can be earmarked for specific uses and legislators all wanted to see the spectrum sold by FCC auction so that they could use some of the money.

The rural spectrum-sharing idea might not be not dead since the announcement was made by short letter. However, the Chairman could easily have mentioned rural broadband in the letters to legislators and didn’t. The Chairman has made numerous speeches where he said that solving the rural digital divide is his primary goal. It’s clear by his actions during the last few years that deregulation and giveaways to the big carriers under the guise of promoting 5G are the real priority of this FCC.

The C-Band spectrum sits next to the recently released CBRS spectrum at 3.5 GHz. Just as additional spectrum benefits 5G, fixed wireless technology improves significantly by combining multiple bands of frequency. Rural carriers have been arguing for years that the FCC should allow for the sharing of spectrum. Proponents of rural broadband argue that urban and rural use of spectrum can coexist since most 5G spectrum is only going to be needed in urban areas. They believe that such spectrum can be used in a point-to-point or point-to-multipoint configuration in rural America without interfering with urban 5G. The big cellular carriers are reluctant to share spectrum because it causes them extra effort, so only the FCC can make it happen.

If the final order doesn’t require frequency sharing, it will be another slap in the face for rural broadband. Since there is not yet a written order, proponents of rural broadband still have an opportunity to be heard at the FCC on the topic. However, I fear that the issue has already been decided and that rural broadband will again be ignored by the FCC.

Broadband Still Growing – 3Q 2019

Leichtman Research Group recently released the broadband customer statistics for the third quarter of 2019 for the largest cable and telephone companies. Leichtman compiles most of these numbers from the statistics provided to stockholders other than Cox, which is estimated.

The numbers provided to investors are lower than broadband customers these same companies report to the FCC, and I think that most of the difference is due to the way many of these companies count broadband to apartment buildings. If they provide a gigabit pipe to serve an apartment building, they might that as 1 customer, whereas for FCC reporting they likely count the number of apartment units served.

Following are the broadband customer counts for the third quarter and a comparison to the second quarter of this year.

3Q 2019 Added % Change
Comcast 28,186,000 379,000 1.4%
Charter 26,325,000 380,000 1.5%
AT&T 15,575,000 (123,000) -0.8%
Verizon 6,961,000 (7,000) -0.1%
Cox 5,145,000 25,000 0.5%
CenturyLink 4,714,000 (36,000) -0.8%
Altice 4,180,300 14,900 0.4%
Frontier 3,555,000 (71,000) -2.0%
Mediacom 1,316,000 13,000 1.0%
Windstream 1,040,000 5,700 0.6%
Consolidated 784,151 1,143 0.1%
WOW 773,900 10,420 1.3%
Cable ONE 689,138 7,376 1.1%
Atlantic Broadband 446,137 2,441 0.6%
TDS 437,700 4,300 1.0%
Cincinnati Bell 425,100 (400) -0.1%
100,553,426 605,660 0.6%

Leichtman says this group of companies represents 96% of all US broadband customers. I’m not sure how they calculated that percentage. That implies that there are only about 4 million broadband customers for companies not on this list, and that feels a little low to me.

For the quarter, these companies collectively saw growth that annualizes to 2.4%. This is a significant uptick over the second quarter of 2019 that saw an annualized growth rate of 1.7%.

On an annualized basis the third quarter of 2019 added about the same number of customers that were added for the calendar year of 2018. However, the cable companies are performing better this year while the losses continue to accelerate for the big telcos. The big telco losers for the quarter are Frontier, which lost 2% of its customer base, and AT&T and CenturyLink which each lost 0.8% of their customer base. Following are the annualized changes in customers in 2018 and 2019:

‘                                          2018                2019

Cable Companies        2,987,721        3,317,904

Telcos                            ( 472,124)        ( 895,564)

Total                              2,425,597        2,422,640

Both Comcast and Charter had spectacular quarters and continue to account for most of the growth in broadband, as each company added around 380,000 customers for the quarter. It would be interesting to understand what is driving that growth. Some of that comes from providing broadband to new homes. Some comes from customers converting away from DSL. And some comes from expansion – I know of examples where both companies are building new network around the fringes of their service areas.

Auditing the Universal Service Fund

I recently heard FCC Commissioner Geoffrey Starks speak to the Broadband Communities meeting in Alexandria, Virginia. He expressed support for finding broadband solutions and cited several examples of communities that don’t have good broadband access today – both due to lack of connectivity and due to the lack of affordable broadband.

One of his more interesting comments is that he wants the FCC to undertake a ‘data-driven’ analysis of the effectiveness of the Universal Service Fund over the last ten years. He wants to understand where the fund has succeeded and where it has failed. Trying to somehow measure the effectiveness of the USF sounds challenging. I can think of numerous successes and failures of USF funding, but I also know of a lot of situations that I would have a hard time classifying as a success or failure.

Consider some of the challenges of looking backward. Over the last decade, the definition of broadband has changed from 4/1 Mbps to 25/3 Mbps. Any USF funds that supported the older speeds will look obsolete and inadequate today. Was using USF funding nine years ago to support slow broadband by today’s standards a success or a failure?

One of the biggest challenges of undertaking data-driven analysis is that the FCC didn’t gather the needed data over time. For example, there has only been a limited amount of speed testing done by the FCC looking at the performance of networks built with USF funding. A more rigorous set of testing starts over the next few years, but I think even the new testing won’t tell the FCC what they need to know. For example, the FCC just changed the rules to let the big telcos off the hook when they decided that USF recipients can help to decide which customers to test. The big telcos aren’t going to test where they didn’t build upgrades or where they know they can’t meet the FCC speed requirements.

The FCC will find many successes from USF funding. I’m aware of many rural communities that have gotten fiber that was partially funded by the ACAM program. These communities will have world-class broadband for the rest of this century. But ACAM money was also used in other places to build 25/3 DSL. I’m sure the rural homes that got this DSL are thankful because it’s far better than what they had before. But will they be happy in a decade or two as their copper networks approach being a century old? Are the areas that got the DSL a success or a failure?

Unfortunately, there are obvious failures with USF funding. Many of the failures come from the inadequate mapping that influenced USF funding decisions. There are millions of households for which carriers have been denied USF funding because the homes have been improperly classified as having broadband when they do not. Commissioner Stark said he was worried about using these same maps for the upcoming RDOF grants – and he should be.

Possibly the biggest failures come from what I call lack of vision by the FCC. The biggest example of this is when they awarded $11 billion to fund the CAF II program for the big telcos, requiring 10/1 Mbps speeds at a time when the FCC had already declared broadband to be 25/3 Mbps. That program was such a failure that the CAF II areas will be eligible for overbuilding using the RDOF grants, barely after the upgrades are slated to be completed. The Universal Service Fund should only support building broadband to meet future speed needs and not today’s needs. This FCC is likely to repeat this mistake if they award the coming RDOF grants to provide 25/3 Mbps speeds – a speed that’s arguably inadequate today and that clearly will be inadequate by the time the RDOF networks are completed seven years from now.

I hope the data-driven analysis asks the right questions. Again, consider CAF II. I think there are huge numbers of homes in the CAF II service areas where the big telcos made no upgrades, or upgraded to speeds far below 10/1 Mbps. I know that some of the big telcos didn’t even spend much of their CAF II funding and pocketed it as revenue. Is the audit going to look deep at such failures and take an honest look at what went wrong?

Commissioner Stark also mentioned the Lifeline program as a failure due to massive fraud. I’ve followed the Lifeline topic closely for years and the fraud has been nowhere near the magnitude that is being claimed by some politicians. Much of the blame for problems with the program came from the FCC because there was never any easy way for telcos to check if customers remained eligible for the program. The FCC is in the process of launching such a database – something that should have been done twenty years ago. The real travesty of the Lifeline program is that the big telcos have walked away. For example, AT&T has stopped offering Lifeline in much of its footprint. The FCC has also decided to make it exceedingly difficult for ISPs to join the program, and I know of numerous ISPs that would love to participate.

I try not to be cynical, and I hope an ‘audit’ isn’t just another way to try to kill the Lifeline program but is instead an honest effort to understand what has worked and not worked in the past. An honest evaluation of the fund’s problems will assign the blame for many of the fund’s problems to the FCC, and ideally, that would stop the current FCC from repeating the mistakes of the past.

Another Farming Broadband Survey

I’ve seen several surveys this year that are looking at the impact of broadband for farmers. This is relatively new and highlights the degree to which broadband has moved near the top of the list of many farmers’ concerns. This blog looks at a survey conducted by the United Soybean Board. This is an organization of soybean farmers that concentrates on research and soybean market development (as opposed to policy and lobbying).

The survey was conducted across a wide cross-section of 2,000 farmers and ranchers from across the country. This survey included farmers of field and row crops like soybeans and corn, livestock, and specialty crops like fruits and vegetables.

Here were some of the key findings of the survey:

  • Almost 60% of farmers said they don’t have had adequate broadband to run their business.
  • 60% of farmers said the primary problem with their broadband is slow speed.Other issues identified include the cost and reliability of broadband connections.
  • 78% of farmers said they have only one option for choosing an ISP.
  • The survey showed that 59% of farmers want to incorporate the use of more data in their business and another 28% are considering it.
  • The survey looked at two aspects of broadband – in the office and in the fields. Only 32% of farmers found broadband in their office to be reliable. Over 77% don’t think they have a good broadband solution in their fields. Only 26% say that cellular coverage is reliable in their fields.
  • 67% of farmers want the ability to transfer data wirelessly from their fields.
  • 90% of farmers are using a cellphone for Internet access in their fields. A few farmers surveyed constructed their own wireless networks to reach their fields.
  • Most farmers now use 2 or 3 different wireless devices (laptops, tablets, smartphones, desktops, and smart farm machinery).
  • 33% of farmers say lack of broadband has affected their equipment purchases – they are not yet buying smart machinery.

Farmers in the survey could also tell their story about how they use or would like to use broadband. Some of the technologies reported include:

  • Precision agriculture where field data provides the ability of farm equipment to apply different amounts of nutrients and insecticide only where it’s needed.
  • Soil monitoring to better understand the condition of the soil – with a goal to improve the soil year after year.
  • Precision irrigation that provides water only where it’s needed.
  • Drones to quickly survey the fields to gather data.

The executive summary of the survey expresses the results well:

American farmers feel the impact of poor connectivity, including limitations on improving farm economic and environmental sustainability and reinvesting in their businesses. They want to do the best things to preserve and improve their farms and natural resources, but lack of clear data to make decisions hampers their continuous improvement. And farmers’ needs for internet access are projected to grow. They value they bring to the U.S. economy could multiply significantly with fast, reliable internet.

 

The Problem with FTC Regulation

As part of the decision to kill Title II regulation, the FCC largely ceded its regulatory authority over broadband to the Federal Trade Commission. FTC regulation is exceedingly weak, meaning that broadband is largely unregulated.

A great example of this is the recent $60 million fine levied on AT&T by the FTC. This case stretched back to 2014 when the company advertised and charged a premium price for an unlimited cellular data plan. It turns out the plan was far from unlimited and once a customer reached an arbitrary amount of monthly usage, AT&T throttled download speeds to the point where the broadband was largely unusable.

This is clearly an unfair consumer practice and the FTC should be applauded for fining AT&T. Unfortunately, the authority to levy fine for bad behavior is the practical extent of the FTC’s regulatory authority.

A strong regulator would not have taken five years to resolve this issue. In today’s world, five years is forever, and AT&T has moved far past the network, the products, and the practices they used in 2014. In 2014 most of the cellular network was still 3G, moving towards 4G. It didn’t take a lot of cellular data usage to stress the network. It was a real crisis for the cellular networks when people started watching video on their phones, and the cellular companies tamped down on usage by enforcing small monthly data caps, and apparently by capping unlimited users as well.

A strong regulator would have ordered AT&T to stop the bad practice in 2014. The FTC doesn’t have that authority. The regulatory process at the FTC is to bring suit against a corporation for bad behavior. Often companies will stop bad behavior immediately to soften the size of potential fines – but they are not required to do so. The FTC suit is like any other lawsuit with discovery and testimony. Once the FTC finds the corporation guilty of bad behavior, the parties often negotiate a settlement, and it’s routine for corporations to  agrees to never undertake the same bad practices again.

A strong regulator would have ordered the whole cellular industry to stop throttling unlimited data customers. The FCC fine applied strictly to AT&T and not to any other cellular carriers. T-Mobile has advertised unlimited data plans for years that get throttled at some point, but this FTC action and the fine against AT&T has no impact on T-Mobile and the other wireless carriers. AT&T got their wrist slapped, but the FTC doesn’t have the authority to tell other cellular companies to not engage in the same bad behavior. The FCC regulates by punishing bad corporate actors and hopes that similar companies will modify their behavior.

A strong regulator would develop forward-thinking policies to head off bad behavior before it happens. One of the bulwarks of regulation is establishing policies that prohibit bad behavior and that reward corporations for good behavior. The FTC has no authority to create policy – only to police bad behavior.

Even if they wanted to regulate broadband more, the FTC doesn’t have the staffing needed to monitor all broadband companies. The agency is responsible for policing bad corporate behavior across all industries, so they only tackle the worst cases of corporate abuse, and more often than not they go after the largest corporations.

At some point, Congress will have to re-regulate broadband. Unregulated corporations inevitably abuse the public. Without regulation, broadband prices are going to go sky-high. Without regulation there will be ISP policies that unfairly punish customers. Without regulation the big ISPs will eventually engage in all of the practices that net neutrality tried to stop. Having the FTC occasionally levy a big fine against a few big ISPs will not deter bad behavior across the whole ISP sector.

What we really need is an FCC that does what it’s supposed to do. If the FCC refused to regulate broadband – the primary product under its umbrella – then the agency is reduced to babysitting spectrum auctions, and not much else of consequence.

Our Degrading Cellular Networks

I don’t know about the rest of you, but I’ve noticed a lot of degradation in the cellular voice network over the last year or two, and the situation is noticeably worsening over time. For a decade or more the cellular network has been a bastion of strength and reliability. I rely heavily on my cellphone all day for work and for years I haven’t given the cellular network a thought because calls worked. Occasionally I’d get a bad voice connection that could be easily remedied by reinitiating a call. But that happened so infrequently that I barely noticed it – it was never something I considered as a problem.

Over the last year, this all changed. I’ve often had a problem making a call and have had to try the same number a half a dozen times to make a connection. Calls mysteriously drop in mid-call, or even stranger, half of the call goes silent and only one party can be heard. Possibly the worse problem is that there are a lot more calls with poor voice quality – something that I thought was a decade behind us.

I happen to work in a small city and it’s not hard to understand why my cell site would be stressed. Half of the homes in my neighborhood have at least one person working from home, and most spend a lot of time on the phone. Our street is only one block from a busy traffic corridor and is also full of businesses. We also have a significant number of teenagers. I would not be surprised to find that the busy hour on our local cellular network is during the afternoon.

However, this is not just a problem with urban cell sites. I’ve lately been asking others about their cellular calling and at least half of people I’ve asked tell me that the quality of the cellular networks in their own neighborhoods has gotten worse. Many of these folks live in small rural towns.

It’s not hard to understand why this is happening. The cellular companies have embraced the ‘unlimited’ data plans, which while not truly unlimited, have encouraged folks to use their cellular data plans. According to Cisco and OpenVault, the amount of data on cellular networks is now doubling every two years – a scorching growth rate that will accumulate to a 60-fold increase in data usage on the cellular networks in a decade. No network can sustain that kind of traffic growth for very long without first becoming congested and eventually collapsing under the load.

The cellular companies don’t want to openly talk about this crisis. I guess that the first cellular company to use the word ‘crisis’ will see their stock tank, so none of them are talking about why cellular performance is degrading. Instead, the cellular carriers have taken the tactic of saying that we need to remove barriers to 5G and that we need to win the 5G race – but what they want is to find solutions to fix the 4G networks before they crash.

The cellular companies have a 3-prong approach to fix the problem. First, they are deploying small cell sites to relieve the pressure from the big cellular towers. One small cell site in my neighborhood would likely eliminate most of the problems I’ve been having, at least for a little while. Unfortunately, in a network where traffic is doubling every two years, this is a temporary solution.

The cellular companies also have been screaming for new mid-range spectrum, because adding spectrum to cell sites and cellphones expands the data capability at each cell site. Unfortunately, working new spectrum into the cellular networks take time. The FCC continues to slog through the approval process for new cellular spectrum, with the best example being the mess happening with C-Band spectrum. Even when new spectrum is approved there is a significant market delay from the time of approval until a new spectrum has been installed in cell sites and phones.

Finally, the cellular carriers are counting on 5G. There a few aspects of 5G that will significantly improve cellular service. The most important is frequency slicing that will right-size the data path to each customer and will get rid of today’s network that provides a full channel to a customer who is doing some minor broadband task. 5G will also allow for a customer to be connected to a different cell site if their closest site is full. Finally, the 5G specifications call for a major expansion of the number of customers that can be served simultaneously from a cell site. Unfortunately for the cellular carriers, most of the major 5G improvements are still five years into the future. And like with new spectrum, there will be a market delay with each 5G breakthrough as updates make it into enough smartphones to make a difference.

There is a fourth issue that is a likely component of the degrading cellular networks. It’s likely with expanding broadband needs that the backhaul links to cell sites are overloaded at peak times and under stress. It doesn’t matter if all of the above changes are implemented if the backhaul is inadequate – because poor backhaul will degrade any broadband network. The big cellular carriers have been working furiously to build fiber to cell sites to eliminate leased backhaul. But much of the backhaul to cell sites is still leased, and the lease costs are one of the major expenses for cellular companies. The cellular companies are reluctant to pay a lot more for bandwidth, and so it’s likely that at the busiest times of the day that many backhaul routes are now overloaded.

The cellular companies need all of these fixes just to keep up with cellular demand growth. They need many more small cell sites, more frequency, 5G upgrades, and robust backhaul. What I find scary is that all of these fixes might not be enough to solve the problem if cellular demand continues to grow at the same torrid pace. I’ve been thinking about buying a landline for my office – something I got rid of 20 years ago – I don’t know if I can wait for the cellular companies to solve their crisis.

The Onslaught of New Content

As if cord cutting isn’t bad enough, online OTT programming is exploding with numerous new options. One has to think that these many options will lure a lot more homes to ditch traditional cable TV.

Disney+. This service is hitting the streets with huge fanfare. It’s priced at $6.99 per month or $5.83 per month with an annual subscription. Disney+ will contain the content provided by Disney, Marvel, Lucasfilm, Pixar, and National Geographic. Disney owns the Star Wars franchise and is planning a lot of new Star Wars content. There will be new content created only for the Disney+ service like a series produced by the Jim Henson Company. Disney also owns most of Hulu and will be offering a bundled package of Disney+, Hulu, and ESPN+ for $12.99 per month.

Apple TV+. The service launched November 1 with a monthly fee of $4.99. It’s being offered for free to customers that buy an expensive Apple product like an iPhone, iPad, Mac, or Apple TV.  The company has set a goal of having 100 million customers within 3-4 years and will launch in over 100 countries. Apple is also offering new content created just for the service. They have announced partnerships for content from Oprah Winfrey, from Reese Witherspoon’s Hello Sunshine production company, and from Steven Spielberg’s Amblin TV. While not yet announced, Wall Street expects Apple to accumulate a library of older content. For now, the service doesn’t work on Amazon Fire and Roku devices, but should in the future.

HBO Max.  This is being offered by AT&T and slated for launch sometime in the spring of 2020. The company is offering this at $14.99 per month, the same price as HBO Now – which is the current online HBO offering that only carries the library of HBO content. Customers subscribing to HBO on a cable system might get the new service for free. The company will likely migrate HBO Plus customers to the new service. HBO Max brings in the vast library of content owned by Warner Media. There will be a curated revolving list of classic movies. They’ve also bought the rights to shows like Friends. The company hopes to have 50 million paying customers by 2025. This is the only online service that doesn’t care if customers buy their prime HBO content online or from a cable company.

Peacock. This is owned by Comcast and is scheduled to launch in April 2020. The service is named for the NBC peacock logo. The service will provide new content including shows from Alec Baldwin and Demi Moore. It will carry the vast library of NBC’s programming. The new offering will also tie into Olympic coverage. For now, Comcast is thinking of giving this free to every Comcast customer and may make it free to everybody.

Quibi. This is a new service created by Jeffery Katzenberg of DreamWorks. It will launch in early 2020 and contains a lot of new content. The unique thing about the service is that it will consist of short-duration content and will only be available on smartphones. The company is working with over 30 partners to create content that is aimed at younger views. The typical content will be 7-10 minutes in length. It’s attracted big names like Steven Spielberg, Kevin Hart, Tyra Banks, and Jennifer Lopez. There are plans for vignettes from traditional series like Punk’d, Varsity Blues, Vikings, and How to Lose a Guy in 10 Days.

Bloomberg. Just to show that all new content isn’t entertainment related, Bloomberg is also planning a new online offering. It will be subscription-based and will offer all of Bloomberg’s current business content plus new content. For example, there are plans for a series, Moon Shot that looks at major scientific breakthroughs. Accelerate will look at test-driving cars of the future. Prognosis will look at cutting edge medicine.

The question faced by customers of traditional cable TV is if they want to continue to pay the big monthly bills for traditional TV and also subscribe to some of this new content. There are a lot of households that are going to want to watch the Disney catalog of programming or see the new content on Apple TV+ or HBO Max. It seems likely that this flood of new content is going to convince more homes to cut the cord.