The Definition of Broadband

When the FCC set the definition of broadband at 25/3 Mbps in January of 2015, I thought it was a reasonable definition. At the time the FCC said that 25/3 Mbps was the minimum speed that defined broadband, and anything faster than 25/3 Mbps was considered to be broadband, and anything slower wasn’t broadband.

2015 was forever ago in terms of broadband usage and there have been speed increases across the industry since then. All of the big cable companies have unilaterally increased their base broadband speeds to between 100 Mbps and 200 Mbps. Numerous small telcos have upgraded their copper networks to fiber. Even the big telcos have increased speeds in rural America through CAF II upgrades that increased speeds to 10/1 Mbps – and the telcos all say they did much better in some places.

The easiest way to look at the right definition of broadband today is to begin with the 25/3 Mbps level set at the beginning of 2015. If that was a reasonable definition at the beginning of 2015, what’s a reasonable definition today? Both Cisco and Ookla track actual speeds achieved by households and both say that actual broadband speeds have been increasing nationally about 21% annually. Apply a 21% annual growth rate to the 25 Mbps download speeds set in 2015 would predict that the definition of broadband today should be 54 Mbps:

2015 2016 2017 2018 2019
25 30 37 44 54

We also have a lot of anecdotal evidence that households want faster speeds. Households have been regularly bailing on urban DSL and moving to faster cable company broadband. A lot of urban DSL can be delivered at speeds between 25 and 50 Mbps, and many homes are finding that to be inadequate. Unfortunately, the big telcos aren’t going to provide the detail needed to understand this phenomenon, but it’s clearly been happening on a big scale.

It’s a little sketchier to apply this same logic to upload speeds. There was a lot of disagreement about using the 3 Mbps download speed standard established in 2015. It seems to have been set to mollify the cable companies that wanted to assign most of their bandwidth to download. However, since 2015 most of the big cable companies have upgraded to DOCSIS 3.1 and they can now provide significantly faster uploads. My home broadband was upgraded by Charter in 2018 from 60/6 Mbps to 135/20 Mbps. It seems ridiculous to keep upload speed goals low, and if I was magically put onto the FCC, I wouldn’t support an upload speed goal of less than 20 Mbps.

You may recall that the FCC justified the 25/3 Mbps definition of broadband by looking at the various download functions that could be done by a family of four. The FCC examined numerous scenarios that considered uses like video streaming, surfing the web, and gaming. The FCC scenario was naive because they didn’t account for the fact that the vast majority of homes use WiFi. Most people don’t realize that WiFi networks generate a lot of overhead due to collisions of data streams – particularly when a household is trying to do multiple big bandwidth applications at the same time. When I made my judgment about the 25/3 Mbps definition back in 2015, I accounted for WiFi overheads and I still thought that 25/3 Mbps was a reasonable definition for the minimum speed of broadband.

Unfortunately, this FCC is never going to unilaterally increase the definition of broadband, because by doing so they would reclassify millions of homes as not having broadband. The FCC’s broadband maps are dreadful, but even with the bad data, it’s obvious that if the definition of broadband was 50/20 Mbps today that a huge number of homes would fall below that target.

The big problem with the failure to recognize the realities of household broadband demand is that the FCC is using the already-obsolete definition of 25/3 Mbps to make policy decisions. I have a follow-up blog to this one that will argue that using that speed as the definition of the upcoming $20.4 billion RDOF grants will be as big of a disaster as the prior FCC decision to hand out billions to upgrade to 10/1 Mbps DSL in the CAF II program.

The fact that household broadband demand grows over time is not news. We have been on roughly the same demand curve growth since the advent of dial-up. It’s massively frustrating to see politics interfere with what is a straight engineering issue. As homes use more broadband, particularly when they want to do multiple broadband tasks at the same time, their demand for faster broadband grows. I can understand that no administration wants to recognize that things are worse than they want them to be – so they don’t want to set the definition of broadband at the right speed. But it’s disappointing to see when the function of the FCC is supposed to be to make sure that America gets the broadband infrastructure it needs. If the agency was operated by technologists instead of political appointees we wouldn’t even be having this debate.

Why Aren’t We Talking about Technology Disruption?

One of the most interesting aspects of modern society is how rapidly we adapt to new technology. Perhaps the best illustration of this is the smartphone. In the short period of a decade, we went from a new invention to the point where the large majority of the American public has a smartphone. Today the smartphone is so pervasive that recent statistics from Pew show that 96% of those under between 18 and 29 have a smartphone.

Innovation is exploding in nearly every field of technology, and the public has gotten so used to change that we barely notice announcements that would have made worldwide headlines a few decades ago. I remembre as a kid when Life Magazine had an issue largely dedicated to nylon and polymers and had the world talking about something that wouldn’t even get noticed today. People seem to accept miracle materials, gene splicing, and self-driving cars as normal technical advances. People now give DNA test kits as Christmas presents. Nobody blinks an eye when big data is used to profile and track us all. We accept cloud computing as just another computer technology. In our little broadband corner of the technology world, the general public has learned that fiber and gigabit speeds are the desired broadband technology.

What I find perhaps particularly interesting is that we don’t talk much about upcoming technologies that will completely change the world. A few technologies get talked to death such as 5G and self-driving cars. But technologists now understand that 5G is, in itself, not a disruptive technology – although it might unleash other disruptive technologies such as ubiquitous sensors throughout our environment. The idea of self-driving cars no longer seems disruptive since I can already achieve the same outcome by calling an Uber. The advent of self-driving semi trucks will be far more disruptive and will lower the cost of the nationwide supply chain when we use fleets of self-driving electric trucks.

I’ve always been intrigued about those who peer into the future and I read everything I can find about upcoming technologies. From the things I read there are a few truly disruptive technologies on the horizon. Consider the following innovations that aren’t too far in the future:

Talking to Computers. This will be the most important breakthrough in history in terms of the interface between humans and technology. In a few short generations, we’ve gone from typing on keyboards, to using a mouse, to using cellphones – but the end game will be talking directly to our computers using natural conversational language. We’ve already seen significant progress with natural language processing and are on a path to be able to converse with computers in the same way we communicate with other people. That will trigger a huge transition in society. Computers will fade into the background since we’ll have the full power of the cloud anywhere that we’re connected to the cloud. Today we get a tiny inkling by seeing how people use Apple Siri or Amazon Alexa – but these are rudimentary voice recognition systems. It’s nearly impossible to predict how mankind will react to having the full power of the web with us all of the time.

Space Elevator. In 2012 the Japanese announced a nationwide goal of building a space elevator by 2050. That goal has now been pulled forward to 2045. A space elevator will be transformational since it will free mankind from the confines of the planet earth. With a space elevator we can cheaply and safely move people and materials to and from space. We can drag up the raw materials needed to build huge space factories that can then take advantage of the mineral riches in the asteroid belt. From there we can colonize the moon and mars, build huge space cities and build spaceships to explore nearby stars. The cost of the space elevator is still estimated to only be around $90 billion, the same as the cost of the high-speed rail system between Osaka and Tokyo.

Alternate Energy. We are in the process of weaning mankind from fossil fuel energy sources. While there is a long way to go, several countries in Europe have the goal to be off carbon fuels within the coming decade. The EU already gets 30% of electricity from alternate energy sources. The big breakthrough might finally come from fusion power. This is something that has been 30 years away my whole adult life, but scientists at MIT and other places have developed the needed magnets that can contain the plasma necessary for a fusion reaction and some scientists are now predicting fusion power is now only 15 years away. Fusion power would supply unlimited non-polluting energy, which would transform the whole world, particularly the third world.

An argument can be made that there are other equally disruptive technologies on the horizon like artificial intelligence, robotics, gene-editing, virtual reality, battery storage, and big data processing. Nothing on the list would be as significant as a self-aware computer – but many scientists still think that’s likely to be far into the future. What we can be sure of is that breakthroughs in technology and science will continue to come at us rapidly from all directions. I wonder if the general public will even notice the mosts important breakthroughs or if change has gotten so ho hum that it’s just an expected part of life.

Worldwide Broadband Trends

Hootsuite is the premier tracker of social media usage around the world. They publish numerous reports annually that track broadband statistics and social media statistic from around the world.

They report the following statistics for the end of 2018. The world has been seeing one million new users online every day since January 2018. That means there are 11 new users on the web every second. There are now 5.11 billion mobile subscribers in the world, 67% of the world’s population. 4.39 billion people have access of some sort to the internet, about 57% of the people in the world. There are 3.48 billion people who use social media.

Mobile subscribers increased by 2% in 2018. Internet users increased by 9.1% and active social media users increased by 9%.

The US and northern Europe both lead the world in Internet access with 95% of the population using the internet from a landline or cellular connection. The rest of the world is still far behind. While we talk about the great connectivity in parts of the far east, the region has a 60% penetration of people who use the Internet. That’s lower than the 63% penetration in Central America and 74% in South America. The areas with the worst broadband coverage are middle Africa at only 12%, eastern Africa at 32% and western Africa at 41%.

The largest growth of Internet users is in India, which saw almost 100 million new Internet users in 2018, a 21% increase. That represents 25% of all new Internet users in the world for last year. Some other countries are growing faster, such as Afghanistan at 156%, Cote D’Ivoire at 69%, Cambodia at 56%, Iran at 29%, and Italy at 27%.  Hootsuite has been tracking Internet users since 2014 and has seen more than 1.9 billion people added to the Internet since then.

The World Wide Web turns 30 this year (that’s hard for many to believe!). It took 16 years to add the first billion users, 6 more years to add the second billion. The internet is now adding a billion users every 2.7 years.

The importance of cellular broadband has grown over time. In 2014, 26% of users connected to the web using a cellular phone. Today that has grown to 48%. The average Internet user worldwide uses the Internet an average of 6 hours and 42 minutes per day. The biggest daily users of the web are in the Philippines, with daily usage over 10 hours per day. In the US the average is 6.5 hours per day.

Google has the world’s two most popular web sites with Google search at number 1 and YouTube at number 2. Facebook is in third, with the top ten rounded out by Baidu, Wikipedia, Yahoo, Twitter, Pornhub, Yandex, and Instagram.

GlobalWebIndex reports that 92% of Internet users (about 4 billion) now watch video each month. To put that into perspective, there are an estimated 6 billion people around the world have access to a television.

It’s estimated that more than 1 billion users now stream games, with Fortnite being the number one game in the world. There are also a billion people who watch other people play games, with 700 million people who watch e-sports.

About 40% of Internet users now interface with the web using voice. In China and India over half of users interface the web with voice.

Social media grew by 288 million new users last year. The US still leads with social media with 70% of Americans internet users connected to at least one social media site. China also has a 70% social media penetration followed by 67% in northern Europe and 66% in South America. China added 95 million users to social media in 2018 followed by India at 60 million and Indonesia at 20 million. Worldwide the average social media usage is 2 hours and 16 minutes per day. The Philippines again leads in this category where daily usage is 4 hours and 12 minutes. In the US it’s a little over 2 hours per day.

While there are still billions with no access to the web, the web keeps growing at a rapid pace around the world. There are efforts by companies like Google, Facebook, and the satellite broadband providers to bring better broadband to the parts of the world with no connections.

Providing Local Content to Rural America

This fall we can look forward to a big battle in Congress over the rules regulating cable TV. The rules that govern the rights of satellite TV providers to carry local network affiliates (ABC, CBS, NBC, and FOX) will be expiring. If Congress takes no action it’s possible that local networks could disappear from satellite cable lineups.

In 2014 Congress passed the STELAR Act (Satellite Television Extension and Localism Act Reauthorization). This legislation allowed satellite providers to deliver distant network stations into rural markets rather than having to negotiate with individual stations in every market. This has acted to hold down satellite TV costs since it makes local stations agreeable to negotiate fees with the satellite providers at reasonable fees.

This is a big contrast to the way that landline cable networks have to pay for local programming. In the 1992 Cable Act, Congress enacted the idea of retransmission consent. These rules were intended to protect local network affiliates since many cable companies at the time were electing to not add local stations channels to their line-ups. The 1992 Act made it mandatory for cable companies to carry local networks, and for the most part, they did so for free.

However, over the last decade, as local stations were losing advertising revenues, they have stepped up charges to cable companies to carry their signal. The fees for access to local network affiliates in most markets has skyrocketed and contributes $10 – $12 per month towards the cost of cable TV bills in most markets (in a few a lot more).

There is a lot of pressure on Congress to look at the whole retransmission issue while they are considering the STELAR renewal for satellite companies. Congress hasn’t made any significant regulatory changes for the cable industry since the 1992 Act and the industry has changed drastically in the last few years.

Just looking back to 2014 when the STELAR act was passed, online content providers like Netflix represented only 2% of the industry. Today there is a slew of online content providers and there are now more households buying online content than subscribing to traditional cable packages. With cord-cutting, the numbers are shifting drastically, with the latest figures trending towards traditional cable losing as much as 5% of total market share this year.

We are also seeing escalating battles over carriage of content. There were 213 blackouts in the industry as of the end of July, contrasted with 165 blackouts for last year. Last month a battle between AT&T and CBS caused those channels to go dark. There has been a running battle between Dish Networks and Univision this year. It’s becoming obvious that the cable companies are no longer willing to automatically accept huge rate increases from local network affiliates.

It’s a classic battle of huge companies. Cable companies are pushing to break the required nature of retransmission consent rules that require them to carry local stations – that rule gives cable companies almost no negotiating power. Meanwhile, the big networks like ABC and CBS have been benefitting from the retransmission revenues. While these fees are theoretically paid to local stations, the parent networks sweep most of this money into their own coffers. The network owners are pushing hard to keep the retransmission consent rules intact.

Most local stations now charge between $2 and $3 monthly to cable companies for every customer that receives their signal. It’s an interesting dynamic because a majority of people could instead get this content for free through the use of rabbit ears. Additionally, most of the national content from the big networks is available online – it’s not hard to find ways to watch the shows from CBS or NBC. The big monthly retransmission fees only add local programming like news and local sports to cable subscribers.

The cord-cutting phenomenon tells us that many households are willing to walk away from local programming if it saves them money. I was in a meeting last with ten people, and not one of them watches local news and local programming. The big question facing Congress is how relevant local content is to most households. There are many people who still love local news and local sports, but that universe keeps shrinking as households are deluged with content alternatives. Expect to hear lots of rhetoric this fall as both sides rachet up arguments for Congress.

 

GAO Supports Broadband to Students

In an interesting move, the General Accounting Office released a paper that suggests that the FCC should expand the use of E-Rate funding to allow wireless connections to student homes. The E-Rate program is one of the major components of the Universal Service Fund. Today the E-Rate program can be used to construct fiber assets to reach schools or to subsidize fast broadband connections to schools that demonstrate the financial need.

Specifically, the GAO “recommended that the FCC assess and report on the potential benefits, costs, and challenges of making wireless access off school grounds eligible for E-rate.” The GAO recommendation went on to discuss the impact of the homework gap that puts students without home broadband at risk for falling behind their peers.

I’m sure that every rural broadband advocate understands this issue and applauds the GAO for making this suggestion. However, this seems extraordinary coming from the GAO and seems outside of their stated scope and function. The GAO is the internal auditor of the US government and is often called the congressional watchdog agency that examines how taxpayer dollars are spent. The GAO routinely provides reports to Congress with ideas on how the government can save money and work more efficiently.

I may have missed it, but I can’t remember the agency ever making policy suggestions of this magnitude to the FCC in the past. This seems like a particularly puzzling recommendation because it’s a suggestion on how to spend existing E-Rate money in a different way rather than suggesting ways for the FCC to save money.

The GAO report does a great job of describing the homework gap. The report discusses challenges that school-age children face in doing homework without a computer. The GAO found that many homes without home broadband rely on cellular connectivity, which is largely inadequate for doing homework. The report points out the shortcoming and challenges on the following chart for the alternatives to home broadband such as libraries, community centers, coffee shops, and outdoor WiFi around schools.

The GAO looked into six pilot projects at schools that tried to alleviate the homework gap using wireless technology to connect to students. The GAO pointed out that in 2016 that two schools asked for FCC waivers in the E-Rate program to provide wireless connectivity to students, but the FCC never reacted to the waiver requests.

The GAO concludes the report by recommending that the FCC take steps to assess and publish the potential benefits, costs, and challenges of making off-premises wireless access eligible for E-rate support. The has already FCC agreed with GAO’s recommendation, so I’m sure there will be an assessment coming in the near-future.

Some Problems with the RDOF

The FCC recently published a set of proposed rules for conducting the $20.4 billion broadband grant program it has labeled as the Rural Digital Opportunity Zone (RDOF). While the FCC is to be applauded for redirecting the funding that formerly was used to support the CAF II program, there are still some problems I foresee in the grant program as proposed.

Reverse Auction. Many of my problems come because of the use of a reverse auction. I understand why politicians and policymakers like this idea. The general concept that those willing to take the least amount of subsidy get the funding somehow sounds fair, but a reverse auction is not going to result in the best use of these funds to bring permanent broadband solutions to rural America:

  • Favors Those Who Don’t Need the Money. We saw this in the CAF II reverse auction where satellite broadband won a significant amount of funding. This time around there’s a good chance that a large amount of grant money might go to Elon Musk’s Starlink and the other low orbit satellite providers. By definition, for satellite technology to work they have to cover everywhere – and so they are going to be launching the satellites anyway without subsidy. These companies can easily be the low bidders because getting anything out of the grant fund is still a great result for them. As we going to be happy of the result of the reverse auction results in billions of dollars handed to Elon Musk?
  • Favors Lowest Cost Technology. By definition, those planning to spend less per customer to bring broadband can take accept money from the grants and still be happy. This means the grants will favor solutions the big telcos again tweaking DSL over ancient copper if they choose to participate. This would allow AT&T and Verizon to grab a lot of money to support rural cellular upgrades. While the FCC is planning to weight the bidding to promote faster technologies like fiber, if the weighting isn’t done right, then the funding will automatically favor lower-cost yet slower technologies. Maybe that’s what the FCC wants – to bring some broadband solution to the largest number of people – but the best policy is to bring a permanent broadband solution to a smaller subset of areas.
  • Discriminates Against High Cost Areas. The areas that need broadband the most are where it costs the most to build any infrastructure. Areas like Appalachia and Alaska are high cost because of topology, and anybody applying for grants in these areas likely can’t afford to reduce the percentage of grant funding their receive. The entire concept of reverse auction, by definition, favors parts of the country with the lowest construction costs. Applicants in the wide-open plains of the Midwest have a built-in advantage.

The Sheer Size of the One-Time Award. The grant awards are likely to be about a year away. I wonder if there will be enough ISPs ready to bid in that short time frame? Bidders need to develop an engineering estimate and business plan of sufficient quality to also attract financing. If there are not enough ISPs able to be ready for the auction in that time frame, even more of the money is likely to flow to big companies like the satellite providers who would be glad to take the whole pot of funding. A better plan would have been to break this into several grant years and award some 10-year grants, some 9-year grants, and some 8-year grants.

No Real Penalties for Cheating. Companies don’t get penalized much for lying about the speeds they can deliver. We saw a few wireless providers in the CAF II reverse auction claim they can deliver 100 Mbps broadband to everybody. Unless somebody develops that technology in the next 2-3 years they are going to deliver something less, at least to a large percentage of their coverage area. If a company gets a bidding credit by making a false claim, they should lose all of their funding and have to repay the FCC. The proposed penalties are not much more than a slap on the wrist and encourage companies to claim faster speeds than they can deliver.

Likely Excludes Some Bidders. The rules still seem to exclude entities that can’t get Eligible Telecommunications Carrier (ETC) status – a regulatory designation required to get money from the Universal Service Fund – a status only available to entities that own the network, and which are also the retail ISP. This would preclude entities like the PUDs, the rural municipal electric companies in Washington that are required by law to operate open access networks. It also could preclude certain kinds of partnerships where the retail ISP is different than the network owner – an arrangement we’re seeing a lot in partnerships between telcos and electric cooperatives. Anybody willing to invest in rural broadband should be eligible to participate.

Are You Paying to Spy on Yourself?

Geoffrey A. Fowler of the Washington Post recently engaged a data expert to track everything going on behind the scenes with his iPhone. What he found was surprising since Apple touts itself as a company that doesn’t invade user privacy. The various apps on his phone were routinely handing out his personal data on a scale that shocked him.

Fowler’s information was being gathered by trackers. This is software built directly into apps and is different than ad tracking cookies that we pick up from web sites. App makers deliberately build trackers into apps and a user can’t get rid of them without getting rid of the app.

Most apps on his phone had these trackers. That included sites like Microsoft OneDrive, Intuit’s Mint, Nike, Spotify, The Washington Post, and the Weather Channel. Some apps came with numerous trackers. He had a food delivery service called DashDoor that included nine separate trackers. Third parties must be paying to share app space because the DashDoor app included trackers for Facebook and Google – those two companies know every time that app is used to order food.

Almost none of these apps disclosed the nature of what they were tracking. When first loaded, most apps ask for somewhat generic permission to track user certain data but don’t disclose the frequency and the extent to which they will gather data from a user.

This issue has relevance beyond privacy concerns because the apps on Fowler’s phone could collectively use as much as 1.5 gigabytes of data per month on his phone. Industry statistics show that the fastest-growing segment of Internet traffic is machine-to-machine communication, and these app trackers make a significant contribution to that traffic. Put bluntly, a lot of machine-to-machine traffic is either being used to back up files or to spy on us.

This has to be concerning to people who are still on measured cellular data plans. This unintended usage can cost real money and a user can end up paying to have trackers spy on them. Our cellphones are generating broadband usage without our knowledge, and mostly without our explicit permission. I’ve had months where I’ve barely roamed with my cellphone and still have seen more than a gigabyte of usage – I now understand where it’s probably coming from.

PCs and tablets have the same problems, with the data tracking coming more from marketing cookies that are loaded when we visit web sites. I scrub these cookies from my computer routinely. My desktop is only used for work and I still find 40 – 100 cookies every week. One of my blogs last year mentioned a guy who had gone on vacation for a month and was shocked when he returned and discovered that his home network had used several gigabytes of data in his absence.

There are ways to block the trackers on your phone, but this mostly involves deleting apps or turning off permission in your privacy setting, and that largely means the apps won’t work. You can also take steps to disguise your data by passing everything through a VPN, but that doesn’t stop the data from being transmitted.

The phone manufacturers are complicit in this tracking. I just got a new Samsung Galaxy and my new phone came with over 300 apps – most for services I don’t use like Facebook, Spotify, and ton of others. These various companies must have paid Samsung (or perhaps AT&T) to include their apps and their trackers. I’ll be spending a few days deleting or disabling most of these apps. I find it creepy that Facebook follows me even though I stopped using the site several years ago. And unlike when I download a new app, I didn’t have the opportunity to allow or deny permission to the many apps on my new phone – I assume AT&T gave that permission.

It might be a generational thing, but it bothers me to have companies reaping my personal data without my permission, without disclosing what they are gathering, and how they are using it. I know young people who are not bothered by tracking and assume that this is just a part of being connected.

The other big concern is that the tracking apps are contributing to the capacity problems on cellular network. I just saw last week that the average US cellphone now uses about 6 GB of data per month. If trackers are pushing out even half a gigabyte per month in usage that is a significant contributor to swamped cellular networks. Cellphone companies are working furiously to keep ahead of the demand and it must be maddening to cellular network engineers to know that 15% – 20% of network usage is being created behind the scenes with app trackers and not from actions taken by users.

In an ideal world, this is something regulators would be investigating to establish rules. Apps like DashDoor shouldn’t be allowed to install a Facebook tracker on your phone without asking for specific and explicit permission. All trackers should have to disclose the exact information they gather about a user and the frequency of that tracking. Unfortunately, this FCC has walked away from any regulatory role in this area. Congress could address the issue – something that European regulators are considering – but this doesn’t seem to be high on anybody’s radar.

Court Chips Away at 5G Deployment Rules

The US Court of Appeals for the D.C. Circuit ruled last week that the FCC had gone too far when it ruled that 5G cell site placement could bypass environmental and historic preservation review. The specific ruling looked at whether the FCC has the authority to bypass these kinds of reviews for sites of religious and cultural importance to federally recognized Indian Tribes. But the ruling has a far larger significance and applies to these kinds of reviews everywhere.

This type of court ruling seemed inevitable because of the brashness of the original FCC order. That order declared that the deployment of 5G is so important that all of the rules in the country applying to the deployment of new infrastructure don’t apply. For courts to buy that argument that must be convinced that 5G deployment is so important that it is indeed a national emergency.

I think everybody who understands the benefits of 5G understands that it is an important new technology – one that will create huge benefits for the country. But it’s hard to make an argument that 5G deployment is an emergency.

The biggest benefits of 5G are only going to manifest with the introduction of frequency slicing into the cellular network, and that looks to be 3 – 4 years away. The deployments that the cellular carriers are labeling as 5G today mostly marketing gimmicks and custoemrs are not yet seeing any of the real benefits from 5G.

I blame the original FCC 5G order on a poorly chosen strategy by the cellular carriers, abetted by the FCC. We are facing a cellular emergency in the country, but it’s a crisis of 4G and not 5G. Our existing 4G network is in serious trouble and it seems that the cellular carriers don’t want to admit it. Cellular data networks are swamped because customer data usage is not doubling every two years. I have seen big problems in my local AT&T network. There have been many days when it’s hard to make or hold a call – something that never happened before last year.

The explosive growth of cellular traffic is partially the fault of the cellular carriers – it comes as a result of ‘unlimited’ data plans that encourage people to watch video and use cellphone data. It wasn’t that long ago when it cost a lot to buy a data plan that exceeded 1 or 2 gigabytes of usage per month. The average customer with an unlimited plan now uses 6 GB per month, and that number is growing rapidly.

The other cause of the increased demand on cellular networks comes from the success of the industry convincing in convincing everybody to use a smartphone. A recent Pew poll showed that 95% of teens and young adults now have a smartphone. The sheer number of customers is swamping the networks.

There is a path out of the current data crisis for cellular networks. It’s a 3-prong approach that involves building more cell sites, adding more bands of frequency onto cellphones, and finally layering on the frequency slicing capabilities of 5G.

It takes at 3 – 5 years to introduce a new frequency into the cellular network. That involves upgrading cell sites, but more importantly, it means building the capability into handsets and then getting the new phones into the hands of enough people to make a difference.

With real 5G benefits still a few years off, the only immediate way to relieve pressure on the cellular network is to add small cell sites. Each small cell site grabs local callers and keeps them off the big tall cell towers. All of the hectic small cell site construction we see is not being done for 5G – it’s being done to take the pressure off the 4G network.

The big cellular companies seem unwilling to admit that their networks are hurting and are in danger of overload – the first company brave enough to say that probably loses customers. Instead, the cellular industry elected to push the 5G narrative as the reason for bypassing the normal way that we build infrastructure. In this case, the courts didn’t buy that 5G is an emergency, and the court is right because 5G isn’t even here yet. If the cellular carriers and the FCC would have declared a 4G emergency I think everybody would have gotten it. We all want our cellphones to work.

The courts are still reviewing the appeal of an issue with even more potential dire consequences to the cellular carriers. Probably the most important aspect of the FCC’s 5G ruling is that cities have little say about the placement of small cell sites and also must expedite permitting for new small cell sites. That ruling was challenged by numerous cities and is being reviewed by the US Court of Appeals for the Ninth Circuit. That issue also boils down to the question of whether deploying 5G is an emergency. I wonder if it’s too late for the cellular carriers to fess up and admit that the emergency is really for 4G – even appeal court judges would likely understand that.

How Smart are Promotional Rates?

I think the big ISPs are recognizing the impact that special promotion rates have on their bottom line. Promotional pricing is the low rates that cable companies offer to new customers to pry them away from the competition. Over the years promotional rates have also become the tool that cable companies use to retain customers. Most customers understand that they have to call the cable company periodically to renegotiate rates – and the big ISPs have routinely given customers a discount to keep them happy.

We’re finally seeing some changes with this practice. When Charter bought Time Warner Cable they found that Time Warner had over 90,000 ‘special’ pricing plans – they routinely negotiated separately with customers when they bought new service or renegotiated prices. Charter decided to end the practice and told most former Time Warner customers that they had to pay the full price at the end of their current contract period.

We’ve seen the same thing with AT&T and DirecTV. The company decided last year to eliminate the special discount on DirecTV and DirecTV Now. When the discount period ends for those products the company moves rates to the full list price and refuses to renegotiate. The practice cost AT&T almost a million customers just in the first quarter of this year. But AT&T says that they are glad to be rid of customers that are not contributing to the bottom line of the company. I’ve seen where the CEOs or other big ISPs like Comcast have said that they are considering changes in these practices.

At CCG we routinely examine customer bills from incumbent ISPs as part of the market research of helping ISPs entering new markets. While our examination of customer bills has never reached the level of equating to a statistically valid sample, I can report that the vast majority of bills we see have at least some level of discount. In some markets it’s rare to find a customer bill with no discount.

The discounts must accumulate to a huge loss of revenue for the big ISPs. The big ISPs all know that one of the only ways they are going to be profitable in the future is to raise broadband rates every year. The growth of broadband customers overall is slowing nationwide since most homes have broadband, although Charter and Comcast are still enjoying the migration of customers off DSL. The ISPs are continuing to lose revenues and margins as they lose cable and landline voice customers. Most US markets are seeing increased competition in broadband services for businesses and large MDUs. There’s not much left other than to raise residential broadband rates if the big ISPs want to satisfy the revenue growth expected by Wall Street.

If the big ISPs phased out promotional discounts it would probably equate to a 5% to 10% revenue increase. This is something that is becoming easier for a cable company to do. Many of them have already come to grips with cord cutting, and many are no longer fighting to keep cable customers. Cable companies are also less worried over time about customers leaving them to go back to DSL – a choice that is harder for consumers to make as the household need for broadband continues to climb.

Most ISPs won’t make a loud splash about killing discounts but will just quietly change policies. After a few years, I would expect customer expectations will reset after they realize that they can no longer extract discounts by threatening to drop service.

I’ve always advised my fiber overbuilder customers to not play this game. I ask clients if they really want to fight hard to win that slice of the market of customers that will change ISPs for a discount. Such customers flop back and forth between ISPs every two years, and in my opinion, companies are better off without such customers. Churn is expensive, and it’s even more expensive if an ISP provides a substantial discount to stop a customer from churning. Not all of my client agree with this philosophy, but if the big ISPs stop providing promotional discounts, then over time the need to do this for competitors will lessen.

This is certainly a practice I’d love to see slip into history. I’ve never liked it as a customer because I despise the idea of having to play the game of renegotiating with an ISP every few years. I’ve also hated this as a consultant. Too many times I’ve seen clients give away a huge amount of margin through these practices, giving away revenue that is needed to meet their forecasts and budgets. It’s dangerous to let marketing folks determine the bottom line because they’ve never met a discount they don’t like – particularly if they can make a bonus for selling or retaining customers.

FCC Proposes Rules for $20.4 Billion Broadband Grants

On August 2 the FCC released a Notice of Proposed Rulemaking (NPRM) that proposes rules for the upcoming grant program that will award $20.4 billion for rural broadband. Since every FCC program needs a name, this grant program is now designated as the Rural Digital Opportunity Fund (RDOF). An NPRM is theoretically only a list of suggestions by the FCC, and there is a comment period that will commence 30 days after the NPRM is posted in the Federal Register. However, realistically, the rules that are proposed in the NPRM are likely to be the rules of the grant program. Here are a few of the highlights:

Timing of Award. The FCC proposes awarding the money in two phases. The Phase I award will be awarded late next year and will award over $16 billion. The Phase II will award will follow and award the remaining $4.4 billion. I know a lot of folks were hoping for a $2 billion annual grant award – but most of the money will be awarded next year. Anybody interested in this program should already be creating a network design and a financial business plan because the industry resources to create business plans are going to soon be too busy to help.

The money will be paid out to grant recipients over 10 years, similar to the ACAM program for small telcos. Grant recipients need to understand the time value of money. If an ISP wins a $1 million grant and borrows money at a rate of 5.5% interest, then the actual value of the grant in today’s dollars is a little more than $750,000.

Areas Eligible for Award. The Phase I auction will only be awarded in areas that are wholly unserved using the definition of broadband as 25/3 Mbps or faster. The areas covered can’t have anybody capable of getting broadband faster than that. The FCC is likely to publish a list of areas eligible for the Phase I grants. Unfortunately, the FCC will use its flawed mapping program to make this determination. This is likely to mean that many parts of the country that ought to be eligible for these grants might not be part of the program.

Phase II is likely to be targeted at areas that did not see awards in Phase I. One of the open questions in the NPRM that is not yet firm is the size of award areas. The NPRM asks if the minimum coverage area should be a census block or a county. It also asks if applicants can bundle multiple areas into one grant request.

The FCC is considering prioritizing areas it thinks are particularly needy. For example, it may give extra grant weighting to areas that don’t yet have 10/1 Mbps broadband. The FCC is also planning on giving extra weighting to some tribal areas.

Weighting for Technology. Like with the CAF II reverse auction, the grant program is going to try to give priority to faster broadband technologies. The FCC is proposing extra weighting for technologies that can deliver at least 100 Mbps and even more weighting for technologies that can deliver gigabit speeds. They are also proposing a grant disincentive for technologies with a latency greater than 100 milliseconds.

Use of Funds. Recipients will be expected to complete construction to 40% of the grant eligible households by the end of the third year, with 20% more expected annually and the whole buildout to be finished by the end of the sixth year.

Reverse Auction. The FCC is proposing a multi-round, descending clock reverse auction so that bidders who are willing to accept the lowest amount of subsidy per passing will win the awards. This is the same process used in the CAF II reverse auctions.

Overall Eligibility. It looks like the same rules for eligibility will apply as with previous grants. Applicants must be able to obtain Eligible Telecommunications Carrier (ETC) status to apply, meaning they must be a facilities-based retail ISP. This will exclude entities such as open access networks where the network owner is a different entity than the ISP. Applicants will also need to have a financial track record, meaning start-up companies need not apply. Applicants must also provide proof of financing.

Measurement Requirements. Grant winners will be subject to controlled speed tests to see if they are delivering what was promised. The FCC is asking if they should keep the current test – where only 70% of customers must meet the speed requirements for an applicant to keep full funding.

I see problems with a few of these requirements that I’ll cover in upcoming blogs.