The Frontier Bankruptcy

Bloomberg reported that Frontier Communications is hoping to file a structured bankruptcy in March. A structured bankruptcy is one where existing creditors agree to cut debt owed to them to help a company survive. There is no guarantee that the existing creditors will go along with Frontier’s plan, and if not, the bankruptcy would be handed to a bankruptcy court to resolve.

It’s been obvious for a long time that Frontier is in trouble. Three years ago, the stock sat at over $51 per share. By January 2018 it had fallen to $8.26 per share, and to $2 per share a year ago. As I write this blog the stock sits at 59 cents per share.

Frontier has been losing customers rapidly. In the year ending September 30, 2019 the company lost 6% of its broadband customers (247,000), with 71,000 of the losses occurring during the third quarter of last year.

For those not familiar with the history of Frontier, the company started as Citizens Telephone Company, a typical small independent telco. The company grew by buying telephone customers from GTE, Contel, and Alltel. The company became Frontier when they bought the remains of the Rochester Telephone Company from Global Crossings. Since then Frontier went on a buying spree and purchased large numbers of customers from Verizon.

Frontiers woes intensified in 2016 when they bungled the takeover of Verizon FiOS customers while taking on huge debt. There were major outages in some major markets that drove customers to change to the cable company competitor. However, Frontier’s biggest problem is due to operating a lot of rural copper networks. The copper networks they purchased had been maintained poorly before acquired by Frontier. For example, Frontier bought all of the Verizon customers in West Virginia, and Verizon had been ignoring the market and had been trying to sell it for over fifteen years.

Frontier got a small boost when the FCC gave them $1.7 billion to upgrade rural DSL to speeds of at least 10/1 Mbps. This month Frontier reports that it has not fully met that requirement in parts of thirteen states. Customers in many places where Frontier has supposedly made the upgrades are saying that speeds are not yet at the required 10/1 Mbps.

Frontier’s real problem is that their rural properties are being overbuilt by other ISPs. For example, Frontier properties are the targets of funding for many state broadband grants. Most of the rural Frontier network is going to be targeted in the upcoming $16 billion RDOF grants this year. It would not be surprising to see the company quietly disappear from rural America as others build better broadband.

Meanwhile, other than in properties that formerly were Verizon FiOS on fiber, the company’s networks in towns are also providing DSL. We’ve seen every telco that offers DSL in urban areas like AT&T and CenturyLink lose a lot of customers year-after-year to the cable companies. It’s increasingly difficult for DSL to keep customers with speeds between 10 Mbps and 50 Mbps when competing against cable products of 100 Mbps and higher.

Last May, Frontier announced the sale of its properties in Washington, Oregon, Idaho and Montana to WaveDivision Capital. That sale was for $1.35 billion, which doesn’t make a big dent in the company’s $16.3 billion in long-term debt. Frontier has also shed 10% of its workforce in an attempt to control costs.

Frontier may get the structured bankruptcy they are seeking or may have to give up more to survive this current bankruptcy. However, restructuring their debt is not going to make up for the huge amounts of its network that sits on dying copper. They are not the only company facing this issue and CenturyLink has even more rural copper. However, CenturyLink has a thriving business in big cities and would be stronger if regulators ever allow it to walk away from rural copper.

The harder question to answer is if there is a viable company remaining after Frontier finally sheds or loses its rural customer base. I don’t know enough to make any prediction on that, but I can predict that the company’s problems will not be over even after making it through this bankruptcy.

Letters of Credit

One of the dumbest rules suggested by the FCC for the new $16.4 billion RDOF grants is that an ISP must provide a letter of credit (LOC) to guarantee that the ISP will be able to meet their obligation to provide the matching funds for the RDOF grants. The FCC had a number of grant winners years ago in the stimulus broadband grant program that never found financing, and the FCC is clearly trying to avoid a repeat of that situation. A coalition of major industry associations wrote a recent letter to the FCC asking them to remove the LOC requirement – this includes, NTCA, INCOMPAS, USTelecom, NRECA, WTA, and WISPA.

There may be no better example of how out of touch Washington DC is with the real world because whoever at the FCC came up with that requirement has no idea what a letter of credit is. A letter of credit is a formal negotiable instrument – a promissory note like a check. A letter of credit is a promise that a bank will honor the obligation of the buyer of a letter of credit should that buyer fail to meet a specific obligation. The most normal use of LOCs is in international trade or transactions between companies that don’t know or trust each other. An example might be a company that agrees to buy $100,000 dollars of bananas from a wholesaler in Costa Rico, payable upon delivery of the bananas to the US. The US buyer of the bananas will obtain a letter of credit, giving assurance to the wholesaler that they’ll get paid. When the bananas are received in the US, the bank is obligated to pay for the bananas if the buyer fails to do so.

Banks consider letters of credits to be the equivalent of loans. The banks must set aside the amount of pledged money in case they are required to disburse the funds. Most letters of credit are only active for a short, defined period of time. It’s highly unusual for a bank to issue a letter of credit that would last as long as the six years required by the RDOF grant process.

Letters of credit are expensive. A bank holds the pledged cash in escrow for the active life of the LOC and expects to be compensated for the lost interest expense they could otherwise have earned. There are also big upfront fees to establish an LOC because the bank has to evaluate a LOC holder in the same way they would evaluate a borrower. Banks also require significant collateral that they can seize should the letter or credit ever get used and the bank must pay out the cash.

I’m having trouble understanding who the letter of credit would benefit in this situation. When the FCC makes an annual grant payment to an ISP, they expect that ISP to be building network – 40% of the RDOF network must be completed by the end of year 3 with 20% more to be completed each of the next three years. The ISP would be expected each year to have the cash available to pay for fiber, work crews, electronics, engineers, etc. You can’t buy a letter of credit that would be payable to those future undefined parties. I think the FCC believes the letter of credit would be used to fund the ISP so they could construct the network. No bank is going to provide a letter of credit where the payee is also the purchaser of the LOC – in banking terms that would be an ISP paying an upfront fee for a guaranteed loan to be delivered later should that ISP not find a loan elsewhere. It’s absurd to think banks would issue such a financial instrument. It’s likely that an ISP who defaults on a LOC is in financial straits, so having a LOC in place would have the opposite effect of what the FCC wants – rather than guarantee future funds a bank would likely seize the assets of the ISP when the LOC is exercised.

A letter of credit has significant implications for the ISP that buys it. Any bank considering lending to the ISP will consider an LOC to be the same as outstanding debt – thus reducing the amount of other money the ISP can borrow. A long-term LOC would tie up a company’s borrowing capacity for the length of the LOC, making it that much harder to finance the RDOF project.

The coalition writing the letter to the FCC claims correctly that requiring letters of credit would stop a lot of ISPs from applying for the grants. Any ISP that that can’t easily borrow large amounts of money from a commercial bank is not going to get a LOC. Even ISPs that can get the letter of credit might decide it makes it too costly to accept the grant. The coalition petitioning the FCC estimates that the aggregate cost to obtain letters of credit for RDOF could cost as much as $1 billion for the grant recipients – my guess is that the estimate is conservatively low.

One of the groups this requirement might cause problems for are ISPs that obtain their funding from the federal RUS program. These entities – mostly telcos and electric cooperatives, would have to go to a commercial bank to get a LOC. If their only debt is with the RUS, banks might not be willing to issue an LOC, regardless of the strength of their balance sheet, since they have no easy way to secure collateral for the LOC.

Hopefully, the FCC comes to its senses, or the RDOF grant program might be a bust before it even gets started. I’m picturing ISPs going to banks and explaining the FCC requirements and seeing blank stares from bankers who are mystified by the request.

AT&T’s Fiber Play

AT&T has quietly become a major player in the fiber-to-the-home market. It’s reported that AT&T added 1.1 million customers on fiber in 2019, bringing its base of homes on fiber to 3.1 million. This puts the company in clear second place for residential fiber behind Verizon’s FiOS deployment.

AT&T got prompted to build fiber due to an agreement with the government as part of the approval for the merger with DirecTV. The company agreed in the summer of 2015 to build fiber to pass 12.5 million homes within four years.

AT&T has been in the fiber business for many years. Like all of the big telcos, AT&T built fiber to large businesses over the last couple of decades. AT&T got dragged into the FTTH business in a few markets when it reacted to the Google Fiber overbuild in markets like Atlanta and the North Carolina research triangle. AT&T has been selectively bringing fiber to large apartment complexes for much of the last decade.

In the first few years of the mandated buildout, AT&T seemed to be only halfheartedly going along with the mandated expansion. They claimed to have passed millions of homes with fiber builds, but there was no press or chatter from customers having received AT&T fiber service. For the first few years after the mandate, AT&T was meeting its mandate by counting passed apartment complexes – many which were likely already within range of AT&T fiber.

But it looks like everything changed at AT&T a few years ago and fiber suddenly appeared in pockets of the many cities where AT&T is the incumbent telephone provider. There were several changes in the industry that likely prompted this turnaround at AT&T. First, they won the FirstNet contract to provide modern connectivity to all first responders nationwide. In many cases this requires building new fiber – financed by the federal government. Second, AT&T needs to connect to huge numbers of small cell sites – something that was not predicted in 2015.

It seems that AT&T management looked at those two opportunities and decided that they could best capitalize on the new fiber by adding residential and small businesses to the fiber network. That was a big change at AT&T. They had long refused to follow in the wake of Verizon and their FiOS network. They instead took the path of beefing up urban DSL with their U-verse business where they paired two copper wires to offer DSL speeds as fast as 48 Mbps. I think the company was likely surprised about how quickly that offering became obsolete as cable companies now routinely offer two to four times that speed.

For the past several years AT&T has been losing DSL customers in droves to the cable companies. For example, in the year ending in the third quarter of 2019, AT&T had lost a net of 123,000 broadband customers, even with the big gains during that period for fiber. The company will likely continue to lose DSL customers as copper networks age and the speeds fall further behind cable company offerings. AT&T has been petitioning the FCC to tear down copper wires, particularly in rural areas, further killing the DSL business.

AT&T’s new strategy for building fiber is interesting. They are only building FTTH in small pockets where they already have fiber. That fiber might be there to serve a large business, a school, or a cell tower. AT&T extends fiber for two to four blocks around these fiber hubs, only where construction costs look reasonable. AT&T has a big cost advantage of building fiber cheaply in areas where the company already has copper wires on poles – the new fiber is overlashed to the existing copper wires.

Late last year, AT&T announced they had met their government mandate and were taking a pause in building new fiber in neighborhoods. The company is instead focused in selling where it has fiber and has a goal of a 50% market share in those areas. That’s an aggressive goal when considering that Comcast and Charter are likely their most common competitor.

AT&T fiber must be considered by anybody building a new fiber network. If AT&T is already in the market, they will likely have sewn up small pockets of the community. It also wouldn’t be hard for AT&T to expand these small pockets to become larger, making them a real competitor to a fiber overbuilder. This will be an odd kind of competition where AT&T is on some blocks and not others – almost making an overbuilder have two marketing plans, for the neighborhoods with and without fiber.

The State of Broadband in North Carolina

Recently I was asked to compare broadband in my state of North Carolina to the rest of the country. It’s an interesting question that folks in many states have been asking.

Parts of North Carolina have the best broadband in the country. There are neighborhoods in the research triangle area where there is fiber from Google Fiber and AT&T and where the cable company offers an affordable gigabit product. That means the lucky homeowners in those neighborhoods are one of a handful of places that have a competitive choice between three gigabit ISPs – something that’s extremely rare.

Like in other states, the cities and larger towns in the state all have a cable provider that offers the basic speeds of more than 100 Mbps, in some cities now 200 Mbps. Since this was traditional AT&T territory, the company has built fiber in small neighborhoods around the state – so there are small pockets of fiber in many communities. However, for most urban residents the cable companies are the only option faster than 50 Mbps and are slowly moving towards becoming the monopoly broadband provider.

Like in most states, North Carolina has huge rural areas with little or no broadband. And like in most places you don’t have to go far outside of any city to find places with dreadful or non-existent broadband. The big telcos have ignored rural copper here like they have done almost everywhere in the country. There are many rural counties with virtually no broadband outside the county seat, and no county that has good broadband everywhere.

Also like most states, there are ISPs providing fiber in rural communities. The state has telephone cooperatives and independent telephone companies that have built or will soon finish the fiber in their traditional service areas. A few of these companies are building outside their traditional footprint, with Riverstreet Networks, owned by Wilkes Telephone Cooperative the most aggressive, and which is building in numerous counties around the state.

There is also a vibrant small WISP industry in the state of ISPs bringing wireless broadband. Like in most states the big issue for the WISPs is finding fiber backhaul. In the western half of the state the impediments to wireless broadband are the mountainous terrain and the thick forests.

The state just voted last year to allow electric cooperatives to enter the broadband business and there are a few of them already considering building fiber. However, unlike many states, there is a lot of rural areas in the state that are served by commercial power companies rather than cooperatives.

Like many other states, North Carolina has an effective ban on municipalities entering the broadband business. Back when fiber-to-the-home was a new technology, the communities of Wilson and Salisbury built fiber networks in their communities and the incumbent providers quickly pushed through legislation stopping municipalities from being ISPs. Municipalities can build and lease broadband facilities to ISPs, but there are enough strings in the law to make it hard to achieve, and difficult to financially justify.

The state has a new broadband grant program set at $15 million this year. While it would probably take more than a century to solve the broadband gaps in the state at $15 million per clip, the funding is still welcome and is bringing solutions to pocket of homes that have had no broadband.

The state takes pride in being first to tackle broadband and had the first statewide government broadband network and was the first state to get fast broadband to all of the schools.

Like most states the broadband gap is widening in some places. Like most states there have been rural hospitals closings, mostly in areas with poor broadband, meaning it’s difficult to provide telemedicine services. There are plenty of stories in the state of families sitting outside of WiFi hotspots so that kids can do homework each night. There are some big stretches of the state where there is little or no cellular service to go along with no broadband. And like everywhere in the country, broadband is priced out of reach of poorer households.

There are a few things unique about broadband in North Carolina, but overall, we look a lot like much of the rest of America. We have homes with great broadband and homes with no broadband. Broadband is creeping into the rural parts of the state but at a glacial pace. Like most states, we still have a long way to go to provide everybody with decent broadband.

5G and Rural America

FCC Chairman Ajit Pai recently told the crowd at CES that 5G would be a huge benefit to rural America and would help to close the rural broadband divide. I have to imagine he’s saying this to keep rural legislators on board to support that FCC’s emphasis on promoting 5G. I’ve thought hard about the topic and I have a hard time seeing how 5G will make much difference in rural America – particularly with broadband.

There is more than one use of 5G, and I’ve thought through each one of them. Let me start with 5G cellular service. The major benefits of 5G cellular are that a cell site will be able to handle up to 100,000 simultaneous connection per cell site. 5G also promises slightly faster cellular data speeds. The specification calls for speeds up to 100 Mbps with the normal cellular frequencies – which happens to also have been the specification for 4G, although it was never realized.

I can’t picture a scenario where a rural cell site might need 100,000 simultaneous connections within a circle of a few miles. There aren’t many urban places that need that many connections today other than stadiums and other crowded locations where a lot of people want connectivity at the same time. I’ve heard farm sensors mentioned as a reason for needing 5G, but I don’t buy it. The normal crop sensor might dribble out tiny amounts of data a few times per day. These sensors cost close to $1,000 today, but even if they somehow get reduced to a cost of pennies it’s hard to imagine a situation where any given rural cell site is going to need to more capacity than is available with 4G.

It’s great if rural cell sites get upgraded, but there can’t be many rural cell sites that are overloaded enough to demand 5G. There is also the economics. It’s hard to imagine the cellular carriers being willing to invest in a rural cell site that might support only a few farmers – and it’s hard to think the farmers are willing to pay enough to justify their own cell site

There has also been talk of lower frequencies benefitting rural America, and there is some validity to that. For example, T-Mobile’s 600 MHz frequency travels farther and penetrates obstacles better than higher frequencies. Using this frequency might extend good cellular data coverage as much as an extra mile and might support voice for several additional miles from a cell site. However, low frequencies don’t require 5G to operate. There is nothing stopping these carriers from introducing low frequencies with 4G (and in fact, that’s what they have done in the first-generation cellphones capable of using the lower frequencies). The cellular carriers are loudly claiming that their introduction of new frequencies is the same thing as 5G – it’s not.

5G can also be used to provide faster data using millimeter wave spectrum. The big carriers are all deploying 5G hot spots with millimeter wave technology in dense urban centers. This technology broadcasts super-fast broadband for up to 1,000 feet.  The spectrum is also super-squirrely in that it doesn’t pass through anything, even a pane of glass. Try as I might, I can’t find a profitable application for this technology in suburbs, let alone rural places. If a farmer wants fast broadband in the barnyard I suspect we’re only a few years away from people being able to buy a 5G/WiFi 6 hot spot that could satisfy this purpose without paying a monthly fee to a cellular company.

Finally, 5G can be used to provide gigabit wireless loops from a fiber network. This is the technology trialed by Verizon in a few cities like Sacramento. In that trial, speeds were about 300 Mbps, but there are no reason speeds can’t climb to a gigabit. For this technology to work there has to be a transmitter on fiber within 1,000 feet of a customer. It seems unlikely to me that somebody spending the money to get fiber close to farms would use electronics for the last few hundred feet instead of a fiber drop. The electronics are always going to have problems and require truck rolls, and the electronics will likely have to be replaced at least once per decade. The small telcos and electric coops I know would scoff at the idea of adding another set of electronics into a rural fiber network.

I expect some of the 5G benefits to find uses in larger county seats – but those towns have the same characteristics as suburbia. It’s hard to think that rural America outside of county seats will ever need 5G.

I’m at a total loss of why Chairman Pai and many politicians keep extolling the virtues of rural 5G. I have no doubt that rural cell sites will be updated to 5G over time, but the carriers will be in no hurry to do so. It’s hard to find situations in rural America that demand a 5G solution that can’t be done with 4G – and it’s even harder to justify the cost of 5G upgrades that benefit only a few customers. I can’t find a business case, or even an engineering case for pushing 5G into rural America. I most definitely can’t foresee a 5G application that will solve the rural broadband divide.

 

Is 5G Radiation Safe?

There is a lot of public sentiment against placing small cell sites on residential streets. There is a particular fear of broadcasting higher millimeter wave frequencies near to homes since these frequencies have never been in widespread use before. In the public’s mind, higher frequencies mean a higher danger of health problems related to exposure to radiofrequency emissions. The public’s fears are further stoked when they hear that Switzerland and Belgium are limiting the deployment of millimeter wave radios until there is better proof that they are safe.

The FCC released a report and order on December 4 that is likely to add fuel to the fire. The agency rejected all claims that there is any public danger from radiofrequency emissions and affirmed the existing frequency exposure rules. The FCC said that none of the thousand filings made in the docket provided any scientific evidence that millimeter wave, and other 5G frequencies are dangerous.

The FCC is right in their assertion that there are no definitive scientific studies linking cellular frequencies to cancer or other health issues. However, the FCC misses the point that most of those asking for caution, including scientists, agree with that. The public has several specific fears about the new frequencies being used:

  • First is the overall range of new frequencies. In the recent past, the public was widely exposed to relatively low frequencies from radio and TV stations, to a fairly narrow range of cellular frequencies, and two bands of WiFi. The FCC is in the process of approving dozens of new bands of frequency that will be widely used where people live and work. The fear is not so much about any given frequency being dangerous, but rather a fear that being bombarded by a large range of frequencies will create unforeseen problems.
  • People are also concerned that cellular transmitters are moving from tall towers, which normally have been located away from housing, to small cell sites on poles that are located on residential streets. The fear is that these transmitters are generating a lot of radiation close to the transmitter – which is true. The amount of frequency that strikes a given area decreases rapidly with distance from a transmitter. The anecdote that I’ve seen repeated on social media is of placing a cell site fifteen feet from the bedroom of a child. I have no idea if there is a real small cell site that is the genesis of this claim – but there could be. In dense urban neighborhoods, there are plenty of streets where telephone poles are within a few feet of homes. I admit that I would be leery about having a small cell site directly outside one of my windows.
  • The public worries when they know that there will always be devices that don’t meet the FCC guidelines. As an example, the Chicago Tribune tested eleven smartphones in August and found that a few of them were issuing radiation at twice the FCC maximum-allowable limit. The public understands that vendors play loose with regulatory rules and that the FCC largely ignores such violations.

The public has no particular reason to trust this FCC. The FCC under Chairman Pai has sided with the large carriers on practically every issue in front of the Commission. This is not to say that the FCC didn’t give this docket the full consideration that should be given to all dockets – but the public perception is that this FCC would side with the cellular carriers even if there was a public health danger.

The FCC order is also not particularly helped by citing the buy-in from the Food and Drug Administration on the safety of radiation. That agency has licensed dozens of medicines that later proved to be harmful, so that agency also doesn’t garner a lot of public trust.

The FCC made a few changes with this order. They have mandated a new set of warning signs to be posted around transmitters. It’s doubtful that anybody outside of the industry will understand the meaning of the color-coded warnings. The FCC is also seeking comments on whether exposure standards should be changed for frequencies below 100 kHz and above 6 GHz. The agency is also going to exempt certain kinds of transmitters from FCC testing.

I’ve read extensively on both sides of the issue and it’s impossible to know the full story. For example, a majority of scientists in the field signed a petition to the United Nations warning against using higher frequencies without more testing. But it’s also easy to be persuaded by other scientists who say that higher frequencies don’t even penetrate the skin. I’ve not heard of any studies that look at exposing people to a huge range of different low-power frequencies.

This FCC is in a no-win position. The public properly perceives the agency of being pro-carrier, and anything the FCC says is not going to persuade those worried about radiation risks. I tend to side with the likelihood that the radiation is not a big danger, but I also have to wonder if there will be any impact after expanding by tenfold the range of frequencies we’re exposed to. The fact is that we’re not likely to know until after we’ve all been exposed for a decade.

A Decade of Statistics

Now that we’ve started a new decade, I thought it would be interesting to look back to see what progress has been made with broadband in the last ten years. My first realization in doing so was that I’ve been writing this blog about broadband for most of that decade, having started writing in early 2013, so I’ve tracked many of the changes in the industry.

I first looked at statistics on broadband subscribers and on the various ways that we use the Internet. The following statistics are for US adults:

  • 90% of Americans now say they use the Internet, up from 78% at the beginning of the last decade. Nearly 100% of Millennials say they use the Internet.
  • 85% of homes pay for a broadband connection at home. Surprising to me was that almost 80% of homes purchased Internet access in 2010. We now know there are two primary reasons why homes don’t buy broadband – price, and lack of broadband access in rural areas.
  • We are spending more time online. The average US adult now spends 3.7 hours per day online, up from 2.2 hours at the start of the decade.
  • 81% of Americans now use a smartphone, up from 35% at the start of the decade. 93% of Millennials own a smartphone. 96% of all adults own a cellphone.
  • 72% of Americans use social media, up from 43% at the start of the last decade. The number of people who say they get their news from social media (20%) now surpasses those that get news from print media (including online newspapers).
  • The use of tablets exploded in the past decade, growing from 3% of adults to 52% in 2019.
  • The use of desktops and laptops has declined slightly from 78% to 74%.

Most ISPs still care about telephone and cable TV service.

  • The total number of telephone line subscriptions was 150 million in 2010 and was down to 112 million in 2019. This number includes business telephone lines.
  • 39% of US homes still had a landline connection in 2019, down from 68% in 2010. A decade earlier this was at 96%.
  • The US had 104.6 million cable households in 2010 (59.8 million by cable, 6.9 million by telco and 33.9 million by satellite). By the end of the third quarter of 2019, paid CATV subscriptions dropped to 83.3 million (48 million by cable, 8.9 million by telcos, and 26.3 million by satellite). Cable subscribers at telcos surged at the start of the decade, but all categories are now dropping.
  • 50% of homes with Internet access now watch streaming video daily, up from 16% in 2010.

There are other statistics that should be of interest to ISPs:

  • The number of people that move has cut in half over the last 35 years. By 2019 only 1.5% of Americans moved to a different state. 5.9% of people moved but stayed in the same county. Many of my clients have reported lower churn over time due to households moving.
  • Rural populations continue to decline slowly. The last decade saw average declines in rural population of about 0.3% per year. That has slowed by the end of the decade, but overall, rural populations are still slightly declining.
  • Rural populations are aging. A report by the Census bureau in 2019 says that more than 22.9% of Americans over 65 live in rural America. There are 13 states where the percent of rural elderly exceeds 40% (VT, ME, MS, WV, AR, MT, SD, ND, AL, KY, NH, IA). This foretells significant declines in rural populations over the next several decades.
  • In 2019 Millennials surpassed Gen Xers as the largest generation in the workforce. In 2019 the workforce consisted of 57 million Millennials, 53 million Gen Xers, and 38 million baby boomers.
  • The last decade was the first decade in 160 years to see an increase in the size of the average household. The average household grew from 2.58 people in 2010 to 2.63 people in 2019. The number has been declining steadily since 1790 when a household averaged 5.79 people.

Nationwide statistics are always interesting, but few of my clients see the same numbers locally. One of the important pieces of the puzzle when looking for a broadband solution is understanding how your community fits into these national trends. As an example, one of the most disparate statistics we see when doing surveys is the penetration rate of traditional TV. We still find communities where it’s above 80% and others where it’s lower than the national average.

Killing 3G

I have bad news for anybody still clinging to their flip phones. All of the big cellular carriers have announced plans to end 3G cellular service, and each has a different timeline in mind:

  • Verizon previously said they would stop supporting 3G at the end of 2019, but now says it will end service at the end of 2020.
  • AT&T has announced the end of 3G to be coming in early 2022.
  • Sprint and T-Mobile have not expressed a specific date but are both expected to stop 3G service sometime in 2020 or 2021.

The amount of usage on 3G networks is still significant. GSMA reported that at the end of 2018 that as many as 17% of US cellular customers still made 3G connections, which accounted for as much as 19% of all cellular connections.

The primary reason cited for ending 3G is that the technology is far less efficient than 4G. A 3G connection to a cell site chews up the same amount of frequency resources as a 4G connection yet delivers far less data to customers. The carriers are also anxious to free up mid-range spectrum for upcoming 5G deployment.

Opensignal measures actual speed performance for millions of cellular connections and recently reported the following statistics for the average 3G and 4G download speeds as of July 2019:

4G 2019 3G 2019
AT&T 22.5 Mbps 3.3 Mbps
Sprint 19.2 Mbps 1.3 Mbps
T-Mobile 23.6 Mbps 4.2 Mbps
Verizon 22.9 Mbps 0.9 Mbps

The carriers have been hesitating on ending 3G because there are still significant numbers of rural cell sites that still don’t offer 4G. The cellular carriers were counting on funding from the FCC’s Mobility Fund Phase II to upgrade rural cell sites. However, that funding program got derailed and delayed when the FCC found there were massive errors in the data provided for distributing that fund. The big carriers were accused by many of rigging the data in a way to give more funding to themselves instead of to smaller rural cellular providers.

The FCC staff conducted significant testing of the reported speed and coverage data and released a report of their findings in December 2019. The testing showed that the carriers have significantly overreported 4G coverage and speeds across the country. This report is worth reading for anybody that needs to be convinced of the garbage data that has been used for the creation of FCC broadband maps. I wish the FCC Staff would put the same effort into investigating landline broadband data provided to the FCC. The FCC Staff recommended that the agency should release a formal Enforcement Advisory including ‘a detailing of the penalties associated with carrier filings that violate federal law’.

The carriers are also hesitant to end 3G since a lot of customers still use the technology. Opensignal says there are several reasons for the continued use of 3G. First, 12.7% of users of 3G live in rural areas where 3G is the only cellular technology available. Opensignal says that 4.1% of 3G users still own old flip phones that are not capable of receiving 4G. The biggest category of 3G users are customers that own a 4G capable phone but still subscribe to a 3G data plan. AT&T is the largest provider of such plans and has not forced customers to upgrade to 4G plans.

The carriers need to upgrade rural cell sites to 4G before they can be allowed to cut 3G dead. In doing so they need to migrate customers to 4G data plans and also notify customers who still use 3G-only flip phones that it’s finally time to upgrade.

One aspect of the 3G issue that nobody is talking about is that AT&T says it is using fixed wireless connections to meet its CAF II buildout requirements. Since the CAF II areas include some of the most remote landline customers, it stands to reason that these are the same areas that are likely to still be served with 3G cell towers. AT&T can’t deliver 10/1 Mbps or faster speeds using 3G technology. This makes me wonder what AT&T has been telling the FCC in terms of meeting their CAF II build-out requirements.

Broadband and Presidential Politics

For the first time in my memory, broadband has entered into presidential politics. This is an important milestone for rural broadband – not because of the proposals being made by candidates, but because it indicates that the voices of those without rural broadband have reached upward to the top of the political system.

I’m sure that when the presidential candidates go to rural areas that they are asked if they can help find a solution for the lack of broadband in many rural counties. For years I’ve heard from county Boards and Councils that broadband has bubbled up to the top of the list of issues in many rural counties. Rural residents are tired of having to make an extraordinary effort for their kids to do homework, tired of not being able to work from home, and tired of not being able to engage in things the rest of us take for granted.

Candidate proposals are big on rhetoric, but short on details. Some of the stated broadband policies are as follows:

  • The current administration is spending $16.4 billion this year for the largest federal broadband grant program ever. They are also spending $9 billion to expand rural cellular coverage.
  • Senator Bernie Sanders would provide $150 billion in grants and technical assistance for cities and municipalities to build publicly-owned fiber networks as part of a larger Green New Deal infrastructure initiative. That plan obviously extends far beyond a solution for rural broadband, and when cities are thrown into the mix, $150 billion is not going to bring fiber broadband everywhere. He further would regulate broadband as a utility and require that all ISPs offer a low-price ‘basic internet plan’ to make sure that the Internet is available to everybody.
  • Senator Elizabeth Warren has proposed $85 billion for public broadband as part of a larger infrastructure plan.
  • Mayor Pete Buttigieg has proposed an $80 billion Internet-for-All plan that would bring broadband to unserved communities.
  • Former Vice-president Joe Biden supports a $20 billion grant program for rural broadband.
  • Senator Amy Klobuchar proposes perhaps the most workable plan that would provide grants to service providers willing to serve rural America. She has likely based this plan on the successful Border-to-Border grant program in Minnesota.

All of these plans must be taken with a grain of salt because we know that many proposals made on the campaign trail are often forgotten by January after an election. We further have to be skeptical of presidential candidate promises for spending, because Presidents don’t get to spend the big dollar amounts being thrown around – Congress holds those purse strings. It’s possible that none of these candidates gets elected. It’s also possible that one of them gets elected and still would be unable to make headway on the rural broadband issue. For example, there might still be a split House and Senate, making it a challenge to agree on spending priorities. The federal government might get pulled in other directions for a wide variety of reasons and never get around to the rural broadband issue.

As somebody who understands what it takes to run an ISP, some of these ideas scare me. For example, the idea of handing broadband networks to municipalities scares because I know that the majority of local governments have zero interest in taking on that role. If this responsibility was thrust upon them many of them would do a lousy job. Even should networks be handed to governments for free, many are ill-equipped or unwilling to administer and maintain a network. The idea that we could legislate the creation of well-run government-owned ISPs everywhere is not in touch with the realities of the expertise required to own and operate a network. On the flip side, I hate the idea of giving any money to big ISPs to provide better broadband. We’ve seen how poorly that can go in the CAF II program.

I also always cringe whenever I hear the idea of regulating broadband as a utility. I am not against the idea of regulation, but the chances are that the federal government and politicians would goof it up and would create an absolute disaster. Regulating something as complex as broadband is a complicated endeavor and would be hard to get right if done at the federal level – if done poorly we could end up undoing the good than many ISPs have already done.

As an example of the challenge of regulating the industry, I can’t think of any easy mechanism to somehow drag all of the existing communities, telcos, cable companies, and fiber overbuilders that provide broadband into a regulated regime. Most of the entities that have built fiber have already taken on significant debt to build fiber networks. Short of the government paying off their existing loans, it’s hard to think how these companies could begin offering low regulated prices and still meet their existing debt obligations. I can easily list a hundred other issues that could go awry when regulating the industry. I am highly skeptical that Washington DC can figure out all of the nuances of how to do this the right way. I’m a lot more comfortable with the way we originally regulated telephone service – the federal government established broad policies and state regulatory bodies filled in the details.

I am just happy to see broadband being discussed during the election cycle. The same thing is happening at the state and local level, which is one of the main reasons that we’ve seen so many state broadband grant programs being formed. All of the lobbying being done by folks without broadband is finally seeing results – at least in promises being made by politicians. We just need to keep up the pressure until the political talk turns into broadband networks.

The Greed of the Programmers

If you use social media you may have noticed a flurry of activity at the end of December warning that small cable TV providers across the country could lose the Fox channels on January 1. That includes Fox News, Fox Business, FX, National Geographic, FS1, FS2, and the Big Ten Network. The dispute was with NCTC, a cooperative that negotiates rates for most of the smaller cable companies in the country.

Fox was asking for what has been described as a 20% rate hike on programming. Fox was seeking a big rate increase to recognize that they have the number one network on cable TV with 1.5 million daily viewers. NCTC finally struck a deal with Fox on December 31 and the channels didn’t go dark – but the cost of buying the Fox networks went up substantially. Back in September, the Fox channels went dark for ten days on Dish Networks when the satellite company refused to accept the same big rate increase.

This is not the first big rate increase from Fox. ALLO Communications, a sizable fiber overbuilder, says that Fox has raised rates 800% since 2004, To put that into perspective, the cost of living in the US has increased by 36% since 2004.

The Fox rate increase is the perfect metaphor for the woes of the cable industry. Fox is not unique, and during the 2000s most cable programmers raised rates much faster than inflation. Cable companies have had little choice but to pass the rate increases along to customers. The programming cost increases have led to a steady annual rate increase for consumers. The soaring price of cable has led to the cord cutting trend and customers are bailing from traditional cable TV by the millions and at an increasing pace.

As a whole, traditional cable TV has probably now entered what economists call a death spiral. Most programming contracts are for 3 – 5 years and the cable TV companies already know of the big programming cost increases coming for the next few years. As cable companies keep raising rates they will lose more customers. The programmers will likely try to compensate by raising their rates even higher, and within a short number of years, cable TV will cost more than what most homes are willing to pay.

A company like Fox can weather the storm of disappearing cable subscribers since they know that all of the online alternative networks like Sling TV, YouTube TV, and others will carry their major networks like Fox News, Fox Business, and the sports networks. The chances are that the primary Fox channels will be solid and steady earners for the company far into the future. However, the same can’t be said for many cable networks.

The online cable products have far smaller channel lineups than traditional cable. There are more than 100 traditional cable channels that are losing subscribers from cable companies and not replacing them with online programming. It’s only a matter of time until many of these networks go dark, as programming revenues won’t cover the cost of operating the network.

It’s easy for people to hate cable companies since that’s who people pay every month. Cable providers like Comcast and AT&T share in the blame since they are both the two largest cable providers and also owners of content. All cable companies share some blame for not yelling bloody murder to the American public for the last decade – and for not fighting back. The cable companies instead started sliding the programming rate increases into hidden fees. However, the fault ultimately lies with the greed of the programmers. These are mostly big publicly traded companies raise rates every year to please stockholders.

It’s no longer good enough for corporations to make money, they are expected to increase bottom line quarter after quarter, year after year. We’ve only been talking about cord cutting for a few years, but the industry has been declining for over a decade. In 2010 there were nearly 105 million subscribers of traditional cable TV, and that number dropped to just over 83 million by the third quarter of 2019. It’s easy to think of cord cutting as a recent phenomenon, but the industry has been quietly bleeding customers for years. Sadly, the programmers are still denying the reality that they exist in a dying industry and are likely to continue to raise rates like Fox just did.

The supply and demand side of any sane industry would have gotten together years ago and figured out a way for the industry to be sustainable. However, the combined greed of the programmers and the big cable companies has resulted in the runaway rate increases that will doom traditional cable. It’s hard to know where the tipping point will be, but we’ll be there when cable networks start going dark – it’s just a matter of time.