Is 5G Radiation Safe?

There is a lot of public sentiment against placing small cell sites on residential streets. There is a particular fear of broadcasting higher millimeter wave frequencies near to homes since these frequencies have never been in widespread use before. In the public’s mind, higher frequencies mean a higher danger of health problems related to exposure to radiofrequency emissions. The public’s fears are further stoked when they hear that Switzerland and Belgium are limiting the deployment of millimeter wave radios until there is better proof that they are safe.

The FCC released a report and order on December 4 that is likely to add fuel to the fire. The agency rejected all claims that there is any public danger from radiofrequency emissions and affirmed the existing frequency exposure rules. The FCC said that none of the thousand filings made in the docket provided any scientific evidence that millimeter wave, and other 5G frequencies are dangerous.

The FCC is right in their assertion that there are no definitive scientific studies linking cellular frequencies to cancer or other health issues. However, the FCC misses the point that most of those asking for caution, including scientists, agree with that. The public has several specific fears about the new frequencies being used:

  • First is the overall range of new frequencies. In the recent past, the public was widely exposed to relatively low frequencies from radio and TV stations, to a fairly narrow range of cellular frequencies, and two bands of WiFi. The FCC is in the process of approving dozens of new bands of frequency that will be widely used where people live and work. The fear is not so much about any given frequency being dangerous, but rather a fear that being bombarded by a large range of frequencies will create unforeseen problems.
  • People are also concerned that cellular transmitters are moving from tall towers, which normally have been located away from housing, to small cell sites on poles that are located on residential streets. The fear is that these transmitters are generating a lot of radiation close to the transmitter – which is true. The amount of frequency that strikes a given area decreases rapidly with distance from a transmitter. The anecdote that I’ve seen repeated on social media is of placing a cell site fifteen feet from the bedroom of a child. I have no idea if there is a real small cell site that is the genesis of this claim – but there could be. In dense urban neighborhoods, there are plenty of streets where telephone poles are within a few feet of homes. I admit that I would be leery about having a small cell site directly outside one of my windows.
  • The public worries when they know that there will always be devices that don’t meet the FCC guidelines. As an example, the Chicago Tribune tested eleven smartphones in August and found that a few of them were issuing radiation at twice the FCC maximum-allowable limit. The public understands that vendors play loose with regulatory rules and that the FCC largely ignores such violations.

The public has no particular reason to trust this FCC. The FCC under Chairman Pai has sided with the large carriers on practically every issue in front of the Commission. This is not to say that the FCC didn’t give this docket the full consideration that should be given to all dockets – but the public perception is that this FCC would side with the cellular carriers even if there was a public health danger.

The FCC order is also not particularly helped by citing the buy-in from the Food and Drug Administration on the safety of radiation. That agency has licensed dozens of medicines that later proved to be harmful, so that agency also doesn’t garner a lot of public trust.

The FCC made a few changes with this order. They have mandated a new set of warning signs to be posted around transmitters. It’s doubtful that anybody outside of the industry will understand the meaning of the color-coded warnings. The FCC is also seeking comments on whether exposure standards should be changed for frequencies below 100 kHz and above 6 GHz. The agency is also going to exempt certain kinds of transmitters from FCC testing.

I’ve read extensively on both sides of the issue and it’s impossible to know the full story. For example, a majority of scientists in the field signed a petition to the United Nations warning against using higher frequencies without more testing. But it’s also easy to be persuaded by other scientists who say that higher frequencies don’t even penetrate the skin. I’ve not heard of any studies that look at exposing people to a huge range of different low-power frequencies.

This FCC is in a no-win position. The public properly perceives the agency of being pro-carrier, and anything the FCC says is not going to persuade those worried about radiation risks. I tend to side with the likelihood that the radiation is not a big danger, but I also have to wonder if there will be any impact after expanding by tenfold the range of frequencies we’re exposed to. The fact is that we’re not likely to know until after we’ve all been exposed for a decade.

A Decade of Statistics

Now that we’ve started a new decade, I thought it would be interesting to look back to see what progress has been made with broadband in the last ten years. My first realization in doing so was that I’ve been writing this blog about broadband for most of that decade, having started writing in early 2013, so I’ve tracked many of the changes in the industry.

I first looked at statistics on broadband subscribers and on the various ways that we use the Internet. The following statistics are for US adults:

  • 90% of Americans now say they use the Internet, up from 78% at the beginning of the last decade. Nearly 100% of Millennials say they use the Internet.
  • 85% of homes pay for a broadband connection at home. Surprising to me was that almost 80% of homes purchased Internet access in 2010. We now know there are two primary reasons why homes don’t buy broadband – price, and lack of broadband access in rural areas.
  • We are spending more time online. The average US adult now spends 3.7 hours per day online, up from 2.2 hours at the start of the decade.
  • 81% of Americans now use a smartphone, up from 35% at the start of the decade. 93% of Millennials own a smartphone. 96% of all adults own a cellphone.
  • 72% of Americans use social media, up from 43% at the start of the last decade. The number of people who say they get their news from social media (20%) now surpasses those that get news from print media (including online newspapers).
  • The use of tablets exploded in the past decade, growing from 3% of adults to 52% in 2019.
  • The use of desktops and laptops has declined slightly from 78% to 74%.

Most ISPs still care about telephone and cable TV service.

  • The total number of telephone line subscriptions was 150 million in 2010 and was down to 112 million in 2019. This number includes business telephone lines.
  • 39% of US homes still had a landline connection in 2019, down from 68% in 2010. A decade earlier this was at 96%.
  • The US had 104.6 million cable households in 2010 (59.8 million by cable, 6.9 million by telco and 33.9 million by satellite). By the end of the third quarter of 2019, paid CATV subscriptions dropped to 83.3 million (48 million by cable, 8.9 million by telcos, and 26.3 million by satellite). Cable subscribers at telcos surged at the start of the decade, but all categories are now dropping.
  • 50% of homes with Internet access now watch streaming video daily, up from 16% in 2010.

There are other statistics that should be of interest to ISPs:

  • The number of people that move has cut in half over the last 35 years. By 2019 only 1.5% of Americans moved to a different state. 5.9% of people moved but stayed in the same county. Many of my clients have reported lower churn over time due to households moving.
  • Rural populations continue to decline slowly. The last decade saw average declines in rural population of about 0.3% per year. That has slowed by the end of the decade, but overall, rural populations are still slightly declining.
  • Rural populations are aging. A report by the Census bureau in 2019 says that more than 22.9% of Americans over 65 live in rural America. There are 13 states where the percent of rural elderly exceeds 40% (VT, ME, MS, WV, AR, MT, SD, ND, AL, KY, NH, IA). This foretells significant declines in rural populations over the next several decades.
  • In 2019 Millennials surpassed Gen Xers as the largest generation in the workforce. In 2019 the workforce consisted of 57 million Millennials, 53 million Gen Xers, and 38 million baby boomers.
  • The last decade was the first decade in 160 years to see an increase in the size of the average household. The average household grew from 2.58 people in 2010 to 2.63 people in 2019. The number has been declining steadily since 1790 when a household averaged 5.79 people.

Nationwide statistics are always interesting, but few of my clients see the same numbers locally. One of the important pieces of the puzzle when looking for a broadband solution is understanding how your community fits into these national trends. As an example, one of the most disparate statistics we see when doing surveys is the penetration rate of traditional TV. We still find communities where it’s above 80% and others where it’s lower than the national average.

Killing 3G

I have bad news for anybody still clinging to their flip phones. All of the big cellular carriers have announced plans to end 3G cellular service, and each has a different timeline in mind:

  • Verizon previously said they would stop supporting 3G at the end of 2019, but now says it will end service at the end of 2020.
  • AT&T has announced the end of 3G to be coming in early 2022.
  • Sprint and T-Mobile have not expressed a specific date but are both expected to stop 3G service sometime in 2020 or 2021.

The amount of usage on 3G networks is still significant. GSMA reported that at the end of 2018 that as many as 17% of US cellular customers still made 3G connections, which accounted for as much as 19% of all cellular connections.

The primary reason cited for ending 3G is that the technology is far less efficient than 4G. A 3G connection to a cell site chews up the same amount of frequency resources as a 4G connection yet delivers far less data to customers. The carriers are also anxious to free up mid-range spectrum for upcoming 5G deployment.

Opensignal measures actual speed performance for millions of cellular connections and recently reported the following statistics for the average 3G and 4G download speeds as of July 2019:

4G 2019 3G 2019
AT&T 22.5 Mbps 3.3 Mbps
Sprint 19.2 Mbps 1.3 Mbps
T-Mobile 23.6 Mbps 4.2 Mbps
Verizon 22.9 Mbps 0.9 Mbps

The carriers have been hesitating on ending 3G because there are still significant numbers of rural cell sites that still don’t offer 4G. The cellular carriers were counting on funding from the FCC’s Mobility Fund Phase II to upgrade rural cell sites. However, that funding program got derailed and delayed when the FCC found there were massive errors in the data provided for distributing that fund. The big carriers were accused by many of rigging the data in a way to give more funding to themselves instead of to smaller rural cellular providers.

The FCC staff conducted significant testing of the reported speed and coverage data and released a report of their findings in December 2019. The testing showed that the carriers have significantly overreported 4G coverage and speeds across the country. This report is worth reading for anybody that needs to be convinced of the garbage data that has been used for the creation of FCC broadband maps. I wish the FCC Staff would put the same effort into investigating landline broadband data provided to the FCC. The FCC Staff recommended that the agency should release a formal Enforcement Advisory including ‘a detailing of the penalties associated with carrier filings that violate federal law’.

The carriers are also hesitant to end 3G since a lot of customers still use the technology. Opensignal says there are several reasons for the continued use of 3G. First, 12.7% of users of 3G live in rural areas where 3G is the only cellular technology available. Opensignal says that 4.1% of 3G users still own old flip phones that are not capable of receiving 4G. The biggest category of 3G users are customers that own a 4G capable phone but still subscribe to a 3G data plan. AT&T is the largest provider of such plans and has not forced customers to upgrade to 4G plans.

The carriers need to upgrade rural cell sites to 4G before they can be allowed to cut 3G dead. In doing so they need to migrate customers to 4G data plans and also notify customers who still use 3G-only flip phones that it’s finally time to upgrade.

One aspect of the 3G issue that nobody is talking about is that AT&T says it is using fixed wireless connections to meet its CAF II buildout requirements. Since the CAF II areas include some of the most remote landline customers, it stands to reason that these are the same areas that are likely to still be served with 3G cell towers. AT&T can’t deliver 10/1 Mbps or faster speeds using 3G technology. This makes me wonder what AT&T has been telling the FCC in terms of meeting their CAF II build-out requirements.

Broadband and Presidential Politics

For the first time in my memory, broadband has entered into presidential politics. This is an important milestone for rural broadband – not because of the proposals being made by candidates, but because it indicates that the voices of those without rural broadband have reached upward to the top of the political system.

I’m sure that when the presidential candidates go to rural areas that they are asked if they can help find a solution for the lack of broadband in many rural counties. For years I’ve heard from county Boards and Councils that broadband has bubbled up to the top of the list of issues in many rural counties. Rural residents are tired of having to make an extraordinary effort for their kids to do homework, tired of not being able to work from home, and tired of not being able to engage in things the rest of us take for granted.

Candidate proposals are big on rhetoric, but short on details. Some of the stated broadband policies are as follows:

  • The current administration is spending $16.4 billion this year for the largest federal broadband grant program ever. They are also spending $9 billion to expand rural cellular coverage.
  • Senator Bernie Sanders would provide $150 billion in grants and technical assistance for cities and municipalities to build publicly-owned fiber networks as part of a larger Green New Deal infrastructure initiative. That plan obviously extends far beyond a solution for rural broadband, and when cities are thrown into the mix, $150 billion is not going to bring fiber broadband everywhere. He further would regulate broadband as a utility and require that all ISPs offer a low-price ‘basic internet plan’ to make sure that the Internet is available to everybody.
  • Senator Elizabeth Warren has proposed $85 billion for public broadband as part of a larger infrastructure plan.
  • Mayor Pete Buttigieg has proposed an $80 billion Internet-for-All plan that would bring broadband to unserved communities.
  • Former Vice-president Joe Biden supports a $20 billion grant program for rural broadband.
  • Senator Amy Klobuchar proposes perhaps the most workable plan that would provide grants to service providers willing to serve rural America. She has likely based this plan on the successful Border-to-Border grant program in Minnesota.

All of these plans must be taken with a grain of salt because we know that many proposals made on the campaign trail are often forgotten by January after an election. We further have to be skeptical of presidential candidate promises for spending, because Presidents don’t get to spend the big dollar amounts being thrown around – Congress holds those purse strings. It’s possible that none of these candidates gets elected. It’s also possible that one of them gets elected and still would be unable to make headway on the rural broadband issue. For example, there might still be a split House and Senate, making it a challenge to agree on spending priorities. The federal government might get pulled in other directions for a wide variety of reasons and never get around to the rural broadband issue.

As somebody who understands what it takes to run an ISP, some of these ideas scare me. For example, the idea of handing broadband networks to municipalities scares because I know that the majority of local governments have zero interest in taking on that role. If this responsibility was thrust upon them many of them would do a lousy job. Even should networks be handed to governments for free, many are ill-equipped or unwilling to administer and maintain a network. The idea that we could legislate the creation of well-run government-owned ISPs everywhere is not in touch with the realities of the expertise required to own and operate a network. On the flip side, I hate the idea of giving any money to big ISPs to provide better broadband. We’ve seen how poorly that can go in the CAF II program.

I also always cringe whenever I hear the idea of regulating broadband as a utility. I am not against the idea of regulation, but the chances are that the federal government and politicians would goof it up and would create an absolute disaster. Regulating something as complex as broadband is a complicated endeavor and would be hard to get right if done at the federal level – if done poorly we could end up undoing the good than many ISPs have already done.

As an example of the challenge of regulating the industry, I can’t think of any easy mechanism to somehow drag all of the existing communities, telcos, cable companies, and fiber overbuilders that provide broadband into a regulated regime. Most of the entities that have built fiber have already taken on significant debt to build fiber networks. Short of the government paying off their existing loans, it’s hard to think how these companies could begin offering low regulated prices and still meet their existing debt obligations. I can easily list a hundred other issues that could go awry when regulating the industry. I am highly skeptical that Washington DC can figure out all of the nuances of how to do this the right way. I’m a lot more comfortable with the way we originally regulated telephone service – the federal government established broad policies and state regulatory bodies filled in the details.

I am just happy to see broadband being discussed during the election cycle. The same thing is happening at the state and local level, which is one of the main reasons that we’ve seen so many state broadband grant programs being formed. All of the lobbying being done by folks without broadband is finally seeing results – at least in promises being made by politicians. We just need to keep up the pressure until the political talk turns into broadband networks.

The Greed of the Programmers

If you use social media you may have noticed a flurry of activity at the end of December warning that small cable TV providers across the country could lose the Fox channels on January 1. That includes Fox News, Fox Business, FX, National Geographic, FS1, FS2, and the Big Ten Network. The dispute was with NCTC, a cooperative that negotiates rates for most of the smaller cable companies in the country.

Fox was asking for what has been described as a 20% rate hike on programming. Fox was seeking a big rate increase to recognize that they have the number one network on cable TV with 1.5 million daily viewers. NCTC finally struck a deal with Fox on December 31 and the channels didn’t go dark – but the cost of buying the Fox networks went up substantially. Back in September, the Fox channels went dark for ten days on Dish Networks when the satellite company refused to accept the same big rate increase.

This is not the first big rate increase from Fox. ALLO Communications, a sizable fiber overbuilder, says that Fox has raised rates 800% since 2004, To put that into perspective, the cost of living in the US has increased by 36% since 2004.

The Fox rate increase is the perfect metaphor for the woes of the cable industry. Fox is not unique, and during the 2000s most cable programmers raised rates much faster than inflation. Cable companies have had little choice but to pass the rate increases along to customers. The programming cost increases have led to a steady annual rate increase for consumers. The soaring price of cable has led to the cord cutting trend and customers are bailing from traditional cable TV by the millions and at an increasing pace.

As a whole, traditional cable TV has probably now entered what economists call a death spiral. Most programming contracts are for 3 – 5 years and the cable TV companies already know of the big programming cost increases coming for the next few years. As cable companies keep raising rates they will lose more customers. The programmers will likely try to compensate by raising their rates even higher, and within a short number of years, cable TV will cost more than what most homes are willing to pay.

A company like Fox can weather the storm of disappearing cable subscribers since they know that all of the online alternative networks like Sling TV, YouTube TV, and others will carry their major networks like Fox News, Fox Business, and the sports networks. The chances are that the primary Fox channels will be solid and steady earners for the company far into the future. However, the same can’t be said for many cable networks.

The online cable products have far smaller channel lineups than traditional cable. There are more than 100 traditional cable channels that are losing subscribers from cable companies and not replacing them with online programming. It’s only a matter of time until many of these networks go dark, as programming revenues won’t cover the cost of operating the network.

It’s easy for people to hate cable companies since that’s who people pay every month. Cable providers like Comcast and AT&T share in the blame since they are both the two largest cable providers and also owners of content. All cable companies share some blame for not yelling bloody murder to the American public for the last decade – and for not fighting back. The cable companies instead started sliding the programming rate increases into hidden fees. However, the fault ultimately lies with the greed of the programmers. These are mostly big publicly traded companies raise rates every year to please stockholders.

It’s no longer good enough for corporations to make money, they are expected to increase bottom line quarter after quarter, year after year. We’ve only been talking about cord cutting for a few years, but the industry has been declining for over a decade. In 2010 there were nearly 105 million subscribers of traditional cable TV, and that number dropped to just over 83 million by the third quarter of 2019. It’s easy to think of cord cutting as a recent phenomenon, but the industry has been quietly bleeding customers for years. Sadly, the programmers are still denying the reality that they exist in a dying industry and are likely to continue to raise rates like Fox just did.

The supply and demand side of any sane industry would have gotten together years ago and figured out a way for the industry to be sustainable. However, the combined greed of the programmers and the big cable companies has resulted in the runaway rate increases that will doom traditional cable. It’s hard to know where the tipping point will be, but we’ll be there when cable networks start going dark – it’s just a matter of time.

Is it Time to Sell?

A lot of ISPs hope to someday cash in on their sweat equity by selling the business. There have been some surprisingly high recent valuations in parts of the industry which raises the question if this is a good time to sell an ISP?

Anybody that has considered selling in the last decade years knows that valuation multiples have been stagnant and somewhat low by historic standards. A lot of properties have changed hands during that time with multiples in the range of 4.5 to 6.5 times EBITDA (earnings before interest, taxes, depreciation, and amortization). Some ISP properties have sold outside of that range based upon the unique factors of a given sale.

In November, Jeff Johnston of CoBank posted a long blog talking about how valuations might be on the rise – particularly for companies with a lot of fiber or with other upsides. He pointed to three transactions that had valuations higher than historic multiples for the sector.

  • Zayo sold their network of 130,000 route miles of fiber transport for a multiple of 11.1 times EBITDA.
  • Bluebird Network in Missouri and nearby states sold a 6.500-mile fiber transport network for a multiple of 10.4 times EBITDA.
  • Fidelity Communications of Missouri sold an ISP with nearly 135,000 customers for a multiple of 11.7 times EBITDA.

Johnston doesn’t say that these high multiples are the new standard for other ISPs. However, he does surmise that the high multiples probably indicate an uptick in valuation for the whole sector. That’s something that’s only proven over time by seeing higher valuations coming from multiple and smaller transactions – but the cited transactions raise the possibility that we’re seeing an increase in valuation for fiber-based businesses.

It’s important to ask why any buyer would pay 10 or 11 times EBITDA. A buyer paying that much will take a decade to recoup their investment if the purchased business continues to perform at historic levels. Nobody would pay that much for a business unless they expect the margins of the acquired business to improve after acquisition – that’s the key to higher valuations. The buyers of these three businesses are likely expecting significant upsides from the purchased properties.

Buyers often see a one-time bump in margin from the increased efficiency of adding an acquisition to their existing business. This is often referred to as an economy of scale improvement – overheads generally become more affordable as a business gets larger. However, buyers rarely will reward a seller for the economy of scale improvements, so this is rarely built into valuation multiples.

A buyer is usually only willing to pay a high multiple if they foresee the possibility of significant growth from the purchased entity. The purchased company needs to be operating in a footprint with upside potential, or else the purchased company needs to demonstrate that they know how to grow. A buyer must believe they can grow the acquired business enough to recoup their purchase price and also make a good return. For a fiber ISP to get a high valuation they have to be able to convince a buyer that the business has huge upside potential. An ISP needs to already be growing and they need to be able to demonstrate that the growth can be ongoing into the future.

One of the more interesting aspects of getting a high valuation multiple is that a buyer might expect the core management team to remain intact after a sale. That often means that part of the compensation from the sale might be incentive-based and paid in the future based upon post-sale performance.

To summarize, an ISP can get a higher valuation if they can convince a buyer that there is future upside to the business. ISPs that don’t have growth potential will not see the higher valuation multiples cited above – although many potential sellers will think these multiples apply to them. The bottom line is that if your ISP is growing and can keep growing, and you can paint that picture to a buyer, your business might be worth more than you expected.

Is Your Home Listening to You?

When I was a teenager, science fiction books envisioned a future where people talked to their home to take care of mundane tasks. For somebody willing to spend the money on new appliances and devices that future is here today.

Just consider the Amazon Alexa voice assistant, which is installed in the largest number of devices. GE has built Alexa into its new stoves, refrigerators, wall ovens, dishwashers, washers and dryers, and air conditioners. Samsung has built Alexa into refrigerators, washers, dryers, air conditioners, and vacuums. Alexa is built into smart light bulbs, smart wall plugs, televisions, thermostats, smart door locks, security cameras, speakers, and numerous other devices. The chips and/or software to add Alexa to devices are getting cheap and it shouldn’t be long until the app is built into most electronics you might buy.

The convenience of talking to home devices is not without a cost, and companies like Amazon, Apple, and Google are listening to you through the devices. Like other voice assistants, Alexa listens all of the time waiting for a ‘wake word’ that activates the app. There are major privacy and security concerns related to the constant listening. We have to trust the company controlling the device not to listen to us all of the time because there is nothing stopping them from doing so.

Amazon swears they don’t listen or record except for a short period of time after the wake word is spoken. They also swear that they only preserve those recordings in an effort to improve Alexa’s responses to questions. If you are going to use Alexa in your home, you are trusting that Amazon is telling the truth. Back in 2017 Samsung got a huge black eye when they were unable to make that promise concerning their smart TVs.

The other big concern is hacking. There is zero chance that all of the companies making devices that include a voice assistant have iron-clad security. While Amazon really might not be listening to you, a hacker will surely be willing to do so.

To make matters even more uncomfortable, a lot of lawyers and privacy experts believe that if a person knowingly installs a device that listens and transmits information to a third party, that person has waived their Fourth Amendment privacy rights and any rights granted by the Electronic Communications Privacy Act. The concept has not yet been challenged in a court, but if it’s true, then people have no recourse against Amazon or anybody else using the information gathered from a voice assistant device.

My house has four Amazon Echos that we bought when the devices first hit the market. They are convenient and I use them to listen to music, check the weather or news, check the hours at stores or restaurants, and to make the occasional reminder in the middle of the night. My family has gotten uncomfortable with being listened to all of the time and we now unplug the devices when we aren’t using them. This kills all of the spontaneous uses of the devices, but for now, that feels safer than being listened to.

I’m going to be leery about buying any new household appliance that can listen to me. If I can’t disable the listening function, I’m not going to buy the device. It’s impossible to feel secure with these devices right now. It’s impossible to take the word of big company that such devices are safe. You only have to look at the current experiences with the hacking of Ring cameras to know that smart home devices are currently anything but safe.

Small ISPs have never worried much about the devices that people hang off their networks. ISPs provide the bandwidth pipe, and how people use data has not been a concern for the ISP. However, that is slowly changing. I have a lot of clients that are now offering smart thermostats, smart security systems, and other smart devices as a way to boost revenue. ISPs need to be careful of any claims they make to customers. Somebody advertising safety for a smart security system might have liability if that system is hacked and the customer exploited.

Maybe I’m being overly cautious, but the idea of somebody I don’t know being able to listen to everything said in my house makes me uncomfortable. As an industry person who has been following the history of IoT devices, I’m even more uncomfortable since it’s now obvious that most smart home devices have lousy security. If you don’t think Amazon is listening to you, I challenge you to activate Alexa and say something vile about Jeff Bezos, then see how much longer it takes to get your next Amazon shipment. Go ahead, I dare you!

The End of Free Conference Calling

Like many of you reading this blog, I have been using the service Free ConferenceCall.com for many years. I got an email from them last week warning that their service will likely go dark, and they wanted users of the service to call Congress to help keep them in business.

Their issue stems back to an FCC order issued in September of last year that seeks to stop the practice of access arbitrage. This FCC summary of the order describes the situation well. Some small telcos have been making money by billing access on ‘free’ minutes generated by services like free conference calling. The process of making money from free calling services has been known in the industry as access arbitrage.

The FCC tried to stop access arbitrage in 2011. At that time, small rural telcos billed a rate of as much as a penny or two per minute to originate or terminate a long-distance call. Some telcos that were allowed to bill the high rates were making a lot of money by originating calls for free outgoing call center services or by terminating calls from 800 numbers, conference calling services, or free chat lines.

In the 2011 order, the FCC eliminated the access fees associated with terminating a call, migrating to what the FCC called ‘bill and keep’, and they hoped that eliminating the access revenues would kill the arbitrage practices. The FCC order was largely effective and chat lines and other free arbitrage services quickly disappeared.

However, the 2011 order didn’t kill all access charges, and over time the folks who make money with arbitrage found another way to make money with free calling. One of the few access charges left untouched in 2011 was transport, which compensates telcos for the use of fiber networks connecting telcos to the outside world. I’ve noticed that the caller ID for FreeConferenceCalling.com numbers is mostly from Iowa and South Dakota, and I have to assume those calls are being terminated at switches that are remote and that can still bill significant miles of transport.

The access fees billed to terminate calls are paid by the carrier that originates the call. This means that most remaining terminating access is paid today by long-distance carriers like AT&T, Sprint and CenturyLink, which together still sell the bulk of long-distance telephone services. The dollar magnitude of access arbitrage is much smaller than a decade ago. The FCC estimates arbitrage is currently a $40 – $60 million problem, whereas it was hundreds of millions before the FCC’s 2011 order. But those fees are being billed to the long-distance companies that get no benefit from the transaction (thus the term arbitrage – the companies are billing the fees because the rules allow a loophole to do so).

FreeConferenceCalling.com is not the only company doing this, and it’s likely that many conference calling services rely wholly or partially on the arbitrage. It’s worth noting that conference call services that use the Internet to place calls will not be affected by this change – because those calls don’t invoke access charges. The carriers billing for the access on the conference calling may or may not be sharing the revenues with companies like FreeConferenceCalling.com – in either case those carriers no longer have any financial reason to continue the practice.

Companies like FreeConferenceCalling.com don’t automatically have to go out of business, but the FCC order means a drastic change to the way they do business. For instance, the company could start charging a monthly fee for conference calling – likely forcing this particular company to change its name. They might sell advertisements for those sitting waiting for a conference call. They could charge for services like recording calls.

It’s more likely that companies like FreeConferenceCalling.com will quietly die or fade away. I tried using the service yesterday and it already seems to be broken. This latest FCC order probably puts the final nail into the coffin of access arbitrage – although I’ve learned to never say never. As long as there are any fees for calling based upon regulatory orders, there is a chance that somebody will find a way to generate lots of calls that fit the circumstance and get enriched by the arbitrage.

The RDOF Grants – The Good and Bad News

The FCC recently approved a Notice of Proposed Rulemaking that proposes how they will administer the $16 billion in RDOF grants that are going to awarded later this year. As you might imagine, there is both good news and bad news coming from the grant program.

It’s good news that this grant program ought to go a long way towards finally killing off large chunks of big telco rural copper. Almost every area covered by these grants is poorly served today by inadequate rural DSL.

The related bad news is that this grant award points out the huge failure of the FCC’s original CAF II program where the big telcos were given $11 billion to upgrade DSL to at least 10/1 speeds. The FCC is still funding this final year of construction of CAF II upgrades. The new grant money will cover much of the same geographic areas as the original CAF II deployment, meaning the FCC will spend over $27 billion to bring broadband to these rural areas. Even after the RDOF grants are built, many of these areas won’t have adequate broadband. Had the FCC administered both grant programs smartly, most of these areas could be getting fiber.

Perhaps the best good news is that a lot of rural households will get faster broadband. Ironically, since the grants cover rural areas, there will be cases where the RDOF grant brings faster broadband to farms than will be available in the county seat, where no grant money is available.

There is bad news on broadband speeds since the new grant program is only requiring download speeds of 25/3 Mbps. This means the FCC is repeating the same huge mistake they made with CAF II by allowing federal money to spend on broadband that will be obsolete before it’s even built. This grant program will be paid out of ten years and require deployment over six years – anybody paying attention to broadband understands that by six years from now a 25/3 Mbps broadband connection will feel glacial. There is grant weighting to promote faster data speeds, but due to the vagaries of a reverse auction, there will be plenty of funding given to networks that will have speeds close to 25/3 Mbps in performance.

There is further bad news since the FCC is basing the grants upon faulty broadband maps. Funding will only be made available to areas that don’t show 25/3 Mbps capability on the FCC maps. Everybody in the industry, including individual FCC Commissioners, agrees that the current maps based upon 477 data provided by ISPs are dreadful. In the last few months, I’ve worked with half a dozen counties where the FCC maps falsely show large swaths of 25/3 broadband coverage that isn’t there. It’s definitely bad news that the grant money won’t be made available in those areas where the maps overstate broadband coverage – folks in such areas will pay the penalty for inadequate broadband maps.

There is a glimmer of good news with mapping since the FCC will require the big ISPs to report broadband mapping data using polygons later this year. Theoretically, polygons will solve some of the mapping errors around the edges of towns served by cable TV companies. But there will only be time for one trial run of the new maps before the grants, and the big telcos have every incentive to exaggerate speeds in this first round of polygon mapping if it will keep this big pot of money from overbuilding their copper. I don’t expect the big telco mapping to be any better with the polygons.

Another area of good news is that there will be a lot of good done with these grants. There will be rural electric cooperatives, rural telcos, and fiber overbuilders that will use these grants as a down-payment to build rural fiber. These grants are not nearly large enough to pay for the full cost of rural fiber deployment, but these companies will borrow the rest with the faith that they can create a sustainable broadband business using fiber.

The bad news is that there will be plenty of grant money that will be used unwisely. Any money given to the traditional satellite providers might as well just be burned. Anybody living in an area where a satellite provider wins the grant funding won’t be getting better broadband or a new option. There is nothing to stop the big telcos from joining the auction and promising to upgrade to 25/3 Mbps on DSL – something they’ll promise but won’t deliver. There are likely to be a few grant recipients who will use the money to slap together a barely adequate network that won’t be fast and won’t be sustainable – there is a lot of lure in $16 billion of free federal money.

It’s dismaying that there should be so many potential downsides. A grant of this magnitude could be a huge boost to rural broadband. Many areas will be helped and there will be big success stories – but there is likely to be a lot of bad news about grant money spend unwisely.

Federal Subsidies for Satellite Broadband

In December, the FCC awarded $87 million from the CAF II Reverse auction held last summer for satellite broadband. The bulk of the satellite awards went to Viasat, which will supposedly use the money to bring broadband to 123,000 homes in seventeen states. The grant awards are meant to bring 25/3 Mbps broadband to areas that don’t have it today.

I have several problems with this award. First is that the satellite companies already cover these areas today and have been free to sell and market in these areas. The federal grant money doesn’t bring a new broadband alternative to anybody in rural America.

Second, the satellite companies aren’t required to connect any specific number of new customers as a result of the grant awards. They are largely free to just pocket the grants directly as profits. Even when they do connect a new customer, they don’t build any lasting broadband infrastructure, but only install an antenna at each new customer.

Third, rural residents don’t seem to want satellite broadband. In a large survey by the Census Bureau in 2017, 21% of people in the US described their neighborhood as rural (52% chose suburban and 27% said urban). In the quarter ending in June 2019, Viasat claimed 587,000 rural customers in the US, which represents only 2.2% of the 128 million households in the country.  If those customers are all in rural America, then the company has roughly a 10% market penetration.

CCG has been doing broadband surveys for twenty years and I don’t know that we’ve ever talked to a satellite customer who was happy with their broadband. In every survey, we seem to encounter more people who dropped satellite service than those that still have it. Customers complain that satellite costs too much – Viasat claimed in their most recent financial report that the average residential broadband bill is $84.26. Customers also hate the high latency, which can be 10 to 15 times higher than terrestrial broadband. The latency is due to the satellite which is parked almost 22,200 miles above earth – it takes a while for a round trip communication over that distance.

The primary complaints about satellite broadband are tiny monthly data caps. The company’s products that would satisfy the FCC grant speed requirements start with the Unlimited Silver 25 plan at $70 with speeds up to 25 Mbps with a monthly data cap of 60 gigabytes of data usage. The fastest plan is the Unlimited Platinum 100 plan for $150 with speeds up to 100 Mbps and a data cap if 150 gigabytes. Unlike cellular plans where a customer can buy more broadband, the Viasat plans throttle customers to speeds reported to be less than 1 Mbps once a customer reaches the data cap. To put those plans into perspective, OpenVault announced recently that the average US home uses 274 gigabytes of data per month. The average cord cutting home uses 520 gigabytes per month. The satellite broadband is impractical for anybody with school students in the home or for anybody that does even a modest amount of video streaming.

Viasat won the grant funding due to a loophole in the grant program. The program funding was available to anybody that offers broadband of at least 25 Mbps. The grant program intended to deliver a new broadband alternative to rural households – something that satellite broadband does not do. The funding was provided under a reverse auction, and the satellite companies likely placed bids for every eligible rural market – they would have been the default winner for any area that had no other bidder. Even where there was another bidder, a reverse auction goes to the lowest bidder and there is no amount that is too small for the satellite companies to accept. The satellite companies don’t have to make capital expenditures to satisfy the grants.

Giving money to satellite providers makes no sense as broadband policy. They don’t bring new broadband to anybody since the satellite plans are already available. The plans are expensive, have high latency and low monthly data caps.

The much larger RDOF grant program will award $16.4 billion in 2020 for rural broadband and the satellite companies must be ecstatic. If the FCC doesn’t find a way to keep the satellite companies out of this coming auction, the satellite companies could score a billion-dollar windfall. They can do so without offering any products that are not already available today.

To put these grants into perspective, the $87 million grant award is roughly the same size as the money that has been awarded over several years in the Minnesota Border-to-Border grant program. The Minnesota grants have helped funds dozens of projects, many of which built fiber in the state. There is no comparison between the benefits of the state grant program compared to the nearly total absence of benefit from handing federal money to the satellite companies.