Is the FCC Killing State Matching Grants?

In a bizarre last-minute change of the language approved for the upcoming $16.4 billion RDOF grant funds, the FCC inserted new language into the rules that would seem to eliminate grant applicants from accepting matching state grants for projects funded by the RDOF grants.

The new language specifically says that the RDOF grant program now excludes any geographic area that the Commission “know[s] to be awarded funding through the U.S. Department of Agriculture’s ReConnect Program or other similar federal or state broadband subsidy programs or those subject to enforceable broadband deployment obligations.”

It’s fully understandable that the FCC doesn’t want to award grant money from multiple federal grant programs for the same project, and that was a loophole that is sensible to close. I think most industry folks understood this to be true even if it wasn’t in writing.

But the idea of blocking states from making grants to supplement RDOF is counterintuitive. More than half of the states now have state broadband grant programs. It makes no sense for the FCC to tell states how they can spend (or in this case how they cannot spend) their state grant monies.

The whole concept of blocking state matching grants goes against the tradition of federal funding. The vast majority of federal funding programs for infrastructure encourage state matching funds and many programs require it. Matching state grants are used along with federal grants for building infrastructure such as roads, bridges, water and sewer systems, airports, etc. Why would the FCC want to block this for broadband?

The state grant programs that I’m most familiar with were planning to provide matching grants for some RDOF grants. Broadband offices at the state level understand that building broadband networks can be expensive and they know that in some cases the extra funding is needed to make broadband projects viable.

It’s important to remember that the RDOF grants are aimed at the most remote customers in the country – customers that, by definition, will require the largest investment per customer to bring broadband. This is due almost entirely due to the lower household densities in the RDOF grant areas. Costs can be driven up also by local conditions like rocky soil or rough terrain. Federal funding that provides enough money to build broadband in the plains states is likely not going to be enough to induce somebody to build in the remote parts of Appalachia where the RDOF grants are most needed.

State grant programs often also have other agendas. For example, the Border-to-Border grants in Minnesota won’t fund broadband projects that can’t achieve at least 100 Mbps download speeds. This was a deliberate decision so that government funding wouldn’t be wasted to build broadband infrastructure that will be too slow and obsolete soon after it’s constructed. By contrast, the FCC RDOF program is allowing applicants proposing speeds as slow as 25 Mbps. It’s not hard to argue that speed is already obsolete.

I know ISPs that were already hoping for a combination of federal and state grants to build rural infrastructure. If the FCC kills matching grants, then they will be killing the plans for such ISPs that wanted to use the grants to build fiber networks – a permanent broadband solution. Even with both state and federal grants, these ISPs were planning to take on a huge debt burden to make it work.

If the matching grants are killed, I have no doubt that the RDOF money will still be awarded to somebody. However, instead of going to a rural telco or electric coop that wants to build fiber, the grants will go to the big incumbent telephone companies to waste money by pretending to goose rural DSL up to 25 Mbps. Even worse, much of the funding might go to the satellite companies that offer nothing new and a product that people hate. I hate to engage in conspiracy theories, but one of the few justifications I can see for killing matching grants is to make it easier for the big incumbent telcos to win, and waste, another round of federal grant funding.

Letters of Credit

One of the dumbest rules suggested by the FCC for the new $16.4 billion RDOF grants is that an ISP must provide a letter of credit (LOC) to guarantee that the ISP will be able to meet their obligation to provide the matching funds for the RDOF grants. The FCC had a number of grant winners years ago in the stimulus broadband grant program that never found financing, and the FCC is clearly trying to avoid a repeat of that situation. A coalition of major industry associations wrote a recent letter to the FCC asking them to remove the LOC requirement – this includes, NTCA, INCOMPAS, USTelecom, NRECA, WTA, and WISPA.

There may be no better example of how out of touch Washington DC is with the real world because whoever at the FCC came up with that requirement has no idea what a letter of credit is. A letter of credit is a formal negotiable instrument – a promissory note like a check. A letter of credit is a promise that a bank will honor the obligation of the buyer of a letter of credit should that buyer fail to meet a specific obligation. The most normal use of LOCs is in international trade or transactions between companies that don’t know or trust each other. An example might be a company that agrees to buy $100,000 dollars of bananas from a wholesaler in Costa Rico, payable upon delivery of the bananas to the US. The US buyer of the bananas will obtain a letter of credit, giving assurance to the wholesaler that they’ll get paid. When the bananas are received in the US, the bank is obligated to pay for the bananas if the buyer fails to do so.

Banks consider letters of credits to be the equivalent of loans. The banks must set aside the amount of pledged money in case they are required to disburse the funds. Most letters of credit are only active for a short, defined period of time. It’s highly unusual for a bank to issue a letter of credit that would last as long as the six years required by the RDOF grant process.

Letters of credit are expensive. A bank holds the pledged cash in escrow for the active life of the LOC and expects to be compensated for the lost interest expense they could otherwise have earned. There are also big upfront fees to establish an LOC because the bank has to evaluate a LOC holder in the same way they would evaluate a borrower. Banks also require significant collateral that they can seize should the letter or credit ever get used and the bank must pay out the cash.

I’m having trouble understanding who the letter of credit would benefit in this situation. When the FCC makes an annual grant payment to an ISP, they expect that ISP to be building network – 40% of the RDOF network must be completed by the end of year 3 with 20% more to be completed each of the next three years. The ISP would be expected each year to have the cash available to pay for fiber, work crews, electronics, engineers, etc. You can’t buy a letter of credit that would be payable to those future undefined parties. I think the FCC believes the letter of credit would be used to fund the ISP so they could construct the network. No bank is going to provide a letter of credit where the payee is also the purchaser of the LOC – in banking terms that would be an ISP paying an upfront fee for a guaranteed loan to be delivered later should that ISP not find a loan elsewhere. It’s absurd to think banks would issue such a financial instrument. It’s likely that an ISP who defaults on a LOC is in financial straits, so having a LOC in place would have the opposite effect of what the FCC wants – rather than guarantee future funds a bank would likely seize the assets of the ISP when the LOC is exercised.

A letter of credit has significant implications for the ISP that buys it. Any bank considering lending to the ISP will consider an LOC to be the same as outstanding debt – thus reducing the amount of other money the ISP can borrow. A long-term LOC would tie up a company’s borrowing capacity for the length of the LOC, making it that much harder to finance the RDOF project.

The coalition writing the letter to the FCC claims correctly that requiring letters of credit would stop a lot of ISPs from applying for the grants. Any ISP that that can’t easily borrow large amounts of money from a commercial bank is not going to get a LOC. Even ISPs that can get the letter of credit might decide it makes it too costly to accept the grant. The coalition petitioning the FCC estimates that the aggregate cost to obtain letters of credit for RDOF could cost as much as $1 billion for the grant recipients – my guess is that the estimate is conservatively low.

One of the groups this requirement might cause problems for are ISPs that obtain their funding from the federal RUS program. These entities – mostly telcos and electric cooperatives, would have to go to a commercial bank to get a LOC. If their only debt is with the RUS, banks might not be willing to issue an LOC, regardless of the strength of their balance sheet, since they have no easy way to secure collateral for the LOC.

Hopefully, the FCC comes to its senses, or the RDOF grant program might be a bust before it even gets started. I’m picturing ISPs going to banks and explaining the FCC requirements and seeing blank stares from bankers who are mystified by the request.

Is 5G Radiation Safe?

There is a lot of public sentiment against placing small cell sites on residential streets. There is a particular fear of broadcasting higher millimeter wave frequencies near to homes since these frequencies have never been in widespread use before. In the public’s mind, higher frequencies mean a higher danger of health problems related to exposure to radiofrequency emissions. The public’s fears are further stoked when they hear that Switzerland and Belgium are limiting the deployment of millimeter wave radios until there is better proof that they are safe.

The FCC released a report and order on December 4 that is likely to add fuel to the fire. The agency rejected all claims that there is any public danger from radiofrequency emissions and affirmed the existing frequency exposure rules. The FCC said that none of the thousand filings made in the docket provided any scientific evidence that millimeter wave, and other 5G frequencies are dangerous.

The FCC is right in their assertion that there are no definitive scientific studies linking cellular frequencies to cancer or other health issues. However, the FCC misses the point that most of those asking for caution, including scientists, agree with that. The public has several specific fears about the new frequencies being used:

  • First is the overall range of new frequencies. In the recent past, the public was widely exposed to relatively low frequencies from radio and TV stations, to a fairly narrow range of cellular frequencies, and two bands of WiFi. The FCC is in the process of approving dozens of new bands of frequency that will be widely used where people live and work. The fear is not so much about any given frequency being dangerous, but rather a fear that being bombarded by a large range of frequencies will create unforeseen problems.
  • People are also concerned that cellular transmitters are moving from tall towers, which normally have been located away from housing, to small cell sites on poles that are located on residential streets. The fear is that these transmitters are generating a lot of radiation close to the transmitter – which is true. The amount of frequency that strikes a given area decreases rapidly with distance from a transmitter. The anecdote that I’ve seen repeated on social media is of placing a cell site fifteen feet from the bedroom of a child. I have no idea if there is a real small cell site that is the genesis of this claim – but there could be. In dense urban neighborhoods, there are plenty of streets where telephone poles are within a few feet of homes. I admit that I would be leery about having a small cell site directly outside one of my windows.
  • The public worries when they know that there will always be devices that don’t meet the FCC guidelines. As an example, the Chicago Tribune tested eleven smartphones in August and found that a few of them were issuing radiation at twice the FCC maximum-allowable limit. The public understands that vendors play loose with regulatory rules and that the FCC largely ignores such violations.

The public has no particular reason to trust this FCC. The FCC under Chairman Pai has sided with the large carriers on practically every issue in front of the Commission. This is not to say that the FCC didn’t give this docket the full consideration that should be given to all dockets – but the public perception is that this FCC would side with the cellular carriers even if there was a public health danger.

The FCC order is also not particularly helped by citing the buy-in from the Food and Drug Administration on the safety of radiation. That agency has licensed dozens of medicines that later proved to be harmful, so that agency also doesn’t garner a lot of public trust.

The FCC made a few changes with this order. They have mandated a new set of warning signs to be posted around transmitters. It’s doubtful that anybody outside of the industry will understand the meaning of the color-coded warnings. The FCC is also seeking comments on whether exposure standards should be changed for frequencies below 100 kHz and above 6 GHz. The agency is also going to exempt certain kinds of transmitters from FCC testing.

I’ve read extensively on both sides of the issue and it’s impossible to know the full story. For example, a majority of scientists in the field signed a petition to the United Nations warning against using higher frequencies without more testing. But it’s also easy to be persuaded by other scientists who say that higher frequencies don’t even penetrate the skin. I’ve not heard of any studies that look at exposing people to a huge range of different low-power frequencies.

This FCC is in a no-win position. The public properly perceives the agency of being pro-carrier, and anything the FCC says is not going to persuade those worried about radiation risks. I tend to side with the likelihood that the radiation is not a big danger, but I also have to wonder if there will be any impact after expanding by tenfold the range of frequencies we’re exposed to. The fact is that we’re not likely to know until after we’ve all been exposed for a decade.

Killing 3G

I have bad news for anybody still clinging to their flip phones. All of the big cellular carriers have announced plans to end 3G cellular service, and each has a different timeline in mind:

  • Verizon previously said they would stop supporting 3G at the end of 2019, but now says it will end service at the end of 2020.
  • AT&T has announced the end of 3G to be coming in early 2022.
  • Sprint and T-Mobile have not expressed a specific date but are both expected to stop 3G service sometime in 2020 or 2021.

The amount of usage on 3G networks is still significant. GSMA reported that at the end of 2018 that as many as 17% of US cellular customers still made 3G connections, which accounted for as much as 19% of all cellular connections.

The primary reason cited for ending 3G is that the technology is far less efficient than 4G. A 3G connection to a cell site chews up the same amount of frequency resources as a 4G connection yet delivers far less data to customers. The carriers are also anxious to free up mid-range spectrum for upcoming 5G deployment.

Opensignal measures actual speed performance for millions of cellular connections and recently reported the following statistics for the average 3G and 4G download speeds as of July 2019:

4G 2019 3G 2019
AT&T 22.5 Mbps 3.3 Mbps
Sprint 19.2 Mbps 1.3 Mbps
T-Mobile 23.6 Mbps 4.2 Mbps
Verizon 22.9 Mbps 0.9 Mbps

The carriers have been hesitating on ending 3G because there are still significant numbers of rural cell sites that still don’t offer 4G. The cellular carriers were counting on funding from the FCC’s Mobility Fund Phase II to upgrade rural cell sites. However, that funding program got derailed and delayed when the FCC found there were massive errors in the data provided for distributing that fund. The big carriers were accused by many of rigging the data in a way to give more funding to themselves instead of to smaller rural cellular providers.

The FCC staff conducted significant testing of the reported speed and coverage data and released a report of their findings in December 2019. The testing showed that the carriers have significantly overreported 4G coverage and speeds across the country. This report is worth reading for anybody that needs to be convinced of the garbage data that has been used for the creation of FCC broadband maps. I wish the FCC Staff would put the same effort into investigating landline broadband data provided to the FCC. The FCC Staff recommended that the agency should release a formal Enforcement Advisory including ‘a detailing of the penalties associated with carrier filings that violate federal law’.

The carriers are also hesitant to end 3G since a lot of customers still use the technology. Opensignal says there are several reasons for the continued use of 3G. First, 12.7% of users of 3G live in rural areas where 3G is the only cellular technology available. Opensignal says that 4.1% of 3G users still own old flip phones that are not capable of receiving 4G. The biggest category of 3G users are customers that own a 4G capable phone but still subscribe to a 3G data plan. AT&T is the largest provider of such plans and has not forced customers to upgrade to 4G plans.

The carriers need to upgrade rural cell sites to 4G before they can be allowed to cut 3G dead. In doing so they need to migrate customers to 4G data plans and also notify customers who still use 3G-only flip phones that it’s finally time to upgrade.

One aspect of the 3G issue that nobody is talking about is that AT&T says it is using fixed wireless connections to meet its CAF II buildout requirements. Since the CAF II areas include some of the most remote landline customers, it stands to reason that these are the same areas that are likely to still be served with 3G cell towers. AT&T can’t deliver 10/1 Mbps or faster speeds using 3G technology. This makes me wonder what AT&T has been telling the FCC in terms of meeting their CAF II build-out requirements.

The End of Free Conference Calling

Like many of you reading this blog, I have been using the service Free ConferenceCall.com for many years. I got an email from them last week warning that their service will likely go dark, and they wanted users of the service to call Congress to help keep them in business.

Their issue stems back to an FCC order issued in September of last year that seeks to stop the practice of access arbitrage. This FCC summary of the order describes the situation well. Some small telcos have been making money by billing access on ‘free’ minutes generated by services like free conference calling. The process of making money from free calling services has been known in the industry as access arbitrage.

The FCC tried to stop access arbitrage in 2011. At that time, small rural telcos billed a rate of as much as a penny or two per minute to originate or terminate a long-distance call. Some telcos that were allowed to bill the high rates were making a lot of money by originating calls for free outgoing call center services or by terminating calls from 800 numbers, conference calling services, or free chat lines.

In the 2011 order, the FCC eliminated the access fees associated with terminating a call, migrating to what the FCC called ‘bill and keep’, and they hoped that eliminating the access revenues would kill the arbitrage practices. The FCC order was largely effective and chat lines and other free arbitrage services quickly disappeared.

However, the 2011 order didn’t kill all access charges, and over time the folks who make money with arbitrage found another way to make money with free calling. One of the few access charges left untouched in 2011 was transport, which compensates telcos for the use of fiber networks connecting telcos to the outside world. I’ve noticed that the caller ID for FreeConferenceCalling.com numbers is mostly from Iowa and South Dakota, and I have to assume those calls are being terminated at switches that are remote and that can still bill significant miles of transport.

The access fees billed to terminate calls are paid by the carrier that originates the call. This means that most remaining terminating access is paid today by long-distance carriers like AT&T, Sprint and CenturyLink, which together still sell the bulk of long-distance telephone services. The dollar magnitude of access arbitrage is much smaller than a decade ago. The FCC estimates arbitrage is currently a $40 – $60 million problem, whereas it was hundreds of millions before the FCC’s 2011 order. But those fees are being billed to the long-distance companies that get no benefit from the transaction (thus the term arbitrage – the companies are billing the fees because the rules allow a loophole to do so).

FreeConferenceCalling.com is not the only company doing this, and it’s likely that many conference calling services rely wholly or partially on the arbitrage. It’s worth noting that conference call services that use the Internet to place calls will not be affected by this change – because those calls don’t invoke access charges. The carriers billing for the access on the conference calling may or may not be sharing the revenues with companies like FreeConferenceCalling.com – in either case those carriers no longer have any financial reason to continue the practice.

Companies like FreeConferenceCalling.com don’t automatically have to go out of business, but the FCC order means a drastic change to the way they do business. For instance, the company could start charging a monthly fee for conference calling – likely forcing this particular company to change its name. They might sell advertisements for those sitting waiting for a conference call. They could charge for services like recording calls.

It’s more likely that companies like FreeConferenceCalling.com will quietly die or fade away. I tried using the service yesterday and it already seems to be broken. This latest FCC order probably puts the final nail into the coffin of access arbitrage – although I’ve learned to never say never. As long as there are any fees for calling based upon regulatory orders, there is a chance that somebody will find a way to generate lots of calls that fit the circumstance and get enriched by the arbitrage.

The RDOF Grants – The Good and Bad News

The FCC recently approved a Notice of Proposed Rulemaking that proposes how they will administer the $16 billion in RDOF grants that are going to awarded later this year. As you might imagine, there is both good news and bad news coming from the grant program.

It’s good news that this grant program ought to go a long way towards finally killing off large chunks of big telco rural copper. Almost every area covered by these grants is poorly served today by inadequate rural DSL.

The related bad news is that this grant award points out the huge failure of the FCC’s original CAF II program where the big telcos were given $11 billion to upgrade DSL to at least 10/1 speeds. The FCC is still funding this final year of construction of CAF II upgrades. The new grant money will cover much of the same geographic areas as the original CAF II deployment, meaning the FCC will spend over $27 billion to bring broadband to these rural areas. Even after the RDOF grants are built, many of these areas won’t have adequate broadband. Had the FCC administered both grant programs smartly, most of these areas could be getting fiber.

Perhaps the best good news is that a lot of rural households will get faster broadband. Ironically, since the grants cover rural areas, there will be cases where the RDOF grant brings faster broadband to farms than will be available in the county seat, where no grant money is available.

There is bad news on broadband speeds since the new grant program is only requiring download speeds of 25/3 Mbps. This means the FCC is repeating the same huge mistake they made with CAF II by allowing federal money to spend on broadband that will be obsolete before it’s even built. This grant program will be paid out of ten years and require deployment over six years – anybody paying attention to broadband understands that by six years from now a 25/3 Mbps broadband connection will feel glacial. There is grant weighting to promote faster data speeds, but due to the vagaries of a reverse auction, there will be plenty of funding given to networks that will have speeds close to 25/3 Mbps in performance.

There is further bad news since the FCC is basing the grants upon faulty broadband maps. Funding will only be made available to areas that don’t show 25/3 Mbps capability on the FCC maps. Everybody in the industry, including individual FCC Commissioners, agrees that the current maps based upon 477 data provided by ISPs are dreadful. In the last few months, I’ve worked with half a dozen counties where the FCC maps falsely show large swaths of 25/3 broadband coverage that isn’t there. It’s definitely bad news that the grant money won’t be made available in those areas where the maps overstate broadband coverage – folks in such areas will pay the penalty for inadequate broadband maps.

There is a glimmer of good news with mapping since the FCC will require the big ISPs to report broadband mapping data using polygons later this year. Theoretically, polygons will solve some of the mapping errors around the edges of towns served by cable TV companies. But there will only be time for one trial run of the new maps before the grants, and the big telcos have every incentive to exaggerate speeds in this first round of polygon mapping if it will keep this big pot of money from overbuilding their copper. I don’t expect the big telco mapping to be any better with the polygons.

Another area of good news is that there will be a lot of good done with these grants. There will be rural electric cooperatives, rural telcos, and fiber overbuilders that will use these grants as a down-payment to build rural fiber. These grants are not nearly large enough to pay for the full cost of rural fiber deployment, but these companies will borrow the rest with the faith that they can create a sustainable broadband business using fiber.

The bad news is that there will be plenty of grant money that will be used unwisely. Any money given to the traditional satellite providers might as well just be burned. Anybody living in an area where a satellite provider wins the grant funding won’t be getting better broadband or a new option. There is nothing to stop the big telcos from joining the auction and promising to upgrade to 25/3 Mbps on DSL – something they’ll promise but won’t deliver. There are likely to be a few grant recipients who will use the money to slap together a barely adequate network that won’t be fast and won’t be sustainable – there is a lot of lure in $16 billion of free federal money.

It’s dismaying that there should be so many potential downsides. A grant of this magnitude could be a huge boost to rural broadband. Many areas will be helped and there will be big success stories – but there is likely to be a lot of bad news about grant money spend unwisely.

Federal Subsidies for Satellite Broadband

In December, the FCC awarded $87 million from the CAF II Reverse auction held last summer for satellite broadband. The bulk of the satellite awards went to Viasat, which will supposedly use the money to bring broadband to 123,000 homes in seventeen states. The grant awards are meant to bring 25/3 Mbps broadband to areas that don’t have it today.

I have several problems with this award. First is that the satellite companies already cover these areas today and have been free to sell and market in these areas. The federal grant money doesn’t bring a new broadband alternative to anybody in rural America.

Second, the satellite companies aren’t required to connect any specific number of new customers as a result of the grant awards. They are largely free to just pocket the grants directly as profits. Even when they do connect a new customer, they don’t build any lasting broadband infrastructure, but only install an antenna at each new customer.

Third, rural residents don’t seem to want satellite broadband. In a large survey by the Census Bureau in 2017, 21% of people in the US described their neighborhood as rural (52% chose suburban and 27% said urban). In the quarter ending in June 2019, Viasat claimed 587,000 rural customers in the US, which represents only 2.2% of the 128 million households in the country.  If those customers are all in rural America, then the company has roughly a 10% market penetration.

CCG has been doing broadband surveys for twenty years and I don’t know that we’ve ever talked to a satellite customer who was happy with their broadband. In every survey, we seem to encounter more people who dropped satellite service than those that still have it. Customers complain that satellite costs too much – Viasat claimed in their most recent financial report that the average residential broadband bill is $84.26. Customers also hate the high latency, which can be 10 to 15 times higher than terrestrial broadband. The latency is due to the satellite which is parked almost 22,200 miles above earth – it takes a while for a round trip communication over that distance.

The primary complaints about satellite broadband are tiny monthly data caps. The company’s products that would satisfy the FCC grant speed requirements start with the Unlimited Silver 25 plan at $70 with speeds up to 25 Mbps with a monthly data cap of 60 gigabytes of data usage. The fastest plan is the Unlimited Platinum 100 plan for $150 with speeds up to 100 Mbps and a data cap if 150 gigabytes. Unlike cellular plans where a customer can buy more broadband, the Viasat plans throttle customers to speeds reported to be less than 1 Mbps once a customer reaches the data cap. To put those plans into perspective, OpenVault announced recently that the average US home uses 274 gigabytes of data per month. The average cord cutting home uses 520 gigabytes per month. The satellite broadband is impractical for anybody with school students in the home or for anybody that does even a modest amount of video streaming.

Viasat won the grant funding due to a loophole in the grant program. The program funding was available to anybody that offers broadband of at least 25 Mbps. The grant program intended to deliver a new broadband alternative to rural households – something that satellite broadband does not do. The funding was provided under a reverse auction, and the satellite companies likely placed bids for every eligible rural market – they would have been the default winner for any area that had no other bidder. Even where there was another bidder, a reverse auction goes to the lowest bidder and there is no amount that is too small for the satellite companies to accept. The satellite companies don’t have to make capital expenditures to satisfy the grants.

Giving money to satellite providers makes no sense as broadband policy. They don’t bring new broadband to anybody since the satellite plans are already available. The plans are expensive, have high latency and low monthly data caps.

The much larger RDOF grant program will award $16.4 billion in 2020 for rural broadband and the satellite companies must be ecstatic. If the FCC doesn’t find a way to keep the satellite companies out of this coming auction, the satellite companies could score a billion-dollar windfall. They can do so without offering any products that are not already available today.

To put these grants into perspective, the $87 million grant award is roughly the same size as the money that has been awarded over several years in the Minnesota Border-to-Border grant program. The Minnesota grants have helped funds dozens of projects, many of which built fiber in the state. There is no comparison between the benefits of the state grant program compared to the nearly total absence of benefit from handing federal money to the satellite companies.

Taking Advantage of the $9B 5G Fund

The FCC will be moving forward with the $9 billion 5G Fund – a new use of the Universal Service Fund – that will be providing money to expand cellular coverage to the many remote places in the US where 4G cell coverage is still spotty or nonexistent. There is a bit of urgency to this effort since the big cellular companies all want to shut down 3G within a year or two. This money will be made available to cellular carriers, but the funding still opens up possible benefits for other carriers and ISPs.

Some of this funding is likely to go towards extending fiber into rural places to reach cell towers, and that opens up the idea of fiber sharing. There are still a lot of places in the country that don’t have adequate fiber backhaul – the data pipes that bring traffic to and from the big hubs for the Internet. In the last six months alone I’ve worked with three different rural projects where lack of backhaul was a major issue. Nobody can consider building broadband networks in rural communities if the new networks can’t be connected to the web.

By definition, the 5G Fund is going to extend into rural places. If the FCC was maximizing the use of federal grant funds, they would demand that any fiber built with this new fund would be available to others at reasonable rates. This was one of the major provisions of the middle mile networks built a decade ago with stimulus funding. I know of many examples where those middle mile routes are providing backhaul today for rural fixed wireless and fiber networks. Unfortunately, I don’t see any such provisions being discussed in the 5G Fund – which is not surprising. I’m sure the big cellular companies have told the FCC that making them share fiber with others would be an inconvenience, so this idea doesn’t seem to be included in the 5G Fund plan.

I think there is a window of opportunity to partner with wireless carriers to build new fiber jointly. The cellular carriers can get their portion of new fiber funded from the 5G Fund and a partner can pick up new fiber at a fraction of the cost of building the route alone. This could be the simplest form of partnership where each party owns some pairs in a joint fiber.

This is worth considering for anybody already thinking about building rural fiber. The new routes don’t have to be backhaul fiber and could instead be a rural route that is part of a county-wide build-out or fiber being built by an electric cooperative. If somebody is considering building fiber into an area that has poor cellular coverage, the chances are that there will be 5G Fund money coming to that same area.

It has always been challenging to create these kinds of partnerships with AT&T and Verizon, although I am aware of some such partnerships. Both Sprint and T-Mobile have less rural coverage than the other carriers and might be more amenable to considering partnerships – but they might be consumed by the possibility of their merger.

There are a lot of other cellular carriers. The CTIA, the trade association for the larger cellular carriers, has thirty members that are facility-based cellular providers. The Competitive Carriers Association (CCA) has over one hundred members.

Ideally, a deal can be made to share fiber before the reverse auction for the 5G Fund. Any carrier that has a partner for a given route will have a bidding advantage since cost-sharing with a partner will lower the cost of building new fiber. It might be possible to find partnerships after the auction, but there could be restrictions on the newly built assets as part of the grants – we don’t know yet.

My recommendation is that if you are already planning to build rural fiber that you look around to see if one of the cellular carriers might be interested in serving the same area. Both parties can benefit through a cost-sharing partnership – but the real winners are rural customers that gain access to better cellular service and better broadband.

Spectrum and Weather Forecasting

There is currently a brewing controversy over the allocation of various radio frequencies for 5G that could have a negative impact on weather forecasting. Weather forecasting has become extremely sophisticated and relies on masses of data gathered from weather satellites and other data-gathering devices. The masses of data, along with modern supercomputers and data center computing have significantly improved the ability to predict future weather.

There are numerous bands of spectrum used in weather forecasting. For an in-depth look at the complexity of the spectrum needs, see this guide for spectrum used for meteorology from the World Meteorological Association and the ITU (warning: highly technical document). It goes into depth about the various bands of frequency that are used for various weather gathering purposes.

The current controversy involves the use of spectrum at 23.8 GHz. It turns out this frequency has the characteristic that it is absorbed by water vapor. This makes it valuable for meteorological purposes since it can be used by devices in satellites called sounders to measure the different levels of water vapor in the air. This is one of the most valuable tools in the weather data gathering system, particularly over oceans where there are few other measuring devices.

The sounders work by emitting the 23.8 GHz spectrum and measuring the return signals, working similarly to radar. The process of measuring water vapor is extremely sensitive to interference because the return signals to the sounders are extremely faint. The weather community is worried that even a little bit of interference will kill the utility of this valuable tool.

In May 2019 the FCC raised over $2.7 billion through the auction of spectrum in the 24 GHz and 28 GHz bands, including spectrum sitting directly adjacent to the 23.8 GHz band. Before the auction, the administrator of NASA warned the FCC that leakage from the newly auctioned spectrum could degrade the use of the 23.8 GHz spectrum. NOAA (the National Oceanic and Atmospheric Administration) told Congress the same thing. NOAA said that a 30% degradation in the accuracy of the sounders could worsen the ability to predict where hurricanes will land by two or three days – something that would have a huge negative dollar cost.

It’s a convenient fiction in the wireless world that radios stay within the exact frequency bands they are supposed to use. However, in real life radios often stray out-of-band for various reasons and cause interference in adjacent frequency bands. This happens up and down the radio spectrum, but in this case, scientists say that even a little interference could make it difficult or impossible to read the faint signals that are read by the sounders to measure water vapor.

Both NASA and NOAA have proposed that the FCC lower the chances of interference by lowering the power level and the ‘noise’ that comes from cross-band interference. They asked for a limit of -42 decibel watts of noise for nearby spectrum bands while the FCC is recommending -20 decibel watts. The lower the decibel watts number, the less the interference. The World Radiocommunications Conference has a current recommendation of -33 decibel watts, which is scheduled to lower to -39 decibel watts in 2027.

The carriers that bought the spectrum, through filings made by the CTIA, say that the frequencies would be a lot less valuable to them if they have to lower power to meet the noise levels recommended by NASA and NOAA, and the FCC is siding with the carriers.

This is just the first of many frequency battles we’re going to see as the thirst for more 5G spectrum invades spectrum that has been used for scientific or military purposes. The FCC often tries to mitigate interference by moving existing spectrum users to some different frequency band in order to accommodate the best use of spectrum. However, in this case, the weather satellites must use 23.8 GHz because that’s where nature has set the interference with water vapor.

It’s hard not to side with the weather scientists. Everybody, including the carriers, will suffer great harm if the ability to predict hurricanes is degraded. When it comes to something as vital as being able to predict hurricanes, we need to use common sense and caution rather than give the 5G companies every possible slice of available spectrum. It’s not hard to predict that the carriers will fight hard to keep this spectrum even if there is too much interference. Unfortunately, the current FCC is granting the carriers everything on their wish list – expect more of this in 2020.

Congress Mandates Cable TV Pricing Disclosure

In a surprise move by Congress, the recent appropriations bill that funds the government through September 2020 includes a new law that mandates that cable companies tell their customers the truth about cable pricing. Labeled as the Television Viewer Protection Act of 2019 the bipartisan law places new requirements on companies selling cable TV.

The bill was originally sponsored by Representative Mike Doyle (D-PA). In the original version of the bill, the cable providers had to advertise the full monthly cost of service. Full cost meant including such things as hidden fees, equipment charges, and any taxes or surcharges added to a cable bill. The bill also requires disclosure about the details of any promotional pricing and cable companies should make it clear when the promotion ends.

That final version of the law softened the disclosure requirement and cable companies can still promote deceptive special pricing. However, a cable provider must notify customers buying a new plan within 24-hours “by phone, in person, online, or by other reasonable means” of the full cost of buying the service. Customers then have 24-hours from the time the cable company sends the notice to cancel service with no penalty. The cynic in me believes that cable companies will find ways to meet the law and still be deceptive – such as putting the pricing notice at the end of a long email message that customers aren’t likely to read. However, if cable providers follow the spirit of the law, it should end the practice of customers seeing bills that are much higher than what they expect. Another provision of the new law is that cable providers can no longer charge for equipment they don’t provide – something that Frontier was accused of during the last year.

Interestingly, the law only affects cable TV pricing and not pricing for broadband or telephone service. I hope the cable companies don’t somehow shift hidden fees to these other services. The law also seems to ignore the fact that a majority of traditional cable customers buy a bundle of multiple services. The cable companies have never come clean with customers about how bundling discounts work, leaving the companies with the flexibility to penalize customers for withdrawing any one of the bundled services. I suspect the cable companies will somehow not come clean about bundling prices for cable TV, even with this new law.

The bill gives cable companies six months to implement the new practices. Oddly, the bill also allows the FCC to extend the starting date up to six additional months. It’s hard to picture any reason for the FCC to extend the deadline other than kowtowing to the cable companies.

From a consumer perspective, this law is long overdue. For the last five years, the cable companies have disguised much of their rate increases by folding them into hidden fees rather than into advertised rates. A few months ago, Consumer Reports reported that the hidden fees for the big cable companies range from $22.96 monthly for AT&T U-verse to $43.79 for Verizon FiOS.

The timing of this new law is interesting from a market perspective. We’re now seeing cord-cutting at a record pace, and forcing the cable companies to be honest with customers is likely to accelerate cord cutting even more.

Smaller cable providers that compete against the big companies have always been torn about how to advertise their prices. Some match the practices of the big cable companies and have hidden fees and advertise deceptively low prices. Others have taken the high road and advertise the full price of service while pointing out that their competitor’s pricing is deceptive. These new rules make it easier for smaller cable companies to disclose their full prices and to challenge the big cable companies to do the same.

The new law also includes several other changes for the cable industry. The law allows the 5-year sunset provision that has allowed satellite TV providers to import distant local network stations for rural customers. The companies have always argued that the cost of negotiating with every local station across the country is astronomical and that they would allow network channels to go dark rather than seek deals with every local network affiliate in the country. I guess we’ll soon find out if that’s true when the satellite providers can no longer bring in network stations from out of the market. I would hope that a satellite provider that decides not to deliver network affiliates like ABC, CBS, FOX, or NBC will lower the price of the cable package to reflect undelivered channels.

Finally, the bill includes a requirement that local stations and programmers negotiate programming contracts in good faith. That’s an idea that has been bouncing around for a while in response to local stations negotiating in large groups instead of individually. In the last year, we have seen programming go dark at a record pace when stations and programmers are deadlocked in negotiations. We’ll have to wait a while to see if this stronger language gives the FCC any real leverage to end retransmission disputes.