California’s New Privacy Law

If you use the web much you noticed a flurry of new privacy notices at the end of last year, either through pop-up notifications when you visited a website or by emails. These notifications were all due to the California Consumer Privacy Act, the new privacy laws that went into effect on January 1.

The law applies to companies that use the web and that have annual revenues over $25 million, companies that buy, sell or collect data on 50,000 or more consumers, and companies of any size that make more than 50% of their revenue by selling customer’s personal information.

The new law has a lot of requirements for web companies operating in California. Web companies must provide California consumers the ability to opt-out from having their personal information sold to others. Consumers must be given the option to have their data deleted. Consumers must be provided the opportunity to view the data collected about them. Consumers also must be shown the identity of third parties that have purchased their data.

The new law defines personal data broadly to include things like name, address, online identifiers, IP addresses, email addresses, purchasing history, geolocation data, audio/video data, biometric data, or any effort made to classify customers by personality type or trends.

The penalties for violating the law are severe. Consumers can sue web companies for up to $2,500 if they don’t offer these options by January 1 and up to $7,500 per violation if a company intentionally violates the law. It’s not too hard to anticipate the class action lawsuits already brewing that will result from this law.

While these new rules only apply to web companies and how they interact with California consumers, many web sites have taken the safe approach and are applying the new rules to everybody. That’s a safe approach because it’s difficult for web companies to always know where a web visitor is from, especially for people who use VPNs to hide their location.

California isn’t the only state with new privacy rules. Washington has new rules that are not as severe as the California ones but that still layer a lot of new requirements onto ISPs. New York is working on a privacy law that is said to be even tougher than the California one.

These state laws are only in place because Congress seems unable to pass a set of federal privacy rules. The issue has been debated over the last two years, and draft bills have been written, but no proposed law has come before the Senate for a vote, so the issue has gone nowhere. People are rightfully concerned that their data is being used and many people want the government to set some guidelines to protect them. The states are filling the legislative void in the absence of federal legislators taking action.

Web companies will face dilemmas with a proliferation of state privacy laws. Do they try to comply only with customers in a given state? What’s most concerning for web companies is that as more states pass privacy laws that some of the laws will inevitably conflict. There is also a big question about how these laws apply to foreign companies. The California law is written to apply to every company interfacing with California consumers. To complicate matters for web companies, European Union privacy rules are also tough and will inevitably conflict with parts of the California rules.

Like all new laws, this new law will be tested in court. The more interesting challenges will be how this law might impact companies from outside California. The $25 million of revenue is a low threshold and there are numerous companies across the country with revenues of that size that have likely done nothing in response to this law. If companies keep even the most rudimentary database of customer information, then theoretically they violate this law if anybody in the database resides in California. There are going to be lawyers trying to make a living from chasing companies that violate the law, and I doubt that it will take long for the lawsuit to surface.

Will the Big Telcos Pursue RDOF Grants?

One of the most intriguing questions concerning the upcoming $16.4 billion RDOF grant program is if the big telcos are going to participate. I’ve asked the question around the industry and I’ve talked to folks who think the big telcos will fully wade into the reverse auctions, while others think they’ll barely play. We’re not likely to know until the auctions begin.

The big telcos were the full beneficiaries of the original CAF II program when the FCC surprisingly decided to unilaterally award the big telcos the full $9 billion in funding. In that grant program, CenturyLink received over $3 billion, AT&T almost $2.6 billion, Frontier nearly $2 billion, and Windstream over $1 billion. The telcos were supposed to upgrade much of their most rural properties to receive broadband speeds of at least 10/1 Mbps.

CenturyLink and Frontier both recently told the FCC that they are behind in the CAF II build out and didn’t meet their obligation at the end of 2019 to be 80% finished with the upgrades. From what I hear from rural communities, I think the problem is a lot more severe than just the telcos being late. Communities across the country have been telling me that their residents aren’t seeing faster speeds and I think we’re going to eventually find out that a lot of the upgrades aren’t being made.

Regardless of the problems with the original CAF II, the FCC is now offering the $16.4 billion RDOF grant program to cover much of the same areas covered by CAF II. The big telcos are faced with several dilemmas. If they don’t participate, then others are going to get federal assistance to overbuild the traditional big telco service territories. If the big telcos do participate, they have to promise to upgrade to meet the minimum speed obligations of the RDOF of 25/3 Mbps.

Interestingly, the upgrades needed to raise DSL speeds on copper to 25/3 Mbps are not drastically different than the upgrades needed to reach 10/1 Mbps. The upgrades require building fiber deeper into last-mile networks and installing DSL transmitters (DSLAMs) in the field to be within a few miles of subscribers. Fiber must be a little closer to the customer to achieve a speed of 25/3 Mbps rather than 10/1 Mbps – but not drastically closer.

I think the big telcos encountered two problems with the CAF II DSL upgrades. First, they needed to build a lot more fiber than was being funded by CAF II to get fiber within a few miles of every customer. Second, the condition of their rural copper is dreadful and much of it probably won’t support DSL speeds. The big telcos have ignored their rural copper for decades and found themselves unable to coax faster DSL speeds from the old and mistreated copper.

This begs the question of what it even means if the big telcos decide to chase RDOF funding. Throwing more money at their lousy copper is not going to make it perform any better. If they were unable to get 10/1 speeds out of their network, then they are surely going to be unable to get speeds upgraded to 25/3 Mbps.

We can’t ignore that the big telcos have a natural advantage in the RDOF auction. They can file for the money everywhere, and any place where a faster competitor isn’t vying for the money, the big telcos will have a good chance of winning the reverse auction. There are bound to be plenty of places where nobody else bids on RDOF funding, particularly in places like Appalachia where the cost is so high to build, even with grant funding.

It would be a travesty to see any more federal grant money spent to upgrade rural DSL particularly since the FCC already spent $9 billion trying to upgrade the same copper networks. The copper networks everywhere are past their expected useful lives, and the networks operated by the big telcos are in the worst shape. I’ve known many smaller telcos that tried in the past to upgrade to 25/3 on rural DSL and failed – and those companies had networks that were well-maintained and in good condition. It would be impossible to believe the big telcos if they say they can upgrade the most remote homes in the country to 25/3 Mbps speeds. Unfortunately, with the way I read the RDOF rules, there is nothing to stop the big telcos from joining the auction and from taking big chunks of the grant money and then failing again like they did with the original CAF II.

Kari’s Law

ISPs should be aware of two new laws that went into effect in January. The first is Kari’s law. This law requires that all phone systems sold, leased or installed after February 16, 2020 must be pre-configured so that a user can directly dial 911 with no other dialing steps required. This law is aimed to bring the 911 system into buildings served by PBXs, keysystems, Centrex, and cloud-based VoIP. The intent is to improve 911 for places like hotels, universities, hospitals, and businesses. This law puts the responsibility to comply not only on phone system manufacturers, but also on anybody who installs, manages, or operates a voice system.

The law also creates new requirements on existing phone systems that went into effect on January 6. Phone systems must be configured so that any call placed to 911 will immediately notify a ‘central location’ that a call has been placed to 911. This must be implemented immediately for any existing phone system that can provide the notification without needing a software or hardware upgrade. The FCC believes that a large percentage of phone systems are capable of making such notifications, so those notifications must be activated. It’s worth noting that there is no exemption for small businesses – anybody operating a private phone system is expected to comply with the law. Interestingly, the law applies to outward-calling locations like an outbound call center that can’t receive calls.

The FCC leaves a lot of interpretive room in defining a ‘central location’ for delivering the notification. Their goal is that a notification of a call to 911 must be sent to a location where there is a high likelihood that somebody will see it. The FCC wants 911 centers to be able to contact somebody at a business to gain entrance and to hopefully locate the person that made the 911 call.

Notifications can be in any format including emails, text messages, pop-up notifications, alarms, etc. The new rules also require some ancillary information to be included in the notification, where technically feasible. This includes information like a callback number and as much information as possible about the location of the 911 caller (room number, the wing of building, etc).

To the extent possible this also applies to ‘mobile’ devices that are controlled by a phone system. This might include cordless phones used inside of a business or desksets that can be moved anywhere within the business. Companies are not expected to track commercial cellphones that aren’t on their system or the location of devices that are carried off-site.

The second law that went into effect in January is Ray Baum’s Act. One of the many provisions of this law requires that 911 centers be provided with ‘dispatchable location’ information. In plain English, that means that first responders want to know ‘the right door to kick down’ when responding to a 911 call. This goes into effect concurrently with Kari’s law and means that businesses must provide more information to 911 centers about how to respond to call made from their location.

This new law is also aimed at the same kind of buildings as Kari’s law – places like hotels or a business where a 911 responder doesn’t know how to locate the person that called 911. At a minimum, every call to 911 must convey a validated 911 street address. That’s routine information for calls made from single-family homes, but not necessarily so for a complex business like a hospital or business high-rise complex. If a validated 911 address can be conveyed today it must be done so. Businesses are given one year to implement this change and are expected to coordinate with 911 centers if they want to provide complicated information on room number, building layouts, etc.

The law also requires better reporting for mobile devices that are controlled by a phone system. The rules expect the notification to 911 to include the best information possible about the location of a caller with a mobile device such as a cordless phone. This could be as detailed as a room number or something less accurate such as the location of the nearest WiFi hotspot. Companies have two years to implement this change.

The changes that come from the Roy Baum Act are intended to be coordinated with the nearest 911 PSAP so that they understand the nature and quality of the location data when they get a call. Businesses are expected to notify, and test as needed, to make sure that PSAPs know how to locate callers at a business. The FCC has the ability to fine parties that don’t comply with the law, and so expect a few test cases within the next year when businesses fail to implement the new rules or else fail to convey information to their 911 center.

Is the FCC Killing State Matching Grants?

In a bizarre last-minute change of the language approved for the upcoming $16.4 billion RDOF grant funds, the FCC inserted new language into the rules that would seem to eliminate grant applicants from accepting matching state grants for projects funded by the RDOF grants.

The new language specifically says that the RDOF grant program now excludes any geographic area that the Commission “know[s] to be awarded funding through the U.S. Department of Agriculture’s ReConnect Program or other similar federal or state broadband subsidy programs or those subject to enforceable broadband deployment obligations.”

It’s fully understandable that the FCC doesn’t want to award grant money from multiple federal grant programs for the same project, and that was a loophole that is sensible to close. I think most industry folks understood this to be true even if it wasn’t in writing.

But the idea of blocking states from making grants to supplement RDOF is counterintuitive. More than half of the states now have state broadband grant programs. It makes no sense for the FCC to tell states how they can spend (or in this case how they cannot spend) their state grant monies.

The whole concept of blocking state matching grants goes against the tradition of federal funding. The vast majority of federal funding programs for infrastructure encourage state matching funds and many programs require it. Matching state grants are used along with federal grants for building infrastructure such as roads, bridges, water and sewer systems, airports, etc. Why would the FCC want to block this for broadband?

The state grant programs that I’m most familiar with were planning to provide matching grants for some RDOF grants. Broadband offices at the state level understand that building broadband networks can be expensive and they know that in some cases the extra funding is needed to make broadband projects viable.

It’s important to remember that the RDOF grants are aimed at the most remote customers in the country – customers that, by definition, will require the largest investment per customer to bring broadband. This is due almost entirely due to the lower household densities in the RDOF grant areas. Costs can be driven up also by local conditions like rocky soil or rough terrain. Federal funding that provides enough money to build broadband in the plains states is likely not going to be enough to induce somebody to build in the remote parts of Appalachia where the RDOF grants are most needed.

State grant programs often also have other agendas. For example, the Border-to-Border grants in Minnesota won’t fund broadband projects that can’t achieve at least 100 Mbps download speeds. This was a deliberate decision so that government funding wouldn’t be wasted to build broadband infrastructure that will be too slow and obsolete soon after it’s constructed. By contrast, the FCC RDOF program is allowing applicants proposing speeds as slow as 25 Mbps. It’s not hard to argue that speed is already obsolete.

I know ISPs that were already hoping for a combination of federal and state grants to build rural infrastructure. If the FCC kills matching grants, then they will be killing the plans for such ISPs that wanted to use the grants to build fiber networks – a permanent broadband solution. Even with both state and federal grants, these ISPs were planning to take on a huge debt burden to make it work.

If the matching grants are killed, I have no doubt that the RDOF money will still be awarded to somebody. However, instead of going to a rural telco or electric coop that wants to build fiber, the grants will go to the big incumbent telephone companies to waste money by pretending to goose rural DSL up to 25 Mbps. Even worse, much of the funding might go to the satellite companies that offer nothing new and a product that people hate. I hate to engage in conspiracy theories, but one of the few justifications I can see for killing matching grants is to make it easier for the big incumbent telcos to win, and waste, another round of federal grant funding.

Letters of Credit

One of the dumbest rules suggested by the FCC for the new $16.4 billion RDOF grants is that an ISP must provide a letter of credit (LOC) to guarantee that the ISP will be able to meet their obligation to provide the matching funds for the RDOF grants. The FCC had a number of grant winners years ago in the stimulus broadband grant program that never found financing, and the FCC is clearly trying to avoid a repeat of that situation. A coalition of major industry associations wrote a recent letter to the FCC asking them to remove the LOC requirement – this includes, NTCA, INCOMPAS, USTelecom, NRECA, WTA, and WISPA.

There may be no better example of how out of touch Washington DC is with the real world because whoever at the FCC came up with that requirement has no idea what a letter of credit is. A letter of credit is a formal negotiable instrument – a promissory note like a check. A letter of credit is a promise that a bank will honor the obligation of the buyer of a letter of credit should that buyer fail to meet a specific obligation. The most normal use of LOCs is in international trade or transactions between companies that don’t know or trust each other. An example might be a company that agrees to buy $100,000 dollars of bananas from a wholesaler in Costa Rico, payable upon delivery of the bananas to the US. The US buyer of the bananas will obtain a letter of credit, giving assurance to the wholesaler that they’ll get paid. When the bananas are received in the US, the bank is obligated to pay for the bananas if the buyer fails to do so.

Banks consider letters of credits to be the equivalent of loans. The banks must set aside the amount of pledged money in case they are required to disburse the funds. Most letters of credit are only active for a short, defined period of time. It’s highly unusual for a bank to issue a letter of credit that would last as long as the six years required by the RDOF grant process.

Letters of credit are expensive. A bank holds the pledged cash in escrow for the active life of the LOC and expects to be compensated for the lost interest expense they could otherwise have earned. There are also big upfront fees to establish an LOC because the bank has to evaluate a LOC holder in the same way they would evaluate a borrower. Banks also require significant collateral that they can seize should the letter or credit ever get used and the bank must pay out the cash.

I’m having trouble understanding who the letter of credit would benefit in this situation. When the FCC makes an annual grant payment to an ISP, they expect that ISP to be building network – 40% of the RDOF network must be completed by the end of year 3 with 20% more to be completed each of the next three years. The ISP would be expected each year to have the cash available to pay for fiber, work crews, electronics, engineers, etc. You can’t buy a letter of credit that would be payable to those future undefined parties. I think the FCC believes the letter of credit would be used to fund the ISP so they could construct the network. No bank is going to provide a letter of credit where the payee is also the purchaser of the LOC – in banking terms that would be an ISP paying an upfront fee for a guaranteed loan to be delivered later should that ISP not find a loan elsewhere. It’s absurd to think banks would issue such a financial instrument. It’s likely that an ISP who defaults on a LOC is in financial straits, so having a LOC in place would have the opposite effect of what the FCC wants – rather than guarantee future funds a bank would likely seize the assets of the ISP when the LOC is exercised.

A letter of credit has significant implications for the ISP that buys it. Any bank considering lending to the ISP will consider an LOC to be the same as outstanding debt – thus reducing the amount of other money the ISP can borrow. A long-term LOC would tie up a company’s borrowing capacity for the length of the LOC, making it that much harder to finance the RDOF project.

The coalition writing the letter to the FCC claims correctly that requiring letters of credit would stop a lot of ISPs from applying for the grants. Any ISP that that can’t easily borrow large amounts of money from a commercial bank is not going to get a LOC. Even ISPs that can get the letter of credit might decide it makes it too costly to accept the grant. The coalition petitioning the FCC estimates that the aggregate cost to obtain letters of credit for RDOF could cost as much as $1 billion for the grant recipients – my guess is that the estimate is conservatively low.

One of the groups this requirement might cause problems for are ISPs that obtain their funding from the federal RUS program. These entities – mostly telcos and electric cooperatives, would have to go to a commercial bank to get a LOC. If their only debt is with the RUS, banks might not be willing to issue an LOC, regardless of the strength of their balance sheet, since they have no easy way to secure collateral for the LOC.

Hopefully, the FCC comes to its senses, or the RDOF grant program might be a bust before it even gets started. I’m picturing ISPs going to banks and explaining the FCC requirements and seeing blank stares from bankers who are mystified by the request.

5G and Rural America

FCC Chairman Ajit Pai recently told the crowd at CES that 5G would be a huge benefit to rural America and would help to close the rural broadband divide. I have to imagine he’s saying this to keep rural legislators on board to support that FCC’s emphasis on promoting 5G. I’ve thought hard about the topic and I have a hard time seeing how 5G will make much difference in rural America – particularly with broadband.

There is more than one use of 5G, and I’ve thought through each one of them. Let me start with 5G cellular service. The major benefits of 5G cellular are that a cell site will be able to handle up to 100,000 simultaneous connection per cell site. 5G also promises slightly faster cellular data speeds. The specification calls for speeds up to 100 Mbps with the normal cellular frequencies – which happens to also have been the specification for 4G, although it was never realized.

I can’t picture a scenario where a rural cell site might need 100,000 simultaneous connections within a circle of a few miles. There aren’t many urban places that need that many connections today other than stadiums and other crowded locations where a lot of people want connectivity at the same time. I’ve heard farm sensors mentioned as a reason for needing 5G, but I don’t buy it. The normal crop sensor might dribble out tiny amounts of data a few times per day. These sensors cost close to $1,000 today, but even if they somehow get reduced to a cost of pennies it’s hard to imagine a situation where any given rural cell site is going to need to more capacity than is available with 4G.

It’s great if rural cell sites get upgraded, but there can’t be many rural cell sites that are overloaded enough to demand 5G. There is also the economics. It’s hard to imagine the cellular carriers being willing to invest in a rural cell site that might support only a few farmers – and it’s hard to think the farmers are willing to pay enough to justify their own cell site

There has also been talk of lower frequencies benefitting rural America, and there is some validity to that. For example, T-Mobile’s 600 MHz frequency travels farther and penetrates obstacles better than higher frequencies. Using this frequency might extend good cellular data coverage as much as an extra mile and might support voice for several additional miles from a cell site. However, low frequencies don’t require 5G to operate. There is nothing stopping these carriers from introducing low frequencies with 4G (and in fact, that’s what they have done in the first-generation cellphones capable of using the lower frequencies). The cellular carriers are loudly claiming that their introduction of new frequencies is the same thing as 5G – it’s not.

5G can also be used to provide faster data using millimeter wave spectrum. The big carriers are all deploying 5G hot spots with millimeter wave technology in dense urban centers. This technology broadcasts super-fast broadband for up to 1,000 feet.  The spectrum is also super-squirrely in that it doesn’t pass through anything, even a pane of glass. Try as I might, I can’t find a profitable application for this technology in suburbs, let alone rural places. If a farmer wants fast broadband in the barnyard I suspect we’re only a few years away from people being able to buy a 5G/WiFi 6 hot spot that could satisfy this purpose without paying a monthly fee to a cellular company.

Finally, 5G can be used to provide gigabit wireless loops from a fiber network. This is the technology trialed by Verizon in a few cities like Sacramento. In that trial, speeds were about 300 Mbps, but there are no reason speeds can’t climb to a gigabit. For this technology to work there has to be a transmitter on fiber within 1,000 feet of a customer. It seems unlikely to me that somebody spending the money to get fiber close to farms would use electronics for the last few hundred feet instead of a fiber drop. The electronics are always going to have problems and require truck rolls, and the electronics will likely have to be replaced at least once per decade. The small telcos and electric coops I know would scoff at the idea of adding another set of electronics into a rural fiber network.

I expect some of the 5G benefits to find uses in larger county seats – but those towns have the same characteristics as suburbia. It’s hard to think that rural America outside of county seats will ever need 5G.

I’m at a total loss of why Chairman Pai and many politicians keep extolling the virtues of rural 5G. I have no doubt that rural cell sites will be updated to 5G over time, but the carriers will be in no hurry to do so. It’s hard to find situations in rural America that demand a 5G solution that can’t be done with 4G – and it’s even harder to justify the cost of 5G upgrades that benefit only a few customers. I can’t find a business case, or even an engineering case for pushing 5G into rural America. I most definitely can’t foresee a 5G application that will solve the rural broadband divide.

 

Is 5G Radiation Safe?

There is a lot of public sentiment against placing small cell sites on residential streets. There is a particular fear of broadcasting higher millimeter wave frequencies near to homes since these frequencies have never been in widespread use before. In the public’s mind, higher frequencies mean a higher danger of health problems related to exposure to radiofrequency emissions. The public’s fears are further stoked when they hear that Switzerland and Belgium are limiting the deployment of millimeter wave radios until there is better proof that they are safe.

The FCC released a report and order on December 4 that is likely to add fuel to the fire. The agency rejected all claims that there is any public danger from radiofrequency emissions and affirmed the existing frequency exposure rules. The FCC said that none of the thousand filings made in the docket provided any scientific evidence that millimeter wave, and other 5G frequencies are dangerous.

The FCC is right in their assertion that there are no definitive scientific studies linking cellular frequencies to cancer or other health issues. However, the FCC misses the point that most of those asking for caution, including scientists, agree with that. The public has several specific fears about the new frequencies being used:

  • First is the overall range of new frequencies. In the recent past, the public was widely exposed to relatively low frequencies from radio and TV stations, to a fairly narrow range of cellular frequencies, and two bands of WiFi. The FCC is in the process of approving dozens of new bands of frequency that will be widely used where people live and work. The fear is not so much about any given frequency being dangerous, but rather a fear that being bombarded by a large range of frequencies will create unforeseen problems.
  • People are also concerned that cellular transmitters are moving from tall towers, which normally have been located away from housing, to small cell sites on poles that are located on residential streets. The fear is that these transmitters are generating a lot of radiation close to the transmitter – which is true. The amount of frequency that strikes a given area decreases rapidly with distance from a transmitter. The anecdote that I’ve seen repeated on social media is of placing a cell site fifteen feet from the bedroom of a child. I have no idea if there is a real small cell site that is the genesis of this claim – but there could be. In dense urban neighborhoods, there are plenty of streets where telephone poles are within a few feet of homes. I admit that I would be leery about having a small cell site directly outside one of my windows.
  • The public worries when they know that there will always be devices that don’t meet the FCC guidelines. As an example, the Chicago Tribune tested eleven smartphones in August and found that a few of them were issuing radiation at twice the FCC maximum-allowable limit. The public understands that vendors play loose with regulatory rules and that the FCC largely ignores such violations.

The public has no particular reason to trust this FCC. The FCC under Chairman Pai has sided with the large carriers on practically every issue in front of the Commission. This is not to say that the FCC didn’t give this docket the full consideration that should be given to all dockets – but the public perception is that this FCC would side with the cellular carriers even if there was a public health danger.

The FCC order is also not particularly helped by citing the buy-in from the Food and Drug Administration on the safety of radiation. That agency has licensed dozens of medicines that later proved to be harmful, so that agency also doesn’t garner a lot of public trust.

The FCC made a few changes with this order. They have mandated a new set of warning signs to be posted around transmitters. It’s doubtful that anybody outside of the industry will understand the meaning of the color-coded warnings. The FCC is also seeking comments on whether exposure standards should be changed for frequencies below 100 kHz and above 6 GHz. The agency is also going to exempt certain kinds of transmitters from FCC testing.

I’ve read extensively on both sides of the issue and it’s impossible to know the full story. For example, a majority of scientists in the field signed a petition to the United Nations warning against using higher frequencies without more testing. But it’s also easy to be persuaded by other scientists who say that higher frequencies don’t even penetrate the skin. I’ve not heard of any studies that look at exposing people to a huge range of different low-power frequencies.

This FCC is in a no-win position. The public properly perceives the agency of being pro-carrier, and anything the FCC says is not going to persuade those worried about radiation risks. I tend to side with the likelihood that the radiation is not a big danger, but I also have to wonder if there will be any impact after expanding by tenfold the range of frequencies we’re exposed to. The fact is that we’re not likely to know until after we’ve all been exposed for a decade.

The End of Free Conference Calling

Like many of you reading this blog, I have been using the service Free ConferenceCall.com for many years. I got an email from them last week warning that their service will likely go dark, and they wanted users of the service to call Congress to help keep them in business.

Their issue stems back to an FCC order issued in September of last year that seeks to stop the practice of access arbitrage. This FCC summary of the order describes the situation well. Some small telcos have been making money by billing access on ‘free’ minutes generated by services like free conference calling. The process of making money from free calling services has been known in the industry as access arbitrage.

The FCC tried to stop access arbitrage in 2011. At that time, small rural telcos billed a rate of as much as a penny or two per minute to originate or terminate a long-distance call. Some telcos that were allowed to bill the high rates were making a lot of money by originating calls for free outgoing call center services or by terminating calls from 800 numbers, conference calling services, or free chat lines.

In the 2011 order, the FCC eliminated the access fees associated with terminating a call, migrating to what the FCC called ‘bill and keep’, and they hoped that eliminating the access revenues would kill the arbitrage practices. The FCC order was largely effective and chat lines and other free arbitrage services quickly disappeared.

However, the 2011 order didn’t kill all access charges, and over time the folks who make money with arbitrage found another way to make money with free calling. One of the few access charges left untouched in 2011 was transport, which compensates telcos for the use of fiber networks connecting telcos to the outside world. I’ve noticed that the caller ID for FreeConferenceCalling.com numbers is mostly from Iowa and South Dakota, and I have to assume those calls are being terminated at switches that are remote and that can still bill significant miles of transport.

The access fees billed to terminate calls are paid by the carrier that originates the call. This means that most remaining terminating access is paid today by long-distance carriers like AT&T, Sprint and CenturyLink, which together still sell the bulk of long-distance telephone services. The dollar magnitude of access arbitrage is much smaller than a decade ago. The FCC estimates arbitrage is currently a $40 – $60 million problem, whereas it was hundreds of millions before the FCC’s 2011 order. But those fees are being billed to the long-distance companies that get no benefit from the transaction (thus the term arbitrage – the companies are billing the fees because the rules allow a loophole to do so).

FreeConferenceCalling.com is not the only company doing this, and it’s likely that many conference calling services rely wholly or partially on the arbitrage. It’s worth noting that conference call services that use the Internet to place calls will not be affected by this change – because those calls don’t invoke access charges. The carriers billing for the access on the conference calling may or may not be sharing the revenues with companies like FreeConferenceCalling.com – in either case those carriers no longer have any financial reason to continue the practice.

Companies like FreeConferenceCalling.com don’t automatically have to go out of business, but the FCC order means a drastic change to the way they do business. For instance, the company could start charging a monthly fee for conference calling – likely forcing this particular company to change its name. They might sell advertisements for those sitting waiting for a conference call. They could charge for services like recording calls.

It’s more likely that companies like FreeConferenceCalling.com will quietly die or fade away. I tried using the service yesterday and it already seems to be broken. This latest FCC order probably puts the final nail into the coffin of access arbitrage – although I’ve learned to never say never. As long as there are any fees for calling based upon regulatory orders, there is a chance that somebody will find a way to generate lots of calls that fit the circumstance and get enriched by the arbitrage.

The RDOF Grants – The Good and Bad News

The FCC recently approved a Notice of Proposed Rulemaking that proposes how they will administer the $16 billion in RDOF grants that are going to awarded later this year. As you might imagine, there is both good news and bad news coming from the grant program.

It’s good news that this grant program ought to go a long way towards finally killing off large chunks of big telco rural copper. Almost every area covered by these grants is poorly served today by inadequate rural DSL.

The related bad news is that this grant award points out the huge failure of the FCC’s original CAF II program where the big telcos were given $11 billion to upgrade DSL to at least 10/1 speeds. The FCC is still funding this final year of construction of CAF II upgrades. The new grant money will cover much of the same geographic areas as the original CAF II deployment, meaning the FCC will spend over $27 billion to bring broadband to these rural areas. Even after the RDOF grants are built, many of these areas won’t have adequate broadband. Had the FCC administered both grant programs smartly, most of these areas could be getting fiber.

Perhaps the best good news is that a lot of rural households will get faster broadband. Ironically, since the grants cover rural areas, there will be cases where the RDOF grant brings faster broadband to farms than will be available in the county seat, where no grant money is available.

There is bad news on broadband speeds since the new grant program is only requiring download speeds of 25/3 Mbps. This means the FCC is repeating the same huge mistake they made with CAF II by allowing federal money to spend on broadband that will be obsolete before it’s even built. This grant program will be paid out of ten years and require deployment over six years – anybody paying attention to broadband understands that by six years from now a 25/3 Mbps broadband connection will feel glacial. There is grant weighting to promote faster data speeds, but due to the vagaries of a reverse auction, there will be plenty of funding given to networks that will have speeds close to 25/3 Mbps in performance.

There is further bad news since the FCC is basing the grants upon faulty broadband maps. Funding will only be made available to areas that don’t show 25/3 Mbps capability on the FCC maps. Everybody in the industry, including individual FCC Commissioners, agrees that the current maps based upon 477 data provided by ISPs are dreadful. In the last few months, I’ve worked with half a dozen counties where the FCC maps falsely show large swaths of 25/3 broadband coverage that isn’t there. It’s definitely bad news that the grant money won’t be made available in those areas where the maps overstate broadband coverage – folks in such areas will pay the penalty for inadequate broadband maps.

There is a glimmer of good news with mapping since the FCC will require the big ISPs to report broadband mapping data using polygons later this year. Theoretically, polygons will solve some of the mapping errors around the edges of towns served by cable TV companies. But there will only be time for one trial run of the new maps before the grants, and the big telcos have every incentive to exaggerate speeds in this first round of polygon mapping if it will keep this big pot of money from overbuilding their copper. I don’t expect the big telco mapping to be any better with the polygons.

Another area of good news is that there will be a lot of good done with these grants. There will be rural electric cooperatives, rural telcos, and fiber overbuilders that will use these grants as a down-payment to build rural fiber. These grants are not nearly large enough to pay for the full cost of rural fiber deployment, but these companies will borrow the rest with the faith that they can create a sustainable broadband business using fiber.

The bad news is that there will be plenty of grant money that will be used unwisely. Any money given to the traditional satellite providers might as well just be burned. Anybody living in an area where a satellite provider wins the grant funding won’t be getting better broadband or a new option. There is nothing to stop the big telcos from joining the auction and promising to upgrade to 25/3 Mbps on DSL – something they’ll promise but won’t deliver. There are likely to be a few grant recipients who will use the money to slap together a barely adequate network that won’t be fast and won’t be sustainable – there is a lot of lure in $16 billion of free federal money.

It’s dismaying that there should be so many potential downsides. A grant of this magnitude could be a huge boost to rural broadband. Many areas will be helped and there will be big success stories – but there is likely to be a lot of bad news about grant money spend unwisely.

Federal Subsidies for Satellite Broadband

In December, the FCC awarded $87 million from the CAF II Reverse auction held last summer for satellite broadband. The bulk of the satellite awards went to Viasat, which will supposedly use the money to bring broadband to 123,000 homes in seventeen states. The grant awards are meant to bring 25/3 Mbps broadband to areas that don’t have it today.

I have several problems with this award. First is that the satellite companies already cover these areas today and have been free to sell and market in these areas. The federal grant money doesn’t bring a new broadband alternative to anybody in rural America.

Second, the satellite companies aren’t required to connect any specific number of new customers as a result of the grant awards. They are largely free to just pocket the grants directly as profits. Even when they do connect a new customer, they don’t build any lasting broadband infrastructure, but only install an antenna at each new customer.

Third, rural residents don’t seem to want satellite broadband. In a large survey by the Census Bureau in 2017, 21% of people in the US described their neighborhood as rural (52% chose suburban and 27% said urban). In the quarter ending in June 2019, Viasat claimed 587,000 rural customers in the US, which represents only 2.2% of the 128 million households in the country.  If those customers are all in rural America, then the company has roughly a 10% market penetration.

CCG has been doing broadband surveys for twenty years and I don’t know that we’ve ever talked to a satellite customer who was happy with their broadband. In every survey, we seem to encounter more people who dropped satellite service than those that still have it. Customers complain that satellite costs too much – Viasat claimed in their most recent financial report that the average residential broadband bill is $84.26. Customers also hate the high latency, which can be 10 to 15 times higher than terrestrial broadband. The latency is due to the satellite which is parked almost 22,200 miles above earth – it takes a while for a round trip communication over that distance.

The primary complaints about satellite broadband are tiny monthly data caps. The company’s products that would satisfy the FCC grant speed requirements start with the Unlimited Silver 25 plan at $70 with speeds up to 25 Mbps with a monthly data cap of 60 gigabytes of data usage. The fastest plan is the Unlimited Platinum 100 plan for $150 with speeds up to 100 Mbps and a data cap if 150 gigabytes. Unlike cellular plans where a customer can buy more broadband, the Viasat plans throttle customers to speeds reported to be less than 1 Mbps once a customer reaches the data cap. To put those plans into perspective, OpenVault announced recently that the average US home uses 274 gigabytes of data per month. The average cord cutting home uses 520 gigabytes per month. The satellite broadband is impractical for anybody with school students in the home or for anybody that does even a modest amount of video streaming.

Viasat won the grant funding due to a loophole in the grant program. The program funding was available to anybody that offers broadband of at least 25 Mbps. The grant program intended to deliver a new broadband alternative to rural households – something that satellite broadband does not do. The funding was provided under a reverse auction, and the satellite companies likely placed bids for every eligible rural market – they would have been the default winner for any area that had no other bidder. Even where there was another bidder, a reverse auction goes to the lowest bidder and there is no amount that is too small for the satellite companies to accept. The satellite companies don’t have to make capital expenditures to satisfy the grants.

Giving money to satellite providers makes no sense as broadband policy. They don’t bring new broadband to anybody since the satellite plans are already available. The plans are expensive, have high latency and low monthly data caps.

The much larger RDOF grant program will award $16.4 billion in 2020 for rural broadband and the satellite companies must be ecstatic. If the FCC doesn’t find a way to keep the satellite companies out of this coming auction, the satellite companies could score a billion-dollar windfall. They can do so without offering any products that are not already available today.

To put these grants into perspective, the $87 million grant award is roughly the same size as the money that has been awarded over several years in the Minnesota Border-to-Border grant program. The Minnesota grants have helped funds dozens of projects, many of which built fiber in the state. There is no comparison between the benefits of the state grant program compared to the nearly total absence of benefit from handing federal money to the satellite companies.