Protesting 5G

There were over 90 protests nationwide recently against the coming 5G technology, mostly related to health concerns. The protesters have some of the facts wrong about 5G and that makes it easier for policymakers to ignore them. It’s hard to fault anybody about getting the facts wrong about 5G since the carriers have purposefully filled the press with misleading 5G rhetoric. I would venture to say a lot of people in our industry have the same misunderstandings.

I watched a few news reports of the various protests, and protesters cited the following concerns about 5G. They say that it’s already being installed and will be active in most cities by next year. They say that in the near future that cellular speeds will be 100 times faster than today. They say that the FCC has blessed 5G as safe when it’s not. Let me address each of these issues:

What is 5G? Many of the protestors don’t realize that 5G is the marketing name of several different technologies. 5G can mean improved cellular service. 5G can mean high-speed wireless broadband loops like is being tested by Verizon in Sacramento. And 5G can mean gigabit radio connections made between two points, similar to traditional microwave backhaul. Protestors have conflated the claims for each technology and assume they apply to 5G cellular service.

Is 5G Being Installed Today? Cities everywhere are seeing permit requests for small cell sites and often believe these are requests to install 5G – I just talked to a fairly large city the other day who made this assumption. For now, the requests for small cell sites are to bolster the 4G cellular network. The cellular companies aren’t talking about it, but their 4G data networks are in trouble. People are using so much data on their phones that cell sites are getting overwhelmed. The amount of data being used by cellphones users is currently doubling every two years – and no data network can handle that kind of growth for very long. The cellular carriers are quietly beefing up the 4G networks in order to avoid the embarrassment of major network crashes in a few years. They are hoping that within 3 -5 years that 5G can relieve some of the pressure from cellular networks.

Will 5G Be Here Next Year? It might be a decade until we see a full 5G cellular installation. There are 13 major specifications for improvements between 4G and 5G and those will get implemented over the next decade. This won’t stop the marketing departments of the cellular carriers to loudly claim 5G networks after one or two of these improvements have been partially implemented.  What the cellular companies never bothered to tell the public is that the first fully-compliant 4G cell site was just implemented last year – 5G is going to require the same slow steady introduction of changes until full 5G gets here. Starting a year or two from now we might see some 5G improvements, with more 5G upgrades introduced each year thereafter. The carriers will loudly announce every time they make a 5G trial and will make the world think they are the improvements will be immediately installed everywhere.

Will Cellular Speeds be 100 Times Faster? The 5G specification calls for cellular speeds to be improved over time to 100 Mbps, about 6 times faster than 4G cellular speeds today. Speeds won’t improve overnight and this certainly isn’t going to be here in a year or two.

The public thinks that we’ll see gigabit cellular speeds for several reasons. First, Verizon recently introduced a trial for fast cellular using millimeter wave spectrum in small portions of a few downtown areas. Millimeter wave cellular is not going to make sense for wide deployment because the fast data speeds only carry perhaps 200 feet from the transmitter. Millimeter wave spectrum in this application is blocked by almost everything in the environment. This trial was mostly to grab headlines, not to portend a real product. Confusion also came when AT&T recently announced a 2 Gbps connection made to an outdoor hot spot. This is using point-to-point technology that can never apply to cellphones – but the AT&T announcement made this fuzzy on purpose.

What About the Health Impacts? Most 5G cellular service will use the same spectrum, or some new bands that are similar to today’s cellular spectrum. The primary concern for 5G cellular (and 4G) is the introduction of small cell sites into neighborhoods. It’s concerning to citizens when a cell site is on a pole at their curb instead of at the top of a tall tower outside the neighborhood. The neighborhood cell sites are going to be broadcast at a lower power level than the current big cell sites, so theoretically the amount of cellular radiation ought to be similar to today. But to give credit to the protesters, we’ll only know that’s really true after small cell sites have been installed.

The real health concern that is troublesome is not related to 5G cellular using the same frequencies as today, but rather about the use of  millimeter wave spectrum. A significant percentage of the world’s scientists that work in this area recently warned the United Nations that some past research of millimeter wave spectrum shows negative impacts for plant and animal life. The scientists admit that much more research is needed and they pleaded with the UN to not use the general public as guinea pigs. Belgium recently banned millimeter wave spectrum deployment until the health risks are understood. The FCC joins with almost every other country in allowing the deployment of millimeter wave spectrum and is in the process of licensing more of the spectrum.

As mentioned earlier, Verizon recently did a few trials of sending millimeter wave spectrum to cellphones. This was viewed mostly as a gimmick because this doesn’t seem to have real-life market potential due to the limitations for the spectrum and cellphones. I just saw an estimate that it would take over 300,000 small cell sites to blanket Los Angeles with small cells that are close enough to deploy millimeter wave spectrum – that doesn’t sound like a plausable or profitable business plan.

The technology where the protesters should be focused is millimeter wave spectrum wireless loops. Verizon deployed this to a few hundred homes in Sacramento and a few other cities, delivering about 300 Mbps broadband to homes. Verizon says they have plans to deploy this widely. This is the spectrum use that the scientists warned about. A deployment of millimeter wave loops means constantly bombarding residential neighborhoods with millimeter wave spectrum from poles on the curb. The other planned use of millimeter wave spectrum is for indoor routers that will transmit gigabit bandwidth inside of a room. People can clearly decide to not use millimeter wave routers, but have no say about a carrier introducing it into the neighborhood. Protesters have a valid concern for this technology.

Reverse Auctions for Broadband Grants

In April, FCC Chairman Ajit Pai announced a new rural broadband initiative that will provide $20.4 billion of new funding. We don’t know many details, but one of the most likely parameters of that funding is that the money will be awarded by reverse auction. Today I ask if a reverse auction is really the right tool for this particular auction.

In a government reverse auction the winner is the entity willing to take the least amount of money to provide a given task. A reverse auction is much akin to awarding money to the low-cost vendor in government contracting. The big question to ask is if we really want to award grant money to the low-cost bidder? By definition that will reward certain behavior:

Favors Slower and Lower-cost Technologies. If the criteria for award is the percentage of grant matching, it’s far easier for an applicant to accept a lower match if they are deploying a lower-cost technology. Fixed wireless has a big cost advantage over fiber. Satellite has a huge advantage over every other technology since any award for them is 100% gravy. For a reverse auction to work it has to find an equitable weighting process to bring technologies into some sort of parity. The recent CAF II reverse auction is a good example. While some of the money went for fiber, a huge amount went to fixed wireless and satellite broadband – and fiber only got funded in areas where it wasn’t competing against the lower-cost technologies. If there is a reverse auction for the whole country, then the lower-cost technologies will win almost all of the grant funding.

Favors Lower-Cost Regions of the Country. Some parts of the country like Appalachia have a higher cost of construction for every technology, and a reverse auction is going to benefit lower-cost places like the Midwest plains. A reverse auction will also favor grant applications with higher density that include towns versus requests that are 100% rural.

Favors Upgrades over New Construction. A reverse auction will favor applicants that are seeking funds to upgrade existing facilities rather than build new ones. For example, it would promote upgrading DSL over building new fiber.

Formulaic and Allows for No Policy Awards. The FCC and Congress is going to want to see the awards spread across the country to every state. A reverse auction might favor a specific region of the country or even favor a single technology – all of this is outside of the control of the FCC once the auction begins. A reverse grant is self-selecting and once the process is started those willing to take the smallest percentage grant will win all of the money. I think the whole country is going to be furious if most of this huge grant only favors one region or one technology. Most states have elected to not use a reverse auction for state grants because they want some say to make sure that grants are awarded to all corners of a state.

There’s No Fix for Problem Grants. I have clients who think that fixed wireless companies that claimed they could deploy ubiquitous 100 Mbps broadband cheated in the CAF II reverse auctions. They claim the technology can’t deliver that speed to all customers. We’ll find out when these networks are deployed. This was relevant in that particular auction since bidders got extra bid credits for promising faster speeds. This is a cautionary tale about bidders who will manipulate the bidding rules to get an advantage.

Another issue we often see in grant programs is that some of those who are awarded grants find themselves ineligible to take the grants. This happened with the stimulus grants and the returned money was awarded to the next companies in the grant grading process. This is not possible in a reverse auction. By the time of the final award everybody else has dropped out of the process.

The bottom line is that a reverse auction is a terrible process for this grant program. No matter how carefully the FCC sets the eligibility rules, a reverse auction is always going to favor certain technologies or certain parts of the country over others – it’s inevitable in a nationwide reverse auction. A $20.4 billion grant program can bring great broadband to a lot of households. A reverse auction will be a disaster if it pushes money towards upgrading DSL or gives the funding to satellite providers rather than awarding all of the money to build permanent broadband infrastructure.

I know that taking the time to review and rank grant applications is hard work. A reverse auction simplifies this process by sinply declaring if a grant application is eligible for the grant. If you want proof that slogging through grants and choosing the best ones then look at the successful state grant programs. A reverse auction is inevitably going to allocate funds in ways that the FCC is not going to be proud of.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.

Is There a Business Case for 5G Cellular?

Readers might think I spent too much time writing about 5G. However, I’m asked about 5G almost every day. Existing ISPs want to know if 5G is a real threat. Potential ISPs want to know if they should pause their business plans until they understand 5G’s impact. Cities want to know what to expect. The cellular companies have made such a huge deal about 5G that they’ve spooked the rest of the industry.

Today I ask perhaps the most fundamental question of all – is there a business case for 5G cellular? I’m not talking about 5G wireless loops to homes – I’m just asking if there is a financial justification for the cellular companies to upgrade their cell sites to 5G?

Before answering that question, it’s good to remember that the cellular companies badly need to implement 5G because their 4G networks are facing a crisis. After years of training customers to be stingy in using cellphone data, they are now encouraging users to stream video. The result of this shift is that total cellular data usage is now doubling every two years. Any network engineer will tell that that is deadly growth, particular for a cellular network. The existing 4G network can’t handle this growth for more than a few more years. While some of this growth can be temporarily mitigated by inserting small cell sites into the network, that doesn’t look like it is more than a band-aid fix if broadband keeps growing at this fast pace. Small cell sites will be overwhelmed almost as quickly as they are built.

The carriers need 5G because it will expand the capacity of each cell site by allowing many more customers to use a cell site simultaneously. By taking advantage of frequency slicing and the ability to mix and match multiple frequencies a 5G cell site will be a huge step-up in efficiency. The cellular carriers have not publicly admitted that they need 5G just to keep their networks running – but they really don’t have a choice.

The question, though, is if there is a new revenue stream to help pay for the 5G upgrades? To be honest, I can’t find any early 5G cellular application that will generate much revenue in the near future. The obvious new revenue source would be to charge a premium price to use 5G data on a cellphone. There might be some people willing to pay extra in the early stages of the 5G roll-out, but as 4G morphs over time into 5G, any willingness to pay more for data will melt away.

I also wonder if customers will really value faster cellular data speeds. First, we aren’t talking about a ton of extra speed. Forget the recent trials of millimeter wave 5G – that’s a gimmick for now that will not be available anywhere other than in dense urban centers. The 5G specification that matters to the real world is the goal for 5G speeds to increase over a decade to 100 Mbps.

Good 4G data speeds today are in the range of 15 Mbps and that is more than enough speed to stream data while performing any functions we want from a cellphone. Faster speeds will not stream video any faster. Over time perhaps our cellphones will be able to create augmented reality universes, but that technology won’t be here for a while. Faster data speeds are vitally important in a home where we run multiple large data streams simultaneously – but a cellphone is, by definition, one device for one user.

The real advantage of 5G is the ability to make large numbers of connections from a single cell site. It’s also hard to see an immediate path to monetize that. I talk to a friend many mornings as he commutes and he always gets disconnected at the Eisenhower bridge on the DC beltway – there are not enough cellular connections there to allow for handoffs between Maryland and Virginia. 5G will finally fix that problem, but I can’t see anybody paying extra to not be cut off on the bridge – they will finally be getting what they’ve always expected.

Eventually 5G will have big potential as the connection for outdoor sensors, IoT devices, smart cars, smart streetlights, etc. There is also likely to eventually be a huge market for wearables that might include fitness monitors, medical monitors, smart glasses, and even smart clothes. However, all of these applications will take time to come to market – there is a bit of chicken and egg in that these technologies will likely never take off until there is universal 5G coverage. There is very little revenue likely in the next few years for outdoor applications – although this might eventually be the primary new source of 5G revenue.

I look back to last fall when Ronan Dunne, an EVP of Verizon Wireless, made his case to investors for the potential for 5G. He included the outdoor sensors I mention above. He also cited applications like retail, where holograms might spring up near merchandise in stores. He talked about stock trading that takes advantage of the low latency on 5G. He mentioned gaming, which would benefit from lower latency. Most of these applications offer eventual potential for 5G. But none of these applications are going to produce giant revenues over the next three or four years. In the short run it’s hard to imagine almost any immediate revenue from these applications.

Predicting technology is always a crap shoot and perhaps new applications will arise that need 5G that even Verizon hasn’t imagined. The list of applications that Verizon gave to investors is underwhelming and reflects the fact that there is likely no 5G application that will significantly add to the bottom line of the cellular carriers in the immediate future.

This really brings home the idea that as a nation we are not in a worldwide 5G competition. The carriers need 5G soon to stop the collapse of the 4G data networks in busy neighborhoods. I have a hard time thinking they need it immediately for anything else – although eventually we will be surrounded by 5G applications.

Who Should be Eligible for Broadband Grants?

This is the next blog in a series looking at the upcoming $20.4 billion broadband grant recently announced by FCC Chairman Ajit Pai. Today I look at the question of grant eligibility. This will be, by far, the largest broadband grant program to date. What can the FCC do to make sure this money is available to the widest set of constituencies possible?

FCC programs in the past have usually stated that they are available to a wide range of potential grant recipients. However, after digging into the detailed rules of how grants are awarded, it’s generally turned-out that it’s harder for some classes of grant recipients to win a grant. I hope the FCC can take some of the following into consideration this time around.

Municipalities. Cities and counties have often had a hard time meeting the eligibility rules for broadband grants. That’s interesting because local governments receive a lot of federal funding for other purposes, but there are aspects of broadband grants that have been a challenge for local government.

Consider the current Re-Connect grant program that is awarding $600 million for rural broadband. The program is a combination of grants and RUS loans. I worked with several county governments that concluded that they could never accept the RUS loans. The rules governing RUS loans are set by legislation and cast in concrete and so their loan terms are largely not negotiable. There are aspects of RUS loans that don’t work for government entities, like the surety requirement for giving the RUS first lien on a property funded by the loans. Most municipalities are legally not allowed to pledge assets in this manner.

Another past telecom grant practice has stopped municipalities from applying for grants. Some FCC grant programs require that grant applicants show proof of already having the matching funds available to fulfill a project before they can apply for a grant. That requirement makes some sense for commercial applicants because the FCC wants to avoid giving grants to entities that won’t be able to accept the money once awarded. This is a chicken and egg dilemma for local governments. Normally, municipalities win a grant first and then raise the matching bond funds. Bonds that have not yet been sold can’t be used to guarantee a grant – until a bond is sold it doesn’t exist. There is no such thing as a line of credit based upon a future bond issue. Governments can’t raise the bond money first until they know the size of the grant funding – a real example of a Catch-22.

Open Access. Past FCC grants that utilize the Universal Service Fund have required that grant recipients become Eligible Telecommunications Carriers (ETC) – and this might apply to the new $20.4 billion grant program. This is a regulatory designation that basically shows a blessing by a state commission that the entity is a recognized carrier. Only entities that provide retail services are eligible to become an ETC. This requirement makes it impossible for government entities that own open access networks to get grant funding. In an open access structure the network owner – the entity that would use the grant funding to build rural broadband networks – is unable to become an ETC because they don’t provide any retail services to customers, that function is provided by the ISPs that provide service on the open access network.

This requirement discriminates against states that require or favor open access networks like Washington, Utah, Colorado, and Virginia. A good example are the PUDs, the municipal electric companies that operate rural open access fiber networks in rural Washington. The PUDs are building rural fiber, which is a goal the FCC clearly favors, yet they are ineligible for federal grants that require ETC status. This is a case of a regulatory requirement created for a different purpose that will deny federal grant money to large rural areas of the country. The PUDs would love funding to build more rural fiber and the commercial ISPs operating on their networks are ready to sell services.

Partnerships. One of the biggest industry trends is partnerships that are created to build rural broadband networks. Some of the most common partnerships are between county governments and commercial ISPs or between electric cooperatives and rural telcos. Poorly conceived grant rules can cause problems for partnerships even though the FCC and the federal government supposedly love such partnerships.

Any partnership where the network owner doesn’t provide retail broadband can have the same problem as just described for open access networks if the network owner is unable to become an ETC. If the network owner in the partnership is a government entity they will also have problems accepting government loans as part of an award.

Partnerships face other hurdles as well. The current Re-Connect grant program requires providing several years of strong financials to show that a company has a history and a strong balance sheet. New partnerships have no historical financials, yet both entities that own the partnership likely have a long financial history. Poorly written grant rules could block newly formed partnerships from receiving funding, even if the partnership will eventually have a huge cash infusion from the partners.

Other grant rules can also be a problem. For example, a typical partnership might be one where an electric cooperative builds a fiber network and a telco provides services. In this case the grant funding will go to the electric cooperative that is building the network, but the cooperative will be unable to make any pledges or guarantees about customer performance such as promising fast speeds, offering low prices, or serving anchor institutions.

When I read grant rules one of the first things that is always obvious to me is that the people writing the grant rules don’t understand the wide range of business structures that currently operate broadband networks. There are numerous partnership and other arrangements that differ significantly from the traditional model of a single ISP that owns the network and sells retail products to customers. If grant eligibility rules aren’t updated to recognize the way networks are being built and operated, then large numbers of potential grant recipients are going to be ineligible for the new grants. While that hurts the new entities, it really hurts the rural residents that are served by these new kinds of ISPs.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.

The Impact of Satellite Broadband

Recently I’ve had several people ask me about the expected impact of low-orbit satellite broadband. While significant competition from satellites is probably a number of years away, there are several major initiatives like StarLink (Elon Musk), Project Kuiper (Amazon), and OneWeb that have announced plans to launch swarms of satellites to provide broadband.

At this early stage, it’s nearly impossible to know what impact these companies might have. We don’t know anything about their download and speed capacity, their pricing strategy, or their targeted market so it’s impossible to begin to predict their impact. We don’t even know how long it’s going to take to get these satellites in space since these three companies alone have plans to launch over 10,000 new satellites – a tall task when compared to the 1,100 satellites currently active in space.

Even without knowing any of these key facts, BroadbandNow recently grabbed headlines around the industry by predicting that low-orbit satellites will bring an annual savings of $30 billion for US broadband customers. Being a numbers guy, I never let this kind of headline pass without doing some quick math.

They explain their method of calculation on their web site. They are making several major assumptions about the satellite industry. First, they assume the satellite providers will compete on price and will compete in every market in the country. Since the vast majority of American live in metro areas, BroadbandNow is assuming the satellite providers will become a major competitor in every city. They also assume that the satellites will be able to connect to a huge number of customers in the US which will force other ISPs to lower prices.

Those assumptions would have to be true to support the $30 billion in projected annual consumer savings. That is an extraordinary number and works out to be a savings of almost $20 per month for every household in the US. If you spread the $30 billion over only those households that buy broadband today, that would be a savings of over $23 per month. If your further factor out the folks who live in large apartments and don’t get a choice of their ISP, the savings jumps to $27 per household per month. The only way to realize savings of that magnitude would be from a no-holds-barred broadband price war where the satellite providers are chewing into market penetrations everywhere.

I foresee a different future for the satellite industry. Let’s start with a few facts we know. While 10,000 satellites is an impressive number, that’s a worldwide number and there will be fewer than 1,000 satellites over the US. Most of the satellites are tiny – these are not the same as the huge satellites launched by HughesNet. Starlink has described their satellites as varying in size between a football and a small dorm refrigerator. At those small sizes these satellites are probably the electronic equivalent of the OLT cabinets used as neighborhood nodes in a FTTH network – each satellite will likely support some limited and defined number of customers. OneWeb recently told the FCC in a spectrum docket that they are envisioning needing one million radio links, meaning their US satellites would be able to serve one million households. Let’s say that all of the satellite providers together will serve 3 – 5 million homes in the US – that’s an impressive number, but it’s not going to drive other ISPs into a pricing panic.

I also guess that the satellite providers will not offer cheap prices – they don’t need to. In fact, I expect them to charge more than urban ISPs. The satellite providers will have one huge market advantage – the ability to bring broadband where there isn’t landline competition. The satellite providers can likely use all of their capacity selling only in rural America at a premium price.

We still have no real idea about the speeds that will be available with low-orbit satellite broadband. We can ignore Elon Musk who claims he’ll be offering gigabit speeds. The engineering specs show that a satellite can probably make a gigabit connection, but each satellite is an ISP hub and will have a limited bandwidth capacity. Like with any ISP network, the operator can use that capacity to make a few connections at a high bandwidth speed or many more connections at slower speeds. Engineering common sense would predict against using the limited satellite bandwidth to sell gigabit residential products.

That doesn’t mean the satellite providers won’t be lured by big bandwidth customers. They might make more money selling gigabit links at a premium price to small cell sites and ignoring the residential market completely. It’s a much easier business plan, with drastically lower operating costs to sell their capacity to a handful of big cellular companies instead of selling to millions of households. That is going to be a really tempting market alternative.

I could be wrong and maybe the satellite guys will find a way to sell many tens of millions of residential links and compete in every market, in which case they would have an impact on urban broadband prices. But unless the satellites have the capacity to sell to almost everybody, and unless they decide to compete on price, I still can’t see a way to ever see a $30 billion national savings. I instead see them making good margins by selling where there’s no competition.

Speed Goals for FCC Grants

I literally grimaced when I first read about the 25/3 Mbps speed test that will likely be part of the new $20.4 billion grant program recently announced by the FCC. My first thought was that the 25/3 Mbps goal would provide an excuse for the FCC to give the grant money to the big telcos again. Those companies could then take another ten years to bring rural DSL up to the speeds they should have achieved on their own a decade ago. With the history of the FCC pandering to the big telcos I instantly feared this possibility.

But let’s assume that the upcoming grants will be available to all comers. Why would the FCC choose the 25/3 Mbps speed target? It’s a terrible goal for many reasons.

  • While this FCC will not admit it, 25/3 Mbps is already obsolete as the definition of adequate broadband. It’s been five years since 25/3 Mbps was adopted and households are using a lot more data than five years ago. It’s pretty easy to make the case that the definition of broadband today probably ought to be at least 50 Mbps download.
  • If the 25/3 Mbps speed is already outdated today, then it’s a lousy goal for a decade from now. This FCC should not repeat the same blunder as the last FCC did with the original CAF II program. They should set a forward-looking speed goal that reflects the likely speed requirements at the time the grant networks will be constructed. Any network engineer who tracks customer usage will tell you that the minimum speed requirement for eight years from now should be at least 100 Mbps.
  • The 25/3 Mbps just feels ‘puny’. I got the same letdown when I read that a new NASA goal is to put a man on the moon again. Considering the huge leaps we’ve made in technology since 1969, striving for a moon-landing again feels like a small national goal and a waste of our national resources – and so does setting a broadband speed goal of 25/3 Mbps.

One of the goals that Congress gave the FCC is to strive to bring rural broadband into parity with urban broadband. In setting a goal of 25/3 the FCC is ignoring the broadband trend in cities. The big cable companies have increased minimum download speeds for new customers to beteen 100 and 200 Mbps and have unilaterally increased speeds for existing customers. 25/3 Mbps is a DSL speed, and we see the biggest telcos finally starting to walk away from copper. Verizon has gotten out of the copper business in nearly 200 exchanges in the northeast. AT&T has been losing DSL customers and replacing them with fiber customers. It’s almost unthinkable that the FCC would establish a new forward-looking grant program and not expect broadband speeds any faster than DSL.

In my mind, the FCC betrayed rural communities when they adopted the 10/1 Mbps speed goal for CAF II. That told rural communities that they had to settle for second-rate broadband that was far slower than the rest of the country. From what I hear, most rural communities don’t even consider the CAF II upgrades as real broadband. Rural communities want fiber. They view anything slower than fiber as nothing more than a stepping-stone towards eventually getting fiber.

The FCC needs to listen to what rural America wants. If this giant new grant program will make rural communities wait for years to get 25/3 Mbps then rural America will largely ignore it. Communities will continue to plan for something better. Households might begrudgingly buy 25/3 broadband, but the people in rural America know that is not the same as broadband elsewhere and they will continue to clamor for the same broadband that they see in cities.

I hope the FCC understands this. Even if they allow technologies in these grants that can only deliver 25/3 Mbps, the FCC can still use the grant ranking process to favor faster broadband. If the grants grading process emphasizes speed, then the $20 billion could probably be used to bring fiber to 4 or 5 million rural homes. In my mind that would be the ideal use of these grants, because those homes would be brought to parity with the rest of the country. Those homes could be taken off of the FCC’s worry list and the universe of underserved homes would be significantly reduced. If the grants give money to anything less than fiber, the FCC will have to keep on dumping grant money into the same communities over and over until they finally finance fiber.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.

Surveys for Grants and Loans

Many of the federal and state grant programs and many broadband lenders want applicants to undertake a survey to quantify the likely success of a new broadband venture. Unfortunately, there are far too many broadband projects being launched that are unable to answer the basic question, “How many customers are likely to buy service from the new network?” There are only two ways to get a reliable answer to that question – a canvass or a statistically valid survey.

A canvass is the easiest to understand and it involves knocking on the doors or calling every potential customer in a market. I’ve seen many clients have good luck with this when overbuilding a small town or a subdivision. A canvass will be most successful when an ISP has all of the facts needed by potential customers such as specific products and prices. Many companies would label the canvass process as pre-selling – getting potential customers to tentatively commit before construction.

The alternative is a canvass is a ‘statistically valid’ survey. Any survey that doesn’t meet the statistically valid test isn’t worth the paper it’s printed on. There are a few key aspects of doing a statistically valid survey:

Must be Random. This is the most important aspect of a valid survey and is where many surveys fail. Random means that you are sampling the whole community, not just a subset of respondents. A survey that is mailed to people or put online for anybody to take is not random.

The problem with a non-random survey is that the respondents self-select. For example, if you mail a survey to potential customers, then people who are interested in broadband are the most likely to respond and to return the completed survey. It can feel good to get back a lot of positive responses, but it’s far more important to hear from those who don’t support fiber.

The whole purpose of doing a broadband survey is to quantify the amount of support – and that also means quantifying those who won’t buy fiber. I’ve seen results from mailed surveys where almost every response was pro-broadband, and of course, that is unlikely. That result just means that the people who aren’t interested in broadband didn’t bother to complete or return the survey. The only way you can put any faith in a mailed survey is if you get so many responses that it approaches being a canvass. A good analogy of the problems with a mail survey would be to stand in front of a grocery store and ask customers if they like to shop there. While there may be a few customers with complaints, such a survey would not tell you anything about how the community feels about that store since the question was not asked to those who don’t shop at the store.

This blog is too short to describe survey methods – but there are specific acceptable techniques for conducting a random survey either by telephone or by knocking on doors. It’s possible to do those tasks non-randomly, so you should seek advice before conducting a phone or door-knocking survey.

Non-biased Questions. Survey questions must be non-biased, meaning that they can’t lead a respondent towards a certain answer. A question like, “Do you want to save money on broadband?” is worthless because it’s hard to imagine anybody answering no to that question. It’s a lot harder to write non-based questions than you might think, and bias can be a lot more subtle than that question.

Respondent Bias. People who conduct surveys know that there are some kinds of questions that many respondents won’t answer truthfully. For example, I’ve read that nearly half of applicants lie about their annual income when applying for a credit card. For various reasons people want others to think they earn more than they actually do.

Respondent bias can apply to a broadband survey as well. I’ve learned that you can’t rely on responses having to do with spending. For example, many respondents will under-report what they pay each month for broadband. Perhaps people don’t want the survey taker to think they spend too much.

Respondent bias is one of the reasons that political surveys are less reliable than surveys on more factual topics – respondents may not tell the truth about who they will vote for or how they feel about political issues. Luckily, most people are truthful when asked about non-emotional topics and factual questions, and we’ve found residential broadband surveys to be a great predictor of market interest in broadband.

Survey Fatigue. Respondents have a natural tendency to give up if a survey takes too long. They will hang-up on a phone survey or start giving quick and inaccurate answers to get rid of somebody at their door. A survey ought to last no longer than 10 minutes, and the ideal length should be closer to five minutes.

The big takeaway from this discussion is that doing a survey the wrong way will likely give you the wrong answer to the basic question of likely market penetration. You’re better off to not do a survey than to do one that is not statistically valid. I don’t know if there is anything more deadly in launching a new broadband market than having a false expectations of the number of customers that will buy broadband.

What’s the Future for CenturyLink?

I don’t know how many of you watch industry stock prices. I’m certainly not a stock analyst, but I’ve always tracked the stock prices of the big ISPs as another way to try to understand the industry. The stock prices for big ISPs are hard to compare because every big ISP operates multiple lines of business these days. AT&T and Verizon are judged more as cellular companies than as ISPs. AT&T and Comcast stock prices reflect that both are major media companies.

With that said, the stock price for CenturyLink has performed far worse than other big ISPs over the last year. A year ago a share of CenturyLink stock was at $19.24. By the end of the year the stock price was down to $15.44. As I wrote this blog the price was down to $10.89. That’s a 43% drop in share price over the last year and a 30% drop since the first of the year. For comparison, following are the stock prices of the other big ISPs and also trends in broadband customers:

Stock Price 1 Year Ago Stock Price Now % Change 2018 Change in Broadband Customers
CenturyLink $19.24 $10.89 -43.4% -262,000
Comcast $32.14 $43.15 34.3% 1,353,000
Charter $272.84 $377.89 38.5% 1,271,000
AT&T $32.19 $30.62 -4.9% -18,000
Verizon $48.49 $56.91 17.4% 2,000

As a point of comparison to the overall market, the Dow Jones Industrial average was up 4% over this same 1-year period. The above chart is not trying to make a correlation between stock prices and broadband customers since that is just one of dozens of factors that affect the performance of these companies.

Again, I’ve never fully understood how Wall Street values any given company. In reading analyst reports on CenturyLink it seems that the primary reason for the drop in stock price is that all of the company’s business units are trending downward. In the recently released 1Q 2019 results the company showed a year-over-year drop in results for the international, enterprise, small and medium business, wholesale, and consumer business units. It seems that analysts had hoped that the merger with Level 3 would reverse some of the downward trends. Stock prices also dropped when the company surprised the market by cutting its dividend payment in half in February.

CenturyLink faces the same trends as all big ISPs – traditional business lines like landline telephone and cable TV are in decline. Perhaps the most important trend affecting the company is the continued migration of broadband customers from copper-based DSL to cable company broadband. CenturyLink is not replacing the DSL broadband customers it’s losing. In 2018 CenturyLink lost a lot of broadband customers with speeds under 20 Mbps, but had a net gain of customers using more than 20 Mbps. CenturyLink undertook a big fiber-to-the-home expansion in 2017 and built fiber to pass 900,000 homes and businesses – but currently almost all expansion of last-mile networks is on hold.

It’s interesting to compare CenturyLink as an ISP with the big cable companies. The obvious big difference is the trend in broadband customers and revenues. Where CenturyLink lost 262,000 broadband customers in 2018, the two biggest cable companies each added more than a million new broadband customers for the year. CenturyLink and other telcos are losing the battle of DSL versus cable modems with customers migrating to cable companies as they seek faster speeds.

It’s also interesting to compare CenturyLink to the other big telcos. From the perspective of being an ISP, AT&T and Verizon are hanging on to total broadband customers. Both companies are also losing the DSL battle with the cable companies, but each is adding fiber customers to compensate for those losses. Both big telcos are building a lot of new fiber, mostly to provide direct connectivity to their own cell sites, but secondarily to then take advantage of other fiber opportunities around each fiber node.

Verizon has converted over a hundred telephone exchanges in the northeast to fiber-only and is getting out of the copper business in urban areas. Verizon has been quietly filling in its FiOS fiber network to cover the copper it’s abandoning. While nobody knows yet if it’s real, Verizon also has been declaring big plans to to expand into new broadband markets markets using 5G wireless loops.

AT&T was late to the fiber game but has been quietly yet steadily adding residential and business fiber customers over the last few years. They have adopted a strategy of chasing pockets of customers anywhere they own fiber.

CenturyLink had started down the path to replace DSL customers when they built a lot of fiber-to-the-home in 2017. Continuing with fiber construction would have positioned the company to take back a lot of the broadband market in the many large cities it serves. It’s clear that the new CenturyLink CEO doesn’t like the slow returns from investing in last-mile infrastructure and it appears that any hopes to grow the telco part of the business are off the table.

Everything I read says that CenturyLink is facing a corporate crisis. Diving stock prices always put strain on a company. CenturyLink faces more pressure since the activist investors group Southeastern Asset Management holds more than a 6% stake in CenturyLink and made an SEC filing that that the company’s fiber assets are undervalued.

The company has underperformed compared to its peers ever since it was spun off from AT&T as US West. The company then had what turned out to be a disastrous merger with Qwest. There was hope a few years back that the merger with CenturyLink would help to right the company. Most recently has been the merger with Level 3, and at least for now that’s not made a big difference. It’s been reported that CenturyLink has hired advisors to consider if they should sell or spin off the telco business unit. That analysis has just begun, but it won’t be surprising to hear about a major restructuring of the company.

Setting the Right Goals for Grants

Most past federal broadband grant programs had very specific goals. For example, the USDA Community Connect grants that have been around for many years target grants to the poorest parts of the country – the awards are weighted towards communities with the highest levels of poverty. For any grant program to be effective the goals of the program need to be clearly defined, and then the award process needs to be aligned with those goals.

The FCC needs to define the goals of the upcoming $20.4 billion grant program. It the goals are poorly defined then the resulting grant awards are likely to be all over the board in terms of effectiveness. What are the ideal goals for a grant program of this magnitude?

The first goal to be decided is the scope of the coverage – will the goal be to bring somewhat better broadband to as many households as possible, or will it be to bring a long-term broadband solution to a smaller number of households? If the goal is to serve the most households possible, then the grants are going to favor lower-cost technologies and the grants will likely go to the wireless providers and satellite providers – as we saw happen in the recent CAF II reverse auction.

If the grants are aimed at a more permanent solution, then the grants will favor fiber. Perhaps the grants could also go towards anybody willing to extend a cable hybrid-fiber coaxial network into rural areas – but no other technology can be considered as a permanent solution.

There are huge consequences for choosing the first option of serving as many households as possible. These new grants are mostly going to be awarded in the geographic areas covered by the original CAF II program. That program awarded over $11 billion to the big telcos to beef up broadband to speeds of at least 10/1 Mbps. Now, before that program is even finished the FCC is talking about overbuilding those same areas with another $20 billion grant program. If this grant program is used to upgrade homes to fixed wireless, it doesn’t take a crystal ball to understand that in ten years from now we’ll be talking about overbuilding these areas again with fiber. It would be incredibly wasteful to use multiple rounds of grants to upgrade the same geographic areas several times.

The other big issue for these grants to deal with is defining which parts of the country are eligible for the grants. What should be the criteria to decide which homes can be upgraded?

If the test is going to be related to existing speeds, the FCC is going to have to deal with the existing broadband coverage maps that everybody in the industry knows to be badly flawed. The FCC is talking about tackling a new mapping effort – but it’s highly likely that the new maps will just swap old mapping errors for new mapping errors. The reality on the ground is that it’s virtually impossible to map the real speeds on copper or fixed wireless networks. In real life, two rural neighbors can have drastically different speeds due to something as simple as being on different copper pairs. It’s impossible to accurately map DSL or wireless broadband coverage.

To make matters even worse, the current Re-Connect grants are saddled with a rule that says that no more than 10% of grant-covered homes can have existing broadband of more than 10/1 Mbps. Layering that kind of rule on top of terrible maps creates an environment where an ISP is largely unable to define a believable grant footprint.

The FCC must figure out some way to rectify the mapping problem. One of the easiest ways is what I call the technology test – anybody that wants to overbuild copper with fiber should automatically be eligible without trying to figure out the current speeds on the copper. Perhaps the easiest rule could be that any place where there is telco copper and no cable company network should be grant-eligible for fiber overbuilders.

Assuming the grants won’t all go to fiber, then there has to be an alternate way for an ISP or a community to challenge poor maps. Perhaps the FCC needs to provide a realistic time frame to allow local governments to demonstrate the actual speeds in an area, much like what was done in the recent Mobility II grant process.

This blog is part of a series on Designing the Ideal Federal Grant Program.

Designing the Ideal Federal Broadband Grant Program

In April, FCC Chairman Ajit Pai announced a new rural broadband initiative that will provide $20.4 billion of new funding. We don’t know many details yet, but here are a few things that will likely be involved in awarding the funding:

  • The FCC is leaning towards a reverse auction.
  • The program will likely require technologies that can deliver at least 25/3 Mbps broadband speeds.
  • The program will be funded within the existing Universal Service Fund, mostly by repositioning the original CAF II plan.
  • The grants might all be awarded at once, similar to A-CAM and CAF II awards, meaning that there might be only one chance to apply, with the awards to be paid out over a longer time period.

I’m writing a series of blogs that will examine the ideal way to design and administer a grant program of this size. We’ve seen both good and also disastously bad federal broadband programs before, and i’m hoping the FCC will take some time to make this grant program one of the effective ones. I’m sure the details of this new program are not yet set in stone, and folks in rural America need to make their voices heard now if they want some of this money to benefit their communities.

I’m going to look at the following topics, and perhaps more as I write this. At the end of this process I’ll post a whitepaper on my website that consolidates all of these discussions into one document.

A well-designed broadband grant program of this magnitude should consider the following:

What is the End Goal?

It’s important up-front for the FCC to determine how the grant moneys are to be used. The best grant programs have a specific goal, and then the application and award process is designed to best meet the goals. The goal can’t be something as simple as ‘promote rural broadband’, because a goal that simplistic is bound to create a hodgepodge of grant awards.

What Broadband Speeds Should be Supported?

This is an area where the FCC failed miserably in the past. They awarded over $11 billion in the CAF II program that was used to upgrade broadband speeds to speeds of only 10/1 Mbps. When the FCC set the 10/1 Mbps speed that didn’t even meet their own definition of broadband. How should the FCC determine eligible speeds this time to avoid a repeat of the CAF II debacle?

Who Should be Eligible?

FCC programs in the past have usually made the monies available to a wide range of recipients. However, the specific details of the grant programs have often made it hard for whole classes of entities like cities or counties to accept the awards. As an example, there are many entities electing to not participate in the current Re-Connect grant program because they can’t accept any part of the awards that include RUS loans.

Is a Reverse Auction the Right Mechanism?

The FCC and numerous politicians currently favor reverse auctions. Like any mechanism, there are situation where reverse grants are a great tool and others where they will distort the award process. Are reverse auctions a good tool for this grant program?

Other Issues

There are two drastically different ways to hand out these grants. One is to follow the CAF II mechanism and award all of the $20 billion in one huge auction and then pay it out over 6 or 8 years. The other would be to divide the award money into even tranches and have a new grant award for each of those years.

In the recent Re-Connect grants the FCC decided to blend grants and loans. I know the loan component stopped most of my clients from pursuing these grants. Should there be a loan component of the grants?

There are also technical issues to consider. I had clients who were outbid in the recent CAF II reverse auction by wireless companies that gained bidding preference by promising that their fixed wireless networks could achieve across-the-board 100 Mbps broadband. I still don’t know of a wireless technology that can do that over a large footprint. How should the FCC make sure that technologies deliver what’s promised?

What’s the Role of States in this Process?

What should states be doing to maximize the chance for federal grant money to be awarded to their state?

This blog is part of a series:

Setting the Right Goals for Grants

Speed Goals for FCC Grants

Who Should be Eligible for Grants?

Are Reverse Auctions the Right Mechanism?

What Technology Should be Covered?

State’s Role in Broadband Grants

Summary and Conclusions