A Surprise

I think my biggest industry surprise of the last year happened recently when I opened the front door and found that a new yellow page directory had been placed on my porch. I haven’t received a yellow pages directory for the last seven years living in the US or the decade before that living in the Virgin Islands. I hadn’t given it much thought, but I thought the yellow pages were dead.

The yellow pages used to be a big deal. Salespeople would canvass every business in a community and sell ads for the annually produced book. I remember when living in Maryland that the Yellow Pages was at least three inches thick just for the Maryland suburbs of DC and that there were similar volumes for different parts of the DC metropolitan area.

Wikipedia tells me that the yellow pages were started by accident in Cheyenne, Wyoming in 1883 when a printer ran out of white paper and used yellow in printing a directory. The idea caught on quickly and Reuben H. Donnelley printed the first official Yellow Pages directory in 1886.

Yellow Page directories became important to telephone companies as a significant source of revenue. The biggest phone companies produced their directories internally through a subsidiary. For smaller telcos, the yellow page ads were sold, and directories were printed by outside vendors like Donnelley that shared ad revenues with the phone company. The revenue stream became so lucrative in the 1970s and 1980s that many medium-sized telephone companies took the directory function in-house – only to and found out how hard it was to sell ads to every business in a market. The market for yellow pages got so crazy that competing books were created for major metropolitan markets.

Yellow pages were a booming business until the rise of the Internet. The Internet was supposed to replace the yellow pages. The original yellow pages vendors moved the entire yellow page directories online, but this was never a big hit with the public. It was so much easier to leaf through a directory, circle numbers of interest, and take notes in a paper copy of the directory than it was to scroll through pages of listings online.

Merchants always swore that yellow page ads were effective. A merchant that was creative in getting listed in the right categories would get calls from all over a metropolitan area if they sold something unique.

Of course, there was also a downside to yellow pages. The yellow paper and the glue used to bind the thick books meant that the paper wasn’t recyclable. This meant a huge pile of books ended up in landfills every year when the new books were delivered. After the directories lost some of their importance, many cities required that directories were only delivered to homes that asked for them to reduce the huge pile of paper in the landfills.

Yellow pages are just another aspect of telephony that has largely faded away. There was a time that you saw yellow pages sitting somewhere near the main telephone in every home you visited. It’s something that we all had in common – and it’s something that the consumer found to be invaluable. A new business knew they had made it when they saw their business first listed in the yellow pages.

The Accessible, Affordable Internet Act for All – Part 2

This is the second look at the Accessible, Affordable Internet Act for All sponsored by Rep. James E. Clyburn from South Carolina and Sen. Amy Klobuchar from Minnesota. The first blog looked at the problems I perceive from awarding most of the funding in a giant reverse auction.

In a nutshell, the bill provides $94 billion for broadband expansion. A huge chunk of the money would be spent in 2022, with 20% of the biggest funds deferred for four years. There are other aspects of the legislation worth highlighting.

One of the interesting things about the bill is the requirements that are missing. I was surprised to see no ‘buy American’ requirement. While this is a broadband bill, it’s also an infrastructure bill and we should make sure that infrastructure funding is spent as much as possible on American components and American work crews.

While the bill has feel-good language about hoping that ISPs offer good prices, there is no prohibition that I can find against practices like data caps imposed in grant-funded areas that can significantly increase monthly costs for a growing percentage of households.

The most dismaying aspect of the bill that is missing is the idea of imposing accountability on anybody accepting the various federal grant funds. Many state grant programs come with significant accountability. ISPs must often submit proof of construction costs to get paid. State grant agencies routinely visit grant projects to verify that ISPs are building the technology they promised. There is no such accountability in the grants awarded by this bill, just as there was no accountability in the recent RDOF grants or the recently completed CAF II grants. In the original CAF II, the carriers self-certify that the upgrades have been made and provide no back-up that the work was done other than the certification. There is a widespread belief that much of the CAF II upgrades were never done, but we’ll likely never know since the telcos that accepted the grants don’t have any reporting requirements to show that the grant money was spent as intended.

There is also no requirement to report the market success of broadband grants. Any ISPs building last-mile infrastructure should have to report the number of households and businesses that use the network for at least five years after construction is complete. Do we really want to spend over $90 billion for grants without asking the basic question of whether the grants actually helped residents and businesses?

This legislation continues a trend I find bothersome. It will require all networks built with grant funding to offer a low-income broadband product – which is great. But it then sets the speed of the low-income service at 50/50 Mbps while ISPs will be required to provide 100/100 Mbps or faster to everybody else. While it’s hard to fault a 50/50 Mbps product today, that’s not always going to be the case as homes continue to need more broadband. I hate the concept that low-income homes get slower broadband than everybody else just because they are poor. We can provide a lower price without cutting speeds. ISPs will all tell legislators that there is no difference in cost in a fiber network between a 50/50 Mbps and a 100/100 Mbps service. This requirement is nothing more than a backhanded way to remind folks that they are poor – there is no other reason for it that I can imagine.

One of the interesting requirements of this legislation is that the FCC gathers consumer prices for broadband. I’m really curious how this will work. I studied a market last year where I gathered hundreds of customer bills and I found almost no two homes being charged the same rate for the same broadband product. Because of special promotional rates, negotiated rates, bundled discounts, and hidden fees, I wonder how ISPs will honestly answer this question and how the FCC will interpret the results.

The bill allocates a lot of money for ongoing studies and reports. For example, there is a new biennial report that quantifies the number of households where cost is a barrier to buying broadband. I’m curious how that will be done in any meaningful way that will differ from the mountains of demographic data that show that broadband adoption has almost a straight-line relationship to household income. I’m not a big fan of creating permanent report requirements for the government that will never go away.

Taking the Short View

We need to talk about the insidious carryover impact of having a national definition of broadband speed of 25/3 Mbps. You might think that the FCC’s definition of broadband doesn’t matter – but it’s going to have a huge impact in 2021 on how we spend the sudden flood of broadband funding that’s coming to bear from the federal government.

First, a quick reminder about the history of the 25/3 definition of broadband. The FCC under Tom Wheeler increased the definition of broadband in 2015 from the paltry former definition of 4/1 Mbps – a sorely overdue upgrade. At the time that the new definition was set it seemed like a fair definition. The vast majority of US homes could comfortably function with a 25/3 Mbps broadband connection.

But we live in a world where household usage has been madly compounding at a rate of over 20% per year. More importantly, since 2015 we’ve changed the way we use broadband. Homes routinely use simultaneous broadband streams and a large and growing percentage of homes now find 25 Mbps download to be a major constraint on how they want to use broadband. The cable companies understood this, and to keep customers happy have upgraded their minimum download speeds from 100 Mbps to 200 Mbps.

Then came the pandemic and made the whole country focus on upload speeds. Suddenly, every student and every adult who tried to work at home learned that the upload stream for most  broadband connection will just barely support one person working at home and is completely inadequate for homes where multiple people are trying to function at home at the same time.

Meanwhile, the FCC under Chairman Ajit Pai ignored the reality of the big changes in the way that Americas use broadband. The FCC had multiple opportunities to increase the definition of broadband – including after the evident impact of the pandemic – but he stubbornly stuck with the outdated 25/3 definition. Chairman Pai did not want a legacy of suddenly declaring that many millions of homes didn’t have adequate broadband.

We now have an FCC that is likely to increase the definition of broadband, but the FCC is still waiting for a fifth Commissioner to hold a vote on the issue. Meanwhile, we are poised to start handing out billions of dollars of broadband subsidies that come from the $1.9 trillion American Rescue Plan Act. This includes $10 billion that is directly approved as state block grants for broadband plus some portion of the larger $350 billion that is approved for use for more general infrastructure that can include broadband. I can promise you that this money is going to come encumbered in some form or fashion by the old definition of broadband. I can’t predict exactly how this will come into play, but there is no way that the Treasure Department, which is administering these funds can’t ignore the official definition of broadband.

As much as federal officials might want to do the right thing, 25/3 Mbps is the current law of the land. The new federal monies are likely to emphasize serving areas that don’t have speeds that meet that 25/3 Mbps definition. Let me rephrase that to be more precise – federal broadband money will be prioritized to give funding to areas where ISPs have told the FCC that households can’t buy a 25/3 broadband product. Unfortunately, there are huge parts of the US where homes don’t get speeds anywhere close to 25/3 Mbps, but where ISPs are safe in reporting marketing speeds to the FCC rather than actual speeds. States like Georgia and North Carolina have estimated that the number of households that can’t buy a 25/3 Mbps broadband product is twice what is reported to the FCC.

What this all means is that we are going to decide where to spend billions in funding from the American Rescue Plan Act based upon the 25/3 Mbps definition of broadband – a definition that will not long survive a fully staffed FCC. The intransigence of Chairman Pai and the big ISPs that strongly supported him will carry over and have a huge impact even after he is gone. The broadband that will be built with the current funding will last for many decades – but unfortunately, some of this funding will be misdirected due to the government taking the short view that we must keep pretending that 25/3 Mbps is a meaningful measurement of broadband.

T-Mobile to Expand Rural Broadband Coverage

T-Mobile will be launching a marketing effort this month for a fixed LTE broadband product that it’s marketing as 5G. This launch was a requirement of the merger with Sprint. In November 2019 T-Mobile agreed that within three years it would provide fixed cellular broadband to cover 97% of the US population, with the goal increased to 99% within six years. T-Mobile’s announcement of the new product says the company plans to extend the product to 97% of US households by the end of 2022, mirroring the agreement it made with the FCC for the merger.

The latest announcement doesn’t mention broadband speeds. In the 2019 deal with the FCC, T-Mobile promised that it would provide 100 Mbps broadband for cellular speeds to 90% of households in the same three-year period, with the rest of T-Mobile’s commitment at 50 Mbps broadband.

That speed commitment is going to hard for T-Mobile to achieve. The 100 Mbps cellular speed is probably reachable in large cities. PC Magazine conducted its annual test of cellular speeds in summer 2020 in 26 cities and found the following average urban cellular speeds

  • AT&T averaged 103/19 Mbps
  • Verizon averaged 105/22 Mbps
  • T-Mobile averaged 74/26 Mbps

The AT&T and Verizon average speeds were boosted by having a millimeter-wave service available in some downtowns. The average LTE speed for those carriers likely looks a lot more like T-Mobile after backing out this outdoor-only gimmick product that is not reachable by most people. The PC Magazine speed tests verified what has been widely reported – that 5G speeds are slower on average than 4G LTE speeds.

The above speeds are only in large metropolitan areas where the majority of cellular customers are within a mile of a cell site. It’s going to be a different speed story when measuring cellular speeds in rural America where the average customer is going to likely live several miles from a cellular tower.

There are still huge tracts of rural America that have little or no cellular coverage – and I guess the FCC counts them as the 3% that don’t have to be covered by T-Mobile. My experience in driving through rural America is that there are probably more than 3% of homes that have either no cellular, or extremely weak cellular, and who are not going to benefit from a cellular data product from any of the carriers.

My consulting firm has been conducting rural speeds tests in communities around the country and I rarely see a fixed-cellular customer on any of the three carriers with speeds over 20 Mbps –  and speeds are often much slower. T-Mobile can make promises to the FCC about delivering 100 Mbps, but none of the cellular carriers are going to build a robust cellular network in rural America that can achieve the same speeds as in cities. They would be crazy to even think about it.

This doesn’t mean that rural homes shouldn’t start inquiring about the T-Mobile fixed cellular product – because it might be the fastest broadband in your neighborhood. T-Mobile has announced a $60 monthly price for customers willing to use autopay. The best news of the product is that data usage will be unlimited – a big relief to customers who have been paying a fortune for capped cellular broadband on a Verizon or AT&T hotspot.

T-Mobile says it currently has 100,000 customers nationwide on the fixed cellular product. It’s hoping to get 500,000 customers by the end of 2021, with a longer-term goal of 7 or 8 million customers within five years.

Theoretically, the broadband options for rural America should be getting better. We’re seeing beta trials with low-orbit satellite and announcements like this one from T-Mobile that promise something better than the telco DSL that barely works in rural places. I’d love to hear from folks about what is available where you live – because I’m not going to believe that T-Mobile will cover 97% of households without some proof. I hope my skepticism is misplaced.

I’m Looking to Hire a New Consultant

I’m looking to hire an Associate Consultant. This is a starting consulting position that will work directly with me. I spend interesting days on a wide variety of projects. My primary work is helping communities and ISPs look at opening new broadband markets. I also help ISPs find funding. I work with states and foundations in developing broadband policies. And I work on a number of interesting projects each year that are hard to categorize – I help clients solve problems. I also spend a lot of time responding to RFPs or writing proposals to provide consulting services.

This is a position with big growth potential in both knowledge and earnings. I’m looking for somebody who is willing to learn the intricacies of the broadband industry – you’ll be working with an industry insider and pro who knows a lot about almost all aspects of the industry.

The two traits I value the most are the ability to write clearly and the ability to tackle complex spreadsheets. I realize that’s an uncommon pair of talents for one person, but I run an uncommon business. I promise the right individual an interesting workday.

You’ll find a more detailed job description here.

State versus Federal Broadband Grants

The recently enacted $1.9 trillion American Rescue Plan Act includes some interesting broadband funding. For example, there is $500 million aimed for infrastructure for Tribes, to supplement the $1 billion of other grants also earmarked for this purpose.

But the biggest broadband news from this Act is the “Coronavirus Capital Projects Fund” to be distributed by the  Department of the Treasury. This is $10 billion funding that is largely being earmarked for broadband. The money will be awarded in the form of state block grants – each state will get a block grant between $100 million and perhaps $500 million.

This award by Congress contrasts massively from the massive broadband money being proposed in the Accessible, Affordable Internet Act for All being proposed by Rep James E. Clyburn from South Carolina and Sen. Amy Klobuchar from Minnesota. That bill proposes spending an astonishing $79.5 billion on broadband infrastructure. Over $60 billion would be used for one gigantic federal reverse auction. There is also funding in this bill that would give another $100 million to each state for broadband expansion – but this money would be constrained, and states would have to follow rules established by Congress. My reading of the bill’s language would require states to also hold a reverse auction, which would take away any state authority over how money is spent.

The current American Rescue Plan Act funding is a true block grant to the states. The Act doesn’t proscribe a lot of rules on how states can use the money, which has been the case for many other federal broadband funding mechanisms. It’s likely that, within reason, that states can decide how best to spend the money. The federal legislation does say that the money needs to be spent for capital projects. The money could go to build new broadband infrastructure. The funding could go instead to connect low-income homes or student homes to existing broadband networks. Each state is free to use the money in the way it finds the most useful.

The difference between these two sources of federal funding raises a huge policy question. Should we trust states or the federal government to spend broadband grant money? I’ve been giving this a lot of thought for the last month and I come down on the side of state block grants for broadband, for a lot of reasons:

  • States know where the pain points are in terms of areas that most need a broadband solution. The federal government will instead make decisions on where to spend grant money based upon the egregiously erroneous FCC mapping data.
  • We have already seen that there is a lot of accountability for ISPs that accept state grants. ISPs generally must furnish proof, such as construction cost receipts, that the broadband facilities were constructed, and states often visit and inspect grant projects. Contrast this with FCC grants that have almost no accountability and that hand money to ISPs and hope the ISPs do the right thing.
  • State grants will likely make a judgment call about the ability of an ISP to complete a grant project. States will make judgment calls about the solvency and balance sheet of a grant recipient and won’t give large sums of money to a company that has never tackled a large-dollar project before. A state grant program might be leery of funding a start-up that doesn’t have customers or of funding a technology that a grant recipient hasn’t built and operated before. States also make judgment calls on technology. For example, some states only fund wired gigabit technologies.
  • States grants often don’t insist that grants projects pay prevailing wages – city labor rates – for projects where numerous contractors are willing to accept lower market labor rates. States want grant dollars to spread as far as possible and to help as many households as possible.

I could go on with this list and show many other ways where states are likely to be more responsible with grant dollars than the FCC. State grants are likely less likely to waste dollars, and states are a lot less likely to give grants to bad actors or bad technologies.

This is not to say that state grants will be perfect – because governments never are. There will be states where politics will be the primary driver behind who gets funded. We don’t have to look any further than $100 million CASF grants in California in 2016 to see a program where all of the state grant money was handed to AT&T and Frontier. Some states will make poor funding decisions – but most will not.

This leads to the most fundamental reason I favor state block grants. If a state screws up a grant funding program, voters who want better broadband will have somebody to blame and vote out of office. What’s the accountability to the public if the FCC duplicates the failures of the RDOF grants in a bigger reverse auction – nobody gets to vote against the FCC.

It’s really hard for me to come up with even a single reason that would argue in favor of federal broadband grants over state grants. I hope the folks in Congress who are likely to decide this issue take a little time and think through the issues I’ve discussed in this blog. Federal legislators tend to like federal solutions – and like having their names attached to grant funding bills. But the chances are high that the coming big burst of infrastructure dollars will be the last money given to solve rural broadband for the foreseeable future, so let’s please not screw this up.

Cord Cutting Continues 4Q 2020

The largest traditional cable providers collectively lost over 1.3 million customers in the fourth quarter of 2020 – an overall loss of 1.7% of customers. To put the quarter’s loss into perspective, the big cable providers lost 14,158 cable customers per day throughout the quarter.

The numbers below come from Leichtman Research Group which compiles these numbers from reports made to investors, except for Cox which is estimated. The numbers reported are for the largest cable providers, and Leichtman estimates that these companies represent 95% of all cable customers in the country.

Following is a comparison of the fourth quarter subscriber numbers compared to the end of the third quarter of 2020:

4Q 2020 3Q 2020 Change % Change
Comcast 19,846,000 20,094,000 (248,000) -1.2%
Charter 16,200,000 16,235,000 (35,000) -0.2%
DirecTV 13,000,000 13,600,000 (600,000) -4.4%
Dish TV 8,816,000 8,965,000 (149,000) -1.7%
Verizon 3,927,000 4,000,000 (73,000) -1.8%
Cox 3,650,000 3,710,000 (60,000) -1.6%
AT&T TV 3,505,000 3,500,000 5,000 0.1%
Altice 2,961,000 3,035,100 (74,100) -2.4%
Mediacom 643,000 663,000 (20,000) -3.0%
Frontier 485,000 518,000 (33,000) -6.4%
Atlantic Broadband 318,387 317,787 600 0.2%
Cable One 261,000 277,000 (16,000) -5.8%
     
Total 73,612,387 74,914,887 (1,302,500) -1.7%
Total Cable 43,879,387 44,331,887 (452,500) -1.0%
Total Satellite 21,816,000 22,565,000 (749,000) -3.3%
Total Telco 7,917,000 8,018,000 (101,000) -1.3%

Some observations about the numbers:

  • The big loser continued to be AT&T, which lost a net of 595,000 traditional video customers between DirecTV and AT&T TV (relabeled from AT&T U-verse).
  • The big percentage loser continues to be Frontier which lost 6.4% of its cable customers in the quarter, followed closely by Cable One.
  • This is the eighth consecutive quarter that the industry lost over one million cable subscribers.

To put these losses into perspective, the industry has now seen huge customer losses over the last two years. At the end of 2018 these same companies had over 85.4 million customers that had dropped to 79.5 million by the end of 2019, and now down to 73.6 million households by the end of 2020. This is a loss of 11.8 million customers over two years.

The big losses in cable subscribers happened at the same time that the biggest ISPs in the country are adding a lot of customers. The biggest ISPs added almost 900,000 new subscribers in the fourth quarter of 2020 and added 4.6 million new broadband customers during 2020.

In earlier quarters of 2020, we saw that a lot of customers dropping traditional video were switching to online versions of the full cable line-up. That didn’t carry into the fourth quarter of 2020 where the combination of Hulu plus Live TV, Sling TV, AT&T TV, and FuboTV collectively lost 18,000 customers.

Demystifying Fiber Terminology

It’s common when a community is getting fiber to have the engineers tossing around technical terms that a layperson is not going to understand. Today’s blog will try to demystify the more common fiber terminology that you’ll likely hear.

Absorption: This is the phenomenon where the natural impurities in glass absorb some of the light signal inside of a fiber path.

ADSS (all-dielectric self-supporting). This is a hardened fiber cable that can be hung directly without a supporting messenger wire. This is primarily used for hanging fiber near electric wires (avoids have a metal wire near to the power lines).

Attenuation: This is a term that defines the amount of light that is lost during a fiber transmission due to absorption and scattering. Attenuation is usually measured in decibels (dB) per kilometer.

Attenuator: This is a device that is used to purposefully reduce signal power in a fiber optic link.

 Back Reflection (BR) refers to any situation that causes the light signal inside fiber to change direction. The most common form of back reflection happens when there is an interface between a lit fiber and air.

Buffer: This is the protective outer layer of material that is in direct contact with the fiber. Manufacturers offer a wide range of different buffering materials.

Insertion Loss: This describes the phenomenon where the light signal gets interrupted and diminished when the light signal encounters a fiber splice point or some electronic component in the network.

Messenger. This refers to a galvanized steel cable that is strung between poles and which is used to support fiber cable.

Multimode: This is a fiber that is capable of transmitting multiple wavelengths of light. Multimode fibers are larger than other fiber and come in two typical sizes – 50µm (microns) or 62.5 µm, compared to 2µm to 9µm for single-mode fiber. Multimode fiber is most commonly used for short transmission distances.

Return Loss: This measures the amount of light that completes the path through a fiber, expressed in decibels. The higher the return loss, the better.

Scattering: This is the other primary reason for signal loss in a fiber (along with absorption). Scattering occurs when light collides with small particles in the fiber path.

Single Mode: This is a fiber with a small fiber core size of 8-10 µm (microns). This fiber is used to transmit a single wavelength for long distances at high speeds.

Wavelength: This is a measure of the frequency (color) of light, expressed in microns or nanometers. The most typical wavelengths used in fiber cables are 850nm, 1300nm, and 1350nm.

High Precision GPS

Sometimes important changes for our industry come from outside the industry. We’re getting close to seeing affordable GPS devices that can measure accuracy within a centimeter. Higher precision GPS will be invaluable for ISPs and broadband technologies.

Normal GPS isn’t highly accurate. For example, a GPS-enabled smartphone is only accurate to within 4.9 meters (16 feet) under the open sky. Accuracy is even less around tall buildings, trees, bridges, or other obstacles that can block or deflect signals from satellites. This is plenty of accuracy for providing driving directions, which is the use of GPS that most people are familiar with – although occasionally you’ll get bad driving directions in a major city center when your mapping software thinks you’re on a different street.

Applications like using GPS for driving directions use a single frequency with the GPS device (smartphone or car) connecting to a single GPS satellite. The GPS satellites operated by the government can theoretically provide greater accuracy within 2.3 feet. But accuracy is reduced by local factors such as atmospheric conditions, signal blockage, the quality of the receiver, and the position of the satellite in relation to the user. All of these factors contribute to the lessened accuracy for the normal cellphone or car GPS unit.

High-precision GPS has been around for a while. But the current generation of high-precision technology has not been suitable for applications like driving. High-precision GPS devices work by using multiple frequencies and connecting to two GPS satellites. The technology uses complex mathematics models to calculate precise locations. This would normally require three signals (triangulation), but the devices do a good job at determining position based upon two signals. High-precision devices are also expensive to operate, with an annual subscription as high as $1,000 per device.

The new generation of GPS devices will overcome several of the major shortfalls. The new devices do away with the need for two frequencies. That limitation meant that high-precision devices still won’t work while moving in a car – the needed mathematical calculations cannot keep up in a receiver that’s moving.

The new devices instead are using a clever solution. Each device can create a real-time model of the environment that incorporates all of the known factors in the region that affect GPS accuracy. In essence, a single cellphone is preloaded with a simulation of the GPS environment, and the cellphone can then correct for expected distortions in the GPS measurement – meaning much higher accuracy in locations. These regional models are updated during the day to account for changes in temperature and weather and are beamed to any device trying to use GPS.

Higher precision to GPS opens up applications that were unattainable in the past. The simplest application will be precision locations for things like handholes. Technicians will no longer need to search through tall grass along a rural road to find a handhole because with the new GPS they’ll know the exact location.

Better GPS will be invaluable in locating existing utilities and in siting new buried construction. An engineer that first walks a route can define exactly where to dig or where to place a buried fiber.

Better GPS will be invaluable to rural broadband services like precision agriculture. Once a farmer precisely maps a field, they can tell a self-driving tractor or harvester exactly where to drive without needing a driver in the cab of each device.

And better GPS will help daily work functions in numerous ways we will discover once it’s routinely available. Somebody will be able to survey a site for a new hut and precisely define the location of a concrete pad or a fence without having to return to oversee the construction process. Companies will be able to precisely tell a homeowner where to find a buried conduit without having to always wait for a locator. We’ll quickly get used to more precise field measurements just like we all quickly adapted to trusting GPS driving directions.

Grant Accountability

I was listening in on a webinar the other day and heard the comment that the RDOF grants don’t include any requirement to serve customers. Winners of the grants are required to build networks according to a specific timeline, but there is no requirement that they market and sell to anybody. I went back and read the grant requirements, and this is absolutely true. It turns out that this was also the case for many other federal grants in the last decade. The BTOP grants and a few others had requirements to serve anchor institutions, but most federal grants don’t have any specific requirements for serving the public.

You might ask why this matters – after all, doesn’t a grant recipient want to use the money to attract new customers and gain new revenues? Unfortunately, I can think of examples where this was not the case. Consider AT&T and the CAF II grants. AT&T claimed to meet most of the CAF II requirements by claiming that rural DSL customers could change to a wireless broadband product supplied from AT&T cellular towers. But for much of rural America, this wireless product was a fiction. I wrote a blog a few years ago about a guy in Georgia that called AT&T continuously for nearly a year until he finally found somebody who even heard of the product. Even then, the installer that showed up to install the product was from hundreds of miles away. AT&T met its CAF II requirements with a product that it didn’t even bother to tell its customer service reps about.

I now look back and wonder why the FCC didn’t include a requirement to advertise and notify customers in the CAF II grants. This would have made it a lot harder for telcos like Frontier and CenturyLink to fake the CAF II upgrades. The FCC could have required that grant recipients notify each existing DSL customer when faster DSL speeds were available and to also advertise in local newspapers, with maps, as the CAF II upgrades were completed. Can you imagine the public uproar had these telcos been forced to make public claims that the grant upgrades were complete, if they weren’t? A requirement to advertise the completion of the CAF II upgrades would have provided real-time feedback to the FCC from the public about whether DSL speeds were actually improved.

I can foresee this same situation with the RDOF grants. As an example, Starlink has no obligation to serve anybody in the areas where the company will receive nearly a billion dollars of grant money. They aren’t required to spend some of the grant money to advertise in these areas and they aren’t required to give customers in the grant areas any better priority for satellite broadband than customers that live outside the grant areas. I bet that years from now we’ll find out that satellite penetrations are no higher where Starlink got the grants than elsewhere in the country – and maybe even lower since some of the grant areas in the Appalachians don’t look friendly for satellite reception.

Without a requirement to market and sell and to notify the public, there is no reason that other RDOF grant recipients can’t take shortcuts. A few grant recipients might make little or no upgrades like the telcos did in CAF II. It wouldn’t be hard for a grant recipient to build only a portion of a grant award area as a way to shave costs – and hope that nobody notices.

As bad as it is for grant a recipient to not have to notify the public when grant construction is completed, there is no requirement for a grant recipient to give the FCC any feedback on many households actually purchase the improved broadband. The FCC grabs a lot of glory when it announces a grant – but then doesn’t have any requirements for feedback that the public benefited from the grant.

These are all reporting shortcomings that the FCC can still rectify with the RDOF grants so that we don’t have a repeat of the CAF II fiasco. One of the unique features of the RDOF grants is that the grants are geographically specific. The grant areas are mapped down to the street level. Grant recipients should easily be able to notify households when grant work is completed and then count customers that benefit after grant completion.

Interestingly, most state grants have accountability. It’s not unusual for state grant offices to want to see construction receipts and to also send somebody out to verify that the construction was done. The FCC has completely ignored grant accountability which is the primary reason that the CAF II grant work went largely undone. It’s time that the FCC build in some basic accountability in federal grants so that we stop handing out checks to carriers and then hope they’ll do the right thing.