Gaining Access to Multi-tenant Buildings

In 2007 the FCC banned certain kinds of exclusivity arrangements between ISPs and owners of multi-tenant buildings. At the time of the order, the big cable companies had signed contracts with apartment owners giving them exclusive access to buildings. The FCC order in 2007 got rid of the most egregious types of contracts – in many cases, cable company contracts were so convoluted that building owners didn’t even understand the agreements were exclusive.

However, the FCC order was still a far cry away from ordering open access for ISPs to buildings and there are many landlords still today who won’t allow in competitors. The most common arrangements liked by landlords are revenue share arrangements where the building owner makes money from an arrangement with an ISP. While such arrangements aren’t legally exclusive, they can be lucrative enough to make landlords favor an ISP and give them exclusive access.

WISPA, the industry association for wireless ISPs has asked the FCC to force apartment owners to allow access to multiple ISPs. WISPA conducted a survey of its members and found that wireless companies are routinely denied access to apartment buildings. Some of the reasons for denying access include:

  • Existing arrangements with ISPs that make the landlord not want to grant access to an additional ISP.
  • Apartment owners often deny access because wireless ISPs (WISPs) are often not considered to be telephone or cable companies – many WISPs offer only broadband and have no official regulatory status.
  • Building owners often say that an existing ISP serving the building has exclusive rights to the existing wiring, including conduits that might be used to string new wiring to reach units. This is often the case if the original cable or telephone company paid for the inside wiring when the building was first constructed.
  • Many landlords say that they already have an existing marketing arrangement with an ISP, meaning they get rewarded for sending tenants to that ISP.
  • Many landlords will only consider revenue sharing arrangements since that’s what they have with an existing ISP. Some landlords have even insisted on a WISP signing a revenue-sharing arrangement even before negotiating and talking pricing and logistics.

These objections by landlords fall into two categories. One is compensation-based where a landlord is happy with the financial status quo relationship with an existing ISP. The other primary reason is some contractual relationship with an existing ISP that is hard or impossible for a landlord to preempt.

The concerns of WISPs are all valid, and in fact, the same list can be made by companies that want to build fiber to apartment buildings. However, landlords seem more open to fiber-based ISPs since saying that their building has fiber adds cachet and is valued by many tenants.

WISPs sometimes have unusual issues not faced by other ISP overbuilders. For example, one common wireless model is to beam broadband to a roof of an apartment building. That presents a challenge for gaining access to apartments since inside wiring generally begins in a communications space at the base of a building.

The issue is further clouded by the long history of FCC regulation of inside wiring. The topic of ownership and rights for inside wiring has been debated in various dockets since the 1990s and there are regulatory rulings that can give ammunition to both sides of wiring arguments.

The WISPs are facing an antagonistic FCC on this issue. The agency recently preempted a San Francisco ordinance that would have made all apartment buildings open access – meaning available to any ISP. This FCC has been siding with large incumbent cable and telephone companies on most issues and is not likely to go against them by allowing open access to all apartment buildings.

Unlicensed Millimeter Wave Spectrum

I haven’t seen it talked about a lot, but the FCC has set aside millimeter wave spectrum that can be used by anybody to provide broadband. That means that entities will be able to use the spectrum in rural America in areas that the big cellphone companies are likely to ignore.

The FCC set aside the V band (60 GHz) as unlicensed spectrum. This band provides 14 GHz of contiguous spectrum available for anybody to use. This is an interesting spectrum because it has a few drawbacks. This particular spectrum shares a natural harmonic with oxygen and thus is more likely to be absorbed in an open environment than other bands of millimeter wave spectrum. In practice, this will shorten bandwidth delivery distances a bit for the V band.

The FCC also established the E band (70/80 GHz) for public use. This spectrum will have a few more rules than the 60 GHz spectrum and there are light licensing requirements for the spectrum. These licenses are fairly easy to get for carriers, but it’s not so obvious that anybody else can get the spectrum. The FCC will get involved with interference issues with the spectrum – but the short carriage distances of the spectrum make interference somewhat theoretical.

There are several possible uses for the millimeter wave spectrum. First, it can be focused in a beam and used to deliver 1-2 gigabits of broadband for up to a few miles. There have been 60 GHz radios on the market for several years that operate for point-to-point connections. These are mostly used to beam gigabit broadband in places where that’s cheaper than building fiber, like on college campuses or in downtown highrises.

This spectrum can also be used as hotspots, as is being done by Verizon in cities. In the Verizon application, the millimeter wave spectrum is put on pole-mounted transmitters in downtown areas to deliver data to cellphones as fast as 1 Gbps. This can also be deployed in more traditional hot spots like coffee shops. The problem of using 60 GHz spectrum for this use is that there are almost no devices yet that can receive the signal. This isn’t going to get widespread acceptance until somebody builds this into laptops or develops a cheap dongle. My guess is that cellphone makers will ignore 60 GHz in favor or the licensed bands owned by the cellular providers.

The spectrum could also be used to create wireless fiber-to-the-curb like was demonstrated by Verizon in a few neighborhoods in Sacramento and a few other cities earlier this year. The company is delivering residential broadband at speeds of around 300 Mbps. These two frequency bands are higher than what Verizon is using and so won’t carry as far from the curb to homes, so we’ll have to wait until somebody tests this to see if it’s feasible. The big cost of this business plan will still be the cost of building the fiber to feed the transmitters.

The really interesting use of the spectrum is for indoor hot spots. The spectrum can easily deliver multiple gigabits of speed within a room, and unlike WiFi spectrum won’t go through walls and interfere with neighboring rooms. This spectrum would eliminate many of the problems with WiFi in homes and in apartment buildings – but again, this needs to first be built into laptops, sart TVs and other devices.

Unfortunately, the vendors in the industry are currently focused on developing equipment for the licensed spectrum that the big cellular companies will be using. You can’t blame the vendors for concentrating their efforts in the 24, 28, and 39 GHz ranges before looking at these alternate bands. There is always a bit of a catch 22 when introducing any new spectrum – a vendor needs to make the equipment available before anybody can try it, and vendors won’t make the equipment until they have a proven market.

Electronics for millimeter wave spectrum is not as easily created as equipment in lower frequency bands. For instance, in the lower spectrum bands, software-defined radios can easily change between nearby frequencies with no modification of hardware. However, each band of millimeter wave spectrum has different operating characteristics and specific antenna requirements and it’s not nearly as easy to shift between a 39 GHz radio and a 60 GHz radio – they requirements are different for each.

And that means that equipment vendors will need to enter the market if these spectrum bands are ever going to find widespread public use. Hopefully, vendors will find this worth their while because this is a new WiFi opportunity. Wireless vendors have made their living in the WiFi space and they need to be convinced that they have the same with these widely available spectrum bands. I believe that if some vendor builds indoor multi-gigabit routers and receivers, the users will come.

Testing the FCC Maps

USTelecom has been advocating the use of geocoding to make broadband maps more accurate. As part of that advocacy, the association tested their idea by looking at the FCC mapping in parts of Virginia and Missouri.

What they found was not surprising, but still shocking. They found in those two states that as many as 38% of households in rural census blocks were classified as being served, when in fact they were unserved. In FCC-speak, served is a home that has broadband available of 25/3 Mbps or faster. Unserved means homes having either no broadband available or that can buy broadband slower than 10/1 Mbps.

This distinction has huge significance for the industry. First, it’s been clear that the FCC has been overcounting the number of homes that have broadband. But far worse, the FCC has been awarding grants to provide faster broadband in unserved areas and all of the places that have been misclassified have not been eligible for grants. We’re about to enter the biggest grant program ever that will award $20.4 billion, but only to places that don’t have 25/3 Mbps speeds – meaning these misclassified homes will be left out again if the maps aren’t fixed soon.

The USTelecom effort is not even complete since several cable companies in the state did not participate in the trial – and this might mean that the percentage of homes that are misclassified is even larger. The misclassified homes are likely going to be those in census blocks that also contain at least some homes with fast broadband. Homes just past where the cable company networks start might be listed as being capable of buying a gigabit, and yet have no broadband option.

The existing FCC maps use data that is reported by ISPs using the Form 477 process. In that process, ISPs report speed availability by census block. There are two huge flaws with this reporting method. First, if even one customer in the census block can get fast broadband, then the whole census block is assumed to have fast broadband. Second, many ISPs have been reporting marketing speeds instead of actual speeds, and so there are whole census blocks counted as served when nobody can get real broadband.

The trial also uncovered other problems. The ISPs have not been accurate in counting homes by census block. Many ISPs have never accurately mapped their customers, and so the test found numerous examples of customers reported in the wrong census blocks. Additionally, the counts of buildings by census block are often far off, due in part to the confusing nature of rural addresses.

The bottom line is that the FCC has been collecting and reporting highly inaccurate data concerning rural broadband. We’ve known this for a long time because there have been numerous efforts to test the maps in smaller geographic areas that have highlighted these same mistakes. We also have evidence from Microsoft that shows that a huge number of homes are not connected to the Internet at speeds of at least 25/3 Mbps. That’s not just a rural issue, and for the Microsoft numbers to be true there must be a massive number of urban homes that are getting speeds slower than what is being reported to the FCC.

As dramatic as this finding is from USTelecom, it doesn’t tell the whole story. Unfortunately, no mapping strategy is going to be able to truthfully report the broadband speeds for DSL and fixed wireless. The speed of these products varies by home. Further, there is no way to know if a given home can utilize these technologies until somebody tries to connect them. Perhaps this isn’t important for DSL since there is almost no rural DSL capable of delivering 25/3 Mbps broadband. But any mapping of the coverage area of fixed wireless is going to be suspect since many homes are impeded from seeing a tranmitting antenna or else receive slower speeds than their neighbors due to impediments. The USTelecom effort is mostly fixing the boundary issues where homes are assumed to have broadband today but don’t. The 38% misreporting would be much higher if we could somehow magically know the real capabilities of DSL and fixed wireless.

The current FCC didn’t create this problem – it goes back several FCCs ago to the start of the 477 reporting system. However, I have to wonder if this FCC will change its mind about the status of rural broadband in the country even with better maps. The current FCC released broadband data for 2016 that included a huge error. A new ISP, Barrier Free had reported serving 25/3 broadband in census blocks covering 62 million people, when in June of that year the company didn’t yet have any customers. The FCC gleefully reported that the number of homes without broadband had dropped by 25%, mostly due to this reporting error. Even after correcting the error the FCC still declared that broadband in rural America was on the right trajectory and didn’t need any extraordinary effort from the FCC. I’m sure they will decide that rural broadband is fine, even if the number of unserved homes jumps significantly due to better mapping.

Comparing FCC Broadband Programs

I think it’s finally dawning on the big telcos that the days of being able to milk revenues from rural America while ignoring rural copper networks is finally ending. This becomes apparent when looking at the two most recent subsidy programs.

The original CAF II program was a huge boon to the big telcos. Companies like AT&T, CenturyLink, and Frontier collected $11 billion of subsidy to boost their rural copper networks up to speeds of at least 10/1 Mbps. This was a ridiculous program from the start since the FCC had established the definition of broadband to be at least 25/3 Mbps even before awarding this money. Perhaps the craziest thing about CAF II is that the telcos are still making the upgrades – they were required to be 60% complete with the required CAF II upgrades by the end 2018 and to be 100% complete by the end of 2020.

The big telcos report broadband customers to both the FCC and to stockholders, but the reporting is not in enough detail to know if the CAF II money has made any difference in rural America. All of the big telcos are losing broadband customers, but it’s hard to look under the hood to know if they are making any significant customer gains in the CAF II areas. We see little hints from time to time. For example, in the second quarter of this year, CenturyLink lost 56,000 net broadband customers but reports that it lost 78,000 customers with speeds below 20 Mbps and added 22,000 customers with speeds faster than that. That’s the first time they provided any color about their gains and losses. But even that extra detail doesn’t tell us how CenturyLink is doing in the CAF II areas. It’s obvious by looking at the customer losses that telcos aren’t adding the hundreds of thousands of new customers one would expect to see as the result of an $11 billion capital expenditure program. If CAF II is delivering broadband to areas that didn’t have it before, there should be a flood of new rural customers buying better broadband by now. I could be wrong, but when looking at the aggregate customers for each big telco I don’t think that flood of new customers is happening. If it was I think the telcos would be bragging about it.

The CAF II reverse auction took a different approach and awarded funding in those areas where the big telcos didn’t take the original CAF II funds. These subsidies were auctioned off in a reverse auction where the company willing to take the lowest amount of subsidy per customer got the funding. In the auction, most bidders offered to deploy broadband of 100 Mbps speeds or faster – a big contrast to the 10/1 Mbps speeds for CAF II. Some of the grant winners in the reverse auction like electric cooperatives are using the money to build fiber and offer gigabit speeds.

The original CAF II subsidy awards are probably the dumbest decision I’ve ever seen an FCC make (rivaling the recent decision to stop regulating broadband). If the original CAF II awards had been open to all applicants instead of being handed to the big telcos, then many of the homes that have been upgraded to 10/1 Mbps would have instead gotten fiber. Maybe even worse, CAF II basically put huge swaths of rural America on hold for seven years while the big telcos invested in minor tweaks to DSL.

The FCC will soon be handing out $20.4 billion for the new RDOF program to build better rural broadband. It should be press headlines that this money is going to many of the same areas that got the original $11 billion CAF II subsidies – the FCC is paying twice to upgrade the same areas.

Dan McCarthy, the CEO of Frontier Communications recently complained about the new RDOF grant program. He realizes that Frontier has little chance of winning the grants in a reverse auction.  Frontier doesn’t want to invest any of its cash for rural broadband and in an auction would be competing against ISPs willing to invest significant equity to match the RDOF grants. Frontier also recognizes that anything they might propose as upgrades can’t compete with technologies that will deliver speeds of 100 Mbps or faster.

At least the FCC is not handing the RDOF money directly to the big telcos again. It’s been five years since the start of CAF II and I’m still perplexed by the last FCC’s decision to hand $11 billion to the big telcos. Unfortunately, this FCC is still repeating the mistake of awarding grant money to support obsolete speeds. The FCC is proposing that RDOF money can be used to build broadband capable of delivering 25/3 Mbps broadband. In a recent blog, I predict that this is going to go into the books as another short-sighted decision by the FCC and that they’ll again be funding broadband that will be obsolete before it’s completed eight years from now. Hopefully most of the RDOF money will go towards building real broadband. Otherwise, in eight years we might see another giant FCC grant program to improve broadband for a third time in the same rural areas.

Robocalls and Small Carriers

In July, NTCA filed comments in the FCC docket that is looking at an industry-wide solution to fight against robocalls. The comments outline some major concerns about the ability of small carriers to participate in the process.

The industry solution to stop robocalls, which I have blogged about before, is being referred to as SHAKEN/STIR. This new technology will create an encrypted token that verifies that a given call really originated with the phone number listed in the caller ID. Robocalls can’t get this verification token. Today, robocallers spoof telephone numbers, meaning that they insert a calling number into the caller ID that is not real. These bad actors can make a call look like it’s coming from any number – even your own!

On phones with visual caller ID, like cellphones, a small token will appear to verify that the calling party is really from the number shown. Once the technology has been in place for a while, people will learn to ignore calls that don’t come with the token. If the industry does this right, it will become easier to spot robocalls, and I imagine a lot of people will use apps that will automaticlly block calls without a token.

NTCA is concerned that small carriers will be shut out of this system, causing huge harm to them and their customers. Several network prerequisites must be in place to handle the SHAKEN/STIR token process. First, the originating telephone switch must be digital. Most, but not all small carriers now use digital switches. Any telco or CLEC using any older non-digital switch will be shut out of the process, and to participate they’d have to buy a new digital switch. After the many-year decline in telephone customers, such a purchase might be hard to cost-justify. I’m picturing that this might also be a problem for older PBXs – the switches operated by private businesses. The world is full of large legacy PBXs operated by universities, cities, hospitals and large businesses.

Second, the SHAKEN/STIR solution is likely to require an expensive software upgrade for the carriers using digital switches. Again, due to the shrinking demand for selling voice, many small carriers are going to have a hard time justifying the cost of a software upgrade. Anybody using an off-brand digital switch (several switch vendors folded over the last decade) might not have a workable software solution.

The third requirement to participate in SHAKEN/STIR is that the entire path connecting a switch to the public switched telephone network (PSTN) must be end-to-end digital. This is a huge problem and most small telcos, CLECs, cable companies, and other carriers connect to the PSTN using the older TDM technology (based upon multiples of T1s).

You might recall a decade ago there was a big stir about what the FCC termed a ‘digital transition’. The FCC at the time wanted to migrate the whole PSTN to a digital platform largely based upon SIP trunking. While there was a huge industry effort at the time to figure out how to implement the transition, the effort quietly died and the PSTN is still largely based on TDM technology.

I have clients who have asked for digital trunking (the connection between networks) for years, but almost none of them have succeeded. The large telcos like AT&T, Verizon, and CenturyLink don’t want to spend the money at their end to put in new technology for this purpose. A request to go all-digital is either a flatly refused, or else a small carrier is told that they must pay to transport their network traffic to some distance major switching point in places like Chicago or Denver – an expensive proposition.

What happens to a company that doesn’t participate in SHAKEN/STIR? It won’t be pretty because all of the calls originating from such a carrier won’t get a token verifying that the calls are legitimate. This could be devastating to rural America. Once SHAKEN/STIR is in place for a while a lot of people will refuse to accept unverified calls – and that means calls coming from small carriers won’t be answered. This will also affect a lot of cellular calls because in rural America those calls often originate behind TDM trunking.

We already have a problem with rural call completion, meaning that there are often problems trying to place calls to rural places. If small carriers can’t participate in SHAKEN/STIR, after a time their callers will have real problems placing calls because a lot of the world won’t accept calls that are not verified with a token.

The big telcos have assured the FCC that this can be made to work. It’s my understanding that the big telcos have mistakenly told the FCC that the PSTN in the country is mostly all-digital. I can understand why the big telcos might do this because they are under tremendous pressure from the FCC and Congress to tackle the robocall issue. These big companies are only looking out for themselves and not the rest of the industry.

I already had my doubts about the SHAKEN/STIR solution because my guess is that bad actors will find a way to fake the tokens. One has to only look back at the decades-old battles against spam email and against hackers to understand that it’s going to require a back-and-forth battle for a long time to solve robocalling – the first stab of SHAKEN/STIR is not going to fix the problem. The process is even more unlikely to work if it doesn’t function for large parts of the country and for whole rural communities. The FCC needs to listen to NTCA and other rural voices and not create another disaster for rural America.

The Census Bureau and the Digital Divide

John Horrigan recently wrote an interesting article in The Daily Yonder that cited the results of a survey done by the Census Bureau. The agency conducts an annual survey called the American Community Survey (ACS) of 3.5 million households. In recent years the survey has included a few questions about broadband. The most recent ACS survey included questions about the digital divide. The results are at first glance a bit surprising.

The survey shows that more than 20.4 million homes have no broadband subscription at home. The survey shows that 5.1 million homes with no broadband connection are rural and 15.3 million homes are non-rural. Anybody who tracks rural broadband instantly doesn’t think those numbers can be right. However, the Census Bureau uses its own definition of rural which is different than the way most of the world thinks or rural versus urban.

According to the Census Bureau definition, rural is everything that is not urban. The Census bureau looks at the country by regional clusters of population. They count two kinds of urban areas – urbanized areas (UAs) are clusters with 50,000 or more people and urban clusters (UCs) which have between 2,500 and 50,000 people. Most of us would consider many of the UCs to be rural because within this category are a lot of rural county seats and the immediately surrounding areas. The Census statistics count a lot of people who live just outside of towns as urban when our industry considers homes past the last cable company connection as rural.

Horrigan interpets the results of the Census Bureau survey to mean that affordability is a bigger reason today than connectivity for why people don’t have broadband. He reached that conclusion by considering a recent Pew Research poll on the same topic that shows that more homes cite reasons other than availability as reasons they don’t have broadband.

The Pew Research survey asked households why they don’t have broadband. Respondents could supply more than one response.

  • 50% claimed that price was a major factor and 21% cited this as the primary reason.
  • 45% said that their smartphone could do everything they need.
  • 43% said they had good access to the Internet outside the home.
  • 31% said they couldn’t afford a computer.
  • Only 22% said that they couldn’t order a broadband connection, and only 7% said that was the primary reason they didn’t have broadband.

The Census Bureau also correlated their results with household income, and it’s not surprising that low-income households have a much lower broadband connection rate. The Census Bureau survey showed that only 59% of homes that make less than $20,000 per year have broadband. The subscription rate for all households making more than $20,000 is 88%.

Interestingly, the FCC doesn’t ask why people don’t have broadband. They interpret their mission to measure broadband availability and they count homes with or without broadband connections. This raises a few questions. What exactly is the FCC’s mandate from Congress – to get America has connection to reach the Internet or to make sure that America makes those broadband connections? I read the FCC’s mandate from Congress to have some of both goals. If availability is not the primary reason why homes don’t have broadband, the FCC might get more bang from their buck by putting some effort into digital inclusion programs. According to the Horrigan article, there are now more homes that can’t afford broadband than homes that don’t have a connectivity option.

This implies the need for a much-improved Lifeline Fund. The current Lifeline program is likely not making a big difference in digital inclusion. It provides a small monthly subsidy of $9.25 per month for qualifying households to save money on either their telephone bill or their broadband bill. It’s becoming increasingly hard to qualify for Lifeline because the big telcos like AT&T are backing out of the program. Some cable companies provide low-cost cable lines to homes with school students, but to nobody else – and cable companies don’t operate outside of towns.

In addition to a more effective Lifeline program, digital inclusion also means getting computers into homes that can’t afford them. I’ve written before about the non-profit group E2D that provides computers to school students in Charlotte, NC. Perhaps some of the Universal Service Fund could be used to assist effective groups like E2D to get more computers to more households.

My firm CCG conducts surveys and we’ve seen anecdotal evidence in a few recent surveys in poor rural counties that a lot of homes don’t buy the slow DSL option available to them because of price. These homes tell us that price mattered more than connectivity. I don’t have any easy answer for the best way to promote digital inclusion. But there are folks in the country who have made amazing progress in this area and perhaps the FCC should consider giving such groups some help. At a minimum, the FCC needs to recognize that now that most homes have a broadband connection that price is a major barrier for the majority of those who are not connected.

Setting the Definition of Broadband

One of the commenters on my blog asked a good question – can’t we set the definition of broadband by looking at the broadband applications used by the typical household? That sounds like a commonsense approach to the issue and is exactly what the FCC did when they set the definition of broadband to 25/3 Mbps in 2015. They looked at combinations of applications that a typical family of four might use in an evening, with the goal that a household ought to have enough broadband to comfortably do those functions at the same time. This might best be described as a technical approach to defining broadband – look at what households are really using and make sure that the definition of broadband is large enough to cover the expected usage for a typical household.

Taking this approach raises the bigger question – what should the policy be for setting the definition of broadband? I don’t know that I have any answers, but I ask the following questions:

  • The FCC largely conducted a thought experiment when setting the 25/3 definition of broadband – they didn’t try to measure the bandwidth used in the scenarios they considered. If the FCC had measured real homes doing those functions they likely would have found that bandwidth needs were different than they had estimated. Some functions use less bandwidth than they had supposed. But usage also would have been larger than they had calculated, because the FCC didn’t compensate for WiFi overheads and machine-to-machine traffic. As a household makes use of multiple simultaneous broadband functions, the WiFi networks we all use bog down when those applications collide with each other inside the home network. The busy-hour behavior of our home networks needs to be part of a mathematical approach to measuring broadband.
  • The FCC could have gotten a better answer had they hired somebody to measure evening broadband usage in a million homes. We know that broadband usage is like anything else and there are households that barely use broadband and others that use it intentsely. The idea of pinpointing the usage of a typical family is a quaint idea when what’s needed is to understand the curve of broadband usage – what’s the percentage of homes that are light, average, and heavy users. I’m sure that one of the big companies that track broadband usage could measure this somehow. But even after making such measurements we need a policy. Should the definition of broadband be set to satisfy the biggest broadband users, or something else like the medium speed used by households? Analytics can only go so far and at some point there has to be a policy. It’s not an easy policy to establish – if the definition of broadband is set anywhere below the fastest speeds used by households, then policy makers are telling some households that they use too much broadband.
  • If we are going to use measurements to determine the definition of broadband, then this also has to be an ongoing effort. If 25/3 was the right definition of broadband in 2015, how should that definition have changed when homes routinely started watching 4K video? I don’t think anybody can deny that households use more broadband each year, and homes use applications that are more data intensive. The household need for speed definitely increases over time, so any policy for setting a definition of broadband needs to recognize that the definition must grow over time.
  • One fact that is easy to forget is that the big cable companies now serve two-thirds of the broadband customers in the country, and any discussion we have about a definition of broadband is only considering how to handle the remaining one-third of broadband users. There is a good argument to be made that the cable companies already define the ‘market’ speed of broadband. The big cable companies all have minimum broadband speeds for new customers in urban markets today between 100 – 200 Mbps. The companies didn’t set these speeds in a vacuum. The cable companies have unilaterally increased speeds every 3-4 years in response to demands from their customers for faster speeds. I think there is a valid argument to be made that the market speeds used to serve two-thirds of the customers in the country should be the target broadband speed for everybody else. Any policymaker arguing that 25/3 Mbps should still be the definition of broadband is arguing that one-third of the country should settle for second-class broadband.
  • In a related argument I harken back to a policy discussion the FCC used to have when talking about broadband speeds. I can remember a decade or more ago when the FCC generally believed that rural broadband customers deserved to have access to the same speeds as urban customers. That policy was easy to support when cable networks and telco copper networks both delivered similar speeds. However, as cable broadband technology leaped ahead of copper and DSL, these discussion disappeared from the public discourse.
  • When looking at grant programs like the upcoming RDOF program, where the funded networks won’t be completed until 2027, any definition of broadband for the grants needs to look ahead to what the speeds might be like in 2027. Unfortunately, since we can’t agree on how to set the definition of broadband today, we have no context for talking about future speeds.

These are not easy questions. If the FCC was doing its job we would be having vigorous discussions on the topic. Sadly, I don’t foresee any real discussions at the FCC about the policy for setting the definition of broadband. The FCC has hunkered down and continues to support the 25/3 definition of broadband even when it’s clear that it’s grown obsolete. This FCC is unlikely to increase the definition of broadband, because in doing so they would be declaring that millions of homes have something less than broadband. It seems that our policy for setting the definition of broadband is to keep it where it is today because that’s politically expedient.

FCC – Please Don’t Fund 25/3 Broadband

The current FCC recognizes the disaster that was created when the original CAF II grant program subsidized the construction of broadband that supports speeds of only 10/1 Mbps. Several FCC commissioners have said that they don’t want to repeat that disaster. Had the CAF II grant monies been allowed for companies other than the big telcos, much of the money would have gone to fiber ISPs and we’d see a lot more areas covered with good broadband today (meaning fewer headaches for the FCC).

Today I ask the question: what speeds should the new $20.4 billion RDOF grant fund support? In the NPRM for the RDOF grant program, the FCC suggests that the minimum speed they will fund is 25/3 Mbps. It looks like the funding for these grants will start in 2021, and like the CAF II program, anybody taking the money will have six years to complete the broadband construction. I think the right way to think about the speeds for these grants is to look at likely broadband speeds at the end of the construction period in 2027, not at where the world is at two years before the RDOF is even started. If the FCC bases the program on broadband speeds today, they will be making the same error as on the original CAF II – they will use federal money to build broadband that is obsolete before it’s even constructed.

I start by referring to a recent blog where I challenge the idea that 25/3 should be the definition of broadband today. To quickly summarize that blog, we know that broadband demand has been growing constantly since the days of dial-up – and the growth in broadband demand applies to speeds as well as volume of monthly downloading. Both Cisco and Ookla have shown that broadband demand has been growing at a rate if about 21% annually for many years.

At a bare minimum, the definition of broadband today ought to be 50 Mbps download – and that definition is a minimum speed, not a goal that should be used for building tomorrow’s broadband. As I said earlier, in a world where demand continues to grow, today’s definition of broadband shouldn’t matter – what matters is the likely demand for broadband in 2027 when the RDOF networks are operational.

Trending the demand curve chart for download speeds forward presents a story that the FCC doesn’t want to hear. The need for speed is going to continue to increase. If the growth trend holds (and these trends have been steady since the days of dial-up), then the definition of broadband by 2027 ought to be 250 Mbps – meaning by then nobody should build a network that can’t meet that speed.

2019 2020 2021 2022 2023 2024 2025 2026 2027
54 65 78 95 115 139 168 204 246

The big cable companies already recognize what the FCC won’t acknowledge. The minimum speed offered to new customers on urban cable networks today is at least 100 Mbps, and most users can order a gigabit. The cable companies know that if they provide fast speeds they get a lot fewer complaints from customers. In my city of Asheville, NC, Charter unilaterally increased the speed of broadband in 2018 from 60/6 Mbps to 135/20 Mbps. Anybody who has watched the history of cable company broadband knows that they will increase speeds at least once before 2027 to stay ahead of the demand curve. It wouldn’t be surprising by 2027 if cable company minimum speeds are 300 – 500 Mbps. Do we really want to be funding 25/3 rural broadband when speeds in cities will be fifteen times faster?

Will the world behave exactly like this chart – not likely. But will homes in 2027 be happy with 25/3 Mbps broadband – most definitely not. Given a choice, homes don’t even want 25/3 Mbps broadband today. We are already seeing hordes of urban customers abandoning urban DSL that delivers speeds between 25 Mbps and 50 Mbps.

If the FCC funds 25/3 Mbps broadband in the RDOF grant they will be duplicating one of the dumbest FCC decisions ever made – when CAF II funded 10/1 Mbps broadband. The FCC will be funding networks that are massively obsolete before they are even built, and they will be spending scarce federal dollars to again not solve the rural digital divide. There will continue to be cries from rural America to bring real broadband that works and by 2027 we’ll probably be talking about CAF IV grants to try this all over again.

The Definition of Broadband

When the FCC set the definition of broadband at 25/3 Mbps in January of 2015, I thought it was a reasonable definition. At the time the FCC said that 25/3 Mbps was the minimum speed that defined broadband, and anything faster than 25/3 Mbps was considered to be broadband, and anything slower wasn’t broadband.

2015 was forever ago in terms of broadband usage and there have been speed increases across the industry since then. All of the big cable companies have unilaterally increased their base broadband speeds to between 100 Mbps and 200 Mbps. Numerous small telcos have upgraded their copper networks to fiber. Even the big telcos have increased speeds in rural America through CAF II upgrades that increased speeds to 10/1 Mbps – and the telcos all say they did much better in some places.

The easiest way to look at the right definition of broadband today is to begin with the 25/3 Mbps level set at the beginning of 2015. If that was a reasonable definition at the beginning of 2015, what’s a reasonable definition today? Both Cisco and Ookla track actual speeds achieved by households and both say that actual broadband speeds have been increasing nationally about 21% annually. Apply a 21% annual growth rate to the 25 Mbps download speeds set in 2015 would predict that the definition of broadband today should be 54 Mbps:

2015 2016 2017 2018 2019
25 30 37 44 54

We also have a lot of anecdotal evidence that households want faster speeds. Households have been regularly bailing on urban DSL and moving to faster cable company broadband. A lot of urban DSL can be delivered at speeds between 25 and 50 Mbps, and many homes are finding that to be inadequate. Unfortunately, the big telcos aren’t going to provide the detail needed to understand this phenomenon, but it’s clearly been happening on a big scale.

It’s a little sketchier to apply this same logic to upload speeds. There was a lot of disagreement about using the 3 Mbps download speed standard established in 2015. It seems to have been set to mollify the cable companies that wanted to assign most of their bandwidth to download. However, since 2015 most of the big cable companies have upgraded to DOCSIS 3.1 and they can now provide significantly faster uploads. My home broadband was upgraded by Charter in 2018 from 60/6 Mbps to 135/20 Mbps. It seems ridiculous to keep upload speed goals low, and if I was magically put onto the FCC, I wouldn’t support an upload speed goal of less than 20 Mbps.

You may recall that the FCC justified the 25/3 Mbps definition of broadband by looking at the various download functions that could be done by a family of four. The FCC examined numerous scenarios that considered uses like video streaming, surfing the web, and gaming. The FCC scenario was naive because they didn’t account for the fact that the vast majority of homes use WiFi. Most people don’t realize that WiFi networks generate a lot of overhead due to collisions of data streams – particularly when a household is trying to do multiple big bandwidth applications at the same time. When I made my judgment about the 25/3 Mbps definition back in 2015, I accounted for WiFi overheads and I still thought that 25/3 Mbps was a reasonable definition for the minimum speed of broadband.

Unfortunately, this FCC is never going to unilaterally increase the definition of broadband, because by doing so they would reclassify millions of homes as not having broadband. The FCC’s broadband maps are dreadful, but even with the bad data, it’s obvious that if the definition of broadband was 50/20 Mbps today that a huge number of homes would fall below that target.

The big problem with the failure to recognize the realities of household broadband demand is that the FCC is using the already-obsolete definition of 25/3 Mbps to make policy decisions. I have a follow-up blog to this one that will argue that using that speed as the definition of the upcoming $20.4 billion RDOF grants will be as big of a disaster as the prior FCC decision to hand out billions to upgrade to 10/1 Mbps DSL in the CAF II program.

The fact that household broadband demand grows over time is not news. We have been on roughly the same demand curve growth since the advent of dial-up. It’s massively frustrating to see politics interfere with what is a straight engineering issue. As homes use more broadband, particularly when they want to do multiple broadband tasks at the same time, their demand for faster broadband grows. I can understand that no administration wants to recognize that things are worse than they want them to be – so they don’t want to set the definition of broadband at the right speed. But it’s disappointing to see when the function of the FCC is supposed to be to make sure that America gets the broadband infrastructure it needs. If the agency was operated by technologists instead of political appointees we wouldn’t even be having this debate.

FCC Proposes Rules for $20.4 Billion Broadband Grants

On August 2 the FCC released a Notice of Proposed Rulemaking (NPRM) that proposes rules for the upcoming grant program that will award $20.4 billion for rural broadband. Since every FCC program needs a name, this grant program is now designated as the Rural Digital Opportunity Fund (RDOF). An NPRM is theoretically only a list of suggestions by the FCC, and there is a comment period that will commence 30 days after the NPRM is posted in the Federal Register. However, realistically, the rules that are proposed in the NPRM are likely to be the rules of the grant program. Here are a few of the highlights:

Timing of Award. The FCC proposes awarding the money in two phases. The Phase I award will be awarded late next year and will award over $16 billion. The Phase II will award will follow and award the remaining $4.4 billion. I know a lot of folks were hoping for a $2 billion annual grant award – but most of the money will be awarded next year. Anybody interested in this program should already be creating a network design and a financial business plan because the industry resources to create business plans are going to soon be too busy to help.

The money will be paid out to grant recipients over 10 years, similar to the ACAM program for small telcos. Grant recipients need to understand the time value of money. If an ISP wins a $1 million grant and borrows money at a rate of 5.5% interest, then the actual value of the grant in today’s dollars is a little more than $750,000.

Areas Eligible for Award. The Phase I auction will only be awarded in areas that are wholly unserved using the definition of broadband as 25/3 Mbps or faster. The areas covered can’t have anybody capable of getting broadband faster than that. The FCC is likely to publish a list of areas eligible for the Phase I grants. Unfortunately, the FCC will use its flawed mapping program to make this determination. This is likely to mean that many parts of the country that ought to be eligible for these grants might not be part of the program.

Phase II is likely to be targeted at areas that did not see awards in Phase I. One of the open questions in the NPRM that is not yet firm is the size of award areas. The NPRM asks if the minimum coverage area should be a census block or a county. It also asks if applicants can bundle multiple areas into one grant request.

The FCC is considering prioritizing areas it thinks are particularly needy. For example, it may give extra grant weighting to areas that don’t yet have 10/1 Mbps broadband. The FCC is also planning on giving extra weighting to some tribal areas.

Weighting for Technology. Like with the CAF II reverse auction, the grant program is going to try to give priority to faster broadband technologies. The FCC is proposing extra weighting for technologies that can deliver at least 100 Mbps and even more weighting for technologies that can deliver gigabit speeds. They are also proposing a grant disincentive for technologies with a latency greater than 100 milliseconds.

Use of Funds. Recipients will be expected to complete construction to 40% of the grant eligible households by the end of the third year, with 20% more expected annually and the whole buildout to be finished by the end of the sixth year.

Reverse Auction. The FCC is proposing a multi-round, descending clock reverse auction so that bidders who are willing to accept the lowest amount of subsidy per passing will win the awards. This is the same process used in the CAF II reverse auctions.

Overall Eligibility. It looks like the same rules for eligibility will apply as with previous grants. Applicants must be able to obtain Eligible Telecommunications Carrier (ETC) status to apply, meaning they must be a facilities-based retail ISP. This will exclude entities such as open access networks where the network owner is a different entity than the ISP. Applicants will also need to have a financial track record, meaning start-up companies need not apply. Applicants must also provide proof of financing.

Measurement Requirements. Grant winners will be subject to controlled speed tests to see if they are delivering what was promised. The FCC is asking if they should keep the current test – where only 70% of customers must meet the speed requirements for an applicant to keep full funding.

I see problems with a few of these requirements that I’ll cover in upcoming blogs.