Gaining Access to Multi-tenant Buildings

In 2007 the FCC banned certain kinds of exclusivity arrangements between ISPs and owners of multi-tenant buildings. At the time of the order, the big cable companies had signed contracts with apartment owners giving them exclusive access to buildings. The FCC order in 2007 got rid of the most egregious types of contracts – in many cases, cable company contracts were so convoluted that building owners didn’t even understand the agreements were exclusive.

However, the FCC order was still a far cry away from ordering open access for ISPs to buildings and there are many landlords still today who won’t allow in competitors. The most common arrangements liked by landlords are revenue share arrangements where the building owner makes money from an arrangement with an ISP. While such arrangements aren’t legally exclusive, they can be lucrative enough to make landlords favor an ISP and give them exclusive access.

WISPA, the industry association for wireless ISPs has asked the FCC to force apartment owners to allow access to multiple ISPs. WISPA conducted a survey of its members and found that wireless companies are routinely denied access to apartment buildings. Some of the reasons for denying access include:

  • Existing arrangements with ISPs that make the landlord not want to grant access to an additional ISP.
  • Apartment owners often deny access because wireless ISPs (WISPs) are often not considered to be telephone or cable companies – many WISPs offer only broadband and have no official regulatory status.
  • Building owners often say that an existing ISP serving the building has exclusive rights to the existing wiring, including conduits that might be used to string new wiring to reach units. This is often the case if the original cable or telephone company paid for the inside wiring when the building was first constructed.
  • Many landlords say that they already have an existing marketing arrangement with an ISP, meaning they get rewarded for sending tenants to that ISP.
  • Many landlords will only consider revenue sharing arrangements since that’s what they have with an existing ISP. Some landlords have even insisted on a WISP signing a revenue-sharing arrangement even before negotiating and talking pricing and logistics.

These objections by landlords fall into two categories. One is compensation-based where a landlord is happy with the financial status quo relationship with an existing ISP. The other primary reason is some contractual relationship with an existing ISP that is hard or impossible for a landlord to preempt.

The concerns of WISPs are all valid, and in fact, the same list can be made by companies that want to build fiber to apartment buildings. However, landlords seem more open to fiber-based ISPs since saying that their building has fiber adds cachet and is valued by many tenants.

WISPs sometimes have unusual issues not faced by other ISP overbuilders. For example, one common wireless model is to beam broadband to a roof of an apartment building. That presents a challenge for gaining access to apartments since inside wiring generally begins in a communications space at the base of a building.

The issue is further clouded by the long history of FCC regulation of inside wiring. The topic of ownership and rights for inside wiring has been debated in various dockets since the 1990s and there are regulatory rulings that can give ammunition to both sides of wiring arguments.

The WISPs are facing an antagonistic FCC on this issue. The agency recently preempted a San Francisco ordinance that would have made all apartment buildings open access – meaning available to any ISP. This FCC has been siding with large incumbent cable and telephone companies on most issues and is not likely to go against them by allowing open access to all apartment buildings.

The Digital Redlining of Dallas

In 2018 Dr Brian Whitacre, an economist from Oklahoma State University looked in detail at the broadband offered by AT&T in Dallas County, Texas. It’s an interesting county in that it includes all of the City of Dallas as well as wealthy suburban areas. Dr. Whitaker concluded that AT&T has engaged for years in digital redlining – in providing faster broadband only in the more affluent parts of the area.

Dr. Whitaker looked in detail at AT&T’s 477 data at the end of 2017 provided to the FCC. AT&T reports the technology used in each census blocks as well as the ‘up-to’ maximum speed offered in each census block.

AT&T offers three technologies in Dallas county:

  • Fiber-to-the-home with markets speeds up to 1 Gbps download. AT&T offers fiber in 6,287 out of 23,463 census blocks (26.8% of the county). The average maximum speed offered in these census blocks in late 2017 according to the 477 data was 300 Mbps.
  • VDSL, which brings fiber deep into neighborhoods, and which in Dallas offers speeds as fast as 75 Mbps download. AT&T offers this in 10,399 census blocks in Dallas (44.3% of the county). AT&T list census blocks with maximum speeds of 18, 24, 45, and 75 Mbps. The average maximum speed listed in the 477 data is 56 Mbps.
  • ADSL2 or ADSL2+, which is one of the earliest forms of DSL and is mostly deployed from central offices. The technology theoretically delivers speeds up to 24 Mbps but decreases rapidly for customers more than a mile from a central office. AT&T still uses ADSL2 in 6,777 census blocks (28.9% of the county). They list the maximum speeds of various census blocks at 3, 6, 12, and 18 Mbps. The average speed of all ADSL2 census blocks is 7.26 Mbps.

It’s worth noting before going further that the above speed differences, while dramatic, doesn’t tell the whole story. The older ADSL technology has a dramatic drop in customer speeds with distances and speeds are also influenced by the quality of the copper wires. Dr. Whitaker noted that he had anecdotal evidence that some of the homes that were listed as having 3 Mbps of 6 Mbps might have speeds under 1 Mbps.

Dr. Whitaker then overlaid the broadband availability against poverty levels in the county. His analysis started by looking at Census blocks have at least 35% of households below the poverty level. In Dallas County, 6,777 census blocks have poverty rates of 35% or higher.

The findings were as follows:

  • Areas with high poverty were twice as likely to be served by ADSL – 56% of high-poverty areas versus 24% of other parts of the city.
  • VDSL coverage was also roughly 2:1 with 25% of areas with high poverty served by VDSL while 48% of the rest of the city had VDSL.
  • Surprisingly, 19% of census blocks with high poverty were served with fiber. I’m going to conjecture that this might include large apartment complexes where AT&T delivers one fiber to the whole complex – which is not the same product as fiber-to-the-home.

It’s worth noting that the findings are somewhat dated and rely upon 477 data from November 2017. AT&T has not likely upgraded any DSL since then, but they have been installing fiber in more neighborhoods over the last two years in a construction effort that recently concluded. It would be interesting to see if the newer fiber also went to more affluent neighborhoods.

I don’t know that I can write a better conclusion of the findings than the one written by Dr. Whitacre: “The analysis for Dallas demonstrates that AT&T has withheld fiber-enhanced broadband improvements from most Dallas neighborhoods with high poverty rates, relegating them to Internet access services which are vastly inferior to the services enjoyed by their counterparts nearby in the higher-income Dallas suburbs…”

This study was done as a follow-up to work done earlier in Cleveland, Ohio and this same situation can likely be found in almost every large city in the country. It’s not hard to understand why ISPs like AT&T do this – they want to maximize the return on their investment. But this kind of redlining is not in the public interest and is possibly the best argument that can be made for regulating broadband networks. We regulated telephone companies since 1932, and that regulation resulted in the US having the best telephone networks in the world. But we’ve decided to not regulate broadband in the same way, and until we change that decision we’re going to have patchwork networks that create side-by-side haves and have-nots.

Unlicensed Millimeter Wave Spectrum

I haven’t seen it talked about a lot, but the FCC has set aside millimeter wave spectrum that can be used by anybody to provide broadband. That means that entities will be able to use the spectrum in rural America in areas that the big cellphone companies are likely to ignore.

The FCC set aside the V band (60 GHz) as unlicensed spectrum. This band provides 14 GHz of contiguous spectrum available for anybody to use. This is an interesting spectrum because it has a few drawbacks. This particular spectrum shares a natural harmonic with oxygen and thus is more likely to be absorbed in an open environment than other bands of millimeter wave spectrum. In practice, this will shorten bandwidth delivery distances a bit for the V band.

The FCC also established the E band (70/80 GHz) for public use. This spectrum will have a few more rules than the 60 GHz spectrum and there are light licensing requirements for the spectrum. These licenses are fairly easy to get for carriers, but it’s not so obvious that anybody else can get the spectrum. The FCC will get involved with interference issues with the spectrum – but the short carriage distances of the spectrum make interference somewhat theoretical.

There are several possible uses for the millimeter wave spectrum. First, it can be focused in a beam and used to deliver 1-2 gigabits of broadband for up to a few miles. There have been 60 GHz radios on the market for several years that operate for point-to-point connections. These are mostly used to beam gigabit broadband in places where that’s cheaper than building fiber, like on college campuses or in downtown highrises.

This spectrum can also be used as hotspots, as is being done by Verizon in cities. In the Verizon application, the millimeter wave spectrum is put on pole-mounted transmitters in downtown areas to deliver data to cellphones as fast as 1 Gbps. This can also be deployed in more traditional hot spots like coffee shops. The problem of using 60 GHz spectrum for this use is that there are almost no devices yet that can receive the signal. This isn’t going to get widespread acceptance until somebody builds this into laptops or develops a cheap dongle. My guess is that cellphone makers will ignore 60 GHz in favor or the licensed bands owned by the cellular providers.

The spectrum could also be used to create wireless fiber-to-the-curb like was demonstrated by Verizon in a few neighborhoods in Sacramento and a few other cities earlier this year. The company is delivering residential broadband at speeds of around 300 Mbps. These two frequency bands are higher than what Verizon is using and so won’t carry as far from the curb to homes, so we’ll have to wait until somebody tests this to see if it’s feasible. The big cost of this business plan will still be the cost of building the fiber to feed the transmitters.

The really interesting use of the spectrum is for indoor hot spots. The spectrum can easily deliver multiple gigabits of speed within a room, and unlike WiFi spectrum won’t go through walls and interfere with neighboring rooms. This spectrum would eliminate many of the problems with WiFi in homes and in apartment buildings – but again, this needs to first be built into laptops, sart TVs and other devices.

Unfortunately, the vendors in the industry are currently focused on developing equipment for the licensed spectrum that the big cellular companies will be using. You can’t blame the vendors for concentrating their efforts in the 24, 28, and 39 GHz ranges before looking at these alternate bands. There is always a bit of a catch 22 when introducing any new spectrum – a vendor needs to make the equipment available before anybody can try it, and vendors won’t make the equipment until they have a proven market.

Electronics for millimeter wave spectrum is not as easily created as equipment in lower frequency bands. For instance, in the lower spectrum bands, software-defined radios can easily change between nearby frequencies with no modification of hardware. However, each band of millimeter wave spectrum has different operating characteristics and specific antenna requirements and it’s not nearly as easy to shift between a 39 GHz radio and a 60 GHz radio – they requirements are different for each.

And that means that equipment vendors will need to enter the market if these spectrum bands are ever going to find widespread public use. Hopefully, vendors will find this worth their while because this is a new WiFi opportunity. Wireless vendors have made their living in the WiFi space and they need to be convinced that they have the same with these widely available spectrum bands. I believe that if some vendor builds indoor multi-gigabit routers and receivers, the users will come.

Broadband Price Increases

Back in late 2017 Wall Street analyst Jonathan Chaplin of New Street predicted that ISPs would begin flexing their market power and within three or four years would raise broadband rates to $100. His prediction was a little aggressive, but not by much. He also predicted that we’re going to start seeing perpetual annual broadband rate increases.

Stop the Cap! reports that Charter will be raising rates in September, only ten months aftertheir last rate increase in November 2018. The company will be increasing the price of unbundled broadband by $4 per month from $65.99 to $69.99.  Charter is also increasing the cost of using their WiFi modem from $5.00 to $7.99. This brings their total cost of standalone broadband for their base product (between 100 – 200 Mbps) with WiFi to $78.98, up from $70.99. Charter also announced substantial price increases for cable TV.

Even with this rate increase Charter still has the lowest prices for standalone broadband among the major cable companies. Stop the Cap! reports that the base standalone broadband product plus WiFi costs $93 with Comcast, $95 with Cox and $106.50 with Mediacom.

Of course, not everybody pays those full standalone prices. In most markets we’ve studied, around 70% of customers bundle products and get bundling discounts. However, the latest industry statistics show that millions of customers are now cutting the cord annually and will be losing those discounts and will face the standalone broadband prices.

MoffettNathenson LLC, the leading analysts in the industry, recently compared the average revenue per user (ARPU) for four large cable companies – Comcast, Charter, Altice and Cable ONE. The most recent ARPU for the four companies are: Comcast ($60.86), Charter ($56.57), Altice ($64.58), and Cable One ($71.80). You might wonder why the ARPU is so much lower than the price of standalone broadband. Some of the difference is from bundling and promotional discounts. There are also customers on older, slower, and cheaper broadband products who are hanging on to their old bargain prices.

The four companies have seen broadband revenue growth over the last two years between 8.1% and 12%. The reason for the revenue growth varies by company. A lot of the revenue growth at Comcast and Charter still comes from broadband customer growth and both companies added over 200,000 new customers in the second quarter of this year. In the second quarter, Comcast grew at an annualized rate of 3.2% and Charter grew at 4%. This contrasts with the smaller growth at Altice (1.2%) and Cable ONE (2%), and the rest of the cable industry.

The ARPU for these companies increased for several reasons beyond customer growth. Each of the four companies has had at least one rate increase during the last two years. Some of the ARPU growth comes from cord cutters who lose their bundling discount.

For the four cable companies:

  • Comcast revenues grew by 9.4% over the two years and that came from a 4.4% growth in ARPU and 5% due to subscriber growth.
  • Charter broadband revenues grew by 8.1% over two years. That came from a 3.2% increase in ARPU and 4.9% due to subscriber growth.
  • Altice saw a 12% growth in broadband revenues over two years that comes from a 9.8% growth in ARPU and 2.2% due to customer growth.
  • Cable ONE saw a 9.7% growth in broadband revenues over two years due to a 7.5% growth in ARPU and 2.2% increase due to customer growth.

Altice’s story is perhaps the most interesting and offers a lesson for the rest of the industry. The company says that it persuades 80% of new cord cutters to upgrade to a faster broadband product. This tells us that homes cutting the cord believe they’ll use more broadband and are open to the idea of buying a more robust broadband product. This is something I hope all of my clients reading this blog will notice.

Cable ONE took a different approach. They have been purposefully raising cable cable prices for the last few years and do nothing to try to save customers from dropping the cable product. The company is benefitting largely from the increases due to customers who are giving up their bundling discount.

MoffettNathanson also interprets these numbers to indicate that we will be seeing more rate increases in the future. Broadband growth is slowing for the whole industry, including Comcast and Charter. This means that for most cable companies, the only way to continue to grow revenues and margins will be by broadband rate increases. After seeing this analysis, I expect more companies will put effort into upselling cord cutters to faster broadband, but ultimately these large companies will have to raise broadband rates annually to meet Wall Street earnings expectations.

Testing the FCC Maps

USTelecom has been advocating the use of geocoding to make broadband maps more accurate. As part of that advocacy, the association tested their idea by looking at the FCC mapping in parts of Virginia and Missouri.

What they found was not surprising, but still shocking. They found in those two states that as many as 38% of households in rural census blocks were classified as being served, when in fact they were unserved. In FCC-speak, served is a home that has broadband available of 25/3 Mbps or faster. Unserved means homes having either no broadband available or that can buy broadband slower than 10/1 Mbps.

This distinction has huge significance for the industry. First, it’s been clear that the FCC has been overcounting the number of homes that have broadband. But far worse, the FCC has been awarding grants to provide faster broadband in unserved areas and all of the places that have been misclassified have not been eligible for grants. We’re about to enter the biggest grant program ever that will award $20.4 billion, but only to places that don’t have 25/3 Mbps speeds – meaning these misclassified homes will be left out again if the maps aren’t fixed soon.

The USTelecom effort is not even complete since several cable companies in the state did not participate in the trial – and this might mean that the percentage of homes that are misclassified is even larger. The misclassified homes are likely going to be those in census blocks that also contain at least some homes with fast broadband. Homes just past where the cable company networks start might be listed as being capable of buying a gigabit, and yet have no broadband option.

The existing FCC maps use data that is reported by ISPs using the Form 477 process. In that process, ISPs report speed availability by census block. There are two huge flaws with this reporting method. First, if even one customer in the census block can get fast broadband, then the whole census block is assumed to have fast broadband. Second, many ISPs have been reporting marketing speeds instead of actual speeds, and so there are whole census blocks counted as served when nobody can get real broadband.

The trial also uncovered other problems. The ISPs have not been accurate in counting homes by census block. Many ISPs have never accurately mapped their customers, and so the test found numerous examples of customers reported in the wrong census blocks. Additionally, the counts of buildings by census block are often far off, due in part to the confusing nature of rural addresses.

The bottom line is that the FCC has been collecting and reporting highly inaccurate data concerning rural broadband. We’ve known this for a long time because there have been numerous efforts to test the maps in smaller geographic areas that have highlighted these same mistakes. We also have evidence from Microsoft that shows that a huge number of homes are not connected to the Internet at speeds of at least 25/3 Mbps. That’s not just a rural issue, and for the Microsoft numbers to be true there must be a massive number of urban homes that are getting speeds slower than what is being reported to the FCC.

As dramatic as this finding is from USTelecom, it doesn’t tell the whole story. Unfortunately, no mapping strategy is going to be able to truthfully report the broadband speeds for DSL and fixed wireless. The speed of these products varies by home. Further, there is no way to know if a given home can utilize these technologies until somebody tries to connect them. Perhaps this isn’t important for DSL since there is almost no rural DSL capable of delivering 25/3 Mbps broadband. But any mapping of the coverage area of fixed wireless is going to be suspect since many homes are impeded from seeing a tranmitting antenna or else receive slower speeds than their neighbors due to impediments. The USTelecom effort is mostly fixing the boundary issues where homes are assumed to have broadband today but don’t. The 38% misreporting would be much higher if we could somehow magically know the real capabilities of DSL and fixed wireless.

The current FCC didn’t create this problem – it goes back several FCCs ago to the start of the 477 reporting system. However, I have to wonder if this FCC will change its mind about the status of rural broadband in the country even with better maps. The current FCC released broadband data for 2016 that included a huge error. A new ISP, Barrier Free had reported serving 25/3 broadband in census blocks covering 62 million people, when in June of that year the company didn’t yet have any customers. The FCC gleefully reported that the number of homes without broadband had dropped by 25%, mostly due to this reporting error. Even after correcting the error the FCC still declared that broadband in rural America was on the right trajectory and didn’t need any extraordinary effort from the FCC. I’m sure they will decide that rural broadband is fine, even if the number of unserved homes jumps significantly due to better mapping.

Comparing FCC Broadband Programs

I think it’s finally dawning on the big telcos that the days of being able to milk revenues from rural America while ignoring rural copper networks is finally ending. This becomes apparent when looking at the two most recent subsidy programs.

The original CAF II program was a huge boon to the big telcos. Companies like AT&T, CenturyLink, and Frontier collected $11 billion of subsidy to boost their rural copper networks up to speeds of at least 10/1 Mbps. This was a ridiculous program from the start since the FCC had established the definition of broadband to be at least 25/3 Mbps even before awarding this money. Perhaps the craziest thing about CAF II is that the telcos are still making the upgrades – they were required to be 60% complete with the required CAF II upgrades by the end 2018 and to be 100% complete by the end of 2020.

The big telcos report broadband customers to both the FCC and to stockholders, but the reporting is not in enough detail to know if the CAF II money has made any difference in rural America. All of the big telcos are losing broadband customers, but it’s hard to look under the hood to know if they are making any significant customer gains in the CAF II areas. We see little hints from time to time. For example, in the second quarter of this year, CenturyLink lost 56,000 net broadband customers but reports that it lost 78,000 customers with speeds below 20 Mbps and added 22,000 customers with speeds faster than that. That’s the first time they provided any color about their gains and losses. But even that extra detail doesn’t tell us how CenturyLink is doing in the CAF II areas. It’s obvious by looking at the customer losses that telcos aren’t adding the hundreds of thousands of new customers one would expect to see as the result of an $11 billion capital expenditure program. If CAF II is delivering broadband to areas that didn’t have it before, there should be a flood of new rural customers buying better broadband by now. I could be wrong, but when looking at the aggregate customers for each big telco I don’t think that flood of new customers is happening. If it was I think the telcos would be bragging about it.

The CAF II reverse auction took a different approach and awarded funding in those areas where the big telcos didn’t take the original CAF II funds. These subsidies were auctioned off in a reverse auction where the company willing to take the lowest amount of subsidy per customer got the funding. In the auction, most bidders offered to deploy broadband of 100 Mbps speeds or faster – a big contrast to the 10/1 Mbps speeds for CAF II. Some of the grant winners in the reverse auction like electric cooperatives are using the money to build fiber and offer gigabit speeds.

The original CAF II subsidy awards are probably the dumbest decision I’ve ever seen an FCC make (rivaling the recent decision to stop regulating broadband). If the original CAF II awards had been open to all applicants instead of being handed to the big telcos, then many of the homes that have been upgraded to 10/1 Mbps would have instead gotten fiber. Maybe even worse, CAF II basically put huge swaths of rural America on hold for seven years while the big telcos invested in minor tweaks to DSL.

The FCC will soon be handing out $20.4 billion for the new RDOF program to build better rural broadband. It should be press headlines that this money is going to many of the same areas that got the original $11 billion CAF II subsidies – the FCC is paying twice to upgrade the same areas.

Dan McCarthy, the CEO of Frontier Communications recently complained about the new RDOF grant program. He realizes that Frontier has little chance of winning the grants in a reverse auction.  Frontier doesn’t want to invest any of its cash for rural broadband and in an auction would be competing against ISPs willing to invest significant equity to match the RDOF grants. Frontier also recognizes that anything they might propose as upgrades can’t compete with technologies that will deliver speeds of 100 Mbps or faster.

At least the FCC is not handing the RDOF money directly to the big telcos again. It’s been five years since the start of CAF II and I’m still perplexed by the last FCC’s decision to hand $11 billion to the big telcos. Unfortunately, this FCC is still repeating the mistake of awarding grant money to support obsolete speeds. The FCC is proposing that RDOF money can be used to build broadband capable of delivering 25/3 Mbps broadband. In a recent blog, I predict that this is going to go into the books as another short-sighted decision by the FCC and that they’ll again be funding broadband that will be obsolete before it’s completed eight years from now. Hopefully most of the RDOF money will go towards building real broadband. Otherwise, in eight years we might see another giant FCC grant program to improve broadband for a third time in the same rural areas.

Cognitive Bias when Selling Broadband

One of the most successful ways to sell broadband on a newly constructed network is door-to-door sales. I know of numerous fiber overbuilders who have has great success with this sales method. Companies that sell this way all say that some salespeople do better than others.

I’m not going to cover the giant topic of sales training in a short blog, but I’ve been reading about one aspect of the sales process recently. Social scientists have been doing a lot of research into the topic of cognitive bias – the ways that brains take shortcuts to avoid having to do hard thinking all of the time. When somebody answers the door for a sales call, they often automatically react to the salesperson with various kinds of cognitive bias behavior that has to be overcome to make a sale. We’ve all heard that some people are natural-born salespeople, and that means that they have the talent of recognizing cognitive bias. Luckily, this is something that can also be learned.

Here are some examples of the most common kinds of cognitive bias encountered in door-to-door sales.

Attentional Bias. This is the bias where somebody’s actions are affected by the memory of experiences they’ve had in similar circumstances. A salesperson must react quickly if they hear, “I don’t buy from door-to-door salesmen” before the door is slammed in their face.

Confirmation Bias. This is when a person embraces their existing beliefs. Somebody who thinks they already did a great job in picking their current broadband product and provider might not want to admit that they could have done better.

Status Quo Bias. This is a natural desire to keep things the same. This might be the most common objection to buying faster broadband – “If it ain’t broke, don’t fix it.”

Reactance. Reactance is the natural impulse to do the opposite of what you’re told to maintain a sense of independence. In the sales process, this manifests as somebody who perceives the salesperson as pushy and who then reacts to the salesperson rather than to the sales presentation.

Loss Aversion. This is the brain’s natural tendency to fear losses more than gains. This manifests in a fiber sales if the expected installation and conversion process is perceived to be a bigger hassle than the resultant benefit from better broadband.

Bandwagon Effect. This is a bias that can work in a salesperson’s benefit. People tend to be influenced by what their neighbors do, so being able to tell them that their neighbors have already purchased can overcome reluctance. But this can work against a new market entrant if people perceive that sticking with the brand name incumbent is the consensus choice.

Ambiguity Effect. This is the tendency for the brain to avoid scenarios where the outcome is uncertain. Even if customers accept the benefits of faster broadband, they might worry about network outages or the responsiveness of the new provider in responding to customer service calls. Uncertainty can stop them from making a change.

Selection Bias. This is the natural tendency of the brain to notice more of something when it’s brought to their attention. In the sales process, this might mean that a potential customer will assume that your product has many of the same flaws and problems as their current product.

Mere Exposure Effect. This is the tendency for the brain to accept something if they are already familiar with it. This is why companies do brand advertising in a market along with door-to-door sales. A lot of potential customers are a lot more comfortable considering a new broadband product if they are already familiar with the new provider. This is also why ISPs often do well when moving into neighboring communities where many people already know thier name and have heard about them.

The Busy Skies

I was looking over the stated goals of the broadband satellite companies and was struck by the sheer numbers of satellites that are being planned. The table further down in the blog shows plans for nearly 15,000 new satellites.

To put this into perspective, consider the number of satellites ever shot into space. The United Nations Office for Outer Space Affairs (NOOSA) has been tracking space launches for decades. They report that there have been 8,378 objects put into space since the first Sputnik in 1957. As of the beginning of 2019, there were still 4,987 satellites still in orbit, although only 1,957 were still operational.

There was an average of 131 satellites launched per year between 1964 and 2012. Since 2012 we’ve seen 1,731 new satellites, with 2017 (453) and 2018 (382) seeing the most satellites put into space.

The logistics for getting this many new satellites into space is daunting. We’ve already seen OneWeb fall behind schedule. In addition to these satellites, there will continue to be numerous satellites launched for other purposes. I note that a few hundred of these are already in orbit. In the following table, “Current” means satellites that are planned for the next 3-4 years.

Current Future Total
SkyLink 4,425 7,528 11,953
OneWeb 650 1,260 1,910
Telesat 117 512 629
Samsung 4,600 4,600
Kuiper 3,326 3,326
Boeing 147 147
Kepler 140 140
LeoSat 78 30 108
Iridium Next 66 66
SES 03B 27 27
Facebook 1 1
 Total 5,192 9,300 14,492

While space is a big place, there are some interesting challenges from having this many new objects in orbit. One of the biggest concerns is space debris. Low earth satellites travel at a speed of about 17,500 miles per hour to maintain orbit. When satellites collide at that speed, they create a large number of new pieces of space junk, also traveling at high speed. NASA estimates there are currently over 128 million pieces of orbiting debris smaller than 1 square centimeter and 900,000 objects between 1 and 10 square centimeters.

NASA scientist Donald Kessler described the dangers of space debris in 1978 in what’s now described as the Kessler syndrome. Every space collision creates more debris and eventually there will be a cloud of circling debris that will make it nearly impossible to maintain satellites in space. While scientists think that such a cloud is almost inevitable, some worry that a major collision between two large satellites, or malicious destruction by a bad actor government could accelerate the process and could quickly knock out all of the satellites in a given orbit. It would be ironic if the world solves the rural broadband problem using satellites, only to see those satellites disappear a cloud of debris.

Having so many satellites in orbit also concerns another group of scientists. The International Dark Sky Association has been fighting against light pollution that makes it hard to use earth-based telescopes. The group now also warns that a large number of new satellites will forever change our night sky. From any given spot on the Earth, the human eye can see roughly 1,300 visible stars. These satellites are all visible and once launched, mankind will never again see the natural sky that doesn’t contains numerous satellites at any given moment.

Satellite broadband is an exciting idea. The concept of bringing good broadband to remote people, to ships, and to airplanes is enticing. For example, the company Kepler listed above is today connecting to monitors for scientific purposes in places like lips of volcanos and on ocean buoys and is helping us to better understand our world. However, in launching huge numbers of satellites for broadband we’re possibly polluting space in a way that could make it unusable for future generations.

Robocalls and Small Carriers

In July, NTCA filed comments in the FCC docket that is looking at an industry-wide solution to fight against robocalls. The comments outline some major concerns about the ability of small carriers to participate in the process.

The industry solution to stop robocalls, which I have blogged about before, is being referred to as SHAKEN/STIR. This new technology will create an encrypted token that verifies that a given call really originated with the phone number listed in the caller ID. Robocalls can’t get this verification token. Today, robocallers spoof telephone numbers, meaning that they insert a calling number into the caller ID that is not real. These bad actors can make a call look like it’s coming from any number – even your own!

On phones with visual caller ID, like cellphones, a small token will appear to verify that the calling party is really from the number shown. Once the technology has been in place for a while, people will learn to ignore calls that don’t come with the token. If the industry does this right, it will become easier to spot robocalls, and I imagine a lot of people will use apps that will automaticlly block calls without a token.

NTCA is concerned that small carriers will be shut out of this system, causing huge harm to them and their customers. Several network prerequisites must be in place to handle the SHAKEN/STIR token process. First, the originating telephone switch must be digital. Most, but not all small carriers now use digital switches. Any telco or CLEC using any older non-digital switch will be shut out of the process, and to participate they’d have to buy a new digital switch. After the many-year decline in telephone customers, such a purchase might be hard to cost-justify. I’m picturing that this might also be a problem for older PBXs – the switches operated by private businesses. The world is full of large legacy PBXs operated by universities, cities, hospitals and large businesses.

Second, the SHAKEN/STIR solution is likely to require an expensive software upgrade for the carriers using digital switches. Again, due to the shrinking demand for selling voice, many small carriers are going to have a hard time justifying the cost of a software upgrade. Anybody using an off-brand digital switch (several switch vendors folded over the last decade) might not have a workable software solution.

The third requirement to participate in SHAKEN/STIR is that the entire path connecting a switch to the public switched telephone network (PSTN) must be end-to-end digital. This is a huge problem and most small telcos, CLECs, cable companies, and other carriers connect to the PSTN using the older TDM technology (based upon multiples of T1s).

You might recall a decade ago there was a big stir about what the FCC termed a ‘digital transition’. The FCC at the time wanted to migrate the whole PSTN to a digital platform largely based upon SIP trunking. While there was a huge industry effort at the time to figure out how to implement the transition, the effort quietly died and the PSTN is still largely based on TDM technology.

I have clients who have asked for digital trunking (the connection between networks) for years, but almost none of them have succeeded. The large telcos like AT&T, Verizon, and CenturyLink don’t want to spend the money at their end to put in new technology for this purpose. A request to go all-digital is either a flatly refused, or else a small carrier is told that they must pay to transport their network traffic to some distance major switching point in places like Chicago or Denver – an expensive proposition.

What happens to a company that doesn’t participate in SHAKEN/STIR? It won’t be pretty because all of the calls originating from such a carrier won’t get a token verifying that the calls are legitimate. This could be devastating to rural America. Once SHAKEN/STIR is in place for a while a lot of people will refuse to accept unverified calls – and that means calls coming from small carriers won’t be answered. This will also affect a lot of cellular calls because in rural America those calls often originate behind TDM trunking.

We already have a problem with rural call completion, meaning that there are often problems trying to place calls to rural places. If small carriers can’t participate in SHAKEN/STIR, after a time their callers will have real problems placing calls because a lot of the world won’t accept calls that are not verified with a token.

The big telcos have assured the FCC that this can be made to work. It’s my understanding that the big telcos have mistakenly told the FCC that the PSTN in the country is mostly all-digital. I can understand why the big telcos might do this because they are under tremendous pressure from the FCC and Congress to tackle the robocall issue. These big companies are only looking out for themselves and not the rest of the industry.

I already had my doubts about the SHAKEN/STIR solution because my guess is that bad actors will find a way to fake the tokens. One has to only look back at the decades-old battles against spam email and against hackers to understand that it’s going to require a back-and-forth battle for a long time to solve robocalling – the first stab of SHAKEN/STIR is not going to fix the problem. The process is even more unlikely to work if it doesn’t function for large parts of the country and for whole rural communities. The FCC needs to listen to NTCA and other rural voices and not create another disaster for rural America.

The Census Bureau and the Digital Divide

John Horrigan recently wrote an interesting article in The Daily Yonder that cited the results of a survey done by the Census Bureau. The agency conducts an annual survey called the American Community Survey (ACS) of 3.5 million households. In recent years the survey has included a few questions about broadband. The most recent ACS survey included questions about the digital divide. The results are at first glance a bit surprising.

The survey shows that more than 20.4 million homes have no broadband subscription at home. The survey shows that 5.1 million homes with no broadband connection are rural and 15.3 million homes are non-rural. Anybody who tracks rural broadband instantly doesn’t think those numbers can be right. However, the Census Bureau uses its own definition of rural which is different than the way most of the world thinks or rural versus urban.

According to the Census Bureau definition, rural is everything that is not urban. The Census bureau looks at the country by regional clusters of population. They count two kinds of urban areas – urbanized areas (UAs) are clusters with 50,000 or more people and urban clusters (UCs) which have between 2,500 and 50,000 people. Most of us would consider many of the UCs to be rural because within this category are a lot of rural county seats and the immediately surrounding areas. The Census statistics count a lot of people who live just outside of towns as urban when our industry considers homes past the last cable company connection as rural.

Horrigan interpets the results of the Census Bureau survey to mean that affordability is a bigger reason today than connectivity for why people don’t have broadband. He reached that conclusion by considering a recent Pew Research poll on the same topic that shows that more homes cite reasons other than availability as reasons they don’t have broadband.

The Pew Research survey asked households why they don’t have broadband. Respondents could supply more than one response.

  • 50% claimed that price was a major factor and 21% cited this as the primary reason.
  • 45% said that their smartphone could do everything they need.
  • 43% said they had good access to the Internet outside the home.
  • 31% said they couldn’t afford a computer.
  • Only 22% said that they couldn’t order a broadband connection, and only 7% said that was the primary reason they didn’t have broadband.

The Census Bureau also correlated their results with household income, and it’s not surprising that low-income households have a much lower broadband connection rate. The Census Bureau survey showed that only 59% of homes that make less than $20,000 per year have broadband. The subscription rate for all households making more than $20,000 is 88%.

Interestingly, the FCC doesn’t ask why people don’t have broadband. They interpret their mission to measure broadband availability and they count homes with or without broadband connections. This raises a few questions. What exactly is the FCC’s mandate from Congress – to get America has connection to reach the Internet or to make sure that America makes those broadband connections? I read the FCC’s mandate from Congress to have some of both goals. If availability is not the primary reason why homes don’t have broadband, the FCC might get more bang from their buck by putting some effort into digital inclusion programs. According to the Horrigan article, there are now more homes that can’t afford broadband than homes that don’t have a connectivity option.

This implies the need for a much-improved Lifeline Fund. The current Lifeline program is likely not making a big difference in digital inclusion. It provides a small monthly subsidy of $9.25 per month for qualifying households to save money on either their telephone bill or their broadband bill. It’s becoming increasingly hard to qualify for Lifeline because the big telcos like AT&T are backing out of the program. Some cable companies provide low-cost cable lines to homes with school students, but to nobody else – and cable companies don’t operate outside of towns.

In addition to a more effective Lifeline program, digital inclusion also means getting computers into homes that can’t afford them. I’ve written before about the non-profit group E2D that provides computers to school students in Charlotte, NC. Perhaps some of the Universal Service Fund could be used to assist effective groups like E2D to get more computers to more households.

My firm CCG conducts surveys and we’ve seen anecdotal evidence in a few recent surveys in poor rural counties that a lot of homes don’t buy the slow DSL option available to them because of price. These homes tell us that price mattered more than connectivity. I don’t have any easy answer for the best way to promote digital inclusion. But there are folks in the country who have made amazing progress in this area and perhaps the FCC should consider giving such groups some help. At a minimum, the FCC needs to recognize that now that most homes have a broadband connection that price is a major barrier for the majority of those who are not connected.