Amazon Joins the Broadband Space Race

I wrote a blog just a few weeks ago talking about how OneWeb had fully leaped into the race to place broadband satellites by launching a few test satellites and also by raising a few more billion dollars to fund the venture.

It’s been rumored for several years that Amazon was also interested in the idea, but their plans have been under wraps. It just came to light that Amazon has taken the first public steps and had the FCC file paperwork with the International Telecommunications Union to make notice of Amazon’s intent to launch satellites.

Amazon filed with the FCC under the name of Kuiper Systems LLC. Space fans will recognize the corporate name as a reference to the Kuiper belt, which is the area of the solar system past Neptune that is believed to contain numerous comets, asteroids and other small objects made largely of ice.

Amazon has big plans and the ITU filing said the company wants to launch a constellation of 3,236 satellites in low earth orbit. That’s 784 satellites in orbit at 367 miles above the earth, 1,296 in orbit at 379 miles, and 1,156 in orbit at 391 miles. Added to the other companies that are talking about getting into the business that’s now more than 10,000 planned satellites.

We know that Jeff Bezos is serious about space. He owns a rocket business, Blue Origins, that is developing an orbital-class rocket called the New Glenn. That company already has some future contracts to make private launches for OneWeb and Telesat. Amazon also recently launched a cloud computing service knows as AWS Ground Station that is intended to provide communications data links between earth and object in outer space. We also found out recently that Bezos kept 100% control of Blue Origins as part of his divorce settlement.

None of the low-orbit satellite ventures have talked about broadband speeds, prices or customer penetration goals. The only one making any announcement was SpaceX who said that his Starlink satellites would be capable of making a gigabit connection to earth. But that’s a far cry from a realistic estimate of a broadband product and is the satellite version of the Sprint cellphone test that showed that millimeter wave spectrum could deliver gigabit speeds to a cellphone. It can be done but is incredibly hard and would involve synching big data pipes from multiple satellites to a single customer.

We got another clue recently when OneWeb asked the FCC for permission to eventually create 1 million links to earth-based receivers, meaning customers. That puts some perspective on the satellites and shows that they are not trying to bring broadband to every rural customer. But still, one million satellite connections would represent about 10% of the rural homes in the US that don’t have broadband today. If that’s their US goal it automatically tells me that prices will likely be high.

NASA and others in charge of space policy have also started talking recently about the potential dangers from so many objects in orbit. We don’t know the size of the Amazon satellites yet. But Elon Musk said his satellites would range in size from a refrigerator down to some that are not larger than a football. NASA is worried about collisions between manned space flights with satellites and space debris.

Amazon is still early in the process. They haven’t yet filed a formal proposal to the FCC discussing their technology and plans. They are several years behind OneWeb and Starlink in terms of getting a test satellite into orbit. But an Amazon space venture has the built-in advantage of being able to advertise a satellite broadband product on the Amazon website where the vast majority of Americans routinely shop. I can envision Amazon measuring the broadband speed of a customer connected to the Amazon website and popping up an offer to buy faster broadband.

It’s absolutely impossible to predict the impact these various satellite companies will have on US broadband. A lot of their impact is going to depend upon the speeds and prices they offer. A lot of rural America is starting to see some decent speeds offered by WISPs with newer radios. Every year some pockets of of rural America are getting fiber and gigabit speeds. Where might the satellites fall into that mix? We can’t forget that the need for broadband is still doubling every three years, and one has to consider the speeds that homes will want a decade from now – not the speeds households want today. We’re at least a few years from seeing any low-orbit broadband connections and many years away from seeing the swarm of over 10,000 satellites that are planned for broadband delivery.

Why Offer Fast Data Speeds?

A commentor on an earlier blog asked a great question. They observed that most ISPs say that customer usage doesn’t climb when customers are upgraded to speeds faster than 50 Mbps – so why does the industry push for faster speeds? The question was prompted by the observation that the big cable companies have unilaterally increased speeds in most markets to between 100 Mbps to 200 Mbps. There are a lot of different answers to that question.

First, I agree with that observation and I’ve heard the same thing. The majority of households today are happy with a speed of 50 Mbps, and when a customer that already has enough bandwidth is upgraded they don’t immediately increase their downloading habits.

I’ve lately been thinking that 50 Mbps ought to become the new FCC definition of broadband, for exactly the reasons included in the question. This seems to be the speed today where most households can use the Internet in the way they want. I would bet that many households that are happy at 50 Mbps would no longer be happy with 25 Mbps broadband. It’s important to remember that just three or four years ago the same thing could have been said about 25 Mbps, and three or four years before that the same was true of 10 Mbps. One reason to offer faster speeds is to stay ahead of that growth curve. Household bandwidth and speed demand has been doubling every three years or so since 1980. While 50 Mbps is a comfortable level of home bandwidth for many today, in just a few years it won’t be.

It’s also worth noting that there are some households who need more than the 50 Mbps speeds because of the way they use the Internet. Households with multiple family members that all want to stream at the same time are the first to bump against the limitations of a data product. If ISPs never increase speeds above 50 Mbps, then every year more customers will bump against that ceiling and begin feeling frustrated with that speed. We have good evidence this is true by seeing customers leave AT&T U-verse, at 50 Mbps, for faster cable modem broadband.

Another reason that cable companies have unilaterally increased speeds is to help overcome customer WiFi issues. Customers often don’t care about the speed in the room with the WiFi modem, but care about what they can receive in the living room or a bedroom that is several rooms away from the modem. Faster download speeds can provide the boost needed to get a stronger WiFi signal through internal walls. The big cable companies know that increasing speeds cuts down on customer calls complaining about speed issues. I’m pretty sure that the cable companies will say that increasing speeds saves them money due to fewer customer complaints.

Another important factor is customer perception. I always tell people that if they have the opportunity, they should try a computer connected to gigabit speeds. A gigabit product ‘feels’ faster, particularly if the gigabit connection is on fiber with low latency. Many of us are old enough to remember that day when we got our first 1 Mbps DSL or cable modem and got off dial-up. The increase in speed felt liberating, which makes sense because a 1 Mbps DSL line is twenty times faster than dial-up, and also has a lower latency. A gigabit connection is twenty times faster than a 50 Mbps connection and seeing it for the first time has that same wow factor – things appear on the screen almost instantaneously as you hit enter. The human eye is really discerning, and it can see a big difference between loading the same web site at 25 Mbps and at 1 Gbps. The actual time difference isn’t very much, but the eye tells the brain that it is.  I think the cable companies have figured this out – why not give faster speeds if it doesn’t cost anything and makes customers happy?

While customers might not immediately use more broadband, I think increasing the speed invites them to do so over time. I’ve talked to a lot of people who have lived with inadequate broadband connections and they become adept at limiting their usage, just like we’ve all done for many years with cellular data usage. Rural families all know exactly what they can and can’t do on their broadband connection. For example, if they can’t stream video and do schoolwork at the same time, they change their behavior to fit what’s available to them. Even non-rural homes learn to do this to a degree. If trying to stream multiple video streams causes problems, customers quickly learn not to do it.

Households with fast and reliable broadband don’t give a second thought about adding an additional broadband application. It’s not a problem to add a new broadband device or to install a video camera at the front door. It’s a bit of the chicken and egg question – does fast broadband speeds promote greater broadband usage or does the desire to use more applications drive the desire to get faster speeds? It’s hard to know any more since so many homes have broadband speeds from cable companies or fiber providers that are set faster than what they need today.

An IoT Bill of Rights

Parks Associates recently released a report saying that the average broadband home now has 10 connected IoT devices. This includes desktops, laptops, tablets, and smartphones but also today includes a wide arrange of other devices such as smart TVs, smart speakers and a wide range of smart home devices.

I remember back in 2013 when IoT was first being discussed that there was a lot of talk about creating an IoT Bill of Rights that would define the ethics that ought to be required for any smart device placed into people’s homes. The discussion then was that the benefits of smart devices could be outweighed by the harm that could come from IoT manufacturers secretly spying on us and collecting and selling personal data. There was also a lot of concern that IoT devices could provide entry points for hackers into home networks. That discussion largely died, and here we are six years later introducing IoT devices into our homes without any policies or standards defining the rights of smart device users or the obligations of manufacturers to protect privacy.

There were numerous concerns voiced in 2013 that are still valid concerns today, and unfortunately, are issues that most buyers of smart devices don’t think about:

Software Updates. We are used to routinely getting security patches and other software updates for our laptops and smartphone to keep us safe. However, few smart devices come with any mechanism for updates and over time become more vulnerable to hacking. You’ve probably heard the story of the casino that got hacked through a connection into a smart water pump in a fish tank. Hackers used that unprotected connection to gain access to the casino network. There ought to be a requirement that IoT software and firmware is somehow updated, and that would include figuring out how to deal with cases where a device manufacturer goes out of business for some reason. Unfortunately, most of our smart devices are never updated after we buy them.

Full Disclosure. There have been well-publicized cases where the public found out that IoT devices were listening in without their knowledge. There were big headlines when it was revealed that Samsung TVs could both listen and see into the living room. Parents panicked when it was revealed that Cayla dolls were listening to kids and sending conversations to unspecified data center. It’s nearly impossible today to know if a smart device includes a hidden microphone or camera since those devices are so small. Sellers of IoT devices should be required to clearly disclose when devices can watch or listen to buyers. There also should be required to provide clear instructions on how to disable unwanted surveillance.

The Sharing and Sale of IoT Data. Sellers of IoT devices ought to be required before purchase to provide full disclosure on what they do with data they collect from users. And these disclosures should be prominent and not buried in a fine print legalese terms of service document.  I read late last year that as many as 1,500 data points are now gathered on the average connected adult every day. A lot of these comes from location data on our smartphones, but much of it also comes from IoT devices in the home. Manufacturers that violate privacy promises given to customers should be fined heavily.

Data Retention. IoT device manufacturers also ought to disclose how long they keep our data. It’s always an eye-opener to do a Google search on yourself and see things from fifteen and twenty years ago. At the early stage of search engines there was talk about having non-headline data purged after six months – that obviously never happened. We are just now seeing large companies figure out how to make sense out of mountains of data. It’s dismaying to think that years of old data about us, that was probably never used, can be sold to create personal profiles on each of us.

User Control of Devices. In a perfect world, the user would have complete control over the IoT devices in the home. We ought to be able to decide what data is and is not shared. We ought to be to disable surveillance. We ought to be able to encrypt and store data locally that we want to use for ourselves.

We’ve come a long way with IoT since 2013. Then there were a handful of IoT devices like the Nest smart thermostat. If you believe the Parks Associates numbers most of us have brought numerous smart devices into our homes. I’m personally going to bet the Parks number of ten devices is low because many of us own devices that are capable of connecting to our WiFi that we don’t even think about.

We’re allowing all of these devices in our homes without full disclosure from the manufacturers, with no mechanism for keeping device security up-to-date, and with no idea what data is collected on us and how it’s being used.

As consumers we ought to be able to trust that the manufactures of IoT devices are protecting our data and privacy. It’s more likely though that many IoT device makers are hoping to monetize our data, and there’s no part of the government that I’m aware of that is working on the side of the consumer for these issues. We need an IoT bill or rights a lot more now than we did in 2013.

$20.4 Billion in Broadband Funding?

Chairman Ajit Pai and the White House announced a new rural broadband initiative that will provide $20.4 billion over ten years to expand and upgrade rural broadband. There were only a few details in the announcement, and even some of them sound tentative. A few things are probably solid:

  • The money would be used to provide broadband in the price-cap service areas – these are the areas served by the giant telcos.
  • The FCC is leaning towards a reverse auction.
  • Will support projects that deliver at least 25/3 Mbps broadband.
  • Will be funded from the Universal Service Fund and will ‘repurpose’ existing funds.
  • The announcement alludes to awarding the money later this year, which would be incredibly aggressive.
  • This was announced in conjunction with the auction of millimeter wave spectrum – however this is not funded from the proceeds of that auction.

What might it mean to repurpose this from the Universal Service Fund?  The fund dispersed $8.7 billion in 2018. We know of two major upcoming changes to the USF disbursements. First. the new Mobility II fund to bring rural 4G service adds $453 million per year to the USF. Second. the original CAF II program that pays $1.544 billion annually  to the big telcos ends after 2020.

The FCC recently increased the cap on the USF to $11.4 billion. Everybody was scratching their head over that cap since it is so much higher than current spending. But now the number makes sense. If the FCC was to award $2.04 billion in 2020 for the new broadband spending, the fund would expand almost to that new cap. Then, in 2021 the fund would come back down to $9.6 billion after the end of CAF II. We also know that the Lifeline support subsidies have been shrinking every year and the FCC has been eyeing further cuts in that program. We might well end up with a fund by 2021 that isn’t much larger than the fund in 2018.

There are some obviously big things we don’t know. The biggest is the timing of the awards. Will this be a one-time auction for the whole $20.4 billion or a new $2 billion auction for each of the next ten years? This is a vital question. If there is an auction every year then every rural county will have a decent shot at the funding. That will give counties time to develop business plans and create any needed public private partnership to pursue the funding.

However, if the funding is awarded later this year in one big auction and then disbursed over ten years, then I predict that most of the money will go again to the big telcos – this would be a repeat of the original CAF II. That is my big fear. There was great excitement in rural America for the original CAF II program, but in the end that money was all given to the big telcos. The big telcos could easily promise to improve rural DSL to 25/3 Mbps given this kind of funding. They’d then have ten years to fulfill that promise. I find it worrisome that the announcement said that the funding could benefit around 4 million households – that’s exactly the number of households covered by the big telcos in CAF II.

What will be the study areas? The CAF II program awarded funding by county. Big study areas benefit the big telcos since anybody else chasing the money would have to agree to serve the same large areas. Big study areas means big projects which will make it hard for many ISPs to raise any needed matching finds for the grants – large study areas would make it impossible for many ISPs to bid.

My last concern is how the funds will be administered. For example, the current ReConnect funding is being administered by the RUS which is part of the Department of Agriculture. That funding is being awarded as part grants and part loans. As I’ve written many times, there are numerous entities that are unable to accept RUS loans. There are features of those loans that are difficult for government entities to accept. It’s also hard for a commercial ISP to accept RUS funding if they already carry debt from some other source. The $20.4 billion is going to be a lot less impressive if a big chunk of it is loans. It’s going to be disastrous if loans follow the RUS lending rules.

We obviously need to hear a lot more. This could be a huge shot in the arm to rural broadband if done properly – exactly the kind of boost that we need. It could instead be another huge giveaway to the big telcos – or it could be something in between. I know I tend to be cynical, but I can’t ignore that some of the largest federal broadband funding programs have been a bust. Let’s all hope my worries are unfounded.

The National Broadband Penetration Rate

My firm CCG Cinsulting recently completed residential surveys in three cities where I found broadband penetration rates of between 92% and 93%. Those are the highest broadband take rates I’ve ever seen. If I only encountered one city with a penetration rate that high, I would assume that there is some reason why more people in that city have broadband. But now that I’ve seen three cities with the same high penetration rate I started to ask myself different questions. How unusual is it for cities to have penetration rates at that level? What penetration rates should I expect to see today in cities?

I first thought through the survey process. I’ve always found a well-designed survey to produce reliable results for questions like quantifying the market share of the major ISPs. I’ve worked with a few cities that had detailed customer penetration data from franchise fee reporting and in those cities our surveys closely matched that data. I’ve also worked in a few cities where we’ve done several surveys in a relatively short period of time and got nearly the same results from multiple surveys. I’ve come to trust survey results – as long as you follow good practices to make sure the survey is conducted randomly the results seem to be reliable.

I then turned to published industry statistics on the number of broadband customers to see what those told me. The two most cited statistics come from USTelecom and Leichtman Research Group (LRG). As of the end of 2017 US Telecom claimed that 79% of homes had a wired broadband connection, defined as any connection that is faster than 200 kbps, which eliminates dial-up. Leichtman Research Group claimed that 84% of homes had a wired broadband connection at the end of 2017 based upon a nationwide survey. Those numbers are significantly different. Luckily both groups also publish counts of national broadband subscribers, providing a second way to compare the two.

In the USTelecom Industry Metrics and Trends report from March 2018, US Telecom said there was 100 million residential broadband ‘connections’ at the end of 2017. They claim total broadband connections of 109 million when adding businesses.

Leichtman Research Group counts broadband ‘subscribers’ every quarter by gathering the statistics from the financial reports from the largest ISPs. LRG includes all of the big ISPs from Comcast down to Cincinnati Bell with 300,000 broadband customers. LRG claims these large companies represent about 95% of the whole broadband market. LRG counted 95.8 million total broadband customers at the end of 2017 – a count that includes businesses. Adjusting to add the remaining 5% of the market, LRG shows 100.8 million total broadband subscribers, including businesses – over 8 million less than what USTelecom counts.

That’s an astounding difference, and it’s obvious the two groups aren’t counting broadband customers the same way. There must be a difference between ‘subscribers’ and ‘connections’.

I’ve only come up with one reason why the counts would be that different. A lot of apartment complexes and business high rises today are served by a big data pipe provided to the landlord, who then provides broadband to tenants. I’m guessing that the LRG numbers considers the big data pipe to be one broadband customer. In most cases the LRG numbers come from quarterly financial reports to shareholders, and my guess is that ISPs consider a subscriber to be an entity that recieves a bill for broadband service.

I further postulate that USTelecom counts the number of tenants in those same buildings as ‘connections’. We know that big ISPs often do that. For example, AT&T agreed with regulators to pass 12.5 million new residences and businesses with fiber as part of their merger with DirecTV. It’s been clear that one of the big components of those new passings comes from units in apartment complexes. If AT&T was to build a fiber past an apartment complex they could count them as passings to satisfy the FCC without having had to get them as a customer.

The other component of the penetration rate equation is the number of US households. That number is just as confusing. I found a lot of different estimates of the number of US households. For example, the US Census says there was 137.4 million total living units at the end of 2017, with 118.8 million occupied living units. Statistica estimates 127.6 million households at the end of 2018. YCharts shows there are 122.6 million households at the end of 2018. That’s a wide range of ways to count potential residential customers in the country.

Finally, when trying to estimate the broadband penetration rates to be expected in cities, you have to back out the rural homes that can’t get broadband from the equation. That’s also a difficult number to pin down and I can find estimates that range from 6 million to 12 million homes with no broadband alternative.

The bottom line is that I don’t really know what I should expect as an urban broadband penetration rate. I can do math that supports a typical urban penetration rate of 92%. Mostly what I learned from this exercise is how careful I need to be when citing national broadband statistics – if you play it loose you can get almost any answer you want.

Setting Broadband Rates

One of the more interesting things about being a consultant is that I often get to work with new ISPs. One of the topics that invariably arises is how to set rates. There is no right or wrong answer and I’ve seen different pricing structures work in the marketplace. Most rate structure fit into one of these categories:

  • Simple rates with no discounts or bundling;
  • Rates that mimic the incumbent providers;
  • High rates, but with the expectation of having discounts and promotions;
  • Complex rates that cover every imaginable option.

Over the years I’ve become a fan of simple rate structure for a couple of reasons:

  • Simple rates make it easy for customer service reps and other employees.
  • It’s easy to advertise simple rates: “Our rates are the same for everybody – no gimmicks, no tricks, no hidden fees”.
  • It’s easy to bill simple rates. Nobody has to keep track of when special promotions are ending. Simple rates largely eliminate billing errors.
  • It eliminates the process of having to negotiate prices annually with customers. That’s an uncomfortable task for customer service reps. There are customers in every market who chase the cheapest rates and the latest special. Many facility-based ISPs have come to understand that such customers are not profitable if they only stay with the ISP for a year before chasing a cheaper rate elsewhere.
  • It’s easier for customers. Customers appreciate simple, understandable bills. Customers who don’t like to negotiate rates don’t get annoyed when their neighbors pay less than them. Simple rates make it easy to place online orders.

As a consumer I like simple rates. When Sling TV first entered the market they had two similar channel line-ups to choose from, with several additional options on top of each basic package. Since they were the only online provider at the time, I waded through the process of comparing the packages. But I was really annoyed that they made me do so much work to buy their product, and when a simpler provider came along I jumped ship. To this day I can’t figure out what Sling TV gained from making it so hard to compare their options.

ISPs can be just as confusing. I was looking online the other day at the packages offered by Cox. They must have fifty or sixty different triple and double play packages online and it’s virtually impossible for a customer to wade through the choices unless they know exactly what they want.

There are fiber overbuilders who are just as confusing. I remember looking at the pricing list of one of the earliest municipal providers. They had at least a hundred different speed combinations of upload and download speeds. I understand the concept of giving customers what they want, but are there really customers in the world who care about the difference between speed combinations like 35/5 Mbps, 38/5 Mbps, or 35/10 Mbps? I know several smaller ISPs who have as many options as Cox and have a different product name for each unique combination of broadband, video, and voice.

There is such a thing as being too simple. Google Fiber launched in Kansas City with a single product, $70 gigabit broadband. They were surprised to find that a lot of customers wouldn’t consider them since they didn’t offer video or telephone service. Over a few years Google Fiber introduced simple versions of those products and now also offer a 100 Mbps broadband product for $50. Even with these product additions they still have one of the simplest product lineups in the industry – and they are now attractive to a lot more homes.

I know ISPs with complicated rates that have achieved good market penetration. But I have to wonder if they would have done even better had they used simpler rates and made it easier on their staffs and the public.

How We Use More Bandwidth

We’ve known for decades that the demand for broadband growth has been doubling every three years since 1980. Like at any time along that growth curve, there are those that look at the statistics and think that we are nearing the end of the growth curve. It’s hard for a lot of people to accept that bandwidth demand keeps growing on that steep curve.

But the growth is continuing. The company OpenVault measures broadband usage for big ISPs and they recently reported that the average monthly data use for households grew from 201.6 gigabytes in 2017 to 268.7 gigabytes in 2018 – a growth rate of 33%. What is astounding is the magnitude of growth, with an increase of 67.1 gigabytes in just a year. You don’t have to go back very many years to find a time when that number couldn’t have been imagined.

That kind of growth means that households are finding applications that use more bandwidth. Just in the last few weeks I saw several announcements that highlight how bandwidth consumptions keep growing. I wrote a blog last week describing how Google and Microsoft are migrating gaming to the cloud. Interactive gaming already uses a significant amount of bandwidth, but that usage is going to explode upwards when the machine operating the game is in a data center rather than on a local computer or game console. Google says most of its games will operate using 4K video, meaning a download speed of at least 25 Mbps for one stream plus an hourly download usage of 7.2 GB.

I also saw an announcement from Apple that the users of the Apple TV stick or box can now use it on Playstation Vue to watch up to four separate video steams simultaneously. That’s intended for the serious sports fan and there are plenty of households that would love to keep track of four sporting events at the same time. If the four separate video streams are broadcast in HD that would mean downloading 12 GB per hour. If the broadcasts are in 4K that would be an astounding 29 GB per hour.

The announcement that really caught my eye is that Samsung is now selling an 8K video-capable TV. It takes a screen of over 80 inches for the human eye to perceive any benefit from 8K video. There are no immediate plans for anybody to broadcast in 8K, but the same was true when the first 4K TVs were sold. When people buy these TVs, somebody is going to film and stream content in the format. I’m sure that 8K video will have some improved compression techniques, but without a new compression scheme, an 8K video stream is 16 times larger than an HD stream – meaning a theoretical download of 48 GB per hour.

Even without these new gadgets and toys, video usage is certainly the primary driver of the growth of household broadband. In 2014 only 1% of homes had a 4K-capable TV – the industry projects that to go over 50% by the end of this year. As recently as two years ago you had to search to find 4K programming. Today almost all original programming from Netflix, Amazon, and others is shot in 4K, and the web services automatically feed 4K speeds to any customer connection able to accept it. User-generated 4K video, often non-compressed, is all over YouTube. There are now 4K security cameras on the market, just when HD cameras have completely replaced older analog cameras.

Broadband usage is growing in other ways. Cisco projects machine-to-machine connections will represent 51% of all online connections by 2022, up from 40% today. Parks and Associates just reported that the average broadband home now has ten connected devices, and those devices all make internet connections on their own. Our computers and cellphone automatically update software over our broadband connections. Many of us set our devices to automatically back-up our hard drives, pictures, and videos in the cloud. Smart home devices constantly report back to the alarm monitoring service. None of these connections sound large, but in aggregate they really add up.

And sadly, we’re also growing more inefficient. As households download multiple streams of music, video, and file downloads we overload our WiFi connection and/or our broadband connection and thus request significant retransmission of missing or incomplete packets. I’ve seen estimates that this overhead can easily average 20% of the bandwidth used when households try to do multiple things at the same time.

I also know that when we look up a few years from now to see that broadband usage is still growing that there will be a new list of reasons for the growth. It may seem obvious, but when handed enough bandwidth, households are finding a way to use it.

Please, Not Another Mapping Debacle

There are numerous parties making proposals to the FCC on how to fix the broken broadband mapping program. Today I want to look at the proposal made by USTelecom. On the surface the USTelecom proposal sounds reasonable. They want to geocode every home and business in the US to create a giant database and map of potential broadband customers. ISPs will then overlay speeds on the detailed maps, by address. USTelecom suggests that defining broadband by address will eliminate the problems of reporting broadband by Census block.

Their idea should work well for customers of fiber ISPs and cable companies. Customer addresses are either covered by those technologies or they’re not. But the proposed new maps won’t do much better than current maps for the other technologies used in rural America for a number of reasons:

  • Telcos that provide rural DSL aren’t going to tell the truth about the speeds being delivered. Does anybody honestly believe that after taking billions of dollars to improve rural DSL that Frontier and CenturyLink are going to admit on these maps that customers in areas covered by CAF II are getting less than 10 Mbps?
  • In the telcos favor, it’s not easy for them to define DSL speeds. We know that DSL speeds drop with distance from a DSLAM transmitting point, so the speed is different with each customer, even with ideal copper.
  • Rural copper is far from ideal, and DSL speeds vary widely by customer due to local conditions. The quality can vary between wires in the same sheathe due to damage or corrosion over time. The quality of the drop wires from the street to the house can drastically impact DSL speeds. Even the inside copper wiring at a home can have a big influence. We also know that in many networks that DSL bogs down in the evenings due to inadquate backhaul, so time of day impacts the speed.
  • What is never mentioned when talking about rural DSL is how many customers are simply told by a telco that DSL won’t work at their home because of one of these reasons. Telcos aren’t reporting these customers as unservable today and it’s unlikely that they’ll be properly reported in the future.
  • Rural fixed wireless has similar issues. The ideal wireless connection has an unimpeded line-of-sight, but many customers have less than an ideal situation. Even a little foliage can slow a connection. Further, every wireless coverage area has dead spots and many customers are blocked from receiving service. Like DSL, wireless speeds also weaken with distance – something a WISP is unlikely or unwilling to disclose by customer. Further, while WISPs can report on what they are delivering to current customers they have no way of knowing about other homes until they climb on the roof and test the line-of-sight.
  • It’s also going to be interesting to see if urban ISPs admit on maps to the redlining and other practices that have supposedly left millions of urban homes without broadband. Current maps ignore this issue.

USTelecom also wants to test-drive the idea of allowing individuals to provide feedback to the maps. Again, this sounds like a good idea. But in real life this is full of problems:

  • Homeowners often don’t know what speeds they are supposed to get, and ISPs often don’t list the speed on bills. The broadband map is supposed to measure the fastest speed available, and the feedback process will be a mess if customers purchasing slower products interject into the process.
  • There are also a lot of problems with home broadband caused by the customer. ISPs operating fiber networks say that customers claiming low speeds usually have a WiFi problem. Customers might be operating ancient WiFi routers or else are measuring speed after the signal has passed through inside multiple walls.

I still like the idea of feedback. My preference would be to allow local governments to be the conduit for feedback to the maps. We saw that work well recently when communities intervened to fix the maps as part of the Mobility Fund Phase II grants that were intended to expand rural 4G coverage.

My real fear is that the effort to rework the maps is nothing more than a delaying tactic. If we start on a new mapping effort now the FCC can throw their hands up for the next three years and take no action on rural broadband. They’ll have the excuse that they shouldn’t make decision based on faulty maps. Sadly, after the three years my bet is that new maps will be just as bad as the current ones – at least in rural America.

I’m not busting on USTelecom’s proposal as much as I’m busting on all proposals. We should not be using maps to decide the allocation of subsidies and grants. It would be so much easier to apply a technology test – we don’t need maps to know that fiber is always better than DSL. The FCC can’t go wrong with a goal of supplanting big telco copper.

Capping the Universal Service Fund

FCC Chairman Ajit Pai recently suggested capping the size of the total Universal Service Fund at $11.4 annually, adjusted going forward for inflation. The chairman has taken a lot of flack on this proposal from advocates of rural broadband. Readers of this blog know that I have been a big critic of this FCC on a whole host of issues. However, this idea doesn’t ive me much heartburn.

Critics of the idea are claiming that this proves that the FCC isn’t serious about fixing the rural broadband problem. I totally agree with that sentiment and this current FCC hasn’t done very little to fix rural broadband. In fact, they’ve gone out of their way to try to hide the magnitude of the rural problem by fiddling with broadband statistics and by hiding behind the faulty data from carriers that come out of the FCC’s broadband mapping effort. My personal guess is that there are millions of more homes that don’t have broadband than are being counted by the FCC.

With that said, the Universal Service Fund shouldn’t be the sole funding source for fixing rural broadband. The fund was never intended for that. The fund was created originally to promote the expansion of rural telephone service. Over time it became the mechanism to help rural telcos survive as other sources of subsidies like access charges were reduced over time. Only in recent years was it repositioned to fund rural broadband.

Although I’m a big proponent for better rural broadband, I am not bothered by capping the Universal Service Fund. First, the biggest components of that fund have been capped for years. The monies available for the rural high cost program, the schools and library fund and for rural healthcare have already been capped. Second, the proposed cap is a little larger than what’s being spent today, and what has been spent historically. This doesn’t look to be a move by the FCC to take away funding from any existing program.

Consumers today fund the Universal Service Fund through fees levied against landline telephone and cellphones. Opponents of capping the fund apparently would like to see the FCC hike those fees to help close the rural broadband gap. As a taxpayer I’m personally not nuts about the idea of letting federal agencies like the FCC print money by raising taxes that we all pay. For the FCC to make any meaningful dent in the rural broadband issue they’d probably have to triple or quadruple the USF fees.

I don’t think there is a chance in hell that Congress would ever let the FCC do that – and not just this Congress, but any Congress. Opponents of Pai’s plan might not recall that past FCCs have had this same deliberation and decided that they didn’t have the authority to unilaterally increase the size of the USF fund.

If we want to federal government to help fix the rural broadband problem, unfortunately the only realistic solution is for Congress to appropriate real money to the effort. This particular Congress is clearly in the pocket of the big telcos, evidenced by the $600 million awarded for rural broadband in last year’s budget reconciliation process. The use of those funds was crippled by language inserted by the big telcos to make it hard to use the money to compete against the telcos.

And that’s the real issue with federal funding. We all decry that we have a huge rural broadband crisis, but what we really have is a big telco crisis. Every rural area that has crappy broadband is served by one of the big telcos. The big telcos stopped making investments to modernize rural networks decades ago. And yet they still have to political clout to block federal money from being used to compete against their outdated and dying networks.

The FCC does have an upcoming opportunity for funding a new broadband program from the Universal Service Fund. After 2020 nearly $2 billion annually will be freed up in the fund at the end of the original CAF II program. If this FCC is at serious about rural broadband the FCC should start talking this year about what to do with those funds. This is a chance for Chairman Pai to put his (USF) money where his mouth is.

Verizon to Retire Copper

Verizon is asking the FCC for permission to retire copper networks throughout its service territory in New York, Massachusetts, Maryland, Virginia, Rhode Island and Pennsylvania. In recent months the company has asked to kill copper in hundreds of exchanges in those states. These range from urban exchanges in New York City to exchanges scattered all over the outer suburbs of Washington DC and Baltimore. Some of these filings can be found at this site.

The filings ask to retire the copper wires. Verizon will no longer support copper in these exchanges and will stop doing any maintenance on copper. The company intends to move people who still are served by copper over to fiber and is not waiting for the FCC notice period to make such conversions. Verizon is also retiring the older DMS telephone switches, purchased years ago from the long-defunct Northern Telecom. Telephone service will be moved to more modern softs switches that Verizon uses for fiber customers.

The FCC process requires Verizon to notify the public about plans to retire copper and if no objections are filed in a given exchange the retirement takes place 90 days after the FCC’s release of the public notice to retire. Verizon has been announcing copper retirements since February 2017 and was forced to respond to intervention in some locations, but eventually refiled most retirement notices a second time.

Interestingly, much of the FiOS fiber network was built by overlashing fiber onto the copper wires, so the copper wires on poles are likely to remain in place for a long time to come.

From a technical perspective, these changes were inevitable. Verizon is the only big telco to widely build fiber plan in residential neighborhoods and it makes no sense to ask them to maintain two technologies in neighborhoods with fiber.

I have to wonder what took them so long to get around to retiring the copper. Perhaps we have that answer in language that is in each FCC request where Verizon says it “has deployed or plans to deploy fiber-to-the-premises in these areas”. When Verizon first deployed FiOS they deployed it in a helter-skelter manner, mostly sticking to neighborhoods which had the lowest deployment cost, usually where they could overlash on aerial copper. At the time they bypassed places where other utilities were buried unless the neighborhood already had empty conduit in place. Perhaps Verizon has quietly added fiber to fill in these gaps or is now prepared to finally do so.

That is the one area of concern raised by these notices. What happens to customers who still only have a copper alternative? If they have a maintenance issue will Verizon refuse to fix it? While Verizon says they are prepared to deploy fiber everywhere, what happens to customers until the fiber is in front of their home or business? What happens to their telephone service if their voice switch is suddenly turned off?

I have to hope that Verizon has considered these situations and that they won’t let customers go dead. While many of the affected exchanges are mostly urban, many of them include rural areas that are not covered by a cable company competitor, so if customers lose Verizon service, they could find themselves with no communications alternative. Is Verizon really going to build FiOS fiber in all of the rural areas around the cities they serve?

AT&T is also working towards eliminating copper and offers fixed cellular as the alternative to copper in rural places. Is that being considered by Verizon but not mentioned in these filings?

I also wonder what happens to new customers. Will Verizon build a fiber drop to a customer who only wants to buy a single telephone line? Will Verizon build fiber to new houses, particularly those in rural areas? In many states the level of telephone regulation has been reduced or eliminated and I have to wonder if Verizon still sees themselves as the carrier of last resort that is required to provide telephone service upon request.

Verizon probably has an answer to all of these questions, but the FCC request to retire copper doesn’t force the company to get specific. All of the questions I’ve asked wouldn’t exist if Verizon built fiber everywhere in an exchange before exiting the copper business. As somebody who has seen the big telcos fail to meet promises many times, I’d be nervous if I was a Verizon customer still served by copper and had to rely on Verizon’s assurance that they have ‘plans’ to bring fiber.