Some Problems with the RDOF

The FCC recently published a set of proposed rules for conducting the $20.4 billion broadband grant program it has labeled as the Rural Digital Opportunity Zone (RDOF). While the FCC is to be applauded for redirecting the funding that formerly was used to support the CAF II program, there are still some problems I foresee in the grant program as proposed.

Reverse Auction. Many of my problems come because of the use of a reverse auction. I understand why politicians and policymakers like this idea. The general concept that those willing to take the least amount of subsidy get the funding somehow sounds fair, but a reverse auction is not going to result in the best use of these funds to bring permanent broadband solutions to rural America:

  • Favors Those Who Don’t Need the Money. We saw this in the CAF II reverse auction where satellite broadband won a significant amount of funding. This time around there’s a good chance that a large amount of grant money might go to Elon Musk’s Starlink and the other low orbit satellite providers. By definition, for satellite technology to work they have to cover everywhere – and so they are going to be launching the satellites anyway without subsidy. These companies can easily be the low bidders because getting anything out of the grant fund is still a great result for them. As we going to be happy of the result of the reverse auction results in billions of dollars handed to Elon Musk?
  • Favors Lowest Cost Technology. By definition, those planning to spend less per customer to bring broadband can take accept money from the grants and still be happy. This means the grants will favor solutions the big telcos again tweaking DSL over ancient copper if they choose to participate. This would allow AT&T and Verizon to grab a lot of money to support rural cellular upgrades. While the FCC is planning to weight the bidding to promote faster technologies like fiber, if the weighting isn’t done right, then the funding will automatically favor lower-cost yet slower technologies. Maybe that’s what the FCC wants – to bring some broadband solution to the largest number of people – but the best policy is to bring a permanent broadband solution to a smaller subset of areas.
  • Discriminates Against High Cost Areas. The areas that need broadband the most are where it costs the most to build any infrastructure. Areas like Appalachia and Alaska are high cost because of topology, and anybody applying for grants in these areas likely can’t afford to reduce the percentage of grant funding their receive. The entire concept of reverse auction, by definition, favors parts of the country with the lowest construction costs. Applicants in the wide-open plains of the Midwest have a built-in advantage.

The Sheer Size of the One-Time Award. The grant awards are likely to be about a year away. I wonder if there will be enough ISPs ready to bid in that short time frame? Bidders need to develop an engineering estimate and business plan of sufficient quality to also attract financing. If there are not enough ISPs able to be ready for the auction in that time frame, even more of the money is likely to flow to big companies like the satellite providers who would be glad to take the whole pot of funding. A better plan would have been to break this into several grant years and award some 10-year grants, some 9-year grants, and some 8-year grants.

No Real Penalties for Cheating. Companies don’t get penalized much for lying about the speeds they can deliver. We saw a few wireless providers in the CAF II reverse auction claim they can deliver 100 Mbps broadband to everybody. Unless somebody develops that technology in the next 2-3 years they are going to deliver something less, at least to a large percentage of their coverage area. If a company gets a bidding credit by making a false claim, they should lose all of their funding and have to repay the FCC. The proposed penalties are not much more than a slap on the wrist and encourage companies to claim faster speeds than they can deliver.

Likely Excludes Some Bidders. The rules still seem to exclude entities that can’t get Eligible Telecommunications Carrier (ETC) status – a regulatory designation required to get money from the Universal Service Fund – a status only available to entities that own the network, and which are also the retail ISP. This would preclude entities like the PUDs, the rural municipal electric companies in Washington that are required by law to operate open access networks. It also could preclude certain kinds of partnerships where the retail ISP is different than the network owner – an arrangement we’re seeing a lot in partnerships between telcos and electric cooperatives. Anybody willing to invest in rural broadband should be eligible to participate.

Are You Paying to Spy on Yourself?

Geoffrey A. Fowler of the Washington Post recently engaged a data expert to track everything going on behind the scenes with his iPhone. What he found was surprising since Apple touts itself as a company that doesn’t invade user privacy. The various apps on his phone were routinely handing out his personal data on a scale that shocked him.

Fowler’s information was being gathered by trackers. This is software built directly into apps and is different than ad tracking cookies that we pick up from web sites. App makers deliberately build trackers into apps and a user can’t get rid of them without getting rid of the app.

Most apps on his phone had these trackers. That included sites like Microsoft OneDrive, Intuit’s Mint, Nike, Spotify, The Washington Post, and the Weather Channel. Some apps came with numerous trackers. He had a food delivery service called DashDoor that included nine separate trackers. Third parties must be paying to share app space because the DashDoor app included trackers for Facebook and Google – those two companies know every time that app is used to order food.

Almost none of these apps disclosed the nature of what they were tracking. When first loaded, most apps ask for somewhat generic permission to track user certain data but don’t disclose the frequency and the extent to which they will gather data from a user.

This issue has relevance beyond privacy concerns because the apps on Fowler’s phone could collectively use as much as 1.5 gigabytes of data per month on his phone. Industry statistics show that the fastest-growing segment of Internet traffic is machine-to-machine communication, and these app trackers make a significant contribution to that traffic. Put bluntly, a lot of machine-to-machine traffic is either being used to back up files or to spy on us.

This has to be concerning to people who are still on measured cellular data plans. This unintended usage can cost real money and a user can end up paying to have trackers spy on them. Our cellphones are generating broadband usage without our knowledge, and mostly without our explicit permission. I’ve had months where I’ve barely roamed with my cellphone and still have seen more than a gigabyte of usage – I now understand where it’s probably coming from.

PCs and tablets have the same problems, with the data tracking coming more from marketing cookies that are loaded when we visit web sites. I scrub these cookies from my computer routinely. My desktop is only used for work and I still find 40 – 100 cookies every week. One of my blogs last year mentioned a guy who had gone on vacation for a month and was shocked when he returned and discovered that his home network had used several gigabytes of data in his absence.

There are ways to block the trackers on your phone, but this mostly involves deleting apps or turning off permission in your privacy setting, and that largely means the apps won’t work. You can also take steps to disguise your data by passing everything through a VPN, but that doesn’t stop the data from being transmitted.

The phone manufacturers are complicit in this tracking. I just got a new Samsung Galaxy and my new phone came with over 300 apps – most for services I don’t use like Facebook, Spotify, and ton of others. These various companies must have paid Samsung (or perhaps AT&T) to include their apps and their trackers. I’ll be spending a few days deleting or disabling most of these apps. I find it creepy that Facebook follows me even though I stopped using the site several years ago. And unlike when I download a new app, I didn’t have the opportunity to allow or deny permission to the many apps on my new phone – I assume AT&T gave that permission.

It might be a generational thing, but it bothers me to have companies reaping my personal data without my permission, without disclosing what they are gathering, and how they are using it. I know young people who are not bothered by tracking and assume that this is just a part of being connected.

The other big concern is that the tracking apps are contributing to the capacity problems on cellular network. I just saw last week that the average US cellphone now uses about 6 GB of data per month. If trackers are pushing out even half a gigabyte per month in usage that is a significant contributor to swamped cellular networks. Cellphone companies are working furiously to keep ahead of the demand and it must be maddening to cellular network engineers to know that 15% – 20% of network usage is being created behind the scenes with app trackers and not from actions taken by users.

In an ideal world, this is something regulators would be investigating to establish rules. Apps like DashDoor shouldn’t be allowed to install a Facebook tracker on your phone without asking for specific and explicit permission. All trackers should have to disclose the exact information they gather about a user and the frequency of that tracking. Unfortunately, this FCC has walked away from any regulatory role in this area. Congress could address the issue – something that European regulators are considering – but this doesn’t seem to be high on anybody’s radar.

Court Chips Away at 5G Deployment Rules

The US Court of Appeals for the D.C. Circuit ruled last week that the FCC had gone too far when it ruled that 5G cell site placement could bypass environmental and historic preservation review. The specific ruling looked at whether the FCC has the authority to bypass these kinds of reviews for sites of religious and cultural importance to federally recognized Indian Tribes. But the ruling has a far larger significance and applies to these kinds of reviews everywhere.

This type of court ruling seemed inevitable because of the brashness of the original FCC order. That order declared that the deployment of 5G is so important that all of the rules in the country applying to the deployment of new infrastructure don’t apply. For courts to buy that argument that must be convinced that 5G deployment is so important that it is indeed a national emergency.

I think everybody who understands the benefits of 5G understands that it is an important new technology – one that will create huge benefits for the country. But it’s hard to make an argument that 5G deployment is an emergency.

The biggest benefits of 5G are only going to manifest with the introduction of frequency slicing into the cellular network, and that looks to be 3 – 4 years away. The deployments that the cellular carriers are labeling as 5G today mostly marketing gimmicks and custoemrs are not yet seeing any of the real benefits from 5G.

I blame the original FCC 5G order on a poorly chosen strategy by the cellular carriers, abetted by the FCC. We are facing a cellular emergency in the country, but it’s a crisis of 4G and not 5G. Our existing 4G network is in serious trouble and it seems that the cellular carriers don’t want to admit it. Cellular data networks are swamped because customer data usage is not doubling every two years. I have seen big problems in my local AT&T network. There have been many days when it’s hard to make or hold a call – something that never happened before last year.

The explosive growth of cellular traffic is partially the fault of the cellular carriers – it comes as a result of ‘unlimited’ data plans that encourage people to watch video and use cellphone data. It wasn’t that long ago when it cost a lot to buy a data plan that exceeded 1 or 2 gigabytes of usage per month. The average customer with an unlimited plan now uses 6 GB per month, and that number is growing rapidly.

The other cause of the increased demand on cellular networks comes from the success of the industry convincing in convincing everybody to use a smartphone. A recent Pew poll showed that 95% of teens and young adults now have a smartphone. The sheer number of customers is swamping the networks.

There is a path out of the current data crisis for cellular networks. It’s a 3-prong approach that involves building more cell sites, adding more bands of frequency onto cellphones, and finally layering on the frequency slicing capabilities of 5G.

It takes at 3 – 5 years to introduce a new frequency into the cellular network. That involves upgrading cell sites, but more importantly, it means building the capability into handsets and then getting the new phones into the hands of enough people to make a difference.

With real 5G benefits still a few years off, the only immediate way to relieve pressure on the cellular network is to add small cell sites. Each small cell site grabs local callers and keeps them off the big tall cell towers. All of the hectic small cell site construction we see is not being done for 5G – it’s being done to take the pressure off the 4G network.

The big cellular companies seem unwilling to admit that their networks are hurting and are in danger of overload – the first company brave enough to say that probably loses customers. Instead, the cellular industry elected to push the 5G narrative as the reason for bypassing the normal way that we build infrastructure. In this case, the courts didn’t buy that 5G is an emergency, and the court is right because 5G isn’t even here yet. If the cellular carriers and the FCC would have declared a 4G emergency I think everybody would have gotten it. We all want our cellphones to work.

The courts are still reviewing the appeal of an issue with even more potential dire consequences to the cellular carriers. Probably the most important aspect of the FCC’s 5G ruling is that cities have little say about the placement of small cell sites and also must expedite permitting for new small cell sites. That ruling was challenged by numerous cities and is being reviewed by the US Court of Appeals for the Ninth Circuit. That issue also boils down to the question of whether deploying 5G is an emergency. I wonder if it’s too late for the cellular carriers to fess up and admit that the emergency is really for 4G – even appeal court judges would likely understand that.

How Smart are Promotional Rates?

I think the big ISPs are recognizing the impact that special promotion rates have on their bottom line. Promotional pricing is the low rates that cable companies offer to new customers to pry them away from the competition. Over the years promotional rates have also become the tool that cable companies use to retain customers. Most customers understand that they have to call the cable company periodically to renegotiate rates – and the big ISPs have routinely given customers a discount to keep them happy.

We’re finally seeing some changes with this practice. When Charter bought Time Warner Cable they found that Time Warner had over 90,000 ‘special’ pricing plans – they routinely negotiated separately with customers when they bought new service or renegotiated prices. Charter decided to end the practice and told most former Time Warner customers that they had to pay the full price at the end of their current contract period.

We’ve seen the same thing with AT&T and DirecTV. The company decided last year to eliminate the special discount on DirecTV and DirecTV Now. When the discount period ends for those products the company moves rates to the full list price and refuses to renegotiate. The practice cost AT&T almost a million customers just in the first quarter of this year. But AT&T says that they are glad to be rid of customers that are not contributing to the bottom line of the company. I’ve seen where the CEOs or other big ISPs like Comcast have said that they are considering changes in these practices.

At CCG we routinely examine customer bills from incumbent ISPs as part of the market research of helping ISPs entering new markets. While our examination of customer bills has never reached the level of equating to a statistically valid sample, I can report that the vast majority of bills we see have at least some level of discount. In some markets it’s rare to find a customer bill with no discount.

The discounts must accumulate to a huge loss of revenue for the big ISPs. The big ISPs all know that one of the only ways they are going to be profitable in the future is to raise broadband rates every year. The growth of broadband customers overall is slowing nationwide since most homes have broadband, although Charter and Comcast are still enjoying the migration of customers off DSL. The ISPs are continuing to lose revenues and margins as they lose cable and landline voice customers. Most US markets are seeing increased competition in broadband services for businesses and large MDUs. There’s not much left other than to raise residential broadband rates if the big ISPs want to satisfy the revenue growth expected by Wall Street.

If the big ISPs phased out promotional discounts it would probably equate to a 5% to 10% revenue increase. This is something that is becoming easier for a cable company to do. Many of them have already come to grips with cord cutting, and many are no longer fighting to keep cable customers. Cable companies are also less worried over time about customers leaving them to go back to DSL – a choice that is harder for consumers to make as the household need for broadband continues to climb.

Most ISPs won’t make a loud splash about killing discounts but will just quietly change policies. After a few years, I would expect customer expectations will reset after they realize that they can no longer extract discounts by threatening to drop service.

I’ve always advised my fiber overbuilder customers to not play this game. I ask clients if they really want to fight hard to win that slice of the market of customers that will change ISPs for a discount. Such customers flop back and forth between ISPs every two years, and in my opinion, companies are better off without such customers. Churn is expensive, and it’s even more expensive if an ISP provides a substantial discount to stop a customer from churning. Not all of my client agree with this philosophy, but if the big ISPs stop providing promotional discounts, then over time the need to do this for competitors will lessen.

This is certainly a practice I’d love to see slip into history. I’ve never liked it as a customer because I despise the idea of having to play the game of renegotiating with an ISP every few years. I’ve also hated this as a consultant. Too many times I’ve seen clients give away a huge amount of margin through these practices, giving away revenue that is needed to meet their forecasts and budgets. It’s dangerous to let marketing folks determine the bottom line because they’ve never met a discount they don’t like – particularly if they can make a bonus for selling or retaining customers.

FCC Proposes Rules for $20.4 Billion Broadband Grants

On August 2 the FCC released a Notice of Proposed Rulemaking (NPRM) that proposes rules for the upcoming grant program that will award $20.4 billion for rural broadband. Since every FCC program needs a name, this grant program is now designated as the Rural Digital Opportunity Fund (RDOF). An NPRM is theoretically only a list of suggestions by the FCC, and there is a comment period that will commence 30 days after the NPRM is posted in the Federal Register. However, realistically, the rules that are proposed in the NPRM are likely to be the rules of the grant program. Here are a few of the highlights:

Timing of Award. The FCC proposes awarding the money in two phases. The Phase I award will be awarded late next year and will award over $16 billion. The Phase II will award will follow and award the remaining $4.4 billion. I know a lot of folks were hoping for a $2 billion annual grant award – but most of the money will be awarded next year. Anybody interested in this program should already be creating a network design and a financial business plan because the industry resources to create business plans are going to soon be too busy to help.

The money will be paid out to grant recipients over 10 years, similar to the ACAM program for small telcos. Grant recipients need to understand the time value of money. If an ISP wins a $1 million grant and borrows money at a rate of 5.5% interest, then the actual value of the grant in today’s dollars is a little more than $750,000.

Areas Eligible for Award. The Phase I auction will only be awarded in areas that are wholly unserved using the definition of broadband as 25/3 Mbps or faster. The areas covered can’t have anybody capable of getting broadband faster than that. The FCC is likely to publish a list of areas eligible for the Phase I grants. Unfortunately, the FCC will use its flawed mapping program to make this determination. This is likely to mean that many parts of the country that ought to be eligible for these grants might not be part of the program.

Phase II is likely to be targeted at areas that did not see awards in Phase I. One of the open questions in the NPRM that is not yet firm is the size of award areas. The NPRM asks if the minimum coverage area should be a census block or a county. It also asks if applicants can bundle multiple areas into one grant request.

The FCC is considering prioritizing areas it thinks are particularly needy. For example, it may give extra grant weighting to areas that don’t yet have 10/1 Mbps broadband. The FCC is also planning on giving extra weighting to some tribal areas.

Weighting for Technology. Like with the CAF II reverse auction, the grant program is going to try to give priority to faster broadband technologies. The FCC is proposing extra weighting for technologies that can deliver at least 100 Mbps and even more weighting for technologies that can deliver gigabit speeds. They are also proposing a grant disincentive for technologies with a latency greater than 100 milliseconds.

Use of Funds. Recipients will be expected to complete construction to 40% of the grant eligible households by the end of the third year, with 20% more expected annually and the whole buildout to be finished by the end of the sixth year.

Reverse Auction. The FCC is proposing a multi-round, descending clock reverse auction so that bidders who are willing to accept the lowest amount of subsidy per passing will win the awards. This is the same process used in the CAF II reverse auctions.

Overall Eligibility. It looks like the same rules for eligibility will apply as with previous grants. Applicants must be able to obtain Eligible Telecommunications Carrier (ETC) status to apply, meaning they must be a facilities-based retail ISP. This will exclude entities such as open access networks where the network owner is a different entity than the ISP. Applicants will also need to have a financial track record, meaning start-up companies need not apply. Applicants must also provide proof of financing.

Measurement Requirements. Grant winners will be subject to controlled speed tests to see if they are delivering what was promised. The FCC is asking if they should keep the current test – where only 70% of customers must meet the speed requirements for an applicant to keep full funding.

I see problems with a few of these requirements that I’ll cover in upcoming blogs.

Looking Back at Looking Forward

I find it interesting to sometimes look backward a few years to see what predictions were made about the future of the telecom industry. Five years ago I went to an NTCA conference where several speakers made predictions about the industry, particularly as it would impact rural America. It’s interesting to look at what was predicted about today just a few years ago. Some predictions were dead on and others fizzled. Following are some of the more interesting misses.

Broadband Technologies. There were predictions that by 2020 that we’d see upgrades to G.Fast in rural copper networks and to next-generation PON equipment for fiber deployments. Neither of these happened for various reasons. US telcos have never accepted G.Fast, although there is widespread adoption in Europe where copper networks are delivering 300 Mbps speeds to customers. The big telcos in the US are making no investments in rural copper unless the FCC fully funds it. Many smaller telcos have taken advantage of changes in the Universal Service Fund to upgrade from copper to fiber rather than upgrade DSL. Next-generation PON electronics are still waiting for one big ISP to buy enough gear to lower prices.

Widespread 5G. It’s not hard to understand why this would have been believed in 2014 since the big carriers were already in hype mode even then. One prediction was that as many as 60% of cellphones would be 5G by 2020. There were several predictions that 5G was going to enable autonomous vehicles and that building fiber along highways would be routine by 2020. There was a prediction that we’d see small cells everywhere, with deployments every 3,000 feet.

The timing of 5G is far behind those predictions. I see where Cisco recently estimated that only 3% of cellphones worldwide would be 5G enabled by 2022. Most experts today believe that the cellular networks will still predominantly rely on 4G LTE even a decade from today. The idea of building a cellular network for autonomous vehicles died – it was always hard to imagine the revenue stream that would have supported that network. We may still get to a dense small cell network someday, but calling for a small cell every 3,000 feet still sounds incredibly aggressive even decades from now.

IoT and LPWAN. There was a prediction that by 2020 that we’d have deployed low bandwidth networks using 900 MHz spectrum that would connect to huge numbers of outdoor IoT sensors. The prediction was that there is a huge revenue opportunity to charge $1 monthly for each sensor. There are still those calling for these networks today, but it’s still not getting any widespread traction.

Widespread Adoption of Augmented and Virtual Reality. Those technologies were on everybody’s future list in 2014. Oculus Rift was the leader in developing virtual reality and Magic Leap had raised several rounds of funding to develop augmented reality. There is now a sizable gaming deployment of virtual reality, but virtual reality has not yet touched the average person or moved beyond gaming. Magic Leap finally started selling a developer headset at the end of last year.

We Should Be Overrun by Now with Robots and Drones. In 2014 there was a prediction of robots everywhere by 2020. New factories are manned today by robots, but robots are still news when they are used in a public-facing function. A few hotels are trying out a robot concierge. There are a few automated fast food restaurants. There are a few hospitals with robots that transport meals and medicines. Robots deliver take-out food in a few city centers and university towns.

Drones are quietly being used for functions like mapping and inspecting storm damage. Flying small drones is now a popular hobby. Amazon keeps experimenting with drone delivery of packages but it’s still in the trial stage. Commercial use of drones is still in its infancy.

Use of Data. My favorite prediction was that by 2020 we’d have software systems that can deliver data at the right place, at the right time, to the right person, on the right device. This harkens back to the old AT&T promise that someday we’d be able to watch any movie we wanted, the minute we wanted. To some degree that old promise came to pass, although it was implemented by somebody other than AT&T.

Some businesses are meeting parts of this prediction today. These are custom platforms that send trouble tickets to technicians, notify employees to connect a new customer, automate ordering of inventory, etc. However, nothing close to that promise has yet made it into our everyday lives. In fact, except for Candy Crush most of us probably still have the same apps on our smartphones we used in 2014. Many of us are still waiting for the digital assistant we were first promised a decade ago.

Got Some Things Right. It’s easy to pick on predictions that never came to pass and I’ve made plenty of those myself. There was some great prediction in 2014. One presenter said we’d continue to see the explosive growth of residential data usage, that would continue to grow at 24% per year – that’s still a dead-on prediction. There was a prediction that businesses would migrate employees to mobile devices and it is routine today to see employees in all sorts of businesses operating from a tablet. There was a prediction of explosive growth of machine-to-machine data traffic, and today this one of the areas fastest traffic growth.

Is AT&T the 800-pound Gorilla?

For years it’s been understood in the industry that Comcast is the hardest incumbent to compete against. However, they are still a cable company and many people dislike cable companies – but Comcast has been the most formidable competitor. The company is reported to have the highest gross margins on cable TV and might be one of the few companies still making a significant profit on cable. Much of that is due to their extensive programming holdings – it’s easier to make money on cable when you buy your own programming. Comcast has also been the best in the industry in creating bundles to lock in customers – bundling things like smart home and more recently cellular service.

But the new 800-pound Gorilla in the industry might be AT&T. The company seems to be finally shaking out of the transition period from integrating their purchase of Time Warner. It can be argued that the programming that came from that merger – things like HBO, CNN, and blockbuster movies – will make AT&T a more formidable competitor than Comcast.

AT&T will be launching its new streaming service, AT&T TV, next month. The company already has one of the largest streaming services with DirecTV Now. It’s been rumored that the streaming service will start at a price around $18 per month – an amazingly low price considering that HBO retails for $15 online today. The company is trying to coax more money out of the millions of current HBO subscribers. This pricing also will lure customers to drop HBO bought from cable companies and instead purchase it online.

AT&T has also been building fiber for the last four years and says that they now pass 20 million homes and businesses. They recently announced the end of the big fiber push and will likely now concentrate on selling to customers in that big footprint. The company is one of the more aggressive marketers and has sent somebody to my door several times in the last year. That’s a sign of a company that is working hard to gain broadband subscribers.

The one area where AT&T is still missing the boat is in not bundling broadband and cellular service. AT&T is still number one in the country with cellular customers, with almost 160 million customers at the end of the recently ended second quarter. For some reason, they have never tried to create bundles into that large customer base.

AT&T has most recently been having a customer purge at DirecTV. For years that business bought market share by offering low-prices significantly below landline cable TV. Over the last, year the company has been refusing to renew promotional pricing deals and is willing to let customers walk. In the first quarter of this year alone the company lost nearly one million customers. The company says they are not unhappy to see these customers leave since they weren’t contributing to the bottom line. This is a sign of a company that is strengthening its position by stripping away the cost of dealing with unprofitable customers.

AT&T has also pushed a few net neutrality issues further than other incumbents. As a whole, the industry seems to be keeping a low profile with issues that are identified as net neutrality violations. There is speculation that the industry doesn’t want to stir up public ire on the topic and invite a regulatory backlash if there is a change in administration.

AT&T widely advertised to its cellular customers earlier this year that the company would not count DirecTV Now usage against cellular or landline data caps. The same will likely be true for AT&T TV. Favoring one’s own service over the competition is clearly one of the things that net neutrality was intended to stop. Since there are data caps on both cellular and AT&T landline products, the move puts Netflix and other streaming services at a competitive disadvantage. That disadvantage will grow over time as more landline customers hit the AT&T data caps.

AT&T has made big mistakes in the past. For instance, they poured a fortune into promoting 50 Gbps DSL instead of pushing for fiber a decade sooner. They launched their cable TV product just as that market peaked. The company seemed to lose sight of all landline and fiber-based products for a decade when everything the company did was for cellular – I remember a decade ago having trouble even finding mention of the broadband business in the AT&T annual report.

We’ll have to wait a few years to see if a company like AT&T can reinvent itself as a media giant. For now, it looks like they are making all of the right moves to take advantage of their huge resources. But the company is still managed by the same folks who were managing it a decade ago, so we’ll have to see if they can change enough to make a difference.

Broadband and the Census

The US government is gearing up to begin the 2020 census which will be administered starting next April 20. For the first time the Census is going to rely heavily on people answering the census questions online. Live census takers will then follow-up with those that don’t submit the online response.

This seems like an odd decision since there are still a lot of people who don’t have home broadband. This seems like a poorly conceived idea by those of us who understand the FCC’s dirty little secret – the FCC has no idea how many homes don’t have broadband.

As a country, we care a lot about an accurate Census. The census data is used for multiple government purposes. The 10-year Census is used to redraw both federal and state political boundaries every ten years. The Census is used to determine the number of US House Representatives allowed for each state. The government uses the Census to allocate the funding for numerous federal programs that allocate funding by population. If an area of the country is undercounted they lose both political representation and federal funding for a wide variety of purposes.

This all means that there is a significant downside risk for any part of the country that is undercounted in the Census. The Census is hiring 175,000 fewer door-to-door people nationwide to follow-up on those that don’t answer the first wave of the Census, and one has to wonder if they are going to be equipped when huge portions of rural America doesn’t respond to the online census request.

As I said earlier, we have no idea as a country how many people don’t have home broadband. According to the FCC maps, there are still 21 million people in rural America with no access to broadband. However, everybody understands that this number is understated due to the idiotic rules used to count broadband customers by the FCC. We use a self-reporting system where ISPs tell the FCC about their broadband coverage. We know that many ISPs have overstated the speeds they can deliver along with the areas of their coverage. That’s bad enough, but the FCC then compounds this error by assuming that if a census block has at least one broadband customer that the whole block has broadband. A census block is normally 600-800 homes and anybody living in rural America understands how large such an area can cover.

We have other people counting broadband that paint a very different picture than the FCC. The one with the widest reach and most credibility is Microsoft. They are able to measure the speed of downloaded software upgrades – a method that tells the real broadband situation at a home. Microsoft estimates that 162 million people in the US don’t have access to broadband that meets the FCC’s definition of 25/3 Mbps. But Microsoft has no way of counting homes with no broadband.

This is not just a rural problem. It’s always been suspected that there are millions of homes in older urban areas that don’t have access to broadband. There are apartments and little pockets of neighborhoods everywhere that were bypassed by the cable companies when they built their networks in the 1970s and 80s. Folks who study this issue estimate that there could be as many as 10 million people in urban areas without broadband access.

Even more importantly, there are millions of people that elect not to buy broadband or who access the Internet only using a cellphone. There are still homes everywhere that either can’t afford the Internet or who refuse to go online. Even among houses with broadband there are going to be many who don’t have good enough computer skills or the language skills to find and complete the Census questions online.

My guess is that the Census Bureau is going to be totally overwhelmed by the levels of non-response of households that don’t take the Census online. There will be huge geographic rural areas where few people respond online. There will be people everywhere who don’t have access to broadband or are unable to navigate the online questionnaire.

In the past the US Census Bureau believes they got a pretty high response. They got a decently high response from households that completed the paper census forms and had an army of census takers that tracked down houses that didn’t respond. If the completion ratio for the Census slips even a few percent, then areas without good broadband are likely to be disadvantaged in the many ways that Census data affects states.

The Census was moved online to save money. I think that the decision to go online is probably ten years premature and that the Census Bureau is probably totally unprepared for what’s going to happen next April. I hope I’m wrong.

More Details on Starlink

A few months ago Starlink, the satellite broadband company founded by Elon Musk, launched 60 broadband satellites. Since that launch, we’ve learned a few more things about the secretive venture.

We now know more details about the satellites. Each one weighs about 500 pounds. They are thin rectangular boxes like a flat-panel TV. Much of the surface is a solar panel, and each satellite also extends a second solar panel.

Each satellite comes with a krypton-powered ion thruster to use to navigate the satellite into initial orbit and to avoid future debris when necessary. This may sound like a cutting-edge propulsion system, but it’s been around for many years and the tiny engines create a small amount of thrust by shooting out charged ions of the noble gas – not a lot of thrust is needed to move a 500-pound satellite.

It seems the satellites can’t detect nearby space debris, so Starlink instead connects to the Air Force’s Combined Space Operations Center, which tracks the trajectories of all known space debris. The company will direct satellites to avoid known debris.

Probably the most important announcement for readers of this blog is that the company is likely to only compete in rural areas where there are few other broadband alternatives. This was finally admitted by Musk. There has been hopeful speculation in some parts of the industry that the low-orbit satellites would provide a broadband alternative everywhere, thus supplying a new competitor for cable companies. Since widespread competition generally results in lower prices there was hope that satellite broadband would act to make the whole broadband market more competitive.

We already had an inkling that satellite broadband was going to be rural-only when OneWeb, one of the competitors to Starlink, told the FCC that they were likely going to ultimately need about like 1 million wireless licenses for receivers. While that might sound like a huge number, one million satellite connections spread across the US is not creating a major competitor. We also heard the same message when several of the satellite companies talked about eventually having tens of millions of customers worldwide at maturity. Even with multiple satellite companies competing for customers there probably won’t be more than 3 – 4 million satellite broadband customers in the US – that would make a dent but wouldn’t fix the rural broadband gap. This strategy makes sense for the satellite companies since they’ll be able to charge a premium price for rural customers who have no broadband alternative instead of cutting prices to compete with cable companies.

There has still been no discussion from Starlink or the other competitors on broadband speeds or broadband pricing. It’s been nearly impossible to predict the impact of the satellites without understanding data speeds and total download capacity. The physics suggest that backhaul to the satellites will be the critical limiting factor, so it’s possible that there will be monthly data caps or some other way to control consumption.

One of the most interesting unanswered questions is how the satellites will do backhaul. Landline ISPs of any size today control cost and control data volumes by directly peering with the largest sources of broadband demand – being mostly Netflix, Google, Amazon, and Microsoft. As much as 70% of the traffic headed to an ISP is from this handful of destinations. Engineers are wondering how Starlink will handle peering. Will there be backhaul between satellites or will each satellite have a dedicated link to the ground for all data usage? This is a key question when a satellite is passing over a remote area – will it try to find a place within sight of the satellite to connect to the Internet or will data instead be passed between satellite with connections only at a major hub?

Answering that question is harder than might be imagined because these satellites are not stationary. Each satellite continuously orbits the earth and so a given customer will be handed off from one satellite to the next as satellites pass out of the visible horizon. The company says the receivers are about the size of a pizza box and they are not aimed at a given satellite, like what happens with satellite TV – instead, each receiver just has to be generally aimed skyward. It’s hard to think that there won’t be issues for homes living in heavy wooded areas.

One last interesting tidbit is that the satellites are visible to the naked eye. When the recent launch was first completed it was easy to spot the string of 60 satellites before they were dispersed. Astronomers are wondering what this will mean when there are ten thousand satellites filling the sky from the various providers. Elon Musk says he’s working to reduce albedo (the reflection of sunlight) to reduce any problems this might cause with land-based astronomy. But for stargazers this means there will always be multiple visible satellites crossing the sky.

FCC Modifies Broadband Mapping Parameters

Last week the FCC decided to change the method of collecting data to support its broadband maps. It’s widely understood that the current mapping system badly misstates broadband coverage. That’s a big problem since the FCC uses the faulty broadband mapping data to make decisions like determining eligibility for broadband grants.

The most important new change is that ISPs have to produce mapping ‘polygons’ to show where they have existing customers. The ISP polygons can cover areas without current customers only where an ISP “has a current broadband connection or it could provide such a connection within ten business days of a customer request and without an extraordinary commitment of resources or construction costs exceeding an ordinary service activation fee.”

The new polygons fix one of the big flaws in the current broadband map. The polygons are going to make a noticeable difference when showing coverage for a cable company or a fiber-to-the-home network. Those networks have hard boundaries – there is always a last home served at the edge of the service area after which nobody else is covered. Today’s mapping by census block doesn’t recognize the hard boundaries of these networks and often counts customers outside these networks as having access to fast data speeds. This is particularly a problem in rural areas where a large area outside a small town might be counted as having 100 Mbps or faster broadband when there is no broadband.

Unfortunately, I don’t see the new maps making a big difference for the rest of rural America unless the ISPs providing DSL and fixed wireless service get scrupulously honest with reporting.  I contend that it is difficult, and perhaps impossible to accurately map these technologies – particularly for disclosing the broadband speed available at a given customer location.

Consider DSL. There are several factors that affect the speed of a DSL product. The one everybody knows is that the amount of delivered bandwidth decreases with distance from the DSLAM (the DSL core modem). However, the quality of DSL performance also depends upon the gauge of the copper serving a customer (there are different sizes of copper in a network), the quality of that copper (copper deteriorates over time), issues with the drop wire (drop wires can suffer from a variety of issues separate from issues in the network), the age and type of DSL electronics (there is still plenty of DSL from the 1990s), and the telco technology used on a given copper route to boost or extend signals. There are also customers who can’t get DSL due to the simple issue that a telco has no spare pairs of copper with which to serve them.

It is not unusual for two customers who are side by side to have a drastically different DSL experience – one might have a decent speed and one might not be able to get any DSL service. There is no way for a telco to reflect these highly local conditions on a broadband map. I’m doubtful that the big telcos even track the speeds available to existing customers. The telcos can’t know anything about homes that don’t have their service today.

The same goes for fixed wireless. Broadband speeds also decrease with distance from the tower. Wireless broadband speeds can vary with temperature and humidity. There is a definite fall-off in speed during precipitation. Wireless broadband using unlicensed spectrum is subject to interference, which can mysteriously come and go. The biggest obstacle for many wireless customers is foliage and other obstacles between a customer and the wireless tower. Just like with DSL, wireless companies don’t have any idea what speed they can deliver to a customer who is not on their network. They usually only know what’s available after climbing on a roof to investigate a connection.

Another big issue the FCC didn’t address is reporting of actual speeds. Our examination of the FCC mapping data for both DSL and fixed wireless shows that many ISPs don’t try to report actual broadband speeds. Instead, we see marketing speeds or something other speed standard being reported. Even if these providers map the polygons correctly, we won’t have a good idea of rural broadband coverage unless the ISPs try hard to report actual speeds. We hear from customers all the time that are being sold a rural broadband product that is marketed to deliver speeds of 10 Mbps, 15 Mbps, or 25 Mbps but which delivers only a few Mbps. If the maps don’t reflect the actual speeds they will still be largely worthless.

One last issue is a head-scratcher. Many rural networks are oversubscribed, meaning there are more customers than can comfortably be accommodated at the busiest usage times on the networks. How do you report the broadband speed for a customer who can get 20 Mbps downloads at 4:00 AM but 3 Mbps in the evening?

I applaud the FCC for finally getting rid of the census blocks. But we can’t pretend that this fix is going to make much of a difference for most of rural America. The rural broadband gap is mostly due to the neglected copper networks of the largest telcos. I can’t imagine any way to ever accurately map DSL and fixed wireless technologies., which means the maps are still going to be terrible in the places we most care about. The FCC is still going to harming rural America if they use the new maps to make decisions for important things like awarding grant money. The only real fix is to throw the maps away for those purposes and do something more sensible. For example, grant money ought to always be available to somebody that wants to build fiber to replace big telco copper – we don’t need a map to know that is good policy.