The Impact of Satellite Broadband

Recently I’ve had several people ask me about the expected impact of low-orbit satellite broadband. While significant competition from satellites is probably a number of years away, there are several major initiatives like StarLink (Elon Musk), Project Kuiper (Amazon), and OneWeb that have announced plans to launch swarms of satellites to provide broadband.

At this early stage, it’s nearly impossible to know what impact these companies might have. We don’t know anything about their download and speed capacity, their pricing strategy, or their targeted market so it’s impossible to begin to predict their impact. We don’t even know how long it’s going to take to get these satellites in space since these three companies alone have plans to launch over 10,000 new satellites – a tall task when compared to the 1,100 satellites currently active in space.

Even without knowing any of these key facts, BroadbandNow recently grabbed headlines around the industry by predicting that low-orbit satellites will bring an annual savings of $30 billion for US broadband customers. Being a numbers guy, I never let this kind of headline pass without doing some quick math.

They explain their method of calculation on their web site. They are making several major assumptions about the satellite industry. First, they assume the satellite providers will compete on price and will compete in every market in the country. Since the vast majority of American live in metro areas, BroadbandNow is assuming the satellite providers will become a major competitor in every city. They also assume that the satellites will be able to connect to a huge number of customers in the US which will force other ISPs to lower prices.

Those assumptions would have to be true to support the $30 billion in projected annual consumer savings. That is an extraordinary number and works out to be a savings of almost $20 per month for every household in the US. If you spread the $30 billion over only those households that buy broadband today, that would be a savings of over $23 per month. If your further factor out the folks who live in large apartments and don’t get a choice of their ISP, the savings jumps to $27 per household per month. The only way to realize savings of that magnitude would be from a no-holds-barred broadband price war where the satellite providers are chewing into market penetrations everywhere.

I foresee a different future for the satellite industry. Let’s start with a few facts we know. While 10,000 satellites is an impressive number, that’s a worldwide number and there will be fewer than 1,000 satellites over the US. Most of the satellites are tiny – these are not the same as the huge satellites launched by HughesNet. Starlink has described their satellites as varying in size between a football and a small dorm refrigerator. At those small sizes these satellites are probably the electronic equivalent of the OLT cabinets used as neighborhood nodes in a FTTH network – each satellite will likely support some limited and defined number of customers. OneWeb recently told the FCC in a spectrum docket that they are envisioning needing one million radio links, meaning their US satellites would be able to serve one million households. Let’s say that all of the satellite providers together will serve 3 – 5 million homes in the US – that’s an impressive number, but it’s not going to drive other ISPs into a pricing panic.

I also guess that the satellite providers will not offer cheap prices – they don’t need to. In fact, I expect them to charge more than urban ISPs. The satellite providers will have one huge market advantage – the ability to bring broadband where there isn’t landline competition. The satellite providers can likely use all of their capacity selling only in rural America at a premium price.

We still have no real idea about the speeds that will be available with low-orbit satellite broadband. We can ignore Elon Musk who claims he’ll be offering gigabit speeds. The engineering specs show that a satellite can probably make a gigabit connection, but each satellite is an ISP hub and will have a limited bandwidth capacity. Like with any ISP network, the operator can use that capacity to make a few connections at a high bandwidth speed or many more connections at slower speeds. Engineering common sense would predict against using the limited satellite bandwidth to sell gigabit residential products.

That doesn’t mean the satellite providers won’t be lured by big bandwidth customers. They might make more money selling gigabit links at a premium price to small cell sites and ignoring the residential market completely. It’s a much easier business plan, with drastically lower operating costs to sell their capacity to a handful of big cellular companies instead of selling to millions of households. That is going to be a really tempting market alternative.

I could be wrong and maybe the satellite guys will find a way to sell many tens of millions of residential links and compete in every market, in which case they would have an impact on urban broadband prices. But unless the satellites have the capacity to sell to almost everybody, and unless they decide to compete on price, I still can’t see a way to ever see a $30 billion national savings. I instead see them making good margins by selling where there’s no competition.

Speed Goals for FCC Grants

I literally grimaced when I first read about the 25/3 Mbps speed test that will likely be part of the new $20.4 billion grant program recently announced by the FCC. My first thought was that the 25/3 Mbps goal would provide an excuse for the FCC to give the grant money to the big telcos again. Those companies could then take another ten years to bring rural DSL up to the speeds they should have achieved on their own a decade ago. With the history of the FCC pandering to the big telcos I instantly feared this possibility.

But let’s assume that the upcoming grants will be available to all comers. Why would the FCC choose the 25/3 Mbps speed target? It’s a terrible goal for many reasons.

  • While this FCC will not admit it, 25/3 Mbps is already obsolete as the definition of adequate broadband. It’s been five years since 25/3 Mbps was adopted and households are using a lot more data than five years ago. It’s pretty easy to make the case that the definition of broadband today probably ought to be at least 50 Mbps download.
  • If the 25/3 Mbps speed is already outdated today, then it’s a lousy goal for a decade from now. This FCC should not repeat the same blunder as the last FCC did with the original CAF II program. They should set a forward-looking speed goal that reflects the likely speed requirements at the time the grant networks will be constructed. Any network engineer who tracks customer usage will tell you that the minimum speed requirement for eight years from now should be at least 100 Mbps.
  • The 25/3 Mbps just feels ‘puny’. I got the same letdown when I read that a new NASA goal is to put a man on the moon again. Considering the huge leaps we’ve made in technology since 1969, striving for a moon-landing again feels like a small national goal and a waste of our national resources – and so does setting a broadband speed goal of 25/3 Mbps.

One of the goals that Congress gave the FCC is to strive to bring rural broadband into parity with urban broadband. In setting a goal of 25/3 the FCC is ignoring the broadband trend in cities. The big cable companies have increased minimum download speeds for new customers to beteen 100 and 200 Mbps and have unilaterally increased speeds for existing customers. 25/3 Mbps is a DSL speed, and we see the biggest telcos finally starting to walk away from copper. Verizon has gotten out of the copper business in nearly 200 exchanges in the northeast. AT&T has been losing DSL customers and replacing them with fiber customers. It’s almost unthinkable that the FCC would establish a new forward-looking grant program and not expect broadband speeds any faster than DSL.

In my mind, the FCC betrayed rural communities when they adopted the 10/1 Mbps speed goal for CAF II. That told rural communities that they had to settle for second-rate broadband that was far slower than the rest of the country. From what I hear, most rural communities don’t even consider the CAF II upgrades as real broadband. Rural communities want fiber. They view anything slower than fiber as nothing more than a stepping-stone towards eventually getting fiber.

The FCC needs to listen to what rural America wants. If this giant new grant program will make rural communities wait for years to get 25/3 Mbps then rural America will largely ignore it. Communities will continue to plan for something better. Households might begrudgingly buy 25/3 broadband, but the people in rural America know that is not the same as broadband elsewhere and they will continue to clamor for the same broadband that they see in cities.

I hope the FCC understands this. Even if they allow technologies in these grants that can only deliver 25/3 Mbps, the FCC can still use the grant ranking process to favor faster broadband. If the grants grading process emphasizes speed, then the $20 billion could probably be used to bring fiber to 4 or 5 million rural homes. In my mind that would be the ideal use of these grants, because those homes would be brought to parity with the rest of the country. Those homes could be taken off of the FCC’s worry list and the universe of underserved homes would be significantly reduced. If the grants give money to anything less than fiber, the FCC will have to keep on dumping grant money into the same communities over and over until they finally finance fiber.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.

Surveys for Grants and Loans

Many of the federal and state grant programs and many broadband lenders want applicants to undertake a survey to quantify the likely success of a new broadband venture. Unfortunately, there are far too many broadband projects being launched that are unable to answer the basic question, “How many customers are likely to buy service from the new network?” There are only two ways to get a reliable answer to that question – a canvass or a statistically valid survey.

A canvass is the easiest to understand and it involves knocking on the doors or calling every potential customer in a market. I’ve seen many clients have good luck with this when overbuilding a small town or a subdivision. A canvass will be most successful when an ISP has all of the facts needed by potential customers such as specific products and prices. Many companies would label the canvass process as pre-selling – getting potential customers to tentatively commit before construction.

The alternative is a canvass is a ‘statistically valid’ survey. Any survey that doesn’t meet the statistically valid test isn’t worth the paper it’s printed on. There are a few key aspects of doing a statistically valid survey:

Must be Random. This is the most important aspect of a valid survey and is where many surveys fail. Random means that you are sampling the whole community, not just a subset of respondents. A survey that is mailed to people or put online for anybody to take is not random.

The problem with a non-random survey is that the respondents self-select. For example, if you mail a survey to potential customers, then people who are interested in broadband are the most likely to respond and to return the completed survey. It can feel good to get back a lot of positive responses, but it’s far more important to hear from those who don’t support fiber.

The whole purpose of doing a broadband survey is to quantify the amount of support – and that also means quantifying those who won’t buy fiber. I’ve seen results from mailed surveys where almost every response was pro-broadband, and of course, that is unlikely. That result just means that the people who aren’t interested in broadband didn’t bother to complete or return the survey. The only way you can put any faith in a mailed survey is if you get so many responses that it approaches being a canvass. A good analogy of the problems with a mail survey would be to stand in front of a grocery store and ask customers if they like to shop there. While there may be a few customers with complaints, such a survey would not tell you anything about how the community feels about that store since the question was not asked to those who don’t shop at the store.

This blog is too short to describe survey methods – but there are specific acceptable techniques for conducting a random survey either by telephone or by knocking on doors. It’s possible to do those tasks non-randomly, so you should seek advice before conducting a phone or door-knocking survey.

Non-biased Questions. Survey questions must be non-biased, meaning that they can’t lead a respondent towards a certain answer. A question like, “Do you want to save money on broadband?” is worthless because it’s hard to imagine anybody answering no to that question. It’s a lot harder to write non-based questions than you might think, and bias can be a lot more subtle than that question.

Respondent Bias. People who conduct surveys know that there are some kinds of questions that many respondents won’t answer truthfully. For example, I’ve read that nearly half of applicants lie about their annual income when applying for a credit card. For various reasons people want others to think they earn more than they actually do.

Respondent bias can apply to a broadband survey as well. I’ve learned that you can’t rely on responses having to do with spending. For example, many respondents will under-report what they pay each month for broadband. Perhaps people don’t want the survey taker to think they spend too much.

Respondent bias is one of the reasons that political surveys are less reliable than surveys on more factual topics – respondents may not tell the truth about who they will vote for or how they feel about political issues. Luckily, most people are truthful when asked about non-emotional topics and factual questions, and we’ve found residential broadband surveys to be a great predictor of market interest in broadband.

Survey Fatigue. Respondents have a natural tendency to give up if a survey takes too long. They will hang-up on a phone survey or start giving quick and inaccurate answers to get rid of somebody at their door. A survey ought to last no longer than 10 minutes, and the ideal length should be closer to five minutes.

The big takeaway from this discussion is that doing a survey the wrong way will likely give you the wrong answer to the basic question of likely market penetration. You’re better off to not do a survey than to do one that is not statistically valid. I don’t know if there is anything more deadly in launching a new broadband market than having a false expectations of the number of customers that will buy broadband.

What’s the Future for CenturyLink?

I don’t know how many of you watch industry stock prices. I’m certainly not a stock analyst, but I’ve always tracked the stock prices of the big ISPs as another way to try to understand the industry. The stock prices for big ISPs are hard to compare because every big ISP operates multiple lines of business these days. AT&T and Verizon are judged more as cellular companies than as ISPs. AT&T and Comcast stock prices reflect that both are major media companies.

With that said, the stock price for CenturyLink has performed far worse than other big ISPs over the last year. A year ago a share of CenturyLink stock was at $19.24. By the end of the year the stock price was down to $15.44. As I wrote this blog the price was down to $10.89. That’s a 43% drop in share price over the last year and a 30% drop since the first of the year. For comparison, following are the stock prices of the other big ISPs and also trends in broadband customers:

Stock Price 1 Year Ago Stock Price Now % Change 2018 Change in Broadband Customers
CenturyLink $19.24 $10.89 -43.4% -262,000
Comcast $32.14 $43.15 34.3% 1,353,000
Charter $272.84 $377.89 38.5% 1,271,000
AT&T $32.19 $30.62 -4.9% -18,000
Verizon $48.49 $56.91 17.4% 2,000

As a point of comparison to the overall market, the Dow Jones Industrial average was up 4% over this same 1-year period. The above chart is not trying to make a correlation between stock prices and broadband customers since that is just one of dozens of factors that affect the performance of these companies.

Again, I’ve never fully understood how Wall Street values any given company. In reading analyst reports on CenturyLink it seems that the primary reason for the drop in stock price is that all of the company’s business units are trending downward. In the recently released 1Q 2019 results the company showed a year-over-year drop in results for the international, enterprise, small and medium business, wholesale, and consumer business units. It seems that analysts had hoped that the merger with Level 3 would reverse some of the downward trends. Stock prices also dropped when the company surprised the market by cutting its dividend payment in half in February.

CenturyLink faces the same trends as all big ISPs – traditional business lines like landline telephone and cable TV are in decline. Perhaps the most important trend affecting the company is the continued migration of broadband customers from copper-based DSL to cable company broadband. CenturyLink is not replacing the DSL broadband customers it’s losing. In 2018 CenturyLink lost a lot of broadband customers with speeds under 20 Mbps, but had a net gain of customers using more than 20 Mbps. CenturyLink undertook a big fiber-to-the-home expansion in 2017 and built fiber to pass 900,000 homes and businesses – but currently almost all expansion of last-mile networks is on hold.

It’s interesting to compare CenturyLink as an ISP with the big cable companies. The obvious big difference is the trend in broadband customers and revenues. Where CenturyLink lost 262,000 broadband customers in 2018, the two biggest cable companies each added more than a million new broadband customers for the year. CenturyLink and other telcos are losing the battle of DSL versus cable modems with customers migrating to cable companies as they seek faster speeds.

It’s also interesting to compare CenturyLink to the other big telcos. From the perspective of being an ISP, AT&T and Verizon are hanging on to total broadband customers. Both companies are also losing the DSL battle with the cable companies, but each is adding fiber customers to compensate for those losses. Both big telcos are building a lot of new fiber, mostly to provide direct connectivity to their own cell sites, but secondarily to then take advantage of other fiber opportunities around each fiber node.

Verizon has converted over a hundred telephone exchanges in the northeast to fiber-only and is getting out of the copper business in urban areas. Verizon has been quietly filling in its FiOS fiber network to cover the copper it’s abandoning. While nobody knows yet if it’s real, Verizon also has been declaring big plans to to expand into new broadband markets markets using 5G wireless loops.

AT&T was late to the fiber game but has been quietly yet steadily adding residential and business fiber customers over the last few years. They have adopted a strategy of chasing pockets of customers anywhere they own fiber.

CenturyLink had started down the path to replace DSL customers when they built a lot of fiber-to-the-home in 2017. Continuing with fiber construction would have positioned the company to take back a lot of the broadband market in the many large cities it serves. It’s clear that the new CenturyLink CEO doesn’t like the slow returns from investing in last-mile infrastructure and it appears that any hopes to grow the telco part of the business are off the table.

Everything I read says that CenturyLink is facing a corporate crisis. Diving stock prices always put strain on a company. CenturyLink faces more pressure since the activist investors group Southeastern Asset Management holds more than a 6% stake in CenturyLink and made an SEC filing that that the company’s fiber assets are undervalued.

The company has underperformed compared to its peers ever since it was spun off from AT&T as US West. The company then had what turned out to be a disastrous merger with Qwest. There was hope a few years back that the merger with CenturyLink would help to right the company. Most recently has been the merger with Level 3, and at least for now that’s not made a big difference. It’s been reported that CenturyLink has hired advisors to consider if they should sell or spin off the telco business unit. That analysis has just begun, but it won’t be surprising to hear about a major restructuring of the company.

Setting the Right Goals for Grants

Most past federal broadband grant programs had very specific goals. For example, the USDA Community Connect grants that have been around for many years target grants to the poorest parts of the country – the awards are weighted towards communities with the highest levels of poverty. For any grant program to be effective the goals of the program need to be clearly defined, and then the award process needs to be aligned with those goals.

The FCC needs to define the goals of the upcoming $20.4 billion grant program. It the goals are poorly defined then the resulting grant awards are likely to be all over the board in terms of effectiveness. What are the ideal goals for a grant program of this magnitude?

The first goal to be decided is the scope of the coverage – will the goal be to bring somewhat better broadband to as many households as possible, or will it be to bring a long-term broadband solution to a smaller number of households? If the goal is to serve the most households possible, then the grants are going to favor lower-cost technologies and the grants will likely go to the wireless providers and satellite providers – as we saw happen in the recent CAF II reverse auction.

If the grants are aimed at a more permanent solution, then the grants will favor fiber. Perhaps the grants could also go towards anybody willing to extend a cable hybrid-fiber coaxial network into rural areas – but no other technology can be considered as a permanent solution.

There are huge consequences for choosing the first option of serving as many households as possible. These new grants are mostly going to be awarded in the geographic areas covered by the original CAF II program. That program awarded over $11 billion to the big telcos to beef up broadband to speeds of at least 10/1 Mbps. Now, before that program is even finished the FCC is talking about overbuilding those same areas with another $20 billion grant program. If this grant program is used to upgrade homes to fixed wireless, it doesn’t take a crystal ball to understand that in ten years from now we’ll be talking about overbuilding these areas again with fiber. It would be incredibly wasteful to use multiple rounds of grants to upgrade the same geographic areas several times.

The other big issue for these grants to deal with is defining which parts of the country are eligible for the grants. What should be the criteria to decide which homes can be upgraded?

If the test is going to be related to existing speeds, the FCC is going to have to deal with the existing broadband coverage maps that everybody in the industry knows to be badly flawed. The FCC is talking about tackling a new mapping effort – but it’s highly likely that the new maps will just swap old mapping errors for new mapping errors. The reality on the ground is that it’s virtually impossible to map the real speeds on copper or fixed wireless networks. In real life, two rural neighbors can have drastically different speeds due to something as simple as being on different copper pairs. It’s impossible to accurately map DSL or wireless broadband coverage.

To make matters even worse, the current Re-Connect grants are saddled with a rule that says that no more than 10% of grant-covered homes can have existing broadband of more than 10/1 Mbps. Layering that kind of rule on top of terrible maps creates an environment where an ISP is largely unable to define a believable grant footprint.

The FCC must figure out some way to rectify the mapping problem. One of the easiest ways is what I call the technology test – anybody that wants to overbuild copper with fiber should automatically be eligible without trying to figure out the current speeds on the copper. Perhaps the easiest rule could be that any place where there is telco copper and no cable company network should be grant-eligible for fiber overbuilders.

Assuming the grants won’t all go to fiber, then there has to be an alternate way for an ISP or a community to challenge poor maps. Perhaps the FCC needs to provide a realistic time frame to allow local governments to demonstrate the actual speeds in an area, much like what was done in the recent Mobility II grant process.

This blog is part of a series on Designing the Ideal Federal Grant Program.

Designing the Ideal Federal Broadband Grant Program

In April, FCC Chairman Ajit Pai announced a new rural broadband initiative that will provide $20.4 billion of new funding. We don’t know many details yet, but here are a few things that will likely be involved in awarding the funding:

  • The FCC is leaning towards a reverse auction.
  • The program will likely require technologies that can deliver at least 25/3 Mbps broadband speeds.
  • The program will be funded within the existing Universal Service Fund, mostly by repositioning the original CAF II plan.
  • The grants might all be awarded at once, similar to A-CAM and CAF II awards, meaning that there might be only one chance to apply, with the awards to be paid out over a longer time period.

I’m writing a series of blogs that will examine the ideal way to design and administer a grant program of this size. We’ve seen both good and also disastously bad federal broadband programs before, and i’m hoping the FCC will take some time to make this grant program one of the effective ones. I’m sure the details of this new program are not yet set in stone, and folks in rural America need to make their voices heard now if they want some of this money to benefit their communities.

I’m going to look at the following topics, and perhaps more as I write this. At the end of this process I’ll post a whitepaper on my website that consolidates all of these discussions into one document.

A well-designed broadband grant program of this magnitude should consider the following:

What is the End Goal?

It’s important up-front for the FCC to determine how the grant moneys are to be used. The best grant programs have a specific goal, and then the application and award process is designed to best meet the goals. The goal can’t be something as simple as ‘promote rural broadband’, because a goal that simplistic is bound to create a hodgepodge of grant awards.

What Broadband Speeds Should be Supported?

This is an area where the FCC failed miserably in the past. They awarded over $11 billion in the CAF II program that was used to upgrade broadband speeds to speeds of only 10/1 Mbps. When the FCC set the 10/1 Mbps speed that didn’t even meet their own definition of broadband. How should the FCC determine eligible speeds this time to avoid a repeat of the CAF II debacle?

Who Should be Eligible?

FCC programs in the past have usually made the monies available to a wide range of recipients. However, the specific details of the grant programs have often made it hard for whole classes of entities like cities or counties to accept the awards. As an example, there are many entities electing to not participate in the current Re-Connect grant program because they can’t accept any part of the awards that include RUS loans.

Is a Reverse Auction the Right Mechanism?

The FCC and numerous politicians currently favor reverse auctions. Like any mechanism, there are situation where reverse grants are a great tool and others where they will distort the award process. Are reverse auctions a good tool for this grant program?

Other Issues

There are two drastically different ways to hand out these grants. One is to follow the CAF II mechanism and award all of the $20 billion in one huge auction and then pay it out over 6 or 8 years. The other would be to divide the award money into even tranches and have a new grant award for each of those years.

In the recent Re-Connect grants the FCC decided to blend grants and loans. I know the loan component stopped most of my clients from pursuing these grants. Should there be a loan component of the grants?

There are also technical issues to consider. I had clients who were outbid in the recent CAF II reverse auction by wireless companies that gained bidding preference by promising that their fixed wireless networks could achieve across-the-board 100 Mbps broadband. I still don’t know of a wireless technology that can do that over a large footprint. How should the FCC make sure that technologies deliver what’s promised?

What’s the Role of States in this Process?

What should states be doing to maximize the chance for federal grant money to be awarded to their state?

This blog is part of a series:

Setting the Right Goals for Grants

Speed Goals for FCC Grants

Clearing Mid-range Spectrum

The FCC is in the process of trying to free up mid-range spectrum for 5G. They just opened a Notice of Proposed Rulemaking looking at 2.5 GHz spectrum, located in the contiguous block between 2495 and 2690 MHz. Overall this is the largest contiguous block of mid-range spectrum. Over half of the spectrum sits idle today, particularly in rural America. The history of this spectrum demonstrates the complications involved in trying to reposition spectrum for broadband and cellular use.

The frequency was first assigned by the FCC in 1963 when it was made available to school systems to transmit educational TV between multiple schools. The spectrum band was called Instructional Television Fixed Service (ITFS). The band was divided into twenty channels and could transmit a TV signal up to about 35 miles. I grew up in a school system that used the technology and from elementary school onward we had a number of classes taught on the TV. Implementing the technology was expensive and much of the spectrum was never claimed.

In 1972 the FCC recognized the underuse of the spectrum and allowed commercial operators to use the bands of 2150 to 2162 MHz on an unlicensed basis for pay-TV transmissions to rooftop antennas. The spectrum could only carry a few TV channels and in the 1970s was used in many markets to transmit the early version of HBO and Nickelodeon. This spectrum band was known as Multipoint Distribution Service (MDS) and also was good for about 35 miles.

Reacting to pressure from cellular companies, the FCC reallocated eight additional channels of the spectrum for commercial use. Added to the MDS spectrum this became known as Multichannel Multipoint Distribution Service (MMDS). At the time this displaced a few school systems and anybody using the spectrum had to pay to move a school system to another workable channel. This spectrum was granted upon request to operators for specific markets.

In 1991 the FCC changed the rules for MMDS and allowed the channels to be used to transmit commercial TV signals. In 1995 any unused MMDS spectrum was sold under one of the first FCC auctions, which was the first to divide service areas into the geographic areas known as Basic Trading Areas (or BTAs) that are still used today. Before this auction, the spectrum was awarded in 35-mile circles called Geographic Service Areas (GSAs). The existing GSAs were left in place and the spectrum sold at auction had to work around existing GSAs.

The FCC started getting pressure from wireless companies to allow for the two-way transmission of data in the frequency (up to now it had been all one-way delivery to a customer site). In 2005 the FCC changed the rules and renamed the block of spectrum as Broadband Radio Service (BRS). This added significant value to licenses since the spectrum could now be repositioned for cellular usage.

At this point, Clearwire entered the picture and saw the value of the spectrum. They offered to buy or lease the spectrum from school systems at prices far lower than market value and were able to amass the right to use a huge amount of the spectrum nationwide. Clearwire never activated much of the spectrum and was in danger of losing the rights to use it. In 2013 Sprint purchased Clearwire, and Sprint is the only cellular company using the spectrum band today.

Today the spectrum band has all sorts of users. There are still school districts using the spectrum to transmit cable TV. There are still license holders who never stopped using the 35-mile GSA areas. There are still MMDS license holders who found a commercial use for the spectrum. And Sprint holds much of the spectrum not held by these other parties.

The FCC is wrestling in the NPRM with how to undo the history of the spectrum to make it more valuable to the industry. Education advocates are still hoping to play in the space since much of the spectrum sits idle in rural America (as is true with a lot of cellular and other mid-range spectrum). The other cellular carriers would like to see chunks of the spectrum sold at auction. Other existing license holders are fighting to extract the biggest value out of any change of control of the spectrum.

The challenge for repositioning this spectrum is complicated because the deployment of the spectrum differs widely today by market. The FCC is struggling to find an easy set of rules to put the genie back in the bottle and start over again. In terms of value for 5G, this spectrum sits in a sweet spot in terms of coverage characteristics. Using the spectrum for cellular data is probably the best use of the spectrum, but the FCC has to step carefully to do this in such a way as to not end up in court cases for years disputing any order. Reallocating spectrum is probably the most difficult thing the FCC does and it’s not hard to see why when you look at the history of this particular block of spectrum and realize that every block of spectrum has a similar messy past.

AT&T and Augmented Reality

Lately it seems like I find a news article almost every week talking about new ways that people are using broadband. The latest news is an announcement that AT&T is selling Magic Leap augmented reality headsets in six cities plus online.

The AT&T launch is being coordinated with the release of an augmented reality immersive experience that will bring The Game of Thrones into people’s homes with a themed gaming experience called The Dead Must Die, with a teaser in this trailer.

Augmented reality differs from virtual reality in that augmented reality overlays images into the local environment. A user will see characters in their living room as opposed to being immersed in a total imaginary environment with virtual reality.

Magic Leap is one of the most interesting tech start-ups. They started in 2014 with a $542 million investment, and since then have raised over $2.3 billion dollars. The company’s investors and advisors include people like Alibaba executive vice chair Joe Tsai and director Steven Spielberg. There have been rumors over the years of an impending product, but until now they’ve never brought a product to market. AT&T will be selling Magic Leap’s first headset, called the Magic Leap One Creator Edition for a price of $2,295. The mass-market headset will surely cost a lot less.

AT&T’s interest in the technology extends past selling the headsets. Magic Leap recently signed a deal with the NBA and its broadcast partner Turner which is now owned by AT&T and will obviously be looking at augmented reality broadcasts of basketball games.

AT&T’s interest goes even far beyond that and they are looking at the Magic Leap technology as the entry into the spatial Internet – moving today’s web experience to three dimensions. AT&T sees the Magic Leap headset as the entry into bringing virtual reality to industries like healthcare, retail and manufacturing. They envision people shopping in 3D, doctors getting 3D computer assistance for visualizing a patient during an operating, and manufacturer workers aided by overlaid 3D blueprints on the manufacturing floor.

While the Magic Leap headset will work on WiFi today, AT&T is promoting Magic Leap as part of their 5G Innovation Program. AT&T is touting this as a technology that will benefit greatly from 5G, which will allow users to go mobile and use the augmented reality technology anywhere.

I couldn’t find any references to the amount of bandwidth used by this first-generation headset, but it has to be significant. Looking at the Game of Thrones application, a user is immersed in a 3D environment and can move and interact with elements in the augmented reality. That means a constant transmission of the elements in the 3D environment. I have to think that is at least equivalent to several simultaneous video transmissions. Regardless of the bandwidth used today, you can bet that as augmented reality becomes mainstream that content makers will find ways to use greater bandwidth.

We are already facing a big increase in bandwidth that is needed to support gaming from the cloud – as is now being pushed by the major game vendors. Layering augmented reality on top of that big data stream will increase bandwidth needs by another major increment.

The Fastest and Slowest Internet in the US

The web site HighSpeedInternet.com has calculated and ranked the average Internet speeds by state. The site offers a speed test and then connects visitors to the web pages for the various ISPs in each zip code in the country. I have to imagine the site makes a commission for broadband customers that subscribe through their links.

Not surprisingly, the east coast states with Verizon FiOS ranked at the top of the list for Internet speeds since many customers in those states have the choice between a fiber network and a big cable company network.

For example, Maryland was top on the list with an average speed of 65 Mbps, as measured by the site’s speed tests. This was followed by New Jersey at 59.6 Mbps, Delaware at 59.1 Mbps, Rhode Island at 56.8 Mbps and Virginia at 56 Mbps.

Even though they are at the top of the list, Maryland is like most states and there are still rural areas of the state with slow or non-existent broadband. The average speed test results are the aggregation of all of the various kinds of broadband customers in the state:

  • Customers with fast Verizon FiOS products
  • Customers with fast broadband from Comcast, the largest ISP in the state
  • Customers that have elected slower, but less expensive DSL options
  • Rural customers with inferior broadband connections

Considering all of the types of customers in the state, an average speed test result of 65 Mbps is impressive. This means that a lot of households in the state have speeds of 65 Mbps or faster. That’s not a surprise considering that both Verizon FiOS and Comcast have base product speeds considerably faster than 65 Mbps. If I was a Maryland politician, I’d be more interested in the distribution curve making up this average. I’d want to know how many speed tests were done by households getting only a few Mbps speeds. I’d want to know how many gigabit homes were in the mix – gigabit is so much faster than the other broadband products that it pulls up the average speed.

I’d also be interested in speeds by zip code. I took a look at the FCC broadband data reported on the 477 forms just for the city of Baltimore and I see widely disparate neighborhoods in terms of broadband adoption. There are numerous neighborhoods just north of downtown Baltimore with broadband adoption rates as low as 30%, and numerous neighborhoods under 40%. Just south of downtown and in the northernmost extremes of the city, the broadband adoption rates are between 80% and 90%. I have to guess that the average broadband speeds are also quite different in these various neighborhoods.

I’ve always wondered about the accuracy of compiling the results of mass speed tests. Who takes these tests? Are people with broadband issues more likely to take the tests? I have a friend who has gigabit broadband and he tests his speed all of the time just to see that he’s still getting what’s he’s paying for (just FYI, he’s never measured a true gigabit, just readings in the high 900s Mbps). I take a speed test every time I read something about speeds. I took the speed test at this site from my office and got a download speed of 43 Mbps. My office happens to be in the most distant corner of the house from the incoming cable modem, and at the connection to the Charter modem we get 135 Mbps. My slower results on this test are due to WiFi and yet this website will log me as an underperforming Charter connection.

There were five states at the bottom of the ranking. Last was Alaska at 17 Mbps, Mississippi at 24.8 Mbps, Idaho at 25.3 Mbps, Montana at 25.7 Mbps and Maine at 26 Mbps. That’s five states where the average internet speed is at or below the FCC’s definition of broadband.

The speeds in Alaska are understandable due to the remoteness of many of the communities. There are still numerous towns and villages that receive Internet backhaul through satellite links. I recently read that the first fiber connection between the US mainland and Alaska is just now being built. That might help speeds some, but there is a long way to go to string fiber backhaul to the remote parts of the state.

Mostly what the bottom of the scale shows is that states that are both rural and somewhat poor end up at the bottom of the list. Interestingly, the states with the lowest household densities such as Wyoming and South Dakota are not in the bottom five due to the widespread presence of rural fiber built by small telcos.

What most matters about this kind of headline is that even in the states with fast broadband there are still plenty of customers with lousy broadband. I would hope that Maryland politicians don’t look at this headline and think that their job is done – by square miles of geography the majority of the state still lacks good broadband.

Broadband and Food Safety

I recently saw a presentation that showed how food safety is starting to rely on good rural broadband. I’ve already witnessed many other ways that farmers use broadband like precision farming, herd monitoring, and drone surveillance, but food safety was a new concept for me.

The presentation centered around the romaine lettuce scare of a few months ago. The food industry was unable to quickly identify the source of the contaminated produce and the result was a recall of all romaine nationwide. It turns out the problem came from one farm in California with E. Coli contamination, bur farmers everywhere paid a steep price as all romaine was yanked from store shelves and restaurants, also resulting in cancellations of upcoming orders.

Parts of the food industry have already implemented the needed solution. You might have noticed that the meat industry is usually able to identify the source of problems relatively quickly and can ususally track problems back to an individual rancher or packing house. Cattle farmer are probably the most advanced at tracking the history of herd animals, but all meat producers track products to some extent.

The ideal solution to the romaine lettuce problem is to document every step of the farming process and to make that information available to retailers and eventually to consumers. In the case of romaine that might mean tracking and recording the basic facts of each crop at each farm. That would mean recording the strain of seeds used. It would mean logging the kinds of fertilizer and insecticide applied to a given field. It would mean recording the date when the romaine was picked. The packing and shipping process would then be tracked so that everything from the tracking number on the box or crate, and the dates and identity of every immediate shipper between farm to grocery store would be recorded.

Inititally this would be used to avoid the large blanket recalls like happened with romaine. Ultimately, this kind of information could be made available to consumers. We could wave our smartphone at produce and find out where it was grown, when it was picked and how long it’s been sitting in the store. There are a whole lot of steps that have to happen before the industry can reach that ultimate goal.

The process needs to start with rural broadband. The farmer needs to be able to log the needed information in the field. The day may come when robots can automatically log everything about the growing process, and that will require even more intensive and powerful broadband. The farmer today needs an easy data entry system that allows data to be scanned into the cloud as they work during the growing, harvesting, and packing process.

There also needs to be some sort of federal standards so that every farmer is collecting the same data, and in a format that can be used by every grocery store and restaurant. There is certainly a big opportunity for any company that can develop the scanners and the software involved in such a system.

In many places this can probably be handled with robust cellular data service that extends into the fields. However, there is a lot of rural America that doesn’t have decent, or even any cell service out in the fields. Any farm tracking data is also going to need adequate broadband to upload data into the cloud. Farms with good broadband are going to have a big advantage over those without. We already know this is true today for cattle and dairy farming where detailed records are kept on each animal. I’ve talked to farmers who have to drive every day to find a place to upload their data into the cloud.

In the many counties where I work today the farmers are among those leading the charge for better broadband. If selling produce or animals requires broadband we are going to see farmers move from impatience to insistence when lack of connectivity means loss of profits.

I know as a consumer that I would feel better knowing more about the produce I buy. I’d love to buy more produce that was grown locally or regionally, but it’s often nearly impossible to identify in the store. I’d feel a lot safer knowing that the batch of food I’m buying has been tracked and certified as safe. Just in the last year there’s been recalls on things like romaine, avocados, spring onions, and packaged greens mixes. I don’t understand why any politician that serves a farming district is not screaming loudly for a national solution for rural broadband.