Surveys for Grants and Loans

Many of the federal and state grant programs and many broadband lenders want applicants to undertake a survey to quantify the likely success of a new broadband venture. Unfortunately, there are far too many broadband projects being launched that are unable to answer the basic question, “How many customers are likely to buy service from the new network?” There are only two ways to get a reliable answer to that question – a canvass or a statistically valid survey.

A canvass is the easiest to understand and it involves knocking on the doors or calling every potential customer in a market. I’ve seen many clients have good luck with this when overbuilding a small town or a subdivision. A canvass will be most successful when an ISP has all of the facts needed by potential customers such as specific products and prices. Many companies would label the canvass process as pre-selling – getting potential customers to tentatively commit before construction.

The alternative is a canvass is a ‘statistically valid’ survey. Any survey that doesn’t meet the statistically valid test isn’t worth the paper it’s printed on. There are a few key aspects of doing a statistically valid survey:

Must be Random. This is the most important aspect of a valid survey and is where many surveys fail. Random means that you are sampling the whole community, not just a subset of respondents. A survey that is mailed to people or put online for anybody to take is not random.

The problem with a non-random survey is that the respondents self-select. For example, if you mail a survey to potential customers, then people who are interested in broadband are the most likely to respond and to return the completed survey. It can feel good to get back a lot of positive responses, but it’s far more important to hear from those who don’t support fiber.

The whole purpose of doing a broadband survey is to quantify the amount of support – and that also means quantifying those who won’t buy fiber. I’ve seen results from mailed surveys where almost every response was pro-broadband, and of course, that is unlikely. That result just means that the people who aren’t interested in broadband didn’t bother to complete or return the survey. The only way you can put any faith in a mailed survey is if you get so many responses that it approaches being a canvass. A good analogy of the problems with a mail survey would be to stand in front of a grocery store and ask customers if they like to shop there. While there may be a few customers with complaints, such a survey would not tell you anything about how the community feels about that store since the question was not asked to those who don’t shop at the store.

This blog is too short to describe survey methods – but there are specific acceptable techniques for conducting a random survey either by telephone or by knocking on doors. It’s possible to do those tasks non-randomly, so you should seek advice before conducting a phone or door-knocking survey.

Non-biased Questions. Survey questions must be non-biased, meaning that they can’t lead a respondent towards a certain answer. A question like, “Do you want to save money on broadband?” is worthless because it’s hard to imagine anybody answering no to that question. It’s a lot harder to write non-based questions than you might think, and bias can be a lot more subtle than that question.

Respondent Bias. People who conduct surveys know that there are some kinds of questions that many respondents won’t answer truthfully. For example, I’ve read that nearly half of applicants lie about their annual income when applying for a credit card. For various reasons people want others to think they earn more than they actually do.

Respondent bias can apply to a broadband survey as well. I’ve learned that you can’t rely on responses having to do with spending. For example, many respondents will under-report what they pay each month for broadband. Perhaps people don’t want the survey taker to think they spend too much.

Respondent bias is one of the reasons that political surveys are less reliable than surveys on more factual topics – respondents may not tell the truth about who they will vote for or how they feel about political issues. Luckily, most people are truthful when asked about non-emotional topics and factual questions, and we’ve found residential broadband surveys to be a great predictor of market interest in broadband.

Survey Fatigue. Respondents have a natural tendency to give up if a survey takes too long. They will hang-up on a phone survey or start giving quick and inaccurate answers to get rid of somebody at their door. A survey ought to last no longer than 10 minutes, and the ideal length should be closer to five minutes.

The big takeaway from this discussion is that doing a survey the wrong way will likely give you the wrong answer to the basic question of likely market penetration. You’re better off to not do a survey than to do one that is not statistically valid. I don’t know if there is anything more deadly in launching a new broadband market than having a false expectations of the number of customers that will buy broadband.

What’s the Future for CenturyLink?

I don’t know how many of you watch industry stock prices. I’m certainly not a stock analyst, but I’ve always tracked the stock prices of the big ISPs as another way to try to understand the industry. The stock prices for big ISPs are hard to compare because every big ISP operates multiple lines of business these days. AT&T and Verizon are judged more as cellular companies than as ISPs. AT&T and Comcast stock prices reflect that both are major media companies.

With that said, the stock price for CenturyLink has performed far worse than other big ISPs over the last year. A year ago a share of CenturyLink stock was at $19.24. By the end of the year the stock price was down to $15.44. As I wrote this blog the price was down to $10.89. That’s a 43% drop in share price over the last year and a 30% drop since the first of the year. For comparison, following are the stock prices of the other big ISPs and also trends in broadband customers:

Stock Price 1 Year Ago Stock Price Now % Change 2018 Change in Broadband Customers
CenturyLink $19.24 $10.89 -43.4% -262,000
Comcast $32.14 $43.15 34.3% 1,353,000
Charter $272.84 $377.89 38.5% 1,271,000
AT&T $32.19 $30.62 -4.9% -18,000
Verizon $48.49 $56.91 17.4% 2,000

As a point of comparison to the overall market, the Dow Jones Industrial average was up 4% over this same 1-year period. The above chart is not trying to make a correlation between stock prices and broadband customers since that is just one of dozens of factors that affect the performance of these companies.

Again, I’ve never fully understood how Wall Street values any given company. In reading analyst reports on CenturyLink it seems that the primary reason for the drop in stock price is that all of the company’s business units are trending downward. In the recently released 1Q 2019 results the company showed a year-over-year drop in results for the international, enterprise, small and medium business, wholesale, and consumer business units. It seems that analysts had hoped that the merger with Level 3 would reverse some of the downward trends. Stock prices also dropped when the company surprised the market by cutting its dividend payment in half in February.

CenturyLink faces the same trends as all big ISPs – traditional business lines like landline telephone and cable TV are in decline. Perhaps the most important trend affecting the company is the continued migration of broadband customers from copper-based DSL to cable company broadband. CenturyLink is not replacing the DSL broadband customers it’s losing. In 2018 CenturyLink lost a lot of broadband customers with speeds under 20 Mbps, but had a net gain of customers using more than 20 Mbps. CenturyLink undertook a big fiber-to-the-home expansion in 2017 and built fiber to pass 900,000 homes and businesses – but currently almost all expansion of last-mile networks is on hold.

It’s interesting to compare CenturyLink as an ISP with the big cable companies. The obvious big difference is the trend in broadband customers and revenues. Where CenturyLink lost 262,000 broadband customers in 2018, the two biggest cable companies each added more than a million new broadband customers for the year. CenturyLink and other telcos are losing the battle of DSL versus cable modems with customers migrating to cable companies as they seek faster speeds.

It’s also interesting to compare CenturyLink to the other big telcos. From the perspective of being an ISP, AT&T and Verizon are hanging on to total broadband customers. Both companies are also losing the DSL battle with the cable companies, but each is adding fiber customers to compensate for those losses. Both big telcos are building a lot of new fiber, mostly to provide direct connectivity to their own cell sites, but secondarily to then take advantage of other fiber opportunities around each fiber node.

Verizon has converted over a hundred telephone exchanges in the northeast to fiber-only and is getting out of the copper business in urban areas. Verizon has been quietly filling in its FiOS fiber network to cover the copper it’s abandoning. While nobody knows yet if it’s real, Verizon also has been declaring big plans to to expand into new broadband markets markets using 5G wireless loops.

AT&T was late to the fiber game but has been quietly yet steadily adding residential and business fiber customers over the last few years. They have adopted a strategy of chasing pockets of customers anywhere they own fiber.

CenturyLink had started down the path to replace DSL customers when they built a lot of fiber-to-the-home in 2017. Continuing with fiber construction would have positioned the company to take back a lot of the broadband market in the many large cities it serves. It’s clear that the new CenturyLink CEO doesn’t like the slow returns from investing in last-mile infrastructure and it appears that any hopes to grow the telco part of the business are off the table.

Everything I read says that CenturyLink is facing a corporate crisis. Diving stock prices always put strain on a company. CenturyLink faces more pressure since the activist investors group Southeastern Asset Management holds more than a 6% stake in CenturyLink and made an SEC filing that that the company’s fiber assets are undervalued.

The company has underperformed compared to its peers ever since it was spun off from AT&T as US West. The company then had what turned out to be a disastrous merger with Qwest. There was hope a few years back that the merger with CenturyLink would help to right the company. Most recently has been the merger with Level 3, and at least for now that’s not made a big difference. It’s been reported that CenturyLink has hired advisors to consider if they should sell or spin off the telco business unit. That analysis has just begun, but it won’t be surprising to hear about a major restructuring of the company.

Setting the Right Goals for Grants

Most past federal broadband grant programs had very specific goals. For example, the USDA Community Connect grants that have been around for many years target grants to the poorest parts of the country – the awards are weighted towards communities with the highest levels of poverty. For any grant program to be effective the goals of the program need to be clearly defined, and then the award process needs to be aligned with those goals.

The FCC needs to define the goals of the upcoming $20.4 billion grant program. It the goals are poorly defined then the resulting grant awards are likely to be all over the board in terms of effectiveness. What are the ideal goals for a grant program of this magnitude?

The first goal to be decided is the scope of the coverage – will the goal be to bring somewhat better broadband to as many households as possible, or will it be to bring a long-term broadband solution to a smaller number of households? If the goal is to serve the most households possible, then the grants are going to favor lower-cost technologies and the grants will likely go to the wireless providers and satellite providers – as we saw happen in the recent CAF II reverse auction.

If the grants are aimed at a more permanent solution, then the grants will favor fiber. Perhaps the grants could also go towards anybody willing to extend a cable hybrid-fiber coaxial network into rural areas – but no other technology can be considered as a permanent solution.

There are huge consequences for choosing the first option of serving as many households as possible. These new grants are mostly going to be awarded in the geographic areas covered by the original CAF II program. That program awarded over $11 billion to the big telcos to beef up broadband to speeds of at least 10/1 Mbps. Now, before that program is even finished the FCC is talking about overbuilding those same areas with another $20 billion grant program. If this grant program is used to upgrade homes to fixed wireless, it doesn’t take a crystal ball to understand that in ten years from now we’ll be talking about overbuilding these areas again with fiber. It would be incredibly wasteful to use multiple rounds of grants to upgrade the same geographic areas several times.

The other big issue for these grants to deal with is defining which parts of the country are eligible for the grants. What should be the criteria to decide which homes can be upgraded?

If the test is going to be related to existing speeds, the FCC is going to have to deal with the existing broadband coverage maps that everybody in the industry knows to be badly flawed. The FCC is talking about tackling a new mapping effort – but it’s highly likely that the new maps will just swap old mapping errors for new mapping errors. The reality on the ground is that it’s virtually impossible to map the real speeds on copper or fixed wireless networks. In real life, two rural neighbors can have drastically different speeds due to something as simple as being on different copper pairs. It’s impossible to accurately map DSL or wireless broadband coverage.

To make matters even worse, the current Re-Connect grants are saddled with a rule that says that no more than 10% of grant-covered homes can have existing broadband of more than 10/1 Mbps. Layering that kind of rule on top of terrible maps creates an environment where an ISP is largely unable to define a believable grant footprint.

The FCC must figure out some way to rectify the mapping problem. One of the easiest ways is what I call the technology test – anybody that wants to overbuild copper with fiber should automatically be eligible without trying to figure out the current speeds on the copper. Perhaps the easiest rule could be that any place where there is telco copper and no cable company network should be grant-eligible for fiber overbuilders.

Assuming the grants won’t all go to fiber, then there has to be an alternate way for an ISP or a community to challenge poor maps. Perhaps the FCC needs to provide a realistic time frame to allow local governments to demonstrate the actual speeds in an area, much like what was done in the recent Mobility II grant process.

This blog is part of a series on Designing the Ideal Federal Grant Program.

Designing the Ideal Federal Broadband Grant Program

In April, FCC Chairman Ajit Pai announced a new rural broadband initiative that will provide $20.4 billion of new funding. We don’t know many details yet, but here are a few things that will likely be involved in awarding the funding:

  • The FCC is leaning towards a reverse auction.
  • The program will likely require technologies that can deliver at least 25/3 Mbps broadband speeds.
  • The program will be funded within the existing Universal Service Fund, mostly by repositioning the original CAF II plan.
  • The grants might all be awarded at once, similar to A-CAM and CAF II awards, meaning that there might be only one chance to apply, with the awards to be paid out over a longer time period.

I’m writing a series of blogs that will examine the ideal way to design and administer a grant program of this size. We’ve seen both good and also disastously bad federal broadband programs before, and i’m hoping the FCC will take some time to make this grant program one of the effective ones. I’m sure the details of this new program are not yet set in stone, and folks in rural America need to make their voices heard now if they want some of this money to benefit their communities.

I’m going to look at the following topics, and perhaps more as I write this. At the end of this process I’ll post a whitepaper on my website that consolidates all of these discussions into one document.

A well-designed broadband grant program of this magnitude should consider the following:

What is the End Goal?

It’s important up-front for the FCC to determine how the grant moneys are to be used. The best grant programs have a specific goal, and then the application and award process is designed to best meet the goals. The goal can’t be something as simple as ‘promote rural broadband’, because a goal that simplistic is bound to create a hodgepodge of grant awards.

What Broadband Speeds Should be Supported?

This is an area where the FCC failed miserably in the past. They awarded over $11 billion in the CAF II program that was used to upgrade broadband speeds to speeds of only 10/1 Mbps. When the FCC set the 10/1 Mbps speed that didn’t even meet their own definition of broadband. How should the FCC determine eligible speeds this time to avoid a repeat of the CAF II debacle?

Who Should be Eligible?

FCC programs in the past have usually made the monies available to a wide range of recipients. However, the specific details of the grant programs have often made it hard for whole classes of entities like cities or counties to accept the awards. As an example, there are many entities electing to not participate in the current Re-Connect grant program because they can’t accept any part of the awards that include RUS loans.

Is a Reverse Auction the Right Mechanism?

The FCC and numerous politicians currently favor reverse auctions. Like any mechanism, there are situation where reverse grants are a great tool and others where they will distort the award process. Are reverse auctions a good tool for this grant program?

Other Issues

There are two drastically different ways to hand out these grants. One is to follow the CAF II mechanism and award all of the $20 billion in one huge auction and then pay it out over 6 or 8 years. The other would be to divide the award money into even tranches and have a new grant award for each of those years.

In the recent Re-Connect grants the FCC decided to blend grants and loans. I know the loan component stopped most of my clients from pursuing these grants. Should there be a loan component of the grants?

There are also technical issues to consider. I had clients who were outbid in the recent CAF II reverse auction by wireless companies that gained bidding preference by promising that their fixed wireless networks could achieve across-the-board 100 Mbps broadband. I still don’t know of a wireless technology that can do that over a large footprint. How should the FCC make sure that technologies deliver what’s promised?

What’s the Role of States in this Process?

What should states be doing to maximize the chance for federal grant money to be awarded to their state?

This blog is part of a series:

Setting the Right Goals for Grants

Clearing Mid-range Spectrum

The FCC is in the process of trying to free up mid-range spectrum for 5G. They just opened a Notice of Proposed Rulemaking looking at 2.5 GHz spectrum, located in the contiguous block between 2495 and 2690 MHz. Overall this is the largest contiguous block of mid-range spectrum. Over half of the spectrum sits idle today, particularly in rural America. The history of this spectrum demonstrates the complications involved in trying to reposition spectrum for broadband and cellular use.

The frequency was first assigned by the FCC in 1963 when it was made available to school systems to transmit educational TV between multiple schools. The spectrum band was called Instructional Television Fixed Service (ITFS). The band was divided into twenty channels and could transmit a TV signal up to about 35 miles. I grew up in a school system that used the technology and from elementary school onward we had a number of classes taught on the TV. Implementing the technology was expensive and much of the spectrum was never claimed.

In 1972 the FCC recognized the underuse of the spectrum and allowed commercial operators to use the bands of 2150 to 2162 MHz on an unlicensed basis for pay-TV transmissions to rooftop antennas. The spectrum could only carry a few TV channels and in the 1970s was used in many markets to transmit the early version of HBO and Nickelodeon. This spectrum band was known as Multipoint Distribution Service (MDS) and also was good for about 35 miles.

Reacting to pressure from cellular companies, the FCC reallocated eight additional channels of the spectrum for commercial use. Added to the MDS spectrum this became known as Multichannel Multipoint Distribution Service (MMDS). At the time this displaced a few school systems and anybody using the spectrum had to pay to move a school system to another workable channel. This spectrum was granted upon request to operators for specific markets.

In 1991 the FCC changed the rules for MMDS and allowed the channels to be used to transmit commercial TV signals. In 1995 any unused MMDS spectrum was sold under one of the first FCC auctions, which was the first to divide service areas into the geographic areas known as Basic Trading Areas (or BTAs) that are still used today. Before this auction, the spectrum was awarded in 35-mile circles called Geographic Service Areas (GSAs). The existing GSAs were left in place and the spectrum sold at auction had to work around existing GSAs.

The FCC started getting pressure from wireless companies to allow for the two-way transmission of data in the frequency (up to now it had been all one-way delivery to a customer site). In 2005 the FCC changed the rules and renamed the block of spectrum as Broadband Radio Service (BRS). This added significant value to licenses since the spectrum could now be repositioned for cellular usage.

At this point, Clearwire entered the picture and saw the value of the spectrum. They offered to buy or lease the spectrum from school systems at prices far lower than market value and were able to amass the right to use a huge amount of the spectrum nationwide. Clearwire never activated much of the spectrum and was in danger of losing the rights to use it. In 2013 Sprint purchased Clearwire, and Sprint is the only cellular company using the spectrum band today.

Today the spectrum band has all sorts of users. There are still school districts using the spectrum to transmit cable TV. There are still license holders who never stopped using the 35-mile GSA areas. There are still MMDS license holders who found a commercial use for the spectrum. And Sprint holds much of the spectrum not held by these other parties.

The FCC is wrestling in the NPRM with how to undo the history of the spectrum to make it more valuable to the industry. Education advocates are still hoping to play in the space since much of the spectrum sits idle in rural America (as is true with a lot of cellular and other mid-range spectrum). The other cellular carriers would like to see chunks of the spectrum sold at auction. Other existing license holders are fighting to extract the biggest value out of any change of control of the spectrum.

The challenge for repositioning this spectrum is complicated because the deployment of the spectrum differs widely today by market. The FCC is struggling to find an easy set of rules to put the genie back in the bottle and start over again. In terms of value for 5G, this spectrum sits in a sweet spot in terms of coverage characteristics. Using the spectrum for cellular data is probably the best use of the spectrum, but the FCC has to step carefully to do this in such a way as to not end up in court cases for years disputing any order. Reallocating spectrum is probably the most difficult thing the FCC does and it’s not hard to see why when you look at the history of this particular block of spectrum and realize that every block of spectrum has a similar messy past.

AT&T and Augmented Reality

Lately it seems like I find a news article almost every week talking about new ways that people are using broadband. The latest news is an announcement that AT&T is selling Magic Leap augmented reality headsets in six cities plus online.

The AT&T launch is being coordinated with the release of an augmented reality immersive experience that will bring The Game of Thrones into people’s homes with a themed gaming experience called The Dead Must Die, with a teaser in this trailer.

Augmented reality differs from virtual reality in that augmented reality overlays images into the local environment. A user will see characters in their living room as opposed to being immersed in a total imaginary environment with virtual reality.

Magic Leap is one of the most interesting tech start-ups. They started in 2014 with a $542 million investment, and since then have raised over $2.3 billion dollars. The company’s investors and advisors include people like Alibaba executive vice chair Joe Tsai and director Steven Spielberg. There have been rumors over the years of an impending product, but until now they’ve never brought a product to market. AT&T will be selling Magic Leap’s first headset, called the Magic Leap One Creator Edition for a price of $2,295. The mass-market headset will surely cost a lot less.

AT&T’s interest in the technology extends past selling the headsets. Magic Leap recently signed a deal with the NBA and its broadcast partner Turner which is now owned by AT&T and will obviously be looking at augmented reality broadcasts of basketball games.

AT&T’s interest goes even far beyond that and they are looking at the Magic Leap technology as the entry into the spatial Internet – moving today’s web experience to three dimensions. AT&T sees the Magic Leap headset as the entry into bringing virtual reality to industries like healthcare, retail and manufacturing. They envision people shopping in 3D, doctors getting 3D computer assistance for visualizing a patient during an operating, and manufacturer workers aided by overlaid 3D blueprints on the manufacturing floor.

While the Magic Leap headset will work on WiFi today, AT&T is promoting Magic Leap as part of their 5G Innovation Program. AT&T is touting this as a technology that will benefit greatly from 5G, which will allow users to go mobile and use the augmented reality technology anywhere.

I couldn’t find any references to the amount of bandwidth used by this first-generation headset, but it has to be significant. Looking at the Game of Thrones application, a user is immersed in a 3D environment and can move and interact with elements in the augmented reality. That means a constant transmission of the elements in the 3D environment. I have to think that is at least equivalent to several simultaneous video transmissions. Regardless of the bandwidth used today, you can bet that as augmented reality becomes mainstream that content makers will find ways to use greater bandwidth.

We are already facing a big increase in bandwidth that is needed to support gaming from the cloud – as is now being pushed by the major game vendors. Layering augmented reality on top of that big data stream will increase bandwidth needs by another major increment.

The Fastest and Slowest Internet in the US

The web site HighSpeedInternet.com has calculated and ranked the average Internet speeds by state. The site offers a speed test and then connects visitors to the web pages for the various ISPs in each zip code in the country. I have to imagine the site makes a commission for broadband customers that subscribe through their links.

Not surprisingly, the east coast states with Verizon FiOS ranked at the top of the list for Internet speeds since many customers in those states have the choice between a fiber network and a big cable company network.

For example, Maryland was top on the list with an average speed of 65 Mbps, as measured by the site’s speed tests. This was followed by New Jersey at 59.6 Mbps, Delaware at 59.1 Mbps, Rhode Island at 56.8 Mbps and Virginia at 56 Mbps.

Even though they are at the top of the list, Maryland is like most states and there are still rural areas of the state with slow or non-existent broadband. The average speed test results are the aggregation of all of the various kinds of broadband customers in the state:

  • Customers with fast Verizon FiOS products
  • Customers with fast broadband from Comcast, the largest ISP in the state
  • Customers that have elected slower, but less expensive DSL options
  • Rural customers with inferior broadband connections

Considering all of the types of customers in the state, an average speed test result of 65 Mbps is impressive. This means that a lot of households in the state have speeds of 65 Mbps or faster. That’s not a surprise considering that both Verizon FiOS and Comcast have base product speeds considerably faster than 65 Mbps. If I was a Maryland politician, I’d be more interested in the distribution curve making up this average. I’d want to know how many speed tests were done by households getting only a few Mbps speeds. I’d want to know how many gigabit homes were in the mix – gigabit is so much faster than the other broadband products that it pulls up the average speed.

I’d also be interested in speeds by zip code. I took a look at the FCC broadband data reported on the 477 forms just for the city of Baltimore and I see widely disparate neighborhoods in terms of broadband adoption. There are numerous neighborhoods just north of downtown Baltimore with broadband adoption rates as low as 30%, and numerous neighborhoods under 40%. Just south of downtown and in the northernmost extremes of the city, the broadband adoption rates are between 80% and 90%. I have to guess that the average broadband speeds are also quite different in these various neighborhoods.

I’ve always wondered about the accuracy of compiling the results of mass speed tests. Who takes these tests? Are people with broadband issues more likely to take the tests? I have a friend who has gigabit broadband and he tests his speed all of the time just to see that he’s still getting what’s he’s paying for (just FYI, he’s never measured a true gigabit, just readings in the high 900s Mbps). I take a speed test every time I read something about speeds. I took the speed test at this site from my office and got a download speed of 43 Mbps. My office happens to be in the most distant corner of the house from the incoming cable modem, and at the connection to the Charter modem we get 135 Mbps. My slower results on this test are due to WiFi and yet this website will log me as an underperforming Charter connection.

There were five states at the bottom of the ranking. Last was Alaska at 17 Mbps, Mississippi at 24.8 Mbps, Idaho at 25.3 Mbps, Montana at 25.7 Mbps and Maine at 26 Mbps. That’s five states where the average internet speed is at or below the FCC’s definition of broadband.

The speeds in Alaska are understandable due to the remoteness of many of the communities. There are still numerous towns and villages that receive Internet backhaul through satellite links. I recently read that the first fiber connection between the US mainland and Alaska is just now being built. That might help speeds some, but there is a long way to go to string fiber backhaul to the remote parts of the state.

Mostly what the bottom of the scale shows is that states that are both rural and somewhat poor end up at the bottom of the list. Interestingly, the states with the lowest household densities such as Wyoming and South Dakota are not in the bottom five due to the widespread presence of rural fiber built by small telcos.

What most matters about this kind of headline is that even in the states with fast broadband there are still plenty of customers with lousy broadband. I would hope that Maryland politicians don’t look at this headline and think that their job is done – by square miles of geography the majority of the state still lacks good broadband.

Broadband and Food Safety

I recently saw a presentation that showed how food safety is starting to rely on good rural broadband. I’ve already witnessed many other ways that farmers use broadband like precision farming, herd monitoring, and drone surveillance, but food safety was a new concept for me.

The presentation centered around the romaine lettuce scare of a few months ago. The food industry was unable to quickly identify the source of the contaminated produce and the result was a recall of all romaine nationwide. It turns out the problem came from one farm in California with E. Coli contamination, bur farmers everywhere paid a steep price as all romaine was yanked from store shelves and restaurants, also resulting in cancellations of upcoming orders.

Parts of the food industry have already implemented the needed solution. You might have noticed that the meat industry is usually able to identify the source of problems relatively quickly and can ususally track problems back to an individual rancher or packing house. Cattle farmer are probably the most advanced at tracking the history of herd animals, but all meat producers track products to some extent.

The ideal solution to the romaine lettuce problem is to document every step of the farming process and to make that information available to retailers and eventually to consumers. In the case of romaine that might mean tracking and recording the basic facts of each crop at each farm. That would mean recording the strain of seeds used. It would mean logging the kinds of fertilizer and insecticide applied to a given field. It would mean recording the date when the romaine was picked. The packing and shipping process would then be tracked so that everything from the tracking number on the box or crate, and the dates and identity of every immediate shipper between farm to grocery store would be recorded.

Inititally this would be used to avoid the large blanket recalls like happened with romaine. Ultimately, this kind of information could be made available to consumers. We could wave our smartphone at produce and find out where it was grown, when it was picked and how long it’s been sitting in the store. There are a whole lot of steps that have to happen before the industry can reach that ultimate goal.

The process needs to start with rural broadband. The farmer needs to be able to log the needed information in the field. The day may come when robots can automatically log everything about the growing process, and that will require even more intensive and powerful broadband. The farmer today needs an easy data entry system that allows data to be scanned into the cloud as they work during the growing, harvesting, and packing process.

There also needs to be some sort of federal standards so that every farmer is collecting the same data, and in a format that can be used by every grocery store and restaurant. There is certainly a big opportunity for any company that can develop the scanners and the software involved in such a system.

In many places this can probably be handled with robust cellular data service that extends into the fields. However, there is a lot of rural America that doesn’t have decent, or even any cell service out in the fields. Any farm tracking data is also going to need adequate broadband to upload data into the cloud. Farms with good broadband are going to have a big advantage over those without. We already know this is true today for cattle and dairy farming where detailed records are kept on each animal. I’ve talked to farmers who have to drive every day to find a place to upload their data into the cloud.

In the many counties where I work today the farmers are among those leading the charge for better broadband. If selling produce or animals requires broadband we are going to see farmers move from impatience to insistence when lack of connectivity means loss of profits.

I know as a consumer that I would feel better knowing more about the produce I buy. I’d love to buy more produce that was grown locally or regionally, but it’s often nearly impossible to identify in the store. I’d feel a lot safer knowing that the batch of food I’m buying has been tracked and certified as safe. Just in the last year there’s been recalls on things like romaine, avocados, spring onions, and packaged greens mixes. I don’t understand why any politician that serves a farming district is not screaming loudly for a national solution for rural broadband.

Access to Low-Price Broadband

The consumer advocate BroadbandNow recently made an analysis of broadband prices across the US and came up with several conclusions:

  • Broadband prices are higher in rural America.
  • They conclude that 45% of households don’t have access to a ‘low-priced plan’ for a wired Internet connection.

They based their research by looking at the published prices of over 2,000 ISPs. As somebody who does that same kind of research in individual markets, I can say that there is often a big difference between published rates and actual rates. Smaller ISPs tend to charge the prices they advertise, so the prices that BroadbandNow found in rural America are likely the prices most customers really pay.

However, the big ISPs in urban areas routinely negotiate rates with customers and a significant percentage of urban broadband customers pay something less than the advertised rates. But the reality is messier even than that since a majority of customers still participate in a bundle of services. It’s usually almost impossible to know the price of any one service inside a bundle and the ISP only reveals the actual rate when a customer tries to break the bundle to drop one of the bundled services. For example, a customer may think they are paying $50 for broadband in a bundle but find out their real rate is $70 if they try to drop cable TV. These issues make it hard to make any sense out of urban broadband rates.

I can affirm that rural broadband rates are generally higher. A lot of rural areas are served by smaller telcos and these companies realize that they need to charge higher rates in order to survive. As the federal subsidies to rural telcos have been reduced over the years these smaller companies have had to charge realistic rates that match their higher costs of doing business in rural America.

I think rural customers understand this. It’s a lot more expensive for an ISP to provide broadband in a place where there are only a few customers per road-mile of network than in urban areas where there might be hundreds of customers per mile. A lot of other commodities cost more in rural America for this same reason.

What this report is not highlighting is that the lower-price broadband in urban areas is DSL. The big telcos have purposefully priced DSL below the cost of cable modem broadband as their best strategy to keep customers. When you find an urban customer that’s paying $40 or $50 for broadband it’s almost always going to be somebody using DSL.

This raises the question of how much longer urban customers will continue to have the DSL option. We’ve already seen Verizon abandon copper-based products in hundreds of urban exchanges in the last few years. Customers in those exchanges can theoretically now buy FiOS on fiber – and pay more for the fiber broadband. This means for large swaths of the northeast urban centers that the DSL option will soon be gone forever. There are persistent industry rumors that CenturyLink would like to get out of the copper business, although I’ve heard no ideas of how they might do it. It’s also just a matter of time before AT&T starts walking away from copper. Will there even be any urban copper a decade from now? Realistically, as DSL disappears with the removal of copper the lowest prices in the market will disappear as well.

There is another trend that impacts the idea of affordable broadband. We know that the big cable companies now understand that their primary way to keep their bottom line growing is to raise broadband rates. We’ve already seen big broadband rate increases in the last year, such as the $5 rate increase from Charter for bundled broadband.

The expectation on Wall Street is that the cable companies will regularly increase broadband rates going into the future. One analyst a year ago advised Comcast that basic broadband ought to cost $90. The cable companies are raising broadband rates in other quieter ways. Several big cable companies have told their boards that they are going to cut back on offering sales incentives for new customers and they want to slow down on negotiating rates with existing customers. It would be a huge rate increase for most customers if they are forced to pay the ‘list’ prices for broadband.

We also see carriers like Comcast starting to collect some significant revenues for customers going over the month data caps. As household broadband volumes continue to grow the percentage of people using their monthly cap should grow rapidly. We’ve also seen ISPs jack up the cost of WiFi or other modems as a backdoor way to get more broadband revenue.

As the cable companies find way to extract more revenue out of broadband customers and as the big telcos migrate out of DSL my bet is that by a decade from now there will be very few customers with ‘affordable’ broadband. Every trend is moving in the opposite direction.

5G and Home IoT

I’ve been asked a lot recently about the potential future of 5G – everybody in the industry wants to understand the potential threat from 5G. One of the biggest proposed uses for 5G is to connect IoT devices to the cloud. Today I’m going to look at what that might mean.

It’s clear that 5G cellular will be the choice for connecting to outdoor IoT sensors. Sensors for farm equipment in rural areas or for outdoor weather and traffic sensors in urban areas are going to most easily handled by 5G cellular since the technology will eventually be everywhere. 5G is particularly suited for serving IoT devices due to frequency slicing where just the right amount of bandwidth, large or small, can be allocated to each small outdoor sensor. 5G has another interesting feature that will allow it to poll sensors on a pre-set schedule rather than have the sensor constantly trying to constantly connect – which will reduce power consumption at the sensor.

It’s clear that the cellular carriers also have their eye on indoor IoT devices. It’s harder to say that 5G will win this battle because today almost all indoor devices are connected using WiFi.

There are a couple of different 5G applications that might work in the indoor environment. The cellular carriers are going to make a pitch to be the technology of choice to connect small inside devices. In my home I can get a good cellular signal everywhere except in the old underground basement. There is no question that cellular signal from outside the home could be used to connect to many of the smaller bandwidth applications within the home. I can’t see any technical reason that devices like my Amazon Echo or smart appliances couldn’t connect to 5G instead of WiFi.

But 5G cellular has a number of hurdles issues to overcome to break into this market. I’m always going to have a wired broadband connection to my home, and as long as that connection comes from somebody other than one of the big cellular carriers I’m not going to want to use 5G if that means paying for another monthly subscription to a cellular provider. I’d much rather have my inside devices connected to the current broadband connection. I also want all of my devices on the same network for easy management. I want to use one hub to control smart light switches or other devices and want everything on the same wireless network. That means I don’t want some devices on WiFi and others on cellular.

One of the sales pitches for 5G is that it will be able to easily accommodate large numbers of IoT connections. Looking into the future there might come a time when there are a hundred or more smart devices in the house. It’s not that hard to picture the Jetson’s house where window shades change automatically to collect or block sunlight, where music plays automatically when I enter a room, where my coffee is automatically ready for me when I get out of bed in the morning. These things can be done today with a lot of effort, but with enough smart devices in a home these functions will probably eventually become mainstream.

One of the limitations of WiFi today is that it degrades in a busy environment. A WiFi network pauses each time it gets a new request for a connection, which is the primary reason it’s so hard to keep a good connection in a busy hotel or convention center.

However, the next generation with WiFi 6 is already anticipating these needs in the home. WiFi can adopt the same frequency slicing used by 5G so that only a small portion of a channel can be used to connect to a given device. Events can be scheduled on WiFi so that the network only polls certain sensors only periodically. The WiFi network might only interact with the smart coffee pot or the smart window shades when something needs to be done, rather than maintaining a constantly open channel. It’s likely that the next iterations of WiFi will become nearly as good as 5G for these functions within a closed home environment.

There is an even better solution that is also being discussed. There’s no reason that indoor routers can’t be built that use both WiFi and 5G frequencies. While the cellular companies are gobbling up millimeter wave spectrum, as long as there is an unlicensed slice of spectrum set aside for public use it will be possible to deploy both WiFi on mid-range frequencies and 5G on millimeter wave frequencies at the same time. This would blend the benefits of both technologies. It might mean using WiFi to control the smart coffee pot and indoor 5G to connect to the smart TV.

Unfortunately for the cellular carriers, these duel-function routers won’t need them. The same companies that make WiFi routers today can make combination 5G / WiFi routers that work with the full range of unlicensed spectrum – meaning no revenue opportunity for the cellular carriers. When I look at all of the issues I have a hard time seeing 5G cellular becoming a preferred technology within the home.