Existing 4G Spectrum

I suspect that most people don’t realize the small number of frequencies that are used today to support cellular service. Below is a list of the frequencies used by each US cellular carrier for providing 4G LTE. Except for Sprint, they all use the same basic frequencies.

Frequencies (in MHz)

AT&T  – 1900, 1700 abcde, 700 bc

Verizon – 1900, 1700 f, 700 c

T-Mobile – 1900, 1700 def, 700 a, 600

Sprint – 1900 g, 850, 2500

The letters represent separate licenses for specific sub-bands of the various frequencies. For example, the 1700 MHz band has been licensed in bands a through f and the carriers own rights to various sub-bands rather than to the whole spectrum. The same is also true for 1900 MHz and 700 MHz spectrum. In many cases, the licenses for the various spectrum bands are not nationwide. This means the frequencies used in Cleveland by one of the carriers might be slightly different than the spectrum used in San Francisco.

The carriers are using these limited spectrum bands today to support both 4G voice and data. In metropolitan areas, the carriers are in big trouble. They are finding it impossible to satisfy customer requests for data service, which is resulting in customer blockages or greatly reduced broadband speeds.

One of the primary reasons that the carriers are running into blockages on 4G data is that they aren’t deploying enough different bands of spectrum for broadband. The carriers have three remedies that can be used to improve cellular data – use more bands of spectrum, build more cell sites (small cells), and implement 5G which will allow for more simultaneous connections.

The CTIA, the lobbying group for the wireless carriers has been heavily lobbying the FCC to allocate 400 MHz of additional mid-range spectrum for cellular data. The FCC is considering repositioning numerous bands of spectrum and the CTIA wants to grab everything possible for data purposes.

Unfortunately, spectrum alone is not going to provide the solution the wireless carriers are hoping for. One of the primary reasons that the cellular carriers only use a few different bands of spectrum today is to simplify handsets. There is a huge price to pay for using multiple bands of spectrum in a cell phone. The more bands of spectrum, the more antennas that must be supported and the more power that is used.

If the cellular companies try to load many more bands of mid-range spectrum onto cellphones they will have majorly overstressed the battery life of phones. Most cellphone customers are not likely going to want to trade faster data speeds for shorter battery lives. As I look forward at the strategies of the cellular carriers, the battery life of cellphones might be their biggest limitation. The question is not so much about how much data a cellphone can handle, but rather how much battery life must be sacrificed to gain broadband  performance. The only solution for this is likely some new battery technology that is not yet on the horizon.

I don’t believe that the average cellphone user values cellular data speeds in the same way that they value fast landline data speeds. 4G today is easily capable of streaming video and there’s no reason on a cellphone to stream more than one video stream at the same time. 4G is reasonably okay today at operating most celular apps. The one group of cellphone users that always want more bandwidth are gamers – but there is no way that cellphones are ever going to be able to match the capabilities of gaming systems or gaming computers using landline broadband connections.

I scratch my head every time I hear 5G claims about providing gigabit cellular service. I don’t want to sound like an old-timer who sees no need for greater speeds. But I think we need to be realistic and ask if superfast cellular bandwidth is really needed today – after all, there are still no landline applications for homes that require anything near to a gigabit of bandwidth. The primary reason homes need faster download speeds is to handle multiple big bandwidth applications at the same time, something that is not today a requirement for cellphones.

The idea of gigabit cellular is mostly coming from the imagination of the cellular company marketers. The 5G standard calls for eventual ubiquitous 100 Mbps cellular speeds. Even achieving that much speed is going to require tying together multiple mid-range bands of spectrum. I’m having a hard time seeing the additional revenue streams that will pay for the massive upgrades needed to reach the 100 Mbps goal. The cellular companies all know this but aren’t talking about it because that would dilute the message that 5G will transform the world.

Will Congress Be Forced to Re-regulate Broadband?

Last year the current FCC largely deregulated broadband. They killed Title II regulation and also handed off any remaining vestiges of broadband regulation to the Federal Trade Commission. The FCC is still left with broadband-related tasks associated with broadband. For instance, they still have to track broadband adoption rates. They are still required to try to solve the rural digital divide. They still approve electronics used to provide broadband. But this FCC has killed its own authority to make ISPs change their behavior.

I wrote a blog a month ago talking about the regulatory pendulum. Industries that become dominated by monopolies are always eventually regulated in some manner – governments either proscribe operating rules or else break up monopolies using antitrust laws. One only has to look at the conversation going on in Washington (and around the world) about somehow regulating Facebook, Google and other big web platforms to see that this is inevitable. Big monopolies always grow to trample consumers and eventually the public demands that monopoly abused be curbed.

It’s only been a little over a year since the FCC deregulated broadband and there are already topics looming that beg for regulation. There is nothing to stop this FCC or a future FCC from reintroducing regulation – the courts already gave approval for regulating using Title II. Regulation can also come from Congress – which is the preferred path to stop the wild swings every time there’s a new administration. Even the ISPs would rather be regulated by Congress than to bounce back and forth between FCCs with differing philosophies.

Over half of the states have introduced bills that seek to regulate data privacy. Consumers are tired of data breaches and tired of having their personal information secretly peddled to the highest bidder. A year ago the California legislature passed data rules that largely mimic what’s being done in Europe. The Maine legislature just passed rules that are even more stringent than California in some ways.

It’s going to be incredibly expensive and complicated for web companies to try to comply with rules that differ by state. Web companies are in favor of one set of federal privacy rules – the big companies are already complying with European Union rules and they’ve accepted that providing some privacy to consumers is the cost of doing business. Privacy rules need to apply to ISPs as much as they do to the big web companies. Large ISPs are busy gathering and selling customer data in the same manner as web companies. Cellular companies are gathering and selling huge amounts of customer data.

There are other regulatory issues that are also looming. It seems obvious that if the administration and the Senate turn Democratic that one of their priorities will be to reimplement net neutrality. The ISPs are already starting to quietly violate net neutrality rules. They are first tackling things that customers like such as sponsored video as part of a cellular plan – but over time you can expect the worst kind of abuses that were the reasons behind net neutrality rules.

I think that broadband prices are going to become a major issue. The big ISPs have all acknowledged that one of the few tools they have to maintain earnings growth is to raise broadband prices. Cord cutting is accelerating and in the first quarter the ISPs lost cable customers at a rate of 6% annually. Cord cutting looks like it’s going to go much faster than the industry anticipated as millions of customers bail on traditional cable each quarter. The pressure to raise broadband rates is growing.

We’ve already seen the start of broadband price increases. Over the last few years the ISPs have been raising rates around the edges, such as increasing the monthly price for a broadband modem. More recently we’ve seen direct broadband price increases such as the $5 rate increase for bundled broadband by Charter. We’re seeing Comcast and other ISPs start billing people for crossing data caps. Most recently we know that several ISPs are talking about significantly curtailing special rates and discount for customers – eliminating those discounts probably equates to a 10% – 15% rate increase.

At some point, the FCC will have to deal with rising broadband rates. Higher broadband rates will increase the digital divide as households get priced out from affording broadband. The public will put a lot of pressure on politicians to do something about ISP prices.

Deregulating broadband at a time when a handful of ISPs have the vast majority of broadband customers was one of the most bizarre regulatory decisions I’ve ever seen. All monopolies, regardless of industry need to be regulated – we’ve known this for over a hundred years. It’s just a matter of time before Congress is forced to step up and re-regulate broadband. It may not be tomorrow, but I find it highly unlikely that broadband will still be deregulated a decade from now, and I expect it much sooner.

Cord Cutting Picking Up Pace

Leichtman Research Group has published the cable TV customer counts for the first quarter of 2019 and it’s apparent that the rate of cord cutting is accelerating. These large companies represent roughly 95% of the traditional cable market.

1Q 2019 2,018
Customers Change % Change Losses
DirecTV / AT&T 22,383,000 (543,000) -2.4% (1,189,000)
Comcast 21,866,000 (120,000) -0.5% (371,000)
Charter 16,431,000 (145,000) -0.9% (244,000)
Dish TV 9,639,000 (266,000) -2.7% (1,125,000)
Verizon 4,398,000 (53,000) -1.2% (168,000)
Cox 3,980,000 (35,000) -0.9% (115,000)
Altice 3,297,300 (10,200) -0.3% (98,000)
Frontier 784,000 (54,000) -6.4% (123,000)
Mediacom 764,000 (12,000) -1.5% (45,000)
Cable One 320,611 (11,500) -3.5% (37,465)
83,862,911 (1,249,700) -1.5% (3,515,465)

A few things strike me about this table. First, the annual rate of loss is now 6%. That’s faster than we ever saw for telephone landlines which lost 5% annually at the peak of the market losses. We are only into the third real year of cord cutting and already the rate of customer growth has leaped to a 6% annual loss.

The other big striking number is that the overall traditional cable penetration rate has now dropped to 70%. According to the Census, there are 127.59 million households and adding in the customers of smaller providers shows a 70% market penetration. That’s still a lot of homes with traditional cable TV, but obviously the conversation about cutting the cord is happening in huge numbers of homes.

Another interesting observation is that AT&T is now at the top of the list. They’ve stopped reporting customers separately for DirecTV and for AT&T U-verse, which combined makes them the large cable provider in the country. However, at the rate the company is bleeding traditional cable customers, Comcast is likely to be number one again by the end of this year. AT&T has been encouraging customers to shift to DirecTV Now, delivered only online. However, that service also lost 83,000 customers in the first quarter, so the overall AT&T losses are staggering, at an annual rate of loss of over 8%.

The big losers in total customers are still the satellite companies. As those companies have gotten more realistic about pricing they’ve seen customer flee. There have been numerous articles in the press in publications like Forbes wondering if Dish Networks is even a viable company after these kinds of losses. There is also recent speculation that AT&T might spin off DirecTV and perhaps even merge it with Dish Networks.

The biggest percentage loser is Frontier, losing 6.4% of their customers in just the first quarter. It’s been obvious that the wheels are coming off of Frontier and the company just sold off properties in western states last month in order to raise cash.

For the last few years, Comcast and Charter were still holding on to overall cable customers. This was mostly buoyed by new cable customers that came from big increases in broadband customers – these two companies have added the bulk of new nationwide broadband customers over the last two years. But even with continued broadband growth, these companies are now seeing cable counts drop, and it’s likely that their rate of cord cutting among customers they’ve had for many years is probably as high as the rest of the industry.

It’s still hard to predict the trajectory of cable TV. In just two years the industry as a whole has gone from minor customer losses to losing customers at a rate of 6% per year. I don’t see any analysts predicting where this will bottom out – will it level off or will losses continue to accelerate? In any event, any industry losing 6% of customers annually is in trouble. It’s not going to take many years of losses at this rate for the industry to become irrelevant.

Is the FCC Really Solving the Digital Divide?

The FCC recently released the 2019 Broadband Deployment Report, with the subtitle: Digital Divide Narrowing Substantially. Chairman Pai is highlighting several facts that he says demonstrate that more households now have access to fast broadband. The report highlights rural fiber projects and other efforts that are closing the digital divide. The FCC concludes that broadband is being deployed on a reasonable and timely basis – a determination they are required to make every year by Congressional mandate. If the FCC ever concludes that broadband is not being deployed fast enough, they are required by law to rectify the situation.

To give the FCC some credit, there is a substantial amount of rural fiber being constructed – mostly from the ACAM funds being provided to small telephone companies with some other fiber being deployed via rural broadband grants. Just to provide an example, two years ago Otter Tail County Minnesota had no fiber-to-the-premise. Since then the northern half of the county is seeing fiber deployed from several telephone companies. This kind of fiber expansion is great news to rural counties, but counties like Otter Tail are now wondering how to upgrade the rest of their county.

Unfortunately, this FCC has zero credibility on the issue. The 2018 Broadband Deployment Report reached the same conclusion, but it turns out that there was a huge reporting error in the data supporting that report where the ISP, Barrier Free, had erroneously reported that they had deployed fiber to 62 million residents in New York. Even after the FCC recently corrected for that huge error they still kept the original conclusion. This raises a question about what defines ‘reasonable and timely deployment of broadband’ if having fiber to 52 million fewer people doesn’t change the answer.

Anybody who works with rural broadband knows that the FCC databases are full of holes. The FCC statistics come from the data that ISPs report to the FCC each year about their broadband deployment. In many cases, ISPs exaggerate broadband speeds and report marketing speeds instead of actual speeds. The reporting system also contains a huge logical flaw in that if a census block has only one customer with fast broadband, the whole census block is assumed to have that speed.

I work with numerous rural counties where broadband is still largely non-existent outside of the county seat, and yet the FCC maps routinely show swaths of broadband availability in many rural counties where it doesn’t exist.

Researchers at Penn State recently looked at broadband coverage across rural Pennsylvania and found that the FCC maps grossly overstate the availability of broadband for huge parts of the state. Anybody who has followed the history of broadband in Pennsylvania already understands this. Years ago, Verizon reneged on a deal to introduce DSL everywhere – a promise made in exchange for becoming deregulated. Verizon ended up ignoring most of the rural parts of the state.

Microsoft has blown an even bigger hole in the FCC claims. Microsoft is in an interesting position in that customers in every corner of the country ask for online upgrades for Windows and Microsoft Office. Microsoft is able to measure the actual speed of customer download for tens of millions of upgrades every quarter. Microsoft reports that almost half of all downloads of their software is done at speeds that are slower than the FCC’s definition of broadband of 25/3 Mbps. Measuring a big download is the ultimate test of broadband speeds since ISPs often boost download speeds for the first minute or two to give the impression they have fast broadband (and to fool speed tests). Longer downloads show the real speeds. Admittedly some of Microsoft’s findings are due to households that subscribe to slower broadband to save money, but the Microsoft data still shows that a huge number of ISP connections underperform. The Microsoft figures are also understated since they don’t include the many millions of households that can’t download software since they have no access to home broadband.

The FCC is voting this week to undertake a new mapping program to better define real broadband speeds. I’m guessing that effort will take at least a few years, giving the FCC more time to hide behind bad data. Even with a new mapping process, the data is still going to have many problems if it’s self-reported by the ISPs. I’m sure any new mapping effort will be an improvement, but I don’t hold out any hopes that the FCC will interpret better data to mean that broadband deployment is lagging.

Selling Transport to Small Cell Sites

A lot of my clients make money by selling transport to the big traditional cell sites. Except for a few of them that operate middle-mile networks, the extra money from cell site transport adds a relatively high-margin product into the last-mile network.

Companies are now wondering how hard they should pursue small cell sites. They keep reading about the real-estate grab in the major cities where a number of companies are competing to build small cell enclosures, hoping to attract multiple carriers. They want to understand the size of the potential market for small cells outside of the major metropolitan areas. It’s not an easy question to answer.

The cellular carriers are building small cell sites in larger markets because they have exhausted the capabilities of the traditional large cell sites. The cellular companies have pushed bigger data plans and convinced customers that it’s okay to watch video on cellphones, and now they find that they are running out of bandwidth capacity. The only two immediate fixes are to build additional cell sites (thus, the small cells) or else add more spectrum. They eventually will layer on full 5G capability that will stretch spectrum a lot farther.

There are varying estimates for the eventual market for small cell sites. For example, the CTIA, the lobbying group for the wireless industry, estimates that small cells will grow from 86,000 in 2018 to 800,000 by 2026. The Wall Street analyst firm Cowan estimates 275,000 small cells by the end of 2023.

The big companies that are in the cellular backhaul business are asking the same questions as my clients. Crown Castle is embracing the small cell opportunity and sees it as a big area of future growth. Its competitor American Tower is more cautious and only chases small cell opportunities that have high margins. They caution that the profit opportunity for small cells is a lot less than at big towers. Other companies like Zayo and CenturyLink are pursuing small cells where it makes sense, but neither has yet made this a major part of their growth strategy – they are instead hoping to monetize the opportunity by adding small cells where they already own fiber.

The question that most of my clients want to understand is if the small cell building craze that has hit major metropolitan areas will ever make it out to smaller cities. In general, the big cellular carriers report that the amount of data used on their cell sites is doubling every two years. That’s a huge growth rate that can’t be sustained for very long on any network. But it’s likely that this rate of growth is not the same everywhere, and there are likely many smaller markets where cell sites are still underutilized.

Metropolitan cell sites were already using a lot of broadband even before customers started using more data. We know this because the cellular carriers have been buying and using robust data backhaul to urban sites of a gigabit or more in capacity. One good way to judge the potential for small cell sites is to look at the broadband used on existing tall tower sites. If a big tower site is using only a few hundred Mbps of bandwidth, then the cell site is not overloaded and still has room to accommodate broadband growth.

Everybody also wants to understand the revenue potential. The analyst firm Cowan estimates that the revenue opportunity per small cell site will average between $500 and $1,000 per site per month. That seems like a high price outside of metropolitan areas, where fiber is really expensive. I’ve already been seeing the big cellular carriers pushing for much lower transport rates for the big cell sites and in smaller markets carriers want to pay less than $1,000 per big tower. It probably takes 5 – 7 small cells to fully replace a large tower and it’s hard to envision the cellular carriers greatly expanding their backhaul bill unless they have no option.

It’s also becoming clear that both Verizon and AT&T have a strategy of building their own fiber anyplace where the backhaul costs are too high. We’ve already seen each carrier build some fiber in smaller markets in the last few years to avoid high transport cost situations. If both companies continue to be willing to overbuild to avoid transport costs, they have great leverage for negotiating reasonable, and lower transport costs.

As usual, I always put pen to paper. If the CTIA is right and there will be 800,000 small cell sites within six years that would mean a new annual backhaul cost of almost $5 billion annually for the cellular companies at a cost of $500 per site. While this is a profitable industry, the carriers are not going to absorb that kind of cost increase unless they have no option. If the 800,000 figure is a good estimate, I predict that within that same 6-year period that the cellular carriers will build fiber to a significant percentage of the new sites.

Perhaps the most important factor about the small cell business is that it’s local. I have one client in a town of 7,000 that recently saw several small cell sites added. I have clients in much larger communities where the carriers are not currently looking at small cell sites.

The bottom line for me is that anybody that owns fiber ought to probably provide backhaul for small cells on existing fiber routes. I’d be a lot more cautious about building new fiber for small cell sites. If that new fiber doesn’t drive other good revenue opportunities then it’s probably a much riskier investment than building fiber for the big tower cell sites. It’s also worth understanding the kind of small cell site being constructed. Many small cells sites will continue to be strictly used for cellular service while others might also support 5G local loops. Every last mile fiber provider should be leery about providing access to a broadband competitor.

How’s CAF II Doing in Your County?

The CAF II program was tasked with bringing broadband of at least 10/1 Mbps to large parts of the country. I’ve been talking to folks in rural counties all over the country who don’t think that their area has seen much improvement from the CAF II plan.

The good news is that there is a way to monitor what the big telcos are reporting to the FCC in terms of areas that have seen the CAF II upgrades. This web site provides a map that reports progress on several different FCC broadband plans. The map covers reported progress for the following programs:

  • CAF II – This was the $11 billion subsidy to big telcos to improve rural broadband to at least 10/1 Mbps.
  • CAF II BLS – This was Broadband Loop support that was made available to small telcos. Not entirely sure why the FCC is tracking this using a map.
  • ACAM – This is a subsidy given to smaller telcos to improve broadband to at least 25/3 Mbps, but which many are using to build gigabit fiber.
  • The Alaska Plan. This is the Alaska version of ACAM. Alaska is extremely high cost and has a separate broadband subsidy plan.
  • RBE – These are the Experimental Broadband Grants from 2015.

Participants in each of these programs must report GIS data for locations that have been upgraded, and those upgraded sites are then shown on the map at this site. There is, of course, some delay between the time of completing upgrades and getting information onto this map. It’s now been 4.5 years into the six-year CAF II plan, and the carriers have told the FCC that many of the required upgrades are completed. All CAF II upgrades must be finished by the end of 2020 – and likely most will be completed sometime earlier next year during the summer construction season that dictates construction in much of the country.

The map is easy to use. For example, if you change the ‘Fund’ box at the upper right of the map to CAF II, then all of the areas that were supposed to get CAF II upgrades are shown in light purple. In these areas, the big telcos were supposed to upgrade every residence and business to be able to receive 10/1 Mbps or better broadband.

The map allows you to drill down into more specific detail. For example, if you want to see how CenturyLink performed on CAF II, then choose CenturyLink in the ‘Company Name’ box. This will place a pin on the map for all of the locations that CenturyLink has reported as complete. As you zoom in on the map the upgraded locations will show as dark purple dots. You can zoom in on the map to the point of seeing many local road names.

The map has an additional feature that many will want to see. Down on the left bottom of the map under ‘Boundaries’ you can set political boundaries like County borders.

Most counties are really interested in the information shown on the map. The map shows the areas that were supposed to see upgrades along with areas that have been upgraded to date. This information is vital to counties for a number of reasons. For example, new federal grants and most state grant programs rely on this data to determine if an area is eligible for additional funding. For example, the current $600 million Re-Connect grants can’t be used for areas where more than 10% of homes already have 10/1 Mbps broadband. Any areas on this map that have the purple dots will probably have a hard time qualifying for these grants. The big telcos will likely try to disqualify any grant requests that build where they say they have upgraded.

Probably the most important use of the map is as a starting point for counties to gather accurate data about broadband. For example, you might want to talk to folks that live in the upgraded areas to see if they can really now buy 10/1 Mbps DSL. My guess is that many of the areas shown on these maps as having CAF II upgrades are still going to have download speeds less than 10/1 Mbps. If you find that to be the case I recommend documenting your findings because areas that didn’t get a full upgrade should be eligible for future grant funding.

It’s common knowledge that rural copper has been ignored for decades, often with no routine maintenance. It’s not surprising to anybody who has worked in a DSL environment that many rural lines are incapable of carrying faster DSL. It’s not easy for a big telco to bring 10/1 Mbps broadband over bad copper lines, but unfortunately, it’s easy for them to tell the FCC that the upgrades have been done, even if the speed is not really there.

This map is just one more piece of the puzzle and one more tool for rural counties to use to understand their current broadband situation. For example, it’s definitely a plus if the big telcos really upgraded DSL in these areas to at least 10/1 Mbps – many of these areas had no DSL or incredibly slow DSL before. On the flip side, if the big telcos are exaggerating about these upgrades and the speeds aren’t there, they are going to likely block your region from getting future grant money to upgrade to real broadband. The big telcos have every incentive to lie to protect their DSL and telephone revenues in these remote areas. What’s not tolerable is for the big telcos to use incorrect mapping data to deny homes from getting better broadband.

Big ISP Customer Service Still at the Bottom

This time each year we get a peek at how customers view the telecom industry, and for many years running it’s not been a pretty story. The annual American Customer Satisfaction Index (ACSI) was recently published and shows ISPs still ranked at the bottom of all industries in terms of customer satisfaction.

The survey to create the ACSI rankings is huge and involves over 300,000 households and looks at services that households use the most,  considering 400 companies in 46 different industries across 10 economic sectors.

Customers really hate the big cable TV companies and big ISPs. The ACSI index ranks companies on a scale of 1 to 100 and the two lowest ranking industries are Subscription TV Services (62) and Internet Service Providers (62) – both with the same composite ranking as last year. All other industries have rankings in the 70s and 80s, with industries like breweries (85), TV manufacturers (83), soft drinks (82), food companies (82), and automobiles (82) at the top.

The companies ranked just above ISPs have much higher rankings and include the US Postal Service (70), Fixed-Line Telephone Service (71), and Social Media Companies (72).

The big cable companies rank from the low of Altice (55) to a high for AT&T U-Verse (69). The only other companies that rank higher than the industry average of 62 are Verizon FiOS (68), Dish Networks (67) and DirecTV (66). The biggest cable companies fare poorly – Charter (59) and Comcast (57).

Internet Service Providers didn’t fare any better than cable companies with the overall industry ratings at the same 62. The only three ISPs with rankings above the average are Verizon FiOS (70), AT&T Internet (69) and Altice (63). At the bottom of the rankings are Frontier (55), MediaCom (56), and Windstream (57). The big cable companies don’t fare well as ISPs – Charter (59) and Comcast (61).

This continues to be good news for competitive overbuilders that provide good customer service. It’s been obvious over the years that customers hate calling the big cable companies and ISPs because the process of navigating through live customer service is time-consuming and painful.

But these rankings go far deeper than that. At CCG we conduct surveys for our clients who are usually looking at entering a new market. We also interview a lot of telecom customers during the course of a year. The poor opinion of the big providers in our industry runs deep. I see customers that really dislike the process that many of these companies force upon customers who have to negotiate to get lower rates every year or two. People don’t like to find out that they are paying a lot more than their neighbors for the same services. People also dislike service outages which happen far more often than they should. In the last year, we had several headline-grabbing major outages, but more aggravating to customers are the small daily outages that can hit without notice. Households have come to rely on broadband as much as they do for other household necessities like electricity and water, so outages are becoming intolerable.

Competitive ISPs are not automatically better at customer service than the big companies. Some competitive providers also offer too many product options and are willing to negotiate rates with customers. Small ISPs can also fall into the trap of turning every phone call to the company into a sales pitch. Good ISPs are learning to deal with customers in ways tailored to each customer. I know I personally would be thrilled to have my entire ISP relationship be handled by email or text, as long as by doing so I could be assured that I’m getting a good price. Most ISPs still have a long way to go – although I doubt that any ISP is ever going to be liked more than beer!

Talk to Me – Voice Computing

Technologists predict that one of the most consequential changes in our daily lives will soon come from being able to converse with computers. We are starting to see the early stages of this today as many of us now have personal assistants in our homes such as Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana or Google’s Personal Assistant. In the foreseeable future, we’ll be able to talk to computers in the same way we talk to each other, and that will usher in perhaps the most important change ever of the way that humans interact with technology.

In the book Talk to Me: How Voice Computing Will Transform the Way We Live, Work, and Think the author James Vlahos looks at the history of voice computing and also predicts how voice computing will change our lives in the future. This is a well-written book that explains the underlying technologies in an understandable way. I found this to be a great introduction to the technology behind computer speech, an area I knew little about.

One of the first things made clear in the book is the difficulty of the technical challenge of conversing with computers. There are four distinct technologies involved in conversing with a computer. First is Automatic Speech Recognition (ASR) where human speech is converted into digitized ‘words’. Natural Language Understanding (NLU) is the process used by a computer to interpret the meaning of the digitized words. Natural Language Generation (NGR) is how the computer formulates the way it will respond to a human request. Finally, Speech Synthesis is how the computer converts its answer into audible words.

There is much progress being made with each of these areas. For example, the ASR developers are training computers on how humans talk using machine learning and huge libraries of actual human speech and human interactions from social media sites. They are seeing progress as computers learn the many nuances of the ways that humans communicate. In our science fiction we’ve usually portrayed future computers that talk woodenly like Hal from 2001: A Space Odyssey. It looks like our future instead will be personal assistants that speak to each of us using our own slang, idioms, and speaking style, and in realistic sounding voices of our choosing. The goal for the industry is to make computer speech indistinguishable from human speech.

The book also includes some interesting history of the various voice assistants. One of the most interesting anecdotes is about how Apple blew its early lead in computer speech. Steve Jobs was deeply interested in the development of Siri and told the development team that Apple was going to give the product a high priority. However, Jobs died on the day that Siri was announced to the public and Apple management put the product on the back burner for a long time.

The book dives into some technologies related to computer speech and does so in an understandable way. For instance, the book looks at the current status of Artificial Intelligence and at how computers ‘learn’ and how that research might lead to better voice recognition and synthesis. The book looks at the fascinating attempts to create computer neural networks that mimic the functioning of the human brain.

Probably the most interesting part of the book is the last few chapters that talk about the likely impact of computer speech. When we can converse with computers as if they are people, we’ll no longer need a keyboard or mouse to interface with a computer. At that point, the computer is likely to disappear from our lives and computing will be everywhere in the background. The computing power needed to enable computer speech is going to have to be in the cloud, meaning that we just speak when we want to interface with the cloud.

Changing to voice interface with the cloud also drastically changes our interface with the web. Today most of us use Google or some other search engine when searching for information. While most of us select one of the choices offered on the first or second page of the search results, in the future the company that is providing our voice interface will be making that choice for us. That puts a huge amount of power into the hands of the company providing the voice interface – they essentially get to choreograph our entire web experience. Today the leading candidates to be that voice interface are Google and Amazon, but somebody else might grab the lead. There are ethical issues associated with a choreographed web – the company doing our voice searches is deciding the ‘right’ answer to questions we ask. It will be incredibly challenging for any company to do this without bias, and more likely they will do it to drive profits. Picture Amazon driving all buying decisions to its platform.

The transition to voice computing also drastically changes the business plans of a lot of existing technology companies. Makers of PCs and laptops are likely to lose most of their market. Search engines become obsolete. Social media will change drastically. Web advertising will take a huge hit when we don’t see ads – it’s hard to think users will tolerate listening to many ads as part of the web interface experience.

The book makes it clear that this is not science fiction but is a technology that will be maturing during the next decade. I recently saw a video of teens trying to figure out how to use a rotary dial phone, but it might not be long before kids will grow up without ever having seen a mouse or a QWERTY keyboard. I will admit that a transition to voice is intimidating, and they might have to pry my keyboard from my cold, dead hands.

Should Rural Fiber be a Utility?

I’ve heard or read half a dozen people in the last month say that the way we get rural fiber everywhere is to make fiber a regulated utility. This idea certainly has appeal for the many rural places that don’t have fiber today. On the surface this sounds like a way to possibly get fiber everywhere, and it’s hard to see a downside to that.

However, I can think of a number of hurdles and roadblocks to this concept that might be hard to overcome. This blog is too short to properly explore most of these ideas and it would require a 40-page whitepaper to give this topic justice. With that caveat, here are some of the big issues to be solved if we wanted to create rural fiber utilities.

What About Existing Fiber? What would we do about all of those who have already built rural fiber? There are small telcos, cooperatives, and rural communities that have already acted and found a way to fund a rural fiber network. Would we force someone who has already taken the commercial risk to somehow convert those existing fiber properties into a utility? Most small companies that have built rural fiber took on a huge debt burden to do so. Rural communities that have built fiber likely put tax revenues on the line to do so. It seems unfair to somehow force those with vision to already tackle this to somehow transform into a regulated utility.

What About Choice? One of the most important goals of almost every community I have worked with is to have broadband choice. One of the key aspects of a fiber utility is that it will almost certainly be a monopoly. Are we going to kick out WISPs in favor of a fiber utility? Would a fiber monopoly be able to block satellite broadband? .

The Definition of Rural. What areas are eligible to be part of a regulated fiber utility? If the definition is defined by customer density, then we could end up with farms with fiber and county seats without fiber. There’s also the more global consideration that most urban areas don’t have fiber today. Do we ask cities that don’t have fiber to help subsidize rural broadband? It’s impractical to think that you could force city networks to become a utility because that would financially confiscate networks from the big cable companies.

Who Pays for It? Building fiber in rural America would probably require low-interest loans from the government for the initial construction – we did this before when we built rural electric grids, so this can be made to work. But what about keeping fiber utilities solvent for the long run? The rural telephone network functioned so well because revenues from urban customers were used to subsidize service in rural places. When the big telcos were deregulated the first thing they did was to stop the internal subsidies. Who would pay to keep fiber networks running in rural America? Would urban ISPs have to help pay for rural broadband? Alternatively, might this require a tax on urban broadband customers to subsidize rural broadband customers?

Who Operates It?  This might be the stickiest question of all. Do we hand utility authority to local government, even those who are reluctant to take on the responsibility? Would people favor a fiber utility if the government handed over the operations to AT&T, Verizon, CenturyLink or Frontier? What do we do about cooperatives where the customers want to own their fiber network? Do we force existing fiber owners to somehow sell or give their networks to a new utility?

What About Carrier of Last Resort? One of the premises of being a utility is the idea that everybody in the monopoly service area can get service. Would we force fiber utilities to serve everybody? What about a customer who is so remote that it takes hundreds of thousands of dollars of construction to reach them? Who gets to decide who gets service? Does a fiber utility have to build to reach every new home?

What About Innovation? Technology never sits still. How do force fiber utilities to upgrade over time to stay current and relevant? Upgrading infrastructure is an expensive problem for existing utilities – as I found out recently when a water problem uncovered the fact that my local water utility still has some of the original main feeder pipes built out of wood. The common wisdom is that fiber will last a long time – but who pays to replace it eventually like we are now doing with the wooden water pipes? And what about electronics upgrades that happen far more often?

Government’s Role. None of this can be done without strong rules set by and enforced by the government. For example, the long-term funding mechanisms can only be created by the government. This almost certainly would require a new telecom act from Congress. Considering how lobbyists can sideline almost any legislative effort, is it even possible to create fiber utility that would work? Fiber utilities would also require a strong FCC that agrees to take back and strongly regulate and enforce broadband regulations.

Summary. I’ve only described a partial list of the hurdles faced in creating rural fiber utilities. There is no issue on this list that can’t be solved – but collectively they create huge hurdles. My biggest fear is that politics and lobbying would intervene, and we’d do it poorly. I suspect that similar hurdles faced those who created the rural electric and telephone companies – and they found a way to get it done. But done poorly, fiber utilities could be a disaster.

Summary Conclusions for Designing an FCC Broadband Grant

The earlier series of blogs looked at a number of ideas on how the FCC could create the most effective federal grant program for the upcoming $20.4 billion of announced grants. Following is a summary of the most important conclusions of those blogs:

Have a Clearly Defined Goal. If a federal grant’s program goal is something soft, like ‘improve rural broadband’ then the program is doomed to failure and will fund solutions that only incrementally improve broadband. The grant program should have a bold goal, such as bringing a permanent broadband solution to a significant number of households. For example, done well, this grant could bring fiber to 4 – 5 million homes rather than make incremental broadband improvements everywhere.

Match the Grant Process with the Grant Goals. Past federal grants have often had grant application rules that didn’t match the goals. Since the results of grants are governed by the application rules, those are all that matter. Stated goals for a grant are just rhetoric if those goals are not realized in the grant application requirements. As an example, if a grant goal is to favor the fastest broadband possible, then all grant application rules should be weighted towards that goal.

Match Speed Requirement with the Grant Construction Period. The discussion for the proposed $20.4 billion grant contemplates a minimum speed goal of 25/3 Mbps. That’s a DSL speed and is already becoming obsolete today. A goal of 25/3 Mbps will be badly outdated by the time any grant-funded networks are built. The FCC should not repeat their worst decision ever that gave out $11 billion for CAF II funding to build 10/1 Mbps networks – a speed that was obsolete even before the grants were awarded. The FCC should be requiring future-looking speeds.

Make the Grants Available to Everybody. FCC grant and loan programs often include a statement that they are available to every kind of entity. Yet the actual award process often discriminates against some kinds of applicants. For example, grants that include a loan component make it generally impossible for most municipal entities to accept the awards. Loan rules can also eliminate non-RUS borrowers. Grant rules that require recipients to become Eligible Telecommunications Carriers – a regulatory designation – discriminate against open access networks where the network owner and the ISP are separate entities. If not written carefully, grant rules can discriminate against broadband partnerships where the network owner is a different entity than the operating ISP.

Reverse Auction is not a Good Fit. Reverse auctions are a good technique to use when taking bids for some specific asset. Reverse auctions won’t work well when the awarded area is the whole US. Since reverse auctions favor those who will take the lowest amount of funding a reverse auction will, by definition, favor lower-cost technologies. A reverse auction will also favor parts of the country with lower costs and will discriminate against the high-cost places that need broadband help the most, like Appalachia. A reverse auction also favors upgrades over new construction and would favor upgrading DSL over building faster new technologies. From a political perspective, a reverse auction won’t spread the awards geographically and could favor one region, one technology or even only a few grant applicants. Once the auction is started the FCC would have zero input over who wins the funds – something that would not sit well with Congress.

Technology Matters. The grants should not be awarded to technologies that are temporary broadband band-aids. For example, if the grants are used to upgrade rural DSL or to provide fixed cellular broadband, then the areas receiving the grants will be back at the FCC in the future asking for something better. It’s hard to justify any reason for giving grants to satellite providers.

States Need to Step Up. The magnitude of the proposed federal grant program provides a huge opportunity for states. Those states that increase state grant funding should attract more federal grants to their state. State grants can also influence the federal awards by favoring faster speeds or faster technologies.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.