Portugal and Net Neutrality

Last week I talked about FCC Chairman Ajit Pai’s list of myths concerning net neutrality. One of the ‘myths’ he listed is: Internet service will be provided in bundles like cable television as has happened in Portugal.

This observation has been widely repeated on social media and has been used as a warning of what would happen to us Internet access without net neutrality. The social media postings have included a screen shot of the many options of ‘bundles’ available from the mobile carrier Meo in Portugal. Taken out of context this looks exactly like mobile data bundles.

Meo offers various packages of well-known web applications that customers can buy to opt the applications from monthly data caps. For example, there is a video bundle that includes Netflix, YouTube, Hulu, ESPN, Joost and TV.Com. There are a number of similar bundles like the social bundle that includes Facebook and Twitter, or the shopping bundle that contains Amazon and eBay.

But the reality is that these bundles are similar to the zero-rating done by cellular carriers in the US. The base product from Meo doesn’t block any use of cellular data. These ‘bundles’ are voluntary add-ons and allow a customer to exclude the various packaged content from monthly data caps. If a customer uses a lot of social media, for example, they can exclude this usage from monthly data caps by paying a monthly fee of approximately $5.

The last FCC headed by Tom Wheeler took a look at zero-rating practices here in the US. They ruled that the zero-ratings by AT&T and Verizon violated net neutrality because each carrier has bundled in their own content. But the FCC found that T-Mobile did not violate net neutrality when they included content from others in their zero-rating package. The current FCC has not followed through on those rulings and has taken no action against AT&T or Verizon.

The Meo bundles are similar to the T-Mobile zero-rating packages, with the difference being that the Meo bundles are voluntary while T-Mobile’s are built into the base product. The FCC is correct in pointing out that Portugal did not create mobile ‘bundles’ that are similar to packages of cable TV channels. If anything, I see these bundles as insurance – in effect, customers spend a small amount up front to avoid larger data overages later.

It is also worth noting that Portugal is a member of the European Union which has a strong set of net neutrality rules. But the EU is obviously struggling with zero-rating in the same way we are in the US. The real question this raises is if zero-rating is really a violation of net neutrality. It’s certainly something that customers like. As long as we have stingy monthly data caps then customers are going to like the idea of excusing their most popular apps from measurement against those caps. If cellular carriers offered an actual unlimited data then there would be no need for zero-rating.

I disagreed with the Wheeler FCC’s ruling on T-Mobile’s zero-rating. That ruling basically said that zero-rating is okay as long as the content is not owned by the cellular carrier. This ignores that fact that zero-rating of any kind has a long-term negative impact on competition. T-Mobile is like Meo in that they exclude the most popular web applications from data ca measurement. One of the major principles of net neutrality is to not favor any Internet traffic, and by definition, zero-rating favors the most popular apps over newer or less popular apps.

If enough customers participate in zero-rating the popular apps will maintain prominence over start-ups apps due to the fact that customers can view them for free. This is not the same thing as paid prioritization. That would occur if Netflix was to pay T-Mobile to exclude their app from data caps. That would clearly give Netflix an advantage over other video content. But voluntary zero-ratings by the cellular carriers has the exact same market impact as paid prioritization

None of this is going to matter, though, if the FCC kills Title II regulations. At that point not only will zero-rating be allowed in all forms, but ISPs will be able ask content payers for payment to prioritize their content. ISPs will be able to create Internet bundles that are exactly like cable bundles and that only allow access to certain content. And cellular carriers like AT&T or Comcast are going to be free to bundle in their own video content. It’s ironic that Chairman Pai used this as an example of an Internet myth, because killing net neutrality will make this ‘myth’ come true.

5G Networks and Neighborhoods

With all of the talk about the coming 5G technology revolution I thought it might be worth taking a little time to talk about what a 5G network means for the aesthetics of neighborhoods. Just what might a street getting 5G see in new construction that is not there today?

I live in Asheville, NC and our town is hilly and has a lot of trees. Trees are a major fixture in lots of towns in America, and people plant shade trees along streets and in yards even in states where there are not many trees outside of towns.

5G is being touted as a fiber replacement, capable of delivering speeds up to a gigabit to homes and businesses. This kind of 5G (which is different than 5G cellular) is going to use the millimeter wave spectrum bands. There are a few characteristics of that spectrum that defines how a 5G network must be deployed. This spectrum has extremely short wavelengths, and that means two things. First, the signal isn’t going to travel very far before the signal dissipates and grows too weak to deliver fast data. Second, these short wavelengths don’t penetrate anything. They won’t go through leaves, walls, or even through a person walking past the transmitter – so these frequencies require a true unimpeded line-of-sight connection.

These requirements are going to be problematic on the typical residential street. Go outside your own house and see if there is a perfect line-of-sight from any one pole to your home as well as to three or four of your neighbors. The required unimpeded path means there can be no tree, shrub or other impediment between the transmitter on a pole and each home getting this service. This may not be an issue in places with few trees like Phoenix, but it sure doesn’t look very feasible on my street. On my street the only way to make this work would be by imposing a severe tree trimming regime – something that I know most people in Asheville would resist. I would never buy this service if it meant butchering my old ornamental crepe myrtle. And tree trimming must then be maintained into the future to keep new growth from blocking signal paths.

Even where this can work, this is going to mean putting up some kind of small dish on each customer location in a place that has line-of-sight to the pole transmitter. This dish can’t go just anywhere on a house in the way that satellite TV dishes can often be put in places that aren’t very noticeable. While these dishes will be small, they must go where the transmitter can always see them. That’s going to create all sorts of problems if this is not the place in the home where the existing wiring comes into the home. In my home the wiring comes into the basement in the back of the house while the best line-of-sight options are in the front – and that is going to mean some costly new wiring by an ISP, which might negate the cost advantage of the 5G.

The next consideration is back-haul – how to get the broadband signals into and out of the neighborhood. Ideally this would be done with fiber. But I can’t see somebody spending the money to string fiber in a town like Asheville, or in most residential neighborhoods just to support wireless. The high cost of stringing fiber is the primary impediment today for getting a newer network into cities.

One of the primary alternatives to stringing fiber is to feed neighborhood 5G nodes with point-to-point microwave radio shots. In a neighborhood like mine these won’t be any more practical that the 5G signal paths. The solution I see being used for this kind of back-haul is to erect tall poles of 100’ to 120’ to provide a signal path over the tops of trees. I don’t think many neighborhoods are going to want to see a network of tall poles built around them. And tall poles still suffer the same line-of-sight issues. They still have to somehow beam the signal down to the 5G transmitters – and that means a lot more tree trimming.

All of this sounds dreadful enough, but to top it off the network I’ve described would be needed for a single wireless provider. If more than one company wants to provide wireless broadband then the number of devices multiply accordingly. The whole promise of 5G is that it will allow for multiple new competitors, and that implies a town filled with multiple wireless devices on poles.

And with all of these physical deployment issues there is still the cost issue. I haven’t seen any numbers for the cost of the needed neighborhood transmitters that makes a compelling business case for 5G.

I’m the first one to say that I’ll never declare that something can’t work because over time engineers might find solutions for some of these issues. But where the technology sits today this technology is not going to work on the typical residential street that is full of shade trees and relatively short poles. And that means that much of the talk about gigabit 5G is hype – nobody is going to be building a 5G network in my neighborhood, for the same sorts of reasons they aren’t building fiber here.

Big ISPs and Elections

Before you stop reading, this blog isn’t about party politics – the elections I am talking about are those where citizens vote on building a fiber optic network in their community. The incumbents don’t seem able to pass up the chance to turn an election their way when competition is put onto the ballot.

The latest example of this is the upcoming election on November 7 in Ft. Collins, Colorado. Voters in that community will be voting on whether to amend the city charter to allow the city to build and operate a fiber optic network in the city. Colorado law makes this elections mandatory, but I’ve seen other cities hold voluntary elections on the issue so that they are certain that the citizens are behind their efforts to build fiber. A positive vote in Ft. Collins would allow the city to take the next step to investigate if they want to build a fiber network in the city.

Ft. Collins is a community of 59,000 homes and Comcast and the other incumbent ISPs have spent over $200,000 so far in advertising against the ballot measure – a phenomenal amount of money spent on a local election and the most ever seen in Ft. Collins.

As is usual for fiber ballot initiatives, the incumbents are fighting against the passage of the measure by spreading lies and misinformation. For example, in Ft. Collins they are saying that voting for the measure would preclude the city from making other infrastructure upgrades for things like roads. In fact, this ballot measure just gives the city the legal authority to explore fiber and it’s likely that they would have another election to approve a bond measure if they decide to float a bond for fiber – a decision that would be some time in the future.

The misinformation being floated in Ft. Collins is tame compared to some of the other ways that incumbents have tried to stop fiber initiatives. In Lafayette Louisiana the combination of Cox and BellSouth (now AT&T) were extremely aggressive in trying to stop the fiber initiative (including filing several lawsuits to stop the effort). But prior to the election when fiber was going to be on the ballot they called every home in the community with a push poll that asked ludicrous questions about the fiber project. An alert citizen recorded the push poll and it can be found here. This takes 30 minutes to hear the whole thing, but if you are interested in the tactics the big ISPs use to fight it, this is well worth a listen. There are some amazing questions in this poll, and the gall of this push poll might have been what pushed the election to pre-fiber. In Louisiana the city needed to get more than a 65% yes on the fiber initiative, and due to a strong community effort the ballot measure passed easily.

I also remember a similar election in North St. Paul, Minnesota, a small community surrounded by the city of St. Paul. When the city put a fiber initiative on the ballot Comcast sent busloads of people to the city who went door-to-door to talk people out of voting for fiber. They deployed the usual misinformation campaign and scared a community that had a lot of elderly citizens into voting against the fiber initiative, which narrowly lost at the polls.

There was a similar lection recently in Longmont, Colorado. When the city first held a vote on the same ballot measure as Ft. Collins, the money from the big ISPs defeated the ballot measure. The ISPs won using a misinformation campaign that talked about how the fiber effort would raise taxes. But the citizens there really wanted fiber, and so they asked for a second vote and in the second election there was a massive grass-roots effort to inform the community about the facts. The fiber initiative on the second ballot won resoundingly and the city now has its fiber network.

There are several lessons to be learned from these ballot battles. First, the incumbents are willing to make a sizable investment to stop competition. But what they are spending, like the $200,000 in Ft. Collins, is a drop in the bucket compared to what they stand to lose. Second, they always attack fiber initiatives with misinformation, such as scaring people about higher taxes. They don’t fight by telling what a good job they are doing with broadband And finally, we’ve seen the ISP efforts be successful unless there is a strong grass-roots effort to battle against their lies. Cities are not allowed by law to take sides in ballot initiatives during an election cycle and must sit quietly on the sidelines. And so it’s up to citizens to take on the incumbents if they want fiber. The big ISPs will always outspend the pro-fiber side, but we’ve seen organized grass-roots efforts beat the big money almost every time.

New Technology – October 2017

I’ve run across some amazing new technologies that hopefully will make it to market someday.

Molecular Data Storage. A team of scientists at the University of Manchester recently made a breakthrough with a technology that allows high volumes of data to be stored within individual molecules. They’ve shown the ability to create high-density storage that could save 25,000 gigabits of data on something the size of a quarter.

They achieved the breakthrough using molecules that contain the element dysprosium (that’s going to send you back to the periodic table) cooled to a temperature of -213 centigrade. At that temperature the molecules retain magnetic alignment. Previously this has taken molecules cooled to a temperature of -259 C. The group’s goal is to find a way to do this at -196 C, the temperature of affordable liquid nitrogen, which would make this a viable commercial technology.

The most promising use of this kind of dense storage would be in large data centers since this storage is 100 times more dense than existing technologies. This would make data centers far more energy efficient while also speeding up computing. This kind of improvement since there are predictions that within 25 years data centers will be the largest user of electricity on the planet.

Bloodstream Electricity. Researchers at Fudan University in China have developed a way to generate electricity from a small device immersed in the bloodstream. The device uses stationary nanoscale carbon fibers that act like a tiny hydropower generator. They’ve named the device as ‘fiber-shaped fluidic nanogenerator” (FFNG).

Obviously there will need to be a lot of testing to make sure that the devices don’t cause problems like blood clots. But the devices hold great promise. A person could use these devices to charge a cellphone or wearable device. They could be used to power pacemakers and other medical devices. They could be inserted to power chips in farm animals that could be used to monitor and track them, or used to monitor wildlife.

Light Data Storage. Today’s theme seems to be small, and researchers at Caltech have developed a small computer chip that is capable of temporarily storing data using individual photons. This is the first team that has been able to reliably capture photons in a readable state on a tiny device. This is an important step in developing quantum computers. Traditional computers store data as either a 1 or a 0, but quantum computers store also can store data that is both a 1 and 0 simultaneously. This has shown to be possible with photons.

Quantum computing devices need to be small and operate at the nanoscale because they hold data only fleetingly until it can be processed, and nanochips can allow rapid processing. The Caltech device is small around the size of a red blood cell. The team was able to store a photon for 75 nanoseconds, and the ultimate goal is to store information for a full millisecond.

Photon Data Transmission. Researchers at the University of Ottowa have developed a technology to transmit a secure message using photons that are carrying more than one bit of information. This is a necessary step in developing data transmission using light, which would free the world from the many limitations of radio waves and spectrum.

Radio wave data transmission technologies send one bit of data at a time with each passing wavelength. Being able to send more than one bit of data with an individual proton creates the possibility of being able to send massive amounts of data through the open atmosphere. Scientists have achieved the ability to encode multiple bits with a proton in the lab, but is the first time it’s been done through the atmosphere in a real-world application.

The scientists are now working on a trial between two locations that are almost three miles apart and that will use a technology they call adaptive optics that can compensate for atmospheric turbulence.

There are numerous potential uses for the technology in our industry. This could be used to create ultrahigh-speed connections between a satellite and earth. It could be used to transmit data without fiber between locations with a clear line-of-sight. It could used as a secure method of communications with airplanes since small light beams can’t be intercepted or hacked.

The other use of the technology is to leverage the ability of photons to carry more than one bit of data to create a new kind of encryption that should be nearly impossible to break. The photon data transmission allows for the use of 4D quantum encryption to carry the keys needed to encrypt and decrypt packets, meaning that every data packet could use a different encryption scheme.

Generations Matter

Nielsen recently published their first quarter Total Audience Report for Q1 2017. It’s the best evidence that I’ve seen yet that there is a huge difference between generations when it comes to video viewing habits. Compared to most surveys that look at a few thousand people, these statistics are based on almost 300,000 households.

The report examined in detail the viewing habits of the different US generations – Generation Z (ages 2 – 20), Millennials (ages 21 – 37), Generation X (ages 38 – 52), Baby Boomers (ages 53 – 70) and the Greatest Generation (ages 71+). What might surprise a lot of people is that Generation Z and the Millennials together now make up 48% of the US population – and that means their viewing habits are rapidly growing in importance to the cable TV industry.

The report outlines how the various generations own or use various devices or services. But note that these responses represent the entire household. So, for example, when Nielsen sought answers from somebody in generation Z it’s likely that the answers represent what is owned by their parents who are likely a millennial or in generation X. Here are a few interesting statistics:

  • The broadband penetration rate between generations is about the same, ranging from 82% to 85% of households. It wasn’t too many years ago when the baby boomer households lagged in broadband adoption.
  • There is a significant difference in the use of OTT services like Netflix. 73% of homes representing generation Z subscribe to an OTT service, but only 51% of baby boomer only households.
  • Baby boomers also lag in smartphone adoption at 86% with the younger generations all between 95% and 97% adoption.
  • Baby boomers also lag in the adoption of an enabled smart TV (meaning it’s connected to the web). 28% of baby boomers have an enabled smart TV while younger households are at about 39%.

The biggest difference highlighted in the report is the daily time spent using various entertainment media that includes such things as TV, radio, game consoles, and surfing the Internet.

The big concern to the cable industry is the time spent watching cable content. For example, the average monthly TV viewing for those over 65 is 231 hours of live TV and 34 hours of time-sifted TV. But for people aged 12-17 that is only 60 hours live and 10 hours time-shifted. For ages 18-24 it’s 72 hours live and 12 hours time-shifted. For ages 25-34 it’s 101 hours live and 19 hours time-shifted. This is probably the best proof I’ve seen of how much less younger generations are invested in traditional TV.

This drastic difference for TV stands out because for other kinds of media there is not such a stark difference. For example, those over 65 spend about 67 hours per month using apps on smartphones while those 18-24 use 77 hours and those 25-34 use 76 hours.

There even wasn’t a drastic difference in the number of hours spent monthly watching video on a smartphone with those over 65 watching 2 hours per month compared to 7 hours for those 18-24 and 6 hours for those 25-34.

The only other media with a stark difference is video game consoles with those over 65 using 13 hours per month while those 18-24 use 49 hours per month. Other things like listening to the radio or using a multimedia device (like Roku or Apple TV) are similar across generations.

The drastic difference in TV viewing has serious repercussions for the industry. For example, TV is no longer a medium to be used to reach those aged 18-24 since they watch TV over 180 hours less per month than those over 65. We’re seeing a big shift in advertising dollars and during the last year the amount spent on web advertising surpassed TV advertising for the first time. When you trend this forward a decade it spells bad news for the broadcasting and cable industries. For many years there was a big hope that as people get older that they would revert to the usage patterns of their parents. But the evidence shows that the opposite seems to be true – that kids keep their viewing habits as they grow older.

When you compare this report to earlier ones it’s obvious that the difference between generations is widening. Just comparing to 2016 those over 65 are watching more TV each month while the youngest generations are cutting back on TV over time – Generation Z watched 15 minutes less TV per day just since 2016.

The Future of AT&T and Verizon

The cellphone companies have done such a great job of getting everybody to purchase a smartphone that cellular service in the country is quickly turning into a commodity. And, as is typical with most commodity products, that means less brand loyalty from customers and lower market prices for the products.

We’ve recently seen the cellular market demonstrate the turn toward becoming a commodity. In the first quarter of this year the cellular companies had their worse performance since back when they began. Both AT&T and Verizon posted losses for post-paid customers for the quarter. T-Mobile added fewer customers than expected and Sprint continued to lose money.

This is a huge turnaround for an industry where the big two cellular companies were each making over $1 billion per month in profits. The change in the industry comes from two things. First, people are now shopping for lower prices and are ready to change carriers to get lower monthly bills. The trend for lower prices was started by T-Mobile to gain market share, but low prices are also being pushed by cellular resellers – being fed by the big carriers. The cellular industry is only going to get more competitive when the cable companies soon enter the market. That will provide enough big players to make cellular minutes a true commodity. The cable companies have said they will be offering low prices as part of packages aimed at making customers stickier and will put real price pressure on the other cellular providers.

But the downturn in the first quarter was almost entirely due to the rush by all of the carriers to sell ‘unlimited’ data plans – which, as I’ve noted in some earlier blogs, are really not unlimited. But these plans offer lower prices for data and are freeing consumers to be able to use their smartphones without the fear of big overage fees. Again, this move was started by T-Mobile, but it was also driven heavily by public demand. AT&T and Verizon recognized that if they didn’t offer this product set that they were going to start bleeding customers to T-Mobile.

It will be really interesting to watch what happens to AT&T and Verizon, who are now predominantly cellular companies that also happen to own networks. The vast majority of revenues for these companies comes from the cellular parts of their companies. When I looked at both of their annual reports last year I had a hard time finding evidence that these companies were even in the landline network business. Discussions of those business lines are buried deeply within the annual reports.

These companies obviously need to find new forms of revenues to stay strong. AT&T is tackling this for now by going in a big way after the Mexican market. But one only has to look down the road a few years to see that Mexico and any other cellular market will also trend towards commoditization.

Both companies have their eyes on the same potential growth plays:

  • Both are making the moves necessary to tackle the advertising business. They look at the huge revenues being made by Facebook and Google and realize that as ISPs they are sitting on customer data that could make them major players in the targeted marketing space. Ad revenues are the predominant revenue source at Google and if these companies can grab even a small slice of that business they will make a lot of money.
  • Both are also chasing content. AT&T’s bid for the purchase of Time Warner is still waiting for government approval. Verizon has made big moves with the purchases of AOL and Yahoo and is rumored to be looking at other opportunities.
  • Both companies have been telling stockholders that there are huge amounts of money to be made from the IoT. These companies want their cellular networks to be the default networks for collecting data from IoT devices. They certainly ought to win the business for things like smart cars, but there will be a real battle between cellular and WiFi/landline connections for most other IoT usage.
  • Both companies are making a lot of noise about 5G. They are mostly concentrating on high-speed wireless connections using millimeter wave spectrum that they hope will make them competitive with the cable companies in urban areas. But even that runs a risk because if we see true competition in urban areas then prices for urban broadband might also tumble. And that might start the process of making broadband into a commodity. On the cellular side it’s hard to think that 5G cellular won’t quickly become a commodity as well. Whoever introduces faster cellphone data speeds might get a bump upward for a few years, but the rest of the industry will certainly catch up to any technological innovations.

It’s hard to foresee any business line where AT&T and Verizon are going to get the same monopoly power that they held in the cellular space for the past few decades. Everything they might undertake is also going to be available to competitors, meaning they are unlikely to make the same kind of huge margins they have historically made with cellular. No doubt they are both going to be huge companies for many decades to come since they own the cellular networks and spectrum. But I don’t think we can expect them to be the cash cows they have been in the past.

White Space Spectrum for Rural Broadband – Part II

Word travels fast in this industry, and in the last few days I’ve already heard from a few local initiatives that have been working to get rural broadband. They’re telling me that the naysayers in their communities are now pushing them to stop working on a broadband solution since Microsoft is going to bring broadband to rural America using white space spectrum. Microsoft is not going to be doing that, but some of the headlines could make you think they are.

Yesterday I talked about some of the issues that must be overcome in order to make white space spectrum viable. It certainly is no slam dunk that the spectrum is going to be viable for unlicensed use under the FCC spectrum plan. And as we’ve seen in the past, it doesn’t take a lot of uncertainty for a spectrum launch to fall flat on its face, something I’ve seen a few times just in recent decades.

With that in mind, let me discuss what Microsoft actually said in both their blog and whitepaper:

  • Microsoft will partner with telecom companies to bring broadband by 2022 to 2 million of the 23.4 million rural people that don’t have broadband today. I have to assume that these ‘partners’ are picking up a significant portion of the cost.
  • Microsoft hopes their effort will act as a catalyst for this to happen in the rest of the country. Microsoft is not themselves planning to fund or build to the remaining rural locations. They say that it’s going to take some combination of public grants and private money to make the numbers work. I just published a blog last Friday talking about the uncertainty of having a federal broadband grant program. Such funding may or may not ever materialize. I have to wonder where the commercial partners are going to be found who are willing to invest the $8 billion to $12 billion that Microsoft estimates this will cost.
  • Microsoft only thinks this is viable if the FCC follows their recommendation to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has been favoring creating just one channel of unlicensed spectrum per market. The cellular companies that just bought this spectrum are screaming loudly to keep this at one channel per market. The skeptic in me says that Microsoft’s white paper and announcement is a clever way for Microsoft to put pressure on the FCC to free up more spectrum. I wonder if Microsoft will do anything if the FCC sticks with one channel per market.
  • Microsoft admits that for this idea to work that manufacturers must mass produce the needed components. This is the classic chicken-and-egg dilemma that has killed other deployments of new spectrum. Manufacturers won’t commit to mass producing the needed gear until they know there is a market, and carriers are going to be leery about using the technology until there are standardized mass market products available. This alone could kill this idea just as the FCC’s plans for the LMDS and MMDS spectrum died in the late 1990s.

I think it’s also important to discuss a few important points that this whitepaper doesn’t talk about:

  • Microsoft never mentions the broadband data speeds that can be delivered with this technology. The whitepaper does talk about being able to deliver broadband to about 10 miles from a given tower. One channel of white space spectrum can deliver about 30 Mbps up to 19 miles in a point-to-point radio shot. From what I know of the existing trials these radios can deliver speeds of around 40 Mbps at six miles in a point-to-multipoint network, and less speed as the distance increases. Microsoft wants multiple channels in a market, because bonding multiple channels could greatly increase speeds to perhaps 100 Mbps. But even with one channel this is great broadband for a rural home that’s never had broadband. But the laws of physics means these radios will never get faster and those will still be the speeds offered a decade and two from now when those speeds are going to feel like slow DSL does today. It seems like too many broadband technology plans fail to recognize the fact that our demand for broadband has been doubling every three years since 1980. What’s pretty good speeds today can become inadequate in a surprisingly short period of time.
  • Microsoft wants to be the company to operate the wireless databases behind this and other spectrum. That gives them a profit motive to spur the wireless spectrums to be used. There is nothing wrong with wanting to make money, but this is not a 100% altruistic offer on their part.

It’s hard to know what to conclude about this. Certainly Microsoft is not bringing broadband to all of rural America. But it sounds like they are willing to work towards making this work. But we can’t ignore the huge hurdles that must be overcome to realize the vision painted by Microsoft in the white paper.

  • First, the technology has to work and the interference issues I discussed in yesterday’s blogs need to be solved for anybody to trust using this spectrum on an unlicensed basis. Nobody will use this spectrum if unlicensed users constantly get bumped off by licensed ones. The trials done for this spectrum to date were not done in a busy spectrum environment.
  • Second, somebody has to be willing to fund the $8B to $12B Microsoft estimates this will cost. There may or may not be any federal grants ever available for this technology, and there may never be commercial investors willing to spend that much on a new technology in rural America. The fact that Microsoft thinks this needs grant funding tells me that a business plan based upon this technology might not stand on its own.
  • Third, the chicken-and-egg issue of getting over the hurdle to have mass-produced gear for the spectrum must be overcome.
  • Finally, the FCC needs to adopt Microsoft’s view that there should be 3 unlicensed channels available everywhere – something that the licensed holders are strongly resisting. And from what I see from the current FCC, there is a god chance that they are going to side with the big cellular companies.

White Space Spectrum for Rural Broadband – Part I

Microsoft has announced that they want to use white space spectrum to bring broadband to rural America. In today and tomorrow’s blog I’m going to discuss the latest thoughts on the white space spectrum. Today I’ll discuss the hurdles that must be overcome to use the spectrum and tomorrow I will discuss in more detail what I think Microsoft is really proposing.

This spectrum being called white space has historically been used for the transmission of television through the air. In the recent FCC incentive auction the FCC got a lot of TV stations to migrate their signals elsewhere to free up this spectrum for broadband uses. And in very rural America much of this spectrum has been unused for decades.

Before Microsoft or anybody can use this spectrum on a widespread basis the FCC needs to determine how much of the spectrum will be available for unlicensed use. The FCC has said for several years that they want to allocate at least one channel of the spectrum for unlicensed usage in every market. But Microsoft and others have been pushing the FCC to allocate at least three channels per market and argue that the white space spectrum, if used correctly, could become as valuable as WiFi. It’s certainly possible that the Microsoft announcement was aimed at putting pressure on the FCC to provide more than one channel of spectrum per market.

The biggest issue that the FCC is wrestling with is interference. One of the best characteristics of white space spectrum is that it can travel great distances. The spectrum passes easily through things that kill higher frequencies. I remember as a kid being able to watch UHF TV stations in our basement that were broadcast from 90 miles away from a tall tower in Baltimore. It is the ability to travel significant distances that makes the spectrum promising for rural broadband. Yet these great distances also exacerbate the interference issues.

Today the spectrum has numerous users. There are still some TV stations that did not abandon the spectrum. There are two bands used for wireless microphones. There was a huge swath of this spectrum just sold to various carriers in the incentive auction that will probably be used to provide cellular data. And the FCC wants to create the unlicensed bands. To confound things, the mix between the various users varies widely by market.

Perhaps the best way to understand white space interference issues is to compare it to WiFi. One of the best characteristics (and many would also say the worse characteristics) of WiFi is that it allows multiple users to share the bandwidth at the same time. These multiple uses cause interference and so no user gets full use of the spectrum, but this sharing philosophy is what made WiFi so popular – except for the most crowded environments anybody can create an application using WiFi and knows that in most cases the bandwidth will be adequate.

But licensed spectrum doesn’t work that way and the FCC is obligated to protect all spectrum license holders. The FCC has proposed to solve the interference issues by requiring that radios be equipped so that unlicensed users will first dynamically check to make sure there are no licensed uses of the spectrum in the area. If they sense interference they cannot broadcast, or, once broadcasting, if they sense a licensed use they must abandon the signal.

This would all be done by using a database that identifies the licensed users in any given area along with radios that can search for licensed usage before making a connection. This sort of frequency scheme has never been tried before. Rather than sharing spectrum, like WiFi, the unlicensed user will be only allowed to use the spectrum when there is no interference. As you can imagine the licensed cellular companies, which just spent billions for this spectrum are worried about interference. But there are also concerns by churches, city halls and musicians who use wireless microphones.

It seems unlikely to me that in an urban area with a lot of usage on the spectrum that unlicensed white space spectrum is going to be very attractive. If it’s hard to make or maintain an unlicensed connection then nobody is going to try to use the spectrum in a crowded-spectrum environment.

The question that has yet to be answered is if this kind of frequency plan will work in rural environments. There have been a few trials of this spectrum over the past five years, but those tests really proved the viability of the spectrum for providing broadband and did not test the databases or the interference issue in a busy spectrum environnment. We’ll have to see what happens in rural America once the cellular companies start using the spectrum they just purchased. Because of the great distances in which the spectrum is viable, I can imagine a scenario where the use of licensed white space in a county seat might make it hard to use the spectrum in adjoining rural areas.

And like any new spectrum, there is a chicken and egg situation with the wireless equipment manufacturers. They are not likely to commit to making huge amounts of equipment, which would make this affordable, until they know that this is really going to work in rural areas. And we might not know if this is going to work in rural areas until there have been mass deployments. This same dilemma largely sunk the use fifteen years ago of the LMDS and the MMDS spectrums.

The white space spectrum has huge potential. One channel can deliver 30 Mbps to the horizon on a point-to-point basis. But there is no guarantee that the unlicensed use of the spectrum is going to work well under the frequency plan the FCC is proposing.

AT&T’s Broadband Trials

John Donovan, the chief strategy officer for AT&T, spoke at the Mobile World Congress recently and said that the company was trying five different technologies for the last mile. This includes WLL (wireless local loop), G.Fast, 5G, AirGig and fiber-to-the-premise. He said the company would be examining the economics of all of different technologies. Let me look at each one, in relation to AT&T.

Wireless Local Loop (WLL). The technology uses the companies LTE bandwidth but utilizes a point-to-multipoint network configuration. By using a small dish on the house to receive the signal the company is getting better bandwidth than can be received from normal broadcast cellular. The company has been doing trials on various different versions of the technology for many years. But there are a few recent trials of the newest technology that AT&T will be using for much of its deployment in rural America as part of the CAF II plan. That plan requires the ISP to deliver at least 10/1 Mbps. AT&T says that the technology is delivering speeds of 15 to 25 Mbps. The company says that even at the edge of a cellular network that a customer can get 10 Mbps about 90% of the time.

G.Fast. This is a technology that uses high frequencies to put more bandwidth on telephone copper wire. Speeds are reported to be as high as 500 Mbps, but only for very short distances under 200 feet. AT&T recently announced a G.Fast trial in an apartment building in Minneapolis. The technology is also being tested by CenturyLink and Windstream. All of these trials are using existing telephone copper inside of existing apartment buildings to deliver broadband. So this is not really a last mile technology. AT&T brings fiber to the apartment complex and then uses G.Fast as an inside wire technology. If they find it to be reliable this would be a great alternative to rewiring apartments with fiber.

5G. AT&T recently announced a few trials of early 5G technologies in Austin. They are looking at several technology ideas such carrier aggregation (combining many frequencies). But these are just trials, and AT&T is one of the companies helping to test pre-5G ideas as part of the worldwide effort to define the 5G specifications. These are not tests of market-ready technologies, but are instead field trials for various concepts needed to make 5G work. There is no doubt that AT&T will eventually replace LTE wireless with 5G wireless, but that transition is still many years in the future. The company is claiming to be testing 5G for the press release benefits – but these are not tests of a viable last mile technology – just tests that are moving lab concepts to early field trials.

AirGig. This one remains a mystery. AT&T says it will begin trialing the technology later this year with two power companies. There has been a little bit of clarification of the technology since the initial press release. This is not a broadband over powerline technology – it’s completely wireless and is using the open lines-of-sight on top of power poles to create a clear path for millimeter wave radios. The company has also said that they don’t know yet which wireless technology will be used to go from the poles into the home – they said the whole range of licensed spectrum is under consideration including the LTE frequencies. And if that’s the case then the AirGig is a fiber-replacement, but the delivery to homes would be about the same as WLL.

FTTP. Donovan referred to fiber-to-the-home as a trial, but by now the company understands the economics of fiber. The company keeps stretching the truth a bit about their fiber deployments. The company keeps saying that they have deployed fiber to 4 million homes, with 8 million more coming in the next three years. But the fact is they have actually only passed the 4 million homes that they can market to as is disclosed on their own web site. The twelve million home target was something that was dictated by the FCC as part of the settlement allowing the company to buy DirecTV.

We don’t know how many fiber customers AT&T has. They are mostly marketing this to apartment buildings, although there are residential customers around the country saying they have it. But they have not sold big piles of fiber connections like Verizon FiOS. This can be seen by looking at the steady drop in total AT&T data customers – 16.03 million in 2014, 15.78 million in 2015 and 15.62 million at the end of the third quarter of 2016. AT&T’s fiber is not really priced to be super-competitive, except in markets where they compete with Google Fiber. Their normal prices elsewhere on fiber are $70 for 100 Mbps, $80 for 300 Mbps and $99 for a gigabit.

The Resurgence of Rabbit Ears

rabbit earsThere is perhaps no better way to understand the cord cutting phenomenon than by looking at the booming sales of home TV antennas known as ‘rabbit ears’ used to receive local television off the airwaves. A study released by Park Associates shows that 15% of households now use rabbit ears, and that is a pretty amazing statistic. That is up from 8% of households from as recently as 2013. And I recall an earlier time when this had fallen below 5%.

For the longest time the TV-watching public was counted in three groups – those who had cable TV (including satellite), those that used rabbit ears to watch local TV only, and those with no TV. We now have a fourth category – those that only watch OTT programming such as Netflix.

I was once in the category of not watching TV at all. I remember twenty years ago I went to Circuit City (now gone) to consider buying a set of rabbit ears and the clerks there weren’t even sure if the store carried them. With some asking around they found that they had a few units of one brand that had been gathering dust.

But today there is a resurgence in rabbit ears and there are easily a dozen major brands. And there are new rabbit ear options coming on the market all of the time. For example, Sling TV just launched AirTV, a $99 box that integrates Sling TV, Netflix and high-quality rabbit ears together with a voice-activated remote control that makes it easy to cut the cord. This looks to be one of the better voice-activation systems around and lets you search programming options by using the name of shows, actors names or genres of types of programming.

Since most people have had cable TV for a long time many have no idea of what they can receive off air for free. The FCC has an interesting map that shows you the expected reception in your area. In my case the map shows that I can get a strong signal from every major network including CW and PBS along with signals from MyTV, Univision and a few independent local stations.

The Parks study also looks at other industry statistics. A few of the most interesting ones include:

  • Penetration of pay-TV was down to 81% in 2016 and has fallen every year since 2014. Parks cites the normal reasons for the decline including the growth of OTT programming, the increasing cost of a cable TV subscription and growing consumer awareness that there are viable alternatives to cable TV.
  • Satisfaction with pay-TV keeps dropping and only one-third of households now say that they are very satisfied with their pay-TV service.
  • OTT viewing continues to rise and 63% of US households now subscribe to at least one OTT offering like Netflix while 31% of households subscribe to more than one.
  • In 2016 12% of households downgraded their pay-TV service (meaning dropped it or went to a less expensive option). This was double the percentage (6%) who upgraded their pay-TV service in 2016.
  • Very few cord nevers (those who have never had cable TV) are deciding to buy pay-TV, with only 2% of them doing so in 2016. This is the statistic that scares the cable companies because cord nevers include new Millenial households. This generation is apparently not interested in being saddled with a pay-TV subscription. In past generations the percentage of new homes that bought pay-TV closely matched the overall penetration of the market – buying TV was something you automatically did when you moved to a new place.

These statistics show how much choice the OTT phenomenon has brought to the marketplace. Ten years ago there wouldn’t have been industry experts predicting the resurgence of rabbit ears. In fact, rabbit ears were associated with other obsolete technologies like buggy whips and were used as the butt of jokes to make fun of those who didn’t like the modern world. But this is no longer true and new rabbit ear homes are perhaps some of the most tech savvy, who know that they can craft an entertainment platform without sending a big check to a cable company.