Generation Z as Customers

There has been a lot written about how Millennials are not buying telecom products at nearly the same rate as old generations. A large percentage of new Millennial homes reject both traditional cable TV and telephone service. They seem to be buying home broadband at about the same rate as those in Gen X. I remember seeing a study a few years back that suggests that people’s telecom buying habits are heavily influenced by what they did as teenagers, and the buying habits of the Millennials seems to bear that out.

But now we’re starting to see studies of the next generation – Generation Z, born between 1995 and 2012. This is the first generation that was handed smartphones at an early age and it can be argued that this makes them the first generation that has been immersed in computer technology for their whole life.

CommScope has been doing an annual report of the technology behavior of Generation Z, and this year’s report can be downloaded here. The study is not for the whole generation, just those 13 and older. It’s fascinating and takes an in-depth look at how this young generation uses technology. Some of the more interesting takeaways of this year’s study include:

  • First, this is a large generation. In 2015 they were 26% of worldwide population, which will increase to 33% by 2020.
  • Already today this generation accounts for 25% of consumer spending which is expected to increase to 40% in ten years.
  • 96% of Generation Z in advanced countries own smartphones.
  • This generation is nearly always online and spend 74% of their time online outside of school and work. According to the report this is the “first generation that appears to live equally in the digital world and the real world”.
  • 60% of Generation Z will not use an app or website that is too slow.
  • 2/3 of Generation Z are interested in buying things directly from social media sites.
  • The generation’s use of technology is increasing year-over-year. 80% say they are using their smart phone more than last year. Plus there is an increase of 42% for usage of laptops/desktops, 24% for tablets, 10% for smart watches.
  • 44% of the generation expect to buy a new smartphone every two years.
  • Their most popular apps are YouTube (56%), WhatsApp (47%), Facebook (38%), Instagram (30%), Twitter (22%), Snapchat (20%) and Google Apps (19%).
  • The generation expects performance from devices and services. They want fast bandwidth, long-lasting batteries, efficient and easy-to-use apps. They are likely to be demanding consumers.
  • 70% of the generation are satisfied with their bandwidth at home, and far less satisfied with bandwidth elsewhere like school (41%), shopping (38%), or outdoor public spaces (36%).
  • I don’t know how this compares with older generations, but a lot of Generation Z is interested in new technologies – virtual reality (44%), AI (41%), driverless cars (39%).
  • This generation is the first where more than half are content creators. 52% share content they have created with others. 43% create content weekly.
  • 2/3 of Generation Z agree with the sentiment that the age of personal privacy is over.
  • The generation is split on cellphone choice with 51% preferring Android to 49% for iOS.
  • 63% of Generation Z say that they would be lost without their smartphone.

What does all of this mean for an ISP? I think there are a few key takeaways.

  • This generation values high speed broadband. They also value mobility more than anything else. They are likely buyers of home broadband products unless future cellphone data products get fast and affordable enough to be a reasonable substitute.
  • Generation Z will be even less likely than Millennials to buy traditional cable TV, and practically none will buy landline telephones. This generation is not going to be buying the bundle.
  • This generation grew up with the small cellphone screen and will be happy with that format for much that they do.
  • Generation Z is going to be more demanding than past generations in terms of bandwidth and product performance and will quickly bail on providers that don’t live up to their expectations. This is not going to be a generation of loyal customers, but one that will switch to something better.
  • They are far more likely to be early technology adapters, particularly for technologies that will save them time, like driverless cars.

Who Owns Customer Data?

Our homes are starting to get filled with Internet-enabled devices. I recently looked around my own home, and in addition to the expected devices like computers, printers, tablets and smartphones we have many other devices that can connect to the Internet. We have a smart TV, an eero WiFi network, three Amazon Echos, several fitness trackers, and a smart watch. Many homes have other Internet-connected devices like smart burglar alarms, smart thermostats, smart lighting and even smart major appliances. Kids can have smart toys and game consoles these days which have more computing power than most PCs.

Every one of these devices gathers data on us and a good argument can be made that we are all being spied on by our devices. Each device witnesses a different part of our lives, but add them all together and they paint a detailed picture of the activity in your home and of each person living there.

There are numerous examples of companies that we know are using our data:

  • Last year it was revealed that Roomba was selling detailed information about the layouts of homes to data brokers.
  • The year before we found out that Samsung smart TVs were capable of listening to conversations in our living rooms and also had backdoor connections to the Internet.
  • There has been an uproar about smart talking toys that not only interact with kids but also listen and essentially build profiles on them.
  • Smart devices like smart phones, tablets and computers come with software that is aimed at gathering data on us for marketing purposes. This software generally is baked in and can’t be easily removed. Some companies like Lenovo (and their Superfish malware) went even further and hijacked user web traffic in favor of vendors willing to pay Lenovo.
  • Buyers of John Deere tractors found out that while they own the tractor they don’t own the software. The company penalizes customers who try to repair their tractor by anybody other than an authorized John Deere repairperson.

Probably the most insidious result of all of this spying is that there are now data brokers who gather and sell data that can paint a detailed profile of us. These data profiles are then used to market directly to us or are sold to politicians who can target those most sympathetic to their message. It’s also been reported that smart criminals are using this data to choose victims for their crimes.

I’m sure by now that everybody has searched for something on the web, and then noticed that for the next few weeks they are plastered with ads trying to sell them the subject of their search. This happened to me a few years ago when I was looking at new pick-up trucks on the web. But today this goes a lot farther and people complain about getting medical ads after they have searched the web about an illness.

To make matters worse, we have a government regulatory policy in this country that benefits the corporations that are spying on us. Last year Congress passed privacy rules that let ISPs and anybody else gathering raw digital data off the hook. There are essentially no real privacy rules today. Data privacy is now under the purview of the Federal Trade Commission. They might intervene in a particularly egregious case of invasion of privacy, but their rules are not proactive and only can be used to find companies that have already broken the rules. Unless fines grow to be gargantuan it’s unlikely that the FTC will change much of the worst practices using our data.

The European Union is in the process of enacting rules that will clamp down on data gathering. Their rules that go into effect in a few months will require that customers buy-in to being monitored. That is great in concept, but my guess that it’s going to take a decade of significant fines to get the attention of those companies that gather our data. Unless the fines are larger than the gains from spying on people then companies will continue to monitor us, and they will just work harder to hide evidence of spying from the government.

I think there are very few of us who don’t believe our data should belong solely to us. Nobody really wants outsiders knowing about their web searches. Nobody wants unknown companies tracking their movement inside their homes, their purchases and even their conversations. But for now, the companies that are gathering and using our data have the upper hand and are largely free do nearly anything they want with our data.

Portugal and Net Neutrality

Last week I talked about FCC Chairman Ajit Pai’s list of myths concerning net neutrality. One of the ‘myths’ he listed is: Internet service will be provided in bundles like cable television as has happened in Portugal.

This observation has been widely repeated on social media and has been used as a warning of what would happen to us Internet access without net neutrality. The social media postings have included a screen shot of the many options of ‘bundles’ available from the mobile carrier Meo in Portugal. Taken out of context this looks exactly like mobile data bundles.

Meo offers various packages of well-known web applications that customers can buy to opt the applications from monthly data caps. For example, there is a video bundle that includes Netflix, YouTube, Hulu, ESPN, Joost and TV.Com. There are a number of similar bundles like the social bundle that includes Facebook and Twitter, or the shopping bundle that contains Amazon and eBay.

But the reality is that these bundles are similar to the zero-rating done by cellular carriers in the US. The base product from Meo doesn’t block any use of cellular data. These ‘bundles’ are voluntary add-ons and allow a customer to exclude the various packaged content from monthly data caps. If a customer uses a lot of social media, for example, they can exclude this usage from monthly data caps by paying a monthly fee of approximately $5.

The last FCC headed by Tom Wheeler took a look at zero-rating practices here in the US. They ruled that the zero-ratings by AT&T and Verizon violated net neutrality because each carrier has bundled in their own content. But the FCC found that T-Mobile did not violate net neutrality when they included content from others in their zero-rating package. The current FCC has not followed through on those rulings and has taken no action against AT&T or Verizon.

The Meo bundles are similar to the T-Mobile zero-rating packages, with the difference being that the Meo bundles are voluntary while T-Mobile’s are built into the base product. The FCC is correct in pointing out that Portugal did not create mobile ‘bundles’ that are similar to packages of cable TV channels. If anything, I see these bundles as insurance – in effect, customers spend a small amount up front to avoid larger data overages later.

It is also worth noting that Portugal is a member of the European Union which has a strong set of net neutrality rules. But the EU is obviously struggling with zero-rating in the same way we are in the US. The real question this raises is if zero-rating is really a violation of net neutrality. It’s certainly something that customers like. As long as we have stingy monthly data caps then customers are going to like the idea of excusing their most popular apps from measurement against those caps. If cellular carriers offered an actual unlimited data then there would be no need for zero-rating.

I disagreed with the Wheeler FCC’s ruling on T-Mobile’s zero-rating. That ruling basically said that zero-rating is okay as long as the content is not owned by the cellular carrier. This ignores that fact that zero-rating of any kind has a long-term negative impact on competition. T-Mobile is like Meo in that they exclude the most popular web applications from data ca measurement. One of the major principles of net neutrality is to not favor any Internet traffic, and by definition, zero-rating favors the most popular apps over newer or less popular apps.

If enough customers participate in zero-rating the popular apps will maintain prominence over start-ups apps due to the fact that customers can view them for free. This is not the same thing as paid prioritization. That would occur if Netflix was to pay T-Mobile to exclude their app from data caps. That would clearly give Netflix an advantage over other video content. But voluntary zero-ratings by the cellular carriers has the exact same market impact as paid prioritization

None of this is going to matter, though, if the FCC kills Title II regulations. At that point not only will zero-rating be allowed in all forms, but ISPs will be able ask content payers for payment to prioritize their content. ISPs will be able to create Internet bundles that are exactly like cable bundles and that only allow access to certain content. And cellular carriers like AT&T or Comcast are going to be free to bundle in their own video content. It’s ironic that Chairman Pai used this as an example of an Internet myth, because killing net neutrality will make this ‘myth’ come true.

5G Networks and Neighborhoods

With all of the talk about the coming 5G technology revolution I thought it might be worth taking a little time to talk about what a 5G network means for the aesthetics of neighborhoods. Just what might a street getting 5G see in new construction that is not there today?

I live in Asheville, NC and our town is hilly and has a lot of trees. Trees are a major fixture in lots of towns in America, and people plant shade trees along streets and in yards even in states where there are not many trees outside of towns.

5G is being touted as a fiber replacement, capable of delivering speeds up to a gigabit to homes and businesses. This kind of 5G (which is different than 5G cellular) is going to use the millimeter wave spectrum bands. There are a few characteristics of that spectrum that defines how a 5G network must be deployed. This spectrum has extremely short wavelengths, and that means two things. First, the signal isn’t going to travel very far before the signal dissipates and grows too weak to deliver fast data. Second, these short wavelengths don’t penetrate anything. They won’t go through leaves, walls, or even through a person walking past the transmitter – so these frequencies require a true unimpeded line-of-sight connection.

These requirements are going to be problematic on the typical residential street. Go outside your own house and see if there is a perfect line-of-sight from any one pole to your home as well as to three or four of your neighbors. The required unimpeded path means there can be no tree, shrub or other impediment between the transmitter on a pole and each home getting this service. This may not be an issue in places with few trees like Phoenix, but it sure doesn’t look very feasible on my street. On my street the only way to make this work would be by imposing a severe tree trimming regime – something that I know most people in Asheville would resist. I would never buy this service if it meant butchering my old ornamental crepe myrtle. And tree trimming must then be maintained into the future to keep new growth from blocking signal paths.

Even where this can work, this is going to mean putting up some kind of small dish on each customer location in a place that has line-of-sight to the pole transmitter. This dish can’t go just anywhere on a house in the way that satellite TV dishes can often be put in places that aren’t very noticeable. While these dishes will be small, they must go where the transmitter can always see them. That’s going to create all sorts of problems if this is not the place in the home where the existing wiring comes into the home. In my home the wiring comes into the basement in the back of the house while the best line-of-sight options are in the front – and that is going to mean some costly new wiring by an ISP, which might negate the cost advantage of the 5G.

The next consideration is back-haul – how to get the broadband signals into and out of the neighborhood. Ideally this would be done with fiber. But I can’t see somebody spending the money to string fiber in a town like Asheville, or in most residential neighborhoods just to support wireless. The high cost of stringing fiber is the primary impediment today for getting a newer network into cities.

One of the primary alternatives to stringing fiber is to feed neighborhood 5G nodes with point-to-point microwave radio shots. In a neighborhood like mine these won’t be any more practical that the 5G signal paths. The solution I see being used for this kind of back-haul is to erect tall poles of 100’ to 120’ to provide a signal path over the tops of trees. I don’t think many neighborhoods are going to want to see a network of tall poles built around them. And tall poles still suffer the same line-of-sight issues. They still have to somehow beam the signal down to the 5G transmitters – and that means a lot more tree trimming.

All of this sounds dreadful enough, but to top it off the network I’ve described would be needed for a single wireless provider. If more than one company wants to provide wireless broadband then the number of devices multiply accordingly. The whole promise of 5G is that it will allow for multiple new competitors, and that implies a town filled with multiple wireless devices on poles.

And with all of these physical deployment issues there is still the cost issue. I haven’t seen any numbers for the cost of the needed neighborhood transmitters that makes a compelling business case for 5G.

I’m the first one to say that I’ll never declare that something can’t work because over time engineers might find solutions for some of these issues. But where the technology sits today this technology is not going to work on the typical residential street that is full of shade trees and relatively short poles. And that means that much of the talk about gigabit 5G is hype – nobody is going to be building a 5G network in my neighborhood, for the same sorts of reasons they aren’t building fiber here.

Big ISPs and Elections

Before you stop reading, this blog isn’t about party politics – the elections I am talking about are those where citizens vote on building a fiber optic network in their community. The incumbents don’t seem able to pass up the chance to turn an election their way when competition is put onto the ballot.

The latest example of this is the upcoming election on November 7 in Ft. Collins, Colorado. Voters in that community will be voting on whether to amend the city charter to allow the city to build and operate a fiber optic network in the city. Colorado law makes this elections mandatory, but I’ve seen other cities hold voluntary elections on the issue so that they are certain that the citizens are behind their efforts to build fiber. A positive vote in Ft. Collins would allow the city to take the next step to investigate if they want to build a fiber network in the city.

Ft. Collins is a community of 59,000 homes and Comcast and the other incumbent ISPs have spent over $200,000 so far in advertising against the ballot measure – a phenomenal amount of money spent on a local election and the most ever seen in Ft. Collins.

As is usual for fiber ballot initiatives, the incumbents are fighting against the passage of the measure by spreading lies and misinformation. For example, in Ft. Collins they are saying that voting for the measure would preclude the city from making other infrastructure upgrades for things like roads. In fact, this ballot measure just gives the city the legal authority to explore fiber and it’s likely that they would have another election to approve a bond measure if they decide to float a bond for fiber – a decision that would be some time in the future.

The misinformation being floated in Ft. Collins is tame compared to some of the other ways that incumbents have tried to stop fiber initiatives. In Lafayette Louisiana the combination of Cox and BellSouth (now AT&T) were extremely aggressive in trying to stop the fiber initiative (including filing several lawsuits to stop the effort). But prior to the election when fiber was going to be on the ballot they called every home in the community with a push poll that asked ludicrous questions about the fiber project. An alert citizen recorded the push poll and it can be found here. This takes 30 minutes to hear the whole thing, but if you are interested in the tactics the big ISPs use to fight it, this is well worth a listen. There are some amazing questions in this poll, and the gall of this push poll might have been what pushed the election to pre-fiber. In Louisiana the city needed to get more than a 65% yes on the fiber initiative, and due to a strong community effort the ballot measure passed easily.

I also remember a similar election in North St. Paul, Minnesota, a small community surrounded by the city of St. Paul. When the city put a fiber initiative on the ballot Comcast sent busloads of people to the city who went door-to-door to talk people out of voting for fiber. They deployed the usual misinformation campaign and scared a community that had a lot of elderly citizens into voting against the fiber initiative, which narrowly lost at the polls.

There was a similar lection recently in Longmont, Colorado. When the city first held a vote on the same ballot measure as Ft. Collins, the money from the big ISPs defeated the ballot measure. The ISPs won using a misinformation campaign that talked about how the fiber effort would raise taxes. But the citizens there really wanted fiber, and so they asked for a second vote and in the second election there was a massive grass-roots effort to inform the community about the facts. The fiber initiative on the second ballot won resoundingly and the city now has its fiber network.

There are several lessons to be learned from these ballot battles. First, the incumbents are willing to make a sizable investment to stop competition. But what they are spending, like the $200,000 in Ft. Collins, is a drop in the bucket compared to what they stand to lose. Second, they always attack fiber initiatives with misinformation, such as scaring people about higher taxes. They don’t fight by telling what a good job they are doing with broadband And finally, we’ve seen the ISP efforts be successful unless there is a strong grass-roots effort to battle against their lies. Cities are not allowed by law to take sides in ballot initiatives during an election cycle and must sit quietly on the sidelines. And so it’s up to citizens to take on the incumbents if they want fiber. The big ISPs will always outspend the pro-fiber side, but we’ve seen organized grass-roots efforts beat the big money almost every time.

New Technology – October 2017

I’ve run across some amazing new technologies that hopefully will make it to market someday.

Molecular Data Storage. A team of scientists at the University of Manchester recently made a breakthrough with a technology that allows high volumes of data to be stored within individual molecules. They’ve shown the ability to create high-density storage that could save 25,000 gigabits of data on something the size of a quarter.

They achieved the breakthrough using molecules that contain the element dysprosium (that’s going to send you back to the periodic table) cooled to a temperature of -213 centigrade. At that temperature the molecules retain magnetic alignment. Previously this has taken molecules cooled to a temperature of -259 C. The group’s goal is to find a way to do this at -196 C, the temperature of affordable liquid nitrogen, which would make this a viable commercial technology.

The most promising use of this kind of dense storage would be in large data centers since this storage is 100 times more dense than existing technologies. This would make data centers far more energy efficient while also speeding up computing. This kind of improvement since there are predictions that within 25 years data centers will be the largest user of electricity on the planet.

Bloodstream Electricity. Researchers at Fudan University in China have developed a way to generate electricity from a small device immersed in the bloodstream. The device uses stationary nanoscale carbon fibers that act like a tiny hydropower generator. They’ve named the device as ‘fiber-shaped fluidic nanogenerator” (FFNG).

Obviously there will need to be a lot of testing to make sure that the devices don’t cause problems like blood clots. But the devices hold great promise. A person could use these devices to charge a cellphone or wearable device. They could be used to power pacemakers and other medical devices. They could be inserted to power chips in farm animals that could be used to monitor and track them, or used to monitor wildlife.

Light Data Storage. Today’s theme seems to be small, and researchers at Caltech have developed a small computer chip that is capable of temporarily storing data using individual photons. This is the first team that has been able to reliably capture photons in a readable state on a tiny device. This is an important step in developing quantum computers. Traditional computers store data as either a 1 or a 0, but quantum computers store also can store data that is both a 1 and 0 simultaneously. This has shown to be possible with photons.

Quantum computing devices need to be small and operate at the nanoscale because they hold data only fleetingly until it can be processed, and nanochips can allow rapid processing. The Caltech device is small around the size of a red blood cell. The team was able to store a photon for 75 nanoseconds, and the ultimate goal is to store information for a full millisecond.

Photon Data Transmission. Researchers at the University of Ottowa have developed a technology to transmit a secure message using photons that are carrying more than one bit of information. This is a necessary step in developing data transmission using light, which would free the world from the many limitations of radio waves and spectrum.

Radio wave data transmission technologies send one bit of data at a time with each passing wavelength. Being able to send more than one bit of data with an individual proton creates the possibility of being able to send massive amounts of data through the open atmosphere. Scientists have achieved the ability to encode multiple bits with a proton in the lab, but is the first time it’s been done through the atmosphere in a real-world application.

The scientists are now working on a trial between two locations that are almost three miles apart and that will use a technology they call adaptive optics that can compensate for atmospheric turbulence.

There are numerous potential uses for the technology in our industry. This could be used to create ultrahigh-speed connections between a satellite and earth. It could be used to transmit data without fiber between locations with a clear line-of-sight. It could used as a secure method of communications with airplanes since small light beams can’t be intercepted or hacked.

The other use of the technology is to leverage the ability of photons to carry more than one bit of data to create a new kind of encryption that should be nearly impossible to break. The photon data transmission allows for the use of 4D quantum encryption to carry the keys needed to encrypt and decrypt packets, meaning that every data packet could use a different encryption scheme.

Generations Matter

Nielsen recently published their first quarter Total Audience Report for Q1 2017. It’s the best evidence that I’ve seen yet that there is a huge difference between generations when it comes to video viewing habits. Compared to most surveys that look at a few thousand people, these statistics are based on almost 300,000 households.

The report examined in detail the viewing habits of the different US generations – Generation Z (ages 2 – 20), Millennials (ages 21 – 37), Generation X (ages 38 – 52), Baby Boomers (ages 53 – 70) and the Greatest Generation (ages 71+). What might surprise a lot of people is that Generation Z and the Millennials together now make up 48% of the US population – and that means their viewing habits are rapidly growing in importance to the cable TV industry.

The report outlines how the various generations own or use various devices or services. But note that these responses represent the entire household. So, for example, when Nielsen sought answers from somebody in generation Z it’s likely that the answers represent what is owned by their parents who are likely a millennial or in generation X. Here are a few interesting statistics:

  • The broadband penetration rate between generations is about the same, ranging from 82% to 85% of households. It wasn’t too many years ago when the baby boomer households lagged in broadband adoption.
  • There is a significant difference in the use of OTT services like Netflix. 73% of homes representing generation Z subscribe to an OTT service, but only 51% of baby boomer only households.
  • Baby boomers also lag in smartphone adoption at 86% with the younger generations all between 95% and 97% adoption.
  • Baby boomers also lag in the adoption of an enabled smart TV (meaning it’s connected to the web). 28% of baby boomers have an enabled smart TV while younger households are at about 39%.

The biggest difference highlighted in the report is the daily time spent using various entertainment media that includes such things as TV, radio, game consoles, and surfing the Internet.

The big concern to the cable industry is the time spent watching cable content. For example, the average monthly TV viewing for those over 65 is 231 hours of live TV and 34 hours of time-sifted TV. But for people aged 12-17 that is only 60 hours live and 10 hours time-shifted. For ages 18-24 it’s 72 hours live and 12 hours time-shifted. For ages 25-34 it’s 101 hours live and 19 hours time-shifted. This is probably the best proof I’ve seen of how much less younger generations are invested in traditional TV.

This drastic difference for TV stands out because for other kinds of media there is not such a stark difference. For example, those over 65 spend about 67 hours per month using apps on smartphones while those 18-24 use 77 hours and those 25-34 use 76 hours.

There even wasn’t a drastic difference in the number of hours spent monthly watching video on a smartphone with those over 65 watching 2 hours per month compared to 7 hours for those 18-24 and 6 hours for those 25-34.

The only other media with a stark difference is video game consoles with those over 65 using 13 hours per month while those 18-24 use 49 hours per month. Other things like listening to the radio or using a multimedia device (like Roku or Apple TV) are similar across generations.

The drastic difference in TV viewing has serious repercussions for the industry. For example, TV is no longer a medium to be used to reach those aged 18-24 since they watch TV over 180 hours less per month than those over 65. We’re seeing a big shift in advertising dollars and during the last year the amount spent on web advertising surpassed TV advertising for the first time. When you trend this forward a decade it spells bad news for the broadcasting and cable industries. For many years there was a big hope that as people get older that they would revert to the usage patterns of their parents. But the evidence shows that the opposite seems to be true – that kids keep their viewing habits as they grow older.

When you compare this report to earlier ones it’s obvious that the difference between generations is widening. Just comparing to 2016 those over 65 are watching more TV each month while the youngest generations are cutting back on TV over time – Generation Z watched 15 minutes less TV per day just since 2016.

The Future of AT&T and Verizon

The cellphone companies have done such a great job of getting everybody to purchase a smartphone that cellular service in the country is quickly turning into a commodity. And, as is typical with most commodity products, that means less brand loyalty from customers and lower market prices for the products.

We’ve recently seen the cellular market demonstrate the turn toward becoming a commodity. In the first quarter of this year the cellular companies had their worse performance since back when they began. Both AT&T and Verizon posted losses for post-paid customers for the quarter. T-Mobile added fewer customers than expected and Sprint continued to lose money.

This is a huge turnaround for an industry where the big two cellular companies were each making over $1 billion per month in profits. The change in the industry comes from two things. First, people are now shopping for lower prices and are ready to change carriers to get lower monthly bills. The trend for lower prices was started by T-Mobile to gain market share, but low prices are also being pushed by cellular resellers – being fed by the big carriers. The cellular industry is only going to get more competitive when the cable companies soon enter the market. That will provide enough big players to make cellular minutes a true commodity. The cable companies have said they will be offering low prices as part of packages aimed at making customers stickier and will put real price pressure on the other cellular providers.

But the downturn in the first quarter was almost entirely due to the rush by all of the carriers to sell ‘unlimited’ data plans – which, as I’ve noted in some earlier blogs, are really not unlimited. But these plans offer lower prices for data and are freeing consumers to be able to use their smartphones without the fear of big overage fees. Again, this move was started by T-Mobile, but it was also driven heavily by public demand. AT&T and Verizon recognized that if they didn’t offer this product set that they were going to start bleeding customers to T-Mobile.

It will be really interesting to watch what happens to AT&T and Verizon, who are now predominantly cellular companies that also happen to own networks. The vast majority of revenues for these companies comes from the cellular parts of their companies. When I looked at both of their annual reports last year I had a hard time finding evidence that these companies were even in the landline network business. Discussions of those business lines are buried deeply within the annual reports.

These companies obviously need to find new forms of revenues to stay strong. AT&T is tackling this for now by going in a big way after the Mexican market. But one only has to look down the road a few years to see that Mexico and any other cellular market will also trend towards commoditization.

Both companies have their eyes on the same potential growth plays:

  • Both are making the moves necessary to tackle the advertising business. They look at the huge revenues being made by Facebook and Google and realize that as ISPs they are sitting on customer data that could make them major players in the targeted marketing space. Ad revenues are the predominant revenue source at Google and if these companies can grab even a small slice of that business they will make a lot of money.
  • Both are also chasing content. AT&T’s bid for the purchase of Time Warner is still waiting for government approval. Verizon has made big moves with the purchases of AOL and Yahoo and is rumored to be looking at other opportunities.
  • Both companies have been telling stockholders that there are huge amounts of money to be made from the IoT. These companies want their cellular networks to be the default networks for collecting data from IoT devices. They certainly ought to win the business for things like smart cars, but there will be a real battle between cellular and WiFi/landline connections for most other IoT usage.
  • Both companies are making a lot of noise about 5G. They are mostly concentrating on high-speed wireless connections using millimeter wave spectrum that they hope will make them competitive with the cable companies in urban areas. But even that runs a risk because if we see true competition in urban areas then prices for urban broadband might also tumble. And that might start the process of making broadband into a commodity. On the cellular side it’s hard to think that 5G cellular won’t quickly become a commodity as well. Whoever introduces faster cellphone data speeds might get a bump upward for a few years, but the rest of the industry will certainly catch up to any technological innovations.

It’s hard to foresee any business line where AT&T and Verizon are going to get the same monopoly power that they held in the cellular space for the past few decades. Everything they might undertake is also going to be available to competitors, meaning they are unlikely to make the same kind of huge margins they have historically made with cellular. No doubt they are both going to be huge companies for many decades to come since they own the cellular networks and spectrum. But I don’t think we can expect them to be the cash cows they have been in the past.

White Space Spectrum for Rural Broadband – Part II

Word travels fast in this industry, and in the last few days I’ve already heard from a few local initiatives that have been working to get rural broadband. They’re telling me that the naysayers in their communities are now pushing them to stop working on a broadband solution since Microsoft is going to bring broadband to rural America using white space spectrum. Microsoft is not going to be doing that, but some of the headlines could make you think they are.

Yesterday I talked about some of the issues that must be overcome in order to make white space spectrum viable. It certainly is no slam dunk that the spectrum is going to be viable for unlicensed use under the FCC spectrum plan. And as we’ve seen in the past, it doesn’t take a lot of uncertainty for a spectrum launch to fall flat on its face, something I’ve seen a few times just in recent decades.

With that in mind, let me discuss what Microsoft actually said in both their blog and whitepaper:

  • Microsoft will partner with telecom companies to bring broadband by 2022 to 2 million of the 23.4 million rural people that don’t have broadband today. I have to assume that these ‘partners’ are picking up a significant portion of the cost.
  • Microsoft hopes their effort will act as a catalyst for this to happen in the rest of the country. Microsoft is not themselves planning to fund or build to the remaining rural locations. They say that it’s going to take some combination of public grants and private money to make the numbers work. I just published a blog last Friday talking about the uncertainty of having a federal broadband grant program. Such funding may or may not ever materialize. I have to wonder where the commercial partners are going to be found who are willing to invest the $8 billion to $12 billion that Microsoft estimates this will cost.
  • Microsoft only thinks this is viable if the FCC follows their recommendation to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has been favoring creating just one channel of unlicensed spectrum per market. The cellular companies that just bought this spectrum are screaming loudly to keep this at one channel per market. The skeptic in me says that Microsoft’s white paper and announcement is a clever way for Microsoft to put pressure on the FCC to free up more spectrum. I wonder if Microsoft will do anything if the FCC sticks with one channel per market.
  • Microsoft admits that for this idea to work that manufacturers must mass produce the needed components. This is the classic chicken-and-egg dilemma that has killed other deployments of new spectrum. Manufacturers won’t commit to mass producing the needed gear until they know there is a market, and carriers are going to be leery about using the technology until there are standardized mass market products available. This alone could kill this idea just as the FCC’s plans for the LMDS and MMDS spectrum died in the late 1990s.

I think it’s also important to discuss a few important points that this whitepaper doesn’t talk about:

  • Microsoft never mentions the broadband data speeds that can be delivered with this technology. The whitepaper does talk about being able to deliver broadband to about 10 miles from a given tower. One channel of white space spectrum can deliver about 30 Mbps up to 19 miles in a point-to-point radio shot. From what I know of the existing trials these radios can deliver speeds of around 40 Mbps at six miles in a point-to-multipoint network, and less speed as the distance increases. Microsoft wants multiple channels in a market, because bonding multiple channels could greatly increase speeds to perhaps 100 Mbps. But even with one channel this is great broadband for a rural home that’s never had broadband. But the laws of physics means these radios will never get faster and those will still be the speeds offered a decade and two from now when those speeds are going to feel like slow DSL does today. It seems like too many broadband technology plans fail to recognize the fact that our demand for broadband has been doubling every three years since 1980. What’s pretty good speeds today can become inadequate in a surprisingly short period of time.
  • Microsoft wants to be the company to operate the wireless databases behind this and other spectrum. That gives them a profit motive to spur the wireless spectrums to be used. There is nothing wrong with wanting to make money, but this is not a 100% altruistic offer on their part.

It’s hard to know what to conclude about this. Certainly Microsoft is not bringing broadband to all of rural America. But it sounds like they are willing to work towards making this work. But we can’t ignore the huge hurdles that must be overcome to realize the vision painted by Microsoft in the white paper.

  • First, the technology has to work and the interference issues I discussed in yesterday’s blogs need to be solved for anybody to trust using this spectrum on an unlicensed basis. Nobody will use this spectrum if unlicensed users constantly get bumped off by licensed ones. The trials done for this spectrum to date were not done in a busy spectrum environment.
  • Second, somebody has to be willing to fund the $8B to $12B Microsoft estimates this will cost. There may or may not be any federal grants ever available for this technology, and there may never be commercial investors willing to spend that much on a new technology in rural America. The fact that Microsoft thinks this needs grant funding tells me that a business plan based upon this technology might not stand on its own.
  • Third, the chicken-and-egg issue of getting over the hurdle to have mass-produced gear for the spectrum must be overcome.
  • Finally, the FCC needs to adopt Microsoft’s view that there should be 3 unlicensed channels available everywhere – something that the licensed holders are strongly resisting. And from what I see from the current FCC, there is a god chance that they are going to side with the big cellular companies.

White Space Spectrum for Rural Broadband – Part I

Microsoft has announced that they want to use white space spectrum to bring broadband to rural America. In today and tomorrow’s blog I’m going to discuss the latest thoughts on the white space spectrum. Today I’ll discuss the hurdles that must be overcome to use the spectrum and tomorrow I will discuss in more detail what I think Microsoft is really proposing.

This spectrum being called white space has historically been used for the transmission of television through the air. In the recent FCC incentive auction the FCC got a lot of TV stations to migrate their signals elsewhere to free up this spectrum for broadband uses. And in very rural America much of this spectrum has been unused for decades.

Before Microsoft or anybody can use this spectrum on a widespread basis the FCC needs to determine how much of the spectrum will be available for unlicensed use. The FCC has said for several years that they want to allocate at least one channel of the spectrum for unlicensed usage in every market. But Microsoft and others have been pushing the FCC to allocate at least three channels per market and argue that the white space spectrum, if used correctly, could become as valuable as WiFi. It’s certainly possible that the Microsoft announcement was aimed at putting pressure on the FCC to provide more than one channel of spectrum per market.

The biggest issue that the FCC is wrestling with is interference. One of the best characteristics of white space spectrum is that it can travel great distances. The spectrum passes easily through things that kill higher frequencies. I remember as a kid being able to watch UHF TV stations in our basement that were broadcast from 90 miles away from a tall tower in Baltimore. It is the ability to travel significant distances that makes the spectrum promising for rural broadband. Yet these great distances also exacerbate the interference issues.

Today the spectrum has numerous users. There are still some TV stations that did not abandon the spectrum. There are two bands used for wireless microphones. There was a huge swath of this spectrum just sold to various carriers in the incentive auction that will probably be used to provide cellular data. And the FCC wants to create the unlicensed bands. To confound things, the mix between the various users varies widely by market.

Perhaps the best way to understand white space interference issues is to compare it to WiFi. One of the best characteristics (and many would also say the worse characteristics) of WiFi is that it allows multiple users to share the bandwidth at the same time. These multiple uses cause interference and so no user gets full use of the spectrum, but this sharing philosophy is what made WiFi so popular – except for the most crowded environments anybody can create an application using WiFi and knows that in most cases the bandwidth will be adequate.

But licensed spectrum doesn’t work that way and the FCC is obligated to protect all spectrum license holders. The FCC has proposed to solve the interference issues by requiring that radios be equipped so that unlicensed users will first dynamically check to make sure there are no licensed uses of the spectrum in the area. If they sense interference they cannot broadcast, or, once broadcasting, if they sense a licensed use they must abandon the signal.

This would all be done by using a database that identifies the licensed users in any given area along with radios that can search for licensed usage before making a connection. This sort of frequency scheme has never been tried before. Rather than sharing spectrum, like WiFi, the unlicensed user will be only allowed to use the spectrum when there is no interference. As you can imagine the licensed cellular companies, which just spent billions for this spectrum are worried about interference. But there are also concerns by churches, city halls and musicians who use wireless microphones.

It seems unlikely to me that in an urban area with a lot of usage on the spectrum that unlicensed white space spectrum is going to be very attractive. If it’s hard to make or maintain an unlicensed connection then nobody is going to try to use the spectrum in a crowded-spectrum environment.

The question that has yet to be answered is if this kind of frequency plan will work in rural environments. There have been a few trials of this spectrum over the past five years, but those tests really proved the viability of the spectrum for providing broadband and did not test the databases or the interference issue in a busy spectrum environnment. We’ll have to see what happens in rural America once the cellular companies start using the spectrum they just purchased. Because of the great distances in which the spectrum is viable, I can imagine a scenario where the use of licensed white space in a county seat might make it hard to use the spectrum in adjoining rural areas.

And like any new spectrum, there is a chicken and egg situation with the wireless equipment manufacturers. They are not likely to commit to making huge amounts of equipment, which would make this affordable, until they know that this is really going to work in rural areas. And we might not know if this is going to work in rural areas until there have been mass deployments. This same dilemma largely sunk the use fifteen years ago of the LMDS and the MMDS spectrums.

The white space spectrum has huge potential. One channel can deliver 30 Mbps to the horizon on a point-to-point basis. But there is no guarantee that the unlicensed use of the spectrum is going to work well under the frequency plan the FCC is proposing.