Big Companies and Telecommuting

One of the biggest benefits most communities see when the first get good broadband is the ability for people to telecommute or work from home. Communities that get broadband for the first time report that this is one of the most visible changes made in the community and that soon after getting broadband almost every street and road has somebody working from home.

CCG is a great example of telecommuting and our company went virtual fifteen years ago. The main thing that sent us home in those days was that residential broadband was better than what we could get at the office. All of our employees could get 1 – 2 Mbps broadband at home and that was also the only speed available at our offices over a T1. But we found that even in those early days that a T1 was not enough speed to share among multiple employees.

Telecommuting really picked up at about the same time that CCG went virtual. I recall that AT&T was an early promoter of telecommuting as was the federal government. At first these big companies let employees work at home a day or two a week as a trial. But that worked out so well that over time big organizations felt comfortable with people working out of their homes. I’ve seen a number of studies that show that telecommuting employees are more productive than office employees and work longer hours – due in part to not have to commute. Telecommuting has become so pervasive that there was a cover story in Forbes in 2013 announcing that one out of five American workers worked at home.

Another one of the early pioneers in telecommuting was IBM. A few years ago they announced that 40% of their 380,000 employees worked outside of traditional offices. But last week the company announced that they were ending telecommuting. They told employees in many of their major divisions like Watson development, software development and digital marketing and design that they must move back into a handful of regional offices or leave the company.

The company has seen decreasing revenues for twenty straight quarters and there is speculation that this is a way to reduce their work force without having to go through the pain of choosing who will leave. But what is extraordinary about this announcement is how rare it is. It’s only the second major company that has ended telecommuting in recent memory, the last being Yahoo in 2013.

Both IBM and Yahoo were concerned about earnings and that is probably one of the major reasons that drove their decision to end telecommuting. It seems a bit ironic that companies would make this choice when it’s clear that telecommuting saves money for the employer – something IBM crowed about earlier this year.

Here are just a few of the major findings that have been done about the benefits of telecommuting. It’s improves employee morale and job satisfaction. It reduces attrition, reduces sick and unscheduled leave. It saves companies on office space and overhead costs. It reduces discrimination by equalizing people by personality and talent rather than race, age or appearance. It increases productivity by eliminating unneeded meetings and because telecommuters work more hours than office workers.

But there are downsides. It’s hard to train new employees in a telecommuting environment. One of the most common ways to train new people is to have them spend time with somebody more experienced – something that is difficult with telecommuting. Telecommuting makes it harder to brainstorm ideas, something that benefits from live interaction. And possibly the biggest drawback is that telecommuting isn’t for everybody. Some people cannot function well outside of a structured environment.

As good as telecommuting is for companies it’s even better for smaller and rural communities. A lot of people want to live in the communities they grew up in, around friends and family. We’ve seen a brain drain from rural areas for decades as kids graduate from high school or college and are unable to find meaningful work. But telecommuting lets people live where there is broadband. Many communities that have had broadband come to town report that they see an almost instant uptick in housing prices and demand for housing. And part of that increased demand is from those who want to choose a community rather than follow a job.

One of the more interesting projects I’ve worked on with the telecommuting issue was when I helped the city of Lafayette, Louisiana get a fiber network. Lafayette is not a rural area but a thriving mid-size city, and yet one of the major reasons the residents wanted fiber was the chance to keep their kids at home. The area is largely Cajun with a unique culture and the community was unhappy to see their children have to relocate to larger cities to get jobs after graduating from the university there. Broadband alone can’t fix that kind of problem, but Lafayette is reportedly happy with the changes brought from the fiber network. That’s the kind of benefit that’s hard to quantify in dollar terms.

Net Neutrality and the Digital Divide

There is an interesting idea floating around the industry that is bound to annoy fans of net neutrality. The idea comes from Roslyn Layton who does telecom research at Aalborg University in Denmark. She served on the FCC Transition team for the new administration.

She envisions zero-rating as the best way to solve the digital divide and to finally bring Internet access to everybody. She says that after decades of not finding any other solutions that this might the only reasonable path to get Internet access to people that can’t afford a monthly subscription.

The idea is simple – there are companies who will provide an advertising-driven broadband connection for free to customers, particularly on a cellphone. It’s not hard to envision big companies like Facebook or Google sponsoring cellphone connections and providing data access to customers who would be a captive audience for their ads and content.

This idea is already working elsewhere. Facebook offers this same service in other countries today under the brand name “Free Basics.’ While it certainly costs Facebook to buy the wholesale data connections they must have done the math and figured that having a new customer on their platform is worth more than the cost. Facebook’s stated goal is to serve most of the billions of people on earth and this is a good way to add a lot of customers. With Free basics customers get full use of the Facebook platform along with the basic ability to surf the web. However, the free basic service does not allow a user to freely watch streaming video or to do other data-intensive activities that are not part of the Facebook universe – it’s not an unlimited data plan. I can remember similar products in the US back in the dial-up days when several dial-up providers that gave free connections as long as the customers didn’t mind being bombarded by ads.

There are certainly upsides to this. Such a service would provide enough bandwidth for people to use the web for the basics like hunting for a job or doing school work. And users would get unlimited use of the Facebook platform for functions such as messaging or watching Facebook-sponsored video and content. There are still a substantial number of people in the US who can’t afford a broadband subscription and this would provide a basic level of broadband to anybody willing to deal with the ad-heavy environment.

But there are downsides. This idea violates net neutrality. Even if the current FCC does away with net neutrality one has to think that a future FCC will institute something similar. But even with net neutrality rules in place the FCC could make an exception for a service that tackles the digital divide.

The real downside is that this is not the same as the real internet access that others enjoy. Users would be largely trapped inside whatever platform sponsors their product. That could be Facebook or Google, but it could also be an organization with a social or political agenda. Anybody using this kind of free platform would have something less than unfettered Internet access, and they would be limited to whatever the platform sponsor allows them to see or do outside the base platform. At best this could be called curated Internet access, but realistically it’s a platform to give sponsors unlimited access to users.

But I think we have to be realistic that nobody has yet found a solution to the digital divide. The FCC’s Lifeline program barely makes a dent in it. And I’m not aware of any major ISP who has ever found any mechanism to solve the digital divide issue.

While Facebook offers this in many countries around the globe they received massive pushback when they tried to bring this to India. The Indian government did not want a class of people given a clearly inferior class of Internet connectivity. But in India the government is working hard themselves to solve the digital divide. But there is nobody in the US giving the issue any more than lip service. The issue has been with us since the dial-up days and there has been little progress in the decades since then.

I read some persuasive articles a few years ago when the net neutrality debate was being discussed about this kind of product. There were arguments made that there would be long-term negative ramifications from having a second-class kind of Internet access. The articles worried about the underlying sponsors heavily influencing people with their particular agenda.

But on the flip side, somebody who doesn’t have broadband access probably thinks this is a great idea. It’s unrealistic to think that people have adequate broadband access when they can only get it at the library or a coffee shop. For broadband to benefit somebody it needs to be available when and where they need to use it.

I lean towards thinking this as an idea worth trying. I would hope that there would be more than one or two companies willing to sponsor this, in which case any provider who is too obnoxious or restrictive would not retain customers. People who go to sites like Facebook today are already voluntarily subjected to ads, so this doesn’t seem like too steep of a price to pay to get more people connected to the Internet.

The Myth of OTT Savings

One of the reasons touted in the press for the recent popularity of cord cutting is the desire of people to save money over a traditional cable TV subscription. But as I look at what’s popular on the web I wonder if the savings are really going to be there for people who like to watch a variety of the best content.

There has been an explosion of companies that are pursuing unique video content, and this means that great content can now be found at many different places on the web. Interestingly, most of this great content is not available on traditional TV, other than the content provided by the premium movie channels. But considering the following web platforms that are creating unique content:

  • Netflix. They are the obvious king of unique content and release new shows, specials, movies and documentaries seemingly weekly. And they seem to have a wide variety of content aimed at all demographics.
  • Hulu. They are a bit late to the game. But the newly released The Handmaid’s Tale is getting critical acclaim and will be part of a quickly growing portfolio of unique content.
  • HBO. HBO has always had a few highly popular series with Game of Thrones still drawing huge audiences.
  • CBS All-Access. CBS has made a bold move by offering the new series Star Trek: Discovery only online. It’s bound to draw a lot of customers to the online service.
  • Amazon Prime. The company says they are going to invest billions in unique programming and are aiming at overtaking Netflix. Their recent hit The Man in the High Castle is evidence of the quality programming they are pursuing.
  • Showtime. They have historically created limited amounts of unique content but are now also looking to create a lot more. Their new show Twin Peaks has come out with high reviews.
  • Starz. This network is also now chasing new content and has a hit series with American Gods.
  • Seeso. Even services that most people have never heard of, such as Seeso are creating popular content such as the comedy series My Brother, My Brother and Me.
  • YouTube Red. The industry leader of unique content is YouTube which has allowed anybody to create content. While most of this is still free, the platform is now putting a lot of great content such as the comedy Rhett and Link’s Buddy System behind a paywall.

Subscribing to the above online services with the minimum subscriptions costs $79 per month (and that’s without figuring in the annual cost of Amazon Prime, which most people buy because of the free shipping from Amazon). The above line-up doesn’t include any sports and you’d have to buy a $30 subscription from Sling TV to watch ESPN and a few other popular sports networks. ESPN recently announced that they still don’t have any plans to launch a standalone web product but are instead pursuing being included in the various skinny bundles.

Not considered, though, in the above list are numerous other less-known paid OTT subscriptions available on line. As listed in this recent blog there are dozens of other platforms for people who like specialized content like Japanese anime or British comedies.

Of course, one thing the above list shows is that there is a world of content these days that is not being created by the major networks or the traditional cable networks. There is likely more money pouring into the creation of content outside of the traditional networks.

So OTT doesn’t seem to save as much as hoped for people that wish to enjoy a variety of popular content across different providers.  But there are other benefits driving people to OTT programming.  One of the great benefits of OTT programming is the ability to subscribe and cancel services at will. I have been trying various OTT networks and it’s really tempting to subscribe to each for a month or two until you’ve seen what you want and then move on to something else. I’m starting to think that’s the way I will use these services as long as they continue to allow easy egress and exit.

And OTT programming allows for non-linear TV watching.  As long as somebody lives near to a metropolitan area a cord cutter can still view the traditional network channels using rabbit ears. But what a lot of cord cutters are finding is that they quickly lose their tolerance of linear programming. I know that when I travel and have TV available in the room that I only watch it if I want to catch a football or basketball game. I can no longer tolerate the commercial breaks or the inability to pause linear TV while I want to do something else. And that, perhaps more than anything, is what will bring down traditional cable TV. As much as cable companies tout TV Everywhere, their basic product is still showing content linearly at fixed times. There is such a huge volume of great OTT content available any time on any device that it’s not hard for somebody to walk away from the traditional networks and still always have something you want to watch.

The Future of WiFi

There are a lot of near-term improvements planned for WiFi. The IEEE 802.11 Working Group (part of the Wi-Fi Alliance) has a number of improvements being planned. Many, but not all of the improvements, look at the future of using the newly available millimeter wave spectrum.

It’s been twenty years since the first WiFi standard was approved. I remember how great it felt about fifteen years ago when Verizon gave me a WiFi modem as part of my new FiOS service. Up until then my computing had always been tied to cables and it was so freeing to use a laptop anywhere in the house (although that first generation WiFi didn’t do a great job of penetrating the plaster walls in my old house).

Here are some of the improvements being considered:

802.11ax. The goal of this next-gen WiFi is to enable speeds up to 10 Gbps using the 5 GHz band of free WiFi spectrum. The standard also seeks to provide more bandwidth in the 2.4 GHz band. The developing new standard is looking at the use of Orthogonal Frequency Division Multiple Access (OFDMA), multi-user MIMO and other technology improvements to squeeze more bandwidth out of the currently available WiFi frequency.

Interestingly, this standard only calls for an improvement of about 37% over today’s 802.11ac technology, but the various improvement in the way the spectrum is used will hopefully mean about a four times greater delivery of bandwidth.

Probably the biggest improvement with this standard is the ability to connect efficiently to a greater number of devices. At first this will make 802.11ax WiFi more useful in crowded environments like stadiums and other public places. But the real benefit is to make WiFi the go-to spectrum for use for the Internet of Things. There is a huge race going on between WiFi and cellular technologies to grab the majority of that exploding market. For now, for indoor uses WiFi has the lead and most IoT devices today are WiFi connected. But today’s WiFi networks can get bogged down when there are too many simultaneous requests for connections. We’ll have to wait to see if the changes to the standards improve WiFi enough to keep in ahead in the IoT race.

Of course, the 10 GHz speed is somewhat theoretical in it would provide all of the bandwidth to one device that was located close the transmitter – but the overall improvement in bandwidth promises to be dramatic. This new standard is expected to be finalized by 2019, but there will probably be new hardware that incorporates some of the planned upgrades by 2018.

802.11ay. 802.11ay is the successor to 802.11ad, which never got any market traction. These two standards utilize the 60 GHz spectrum and are intended to deliver big amounts of bandwidth for short distances, such as inside a room. This new standard promises to improve short-range bandwidth up to 20 Gbps, about a three times improvement over 802.11ad. The new standard might have the same market acceptance issues if most users are satisfied instead with 802.11ax. The primary improvements over 802.11ad are the addition of MIMO antennas with up to four simultaneous data streams.

802.11az. The earlier two improvements discussed above are aimed at improving bandwidth to WiFi users. The 802.11az standard instead looks at ways to improve the location and positioning of users on a WiFi network. Since many of the improvements in WiFi use MIMO (multiple input multiple output) antennas, system performance is improved significantly if the WiFi router can accurately and quickly keep track of the precise location of each user on the WiFi network. That’s a relatively simple task in a static environment of talking to fixed-location devices like a TV or appliances, but much harder to do with mobile devices like smartphones, tablets, etc. Improvements in locating technology allows a WiFi network to more quickly track and connect to a device without having to waste frequency resources to first find the device before each transmission.

The other big improvement promised by this standard is increased energy efficiency of the network. As the network becomes adroit at identifying and remembering the location of network devices, the standard allows for WiFi devices to shut down and go to sleep and drop off the network when not in use, saving energy for devices like IoT sensors. The WiFi hub and sensor devices can be ‘scheduled’ to connect at fixed times allowing for devices to save power by sleeping in between connections.

These changes are necessary to keep WiFi useful and relevant. The number of devices that are going to be connected to WiFi is expected to continue to grow at exponential rates, and today’s WiFi can bog down under heavy use, as anybody who tries to use WiFi in a business hotel understands. But a lot of the problems with today’s WiFi can be fixed with the combination of faster data throughput along with tweaks that reduce the problems caused by interference among devices trying to gain the attention of the hug modem. The various improvements planned by the IEEE Working Group are addressing all of these issues.

Two Views on Skinny Bundles

The industry is abuzz this year with talk about skinny bundles. But there is a lot of disagreement about whether skinny bundles are really going to be effective and if they will put a serious dent in the pay-TV market. Today I look at opposing views from two major players in the industry.

First are the recent statements by David Zaslav, the CEO of Discovery Communications. He says the skinny bundles we see in the US are not really ‘skinny’ and are instead just another way to package traditional programming. He says that Discovery sells programming around the world and that in almost 200 other worldwide cable markets there are true skinny bundles that cost between $8 and $12 per month. He says these bundles are popular and give people a real alternative to the big cable bundles.

By contrast all of the major bundles on the market today in the US are priced at $30 to $60 and just provide a different alternative to the cable companies. The current US bundles are expensive because they include high-cost programming like sports, movie channels and major cable networks.

Zaslav’s statements are somewhat ironic since his company is one of the major programmers that drives up the size and the cost of traditional cable TV big channel line-ups. Discovery today includes a suite of 13 channels such as the Discovery Channel, TLC, Animal Planet, Science, and a host of other Discovery channels. Many of my clients are required to carry all of these channels if they want to carry any of them, and at least eight of these channels are required to be in the lower expanded basic tier where most customers have to pay for them. It’s also interesting that most of the current on-line skinny bundles in the US are not carrying the Discovery networks.

An interesting contrast to this comes from Charlie Ergen, Chairman and CEO of Dish Networks. He is wildly enthusiastic about the current US skinny bundles, including his own Sling TV. He says the company first launched Sling TV to try to lure cord cutters back to a paid subscription. But the company found out that they were instead taking customers away from pay-TV including his own satellite customers from Dish networks.

He believes that the public perceives the current US skinny bundles as a real alternative to the traditional pay-TV bundle. Sling TV has done better in the market than original projections. At the end of the 1st quarter of 2017 the company had 1.3 million customers, about double where they sat just last June. The other similar subscription services from Hulu and YouTube are also doing quite well and together are carving off a noticeable slice of the traditional TV market.

But Ergen admits that his Sling TV is a replacement for traditional TV, not a wildly different alternative. A lot of customers like on-line services because they offer the the ability to start and stop service at will or to add or subtract additional small packages of channels to the line-up as their interests change. It’s certainly possible that much of the success of these new bundles comes from consumers who are fed up with the big cable companies.

It’s also debatable if people who move from traditional cable to Sling TV or similar services can be classified as cord cutters. They are cord cutters in that they got rid of coaxial cable feed from the cable company, but they are still subscribing to a lot of the same channels as before and which are still broadcast at set times on a line-up.

For now it looks like the current skinny bundles are meeting moderate success and are attracting a few million customers. They haven’t been around very long and I suspect that a lot of consumers have either never heard of them or haven’t given them any serious consideration. But you can save money with these packages while gaining the flexibility to connect and disconnect on-line at any time – avoiding those dreadful call to cable customer service.

I know I would love to see the skinny bundles that David Zaslav describes. I imagine that each $8 – $12 bundle contains a limited number of channels. At a small size these are probably as close as anybody can get to a la carte programming. And at the end of the day that’s what a lot of cord cutters really want.

Cellular Networks and Fiber

We’ve known for a while that the future 5G that the cellular companies are promising is going to need a lot of fiber. Recently Verizon CEO Lowell McAdam verified this when he said that the company will be building dense fiber networks for this purpose. The company has ordered fiber cables as large as 1,700 strands for their upcoming build in Boston in order to support the future fiber and wireless network there. That’s a huge contrast from Verizon’s initial FiOS builds that largely built a network using mostly 6-strand fibers in a lot of the Northeast.

McAdams believes that the future of urban broadband will be wireless and that Verizon intends to build the fiber infrastructure needed to support that future. Of course, with that much fiber in the environment the company will also be able to supply fiber-to-the-premise to those that need the largest amounts of bandwidth.

Boston is an interesting test case for Verizon. They announced in 2015 that they would be expanding their FiOS network to bring fiber to the city – one of many urban areas that they skipped during their first deployment of fiber-to-the-premise. The company also has engaged with the City government in Boston to develop a smart city – meaning using broadband to enhance the livability of the city and to improve the way the government delivers services to constituents. That effort means building fiber to control traffic systems, police surveillance systems and other similar uses.

And now it’s obvious that the company has decided that building for wireless deployment in Boston is part of that vision. It’s clear that Verizon and AT&T are both hoping for a world where most devices are wireless and that the wireless connections use their networks. They both picture a world where their wireless is not just used for cellphones like today, but will also be used to act as the last mile broadband connection for homes, for connected cars, and for the billions of devices used for the Internet of Things.

With the kind of money Verizon is talking about spending in Boston this might just become the test case for a connected urban area that is both fiber rich and wireless rich. To the extent that they can do it with today’s technology it sounds like Verizon is hoping to serve homes in the City with wireless connections of some sort.

I’ve discussed several times how millimeter wave radios have become cheap enough to be a viable alternative for bringing broadband to urban apartment buildings. That’s a business plan that is also being pursued by companies like Google. But I still am not aware of hardware that can reasonably be used with this same technology to serve large numbers of single family homes. At this point the electronics are still too expensive and there are other technological issues to overcome (such as having fiber deep in neighborhoods for backhaul).

So it will be interesting to watch how Verizon handles their promise to bring fiber to the homes in Boston. Will they continue with the promised FTTP deployment or will they wait to see if there is a wireless alternative on the horizon?

It’s also worth noting that Verizon is tackling this because of the density of Boston. The city has over 3,000 housing units per square mile, making it, and many other urban centers, a great place to consider wireless alternatives instead of fiber. But I have to contrast this with rural America. I’m working with several rural counties right now in Minnesota that have housing densities of between 10 and 15 homes per square mile.

This contrast alone shows why I don’t think rural areas are ever going to see much of the advantages of 5G. Even though it’s expensive to build fiber in a place like Boston, the potential payback is commensurate with the cost of the construction. I’ve always thought that Verizon made a bad strategic decision years ago when they halted their FiOS  construction before finishing building in the metropolitan areas on the east coast. Verizon has fared well in its competition with Comcast and others.

But there is no compelling argument for the wireless companies or anybody else to build fiber in the rural areas. The cost per subscriber is high and the paybacks on investment are painfully long. If somebody is going to invest in rural fiber they might as well use it to connect directly to customers rather than to spend the money in fiber plus adding a wireless network on top of it.

We are going to continue to see headlines about how wireless is the future, and for some places like Boston it might be. Past experience has shown us that wireless technology often works a lot different in the field compared to the lab, so we need to see if the wireless technologies being considered really work as promised. But even if they do, those same technologies are going to have no relevance to rural America. If anything the explosion of urban wireless might further highlight the stark differences between urban and rural America.

Can a Small Cable Company Succeed?

Today I ask the question of whether anybody small can really succeed with a cable TV product. This was prompted by the news that Cable One, one of the mid-sized cable companies, is bleeding cable customers. For those not familiar with the company they are headquartered in Phoenix, AZ and operate cable systems in 19 states with the biggest pockets of customers in Idaho, Mississippi and Texas.

The company just reported that for the 12 months ending on March 31 that they had lost 12.7% of their cable customers and dropped below 300,000 total cable customers. Most of my clients would consider anybody of this size to be a large cable company. But their struggles beg the question of anybody smaller than the really giant cable companies can seriously maintain a profitable and viable cable product in today’s environment.

The drop in their cable customers was precipitated by a number of factors. One that is very familiar to small cable operators is that Cable One decided in 2015 to drop the Viacom suite of channels from their system. We all remember that in that year Viacom announced huge and unprecedented rate increases of over 60% for the suite of channels that include MTV, Comedy Central, BET and a number of other channels. A number of my clients also decided to drop Viacom rather than pay for the huge increases in programming.

Cable One also shares another characteristic with smaller companies in that they are too small to unilaterally negotiate alternate piles of programming to sell as skinny bundles. So they and other small companies are likely to see customers abandoning them for smaller line-ups from Sling TV and other purveyors of smaller on-line line-ups – including Hulu which just announced entry into this quickly growing market.

And finally, Cable One and most other cable companies are now starting to feel the impact of cord cutting. While only a fraction of their customer losses can be blamed on cord cutting, it is now a real phenomenon and all cable companies can expect to lose a few percent of customers every year to Netflix and others.

The really large cable companies are not immune to these same market influences. The giants like Comcast and Charter / Spectrum are going to continue to see big increases in programming costs. Recent Comcast financials show that the company saw a 13% increase in programming cost over the last year (although some of that increase was paid to their own subsidiaries of programmers).

But the handful of giant cable companies are so big that they look like they are going to be able to offset losses in cable revenues in margins with new sources of revenues. For example, Comcast and Charter announced recently that they will be launching a jointly-provisioned cellular business that will help them grow revenues significantly instead of just treading water like smaller cable revenues. And I’ve recently written in here of all of the other ways that Comcast is still growing their business, which smaller companies are unable to duplicate.

The biggest dilemma for small cable companies is that the TV product still drives positive margin for them. While every small cable provider I know moans that they lose money on the cable product, the revenues generated from cable TV are still in excess of programming costs and almost every company I know would suffer at the bottom line if they kill the TV product line.

It has to be troubling for programmers to see cable companies struggling this hard. If somebody the size of Cable One is in crisis then the market for the programmers is quickly shrinking to only serving the handful of giant cable companies. The consolidation of cable providers might mean that the huge cable companies might finally be able to band together to fight back against the big rate increases. Just last week Charter announced that they were demoting a number of Viacom channels to higher tiers (meaning that the channels would not automatically be included in the packages that all customers get).

It’s hard to think of another industry that is trying so hard to collectively drive away their customer base. But all of the big companies – cable providers and programmers – are all publicly traded companies that have huge pressure to keep increasing earnings. As customers continue to drop the programmers raise rates higher, which then further drives more customers to drop out of the cable market. It doesn’t take sophisticated trending to foresee a day within the next decade where cable products could become too expensive for most homes. We are all watching a slow train wreck which the industry seems to have no will or ability to stop.

What is ‘Light Touch’ Regulation?

The new FCC Commissioner Ajit Pai has made several speeches in the last month talking about returning to ‘light-touch regulation’ of the big ISPs. He is opposed to using Title II regulation to regulate ISPs and wants to return to what we had in place before that.

His argument is that the Internet has grown and thrived under the prior way that it was regulated. And he has a point – the Internet has largely been unregulated since its inception. And in many ways the industry has even received preferential regulatory treatment such as the way that Congress has repeatedly exempted broadband services from taxes.

It’s certainly hard to argue with the fact that the Internet has thrived. It’s a little harder to draw the conclusion that light regulation was the cause for this, as the Internet has primarily grown because people love the online content they find there.

But we are now at a different point in the broadband industry than we were when it was in its infancy. Consider the following:

  • The vast majority of homes now have broadband. While the industry is still adding customers there aren’t that many more households that can get broadband that don’t have it.
  • Look back just ten years ago and there was a lot more competition for broadband. In 2007 cable modems and DSL served roughly the same number of customers with similar products in terms of speed. But today cable broadband has become a near-monopoly in most markets.
  • One of the drivers towards implementing net neutrality was the explosive growth of video. Just a few years ago there were many reports of the big ISPs slowing down Netflix and other video traffic. The ISPs were trying to force video providers to pay a premium price to gain access to their networks.
  • While broadband prices have held reasonably stable for a decade, both the cable TV and voice products of the large ISPs are under fire and it’s widely expected that the ISPs will have to start raising broadband rates every year to meet earnings expectations.
  • The ISPs have changed a lot over the last decade and all of the big ones now own content and are no longer just ISPs. This gives them competitive leverage over other competitors.
  • The Internet has become a far more dangerous place for consumers. Hacking and viruses run rampant. And the ISPs and web services like Google and Facebook routinely gather data on consumers for marketing purposes.

I would be the first to agree that hands-off regulation probably contributed to the growth of the Internet. But this is no longer the same industry and it’s hard to think that any of the big ISPs or transport providers need any further protection. These are huge companies with big profits.

It seems to me that the Chairman’s use of the term ‘light-touch regulation’ is code for basically having no regulations at all. And since that was the state of the industry just a few years ago we don’t have to stretch the imagination very far to know what that means.

Before Title II regulation the FCC had almost no power over the big ISPs. The most they could do was to encourage them to do the right thing. Interestingly, in the two or three years leading up to the Title II order it was the threat of coming regulation that kept the ISPs in line more than anything else. The FCC tried to intercede in disputes between the ISPs and video providers and found that they had no leverage on the ISPs. The FCC also didn’t like data caps but they had no power to do anything about them. However, since the ISPs feared price regulation under Title II most of them raised data cap limits to defuse the public outcry over the issue.

So my recollection of the past five years is that it was the threat of coming regulation that kept the big ISPs in line. Because at the end of the day a big ISP could challenge the FCC on broadband issues in court and win every time. So the FCC’s best way to influence the ISPs was to hold the threat of regulation over their heads.

If we go back to that same regulatory place (which is what would happen if Title II is reversed) then there will no longer be any leverage at the FCC. ISPs will be free to do almost anything they want in the broadband arena. The FCC has already let them off the hook for consumer privacy, and that is just the beginning.

You can expect without regulation that the ISPs will do all of those things that net neutrality was supposed to protect against. They all say today that will never happen, and that they believe in the core tenets of net neutrality. But I think we all know that is public relations talk and that the big ISPs will pursue anything that will make them money. That means discriminating against traffic and demanding payments from video providers to get unimpeded broadcasts. It means the ISPs favoring their own content over content of others. And it means a return of price caps and broadband price increases with no fear of FCC intervention. I have a hard time thinking that ‘light-touch’ means anything other than ‘no-touch.’

Ownership of Software Rights

There is an interesting fight currently at the US Patent Office that involves all of us in the telecom industry. The argument is over the right of ownership of the software that comes along these days with almost any type of electronics. The particular fight is between John Deere and tractor owners, but the fight is a precedent for similar software anywhere.

John Deere is arguing that, while a farmer may buy one of their expensive tractors, John Deere still owns the software that operates the tractor. When a farmer buys a tractor they must agree to the terms of the software license, just like we all agree with similar licenses and terms of service all of the time. The John Deere software license isn’t unusual, but what irks farmers is that it requires them to use John Deere authorized maintenance and parts for the term of the software license (which is seemingly forever).

The fight came to a head when some farmers experienced problems with tractors during harvest season and were unable to get authorized repair in a timely manner. Being resourceful they found alternatives and there is now a small black market for software that can replace or patch the John Deere software. But John Deere is attacking farmers that use alternate software saying they are violating the DMCA (Digital Millennium Copyright Act) which prohibits the bypassing of copyrighted locks on content. They argue that farmers have no right to open or modify the software on the tractors which remains the property of John Deere. The Patent Office is siding with John Deere.

This is not a unique fight for farmers and the owners of many electronics companies are taking the same approach. For example all of the major car manufacturers except Tesla have taken the same position. Apple has long taken this position with its iPhone.

So how does this impact the telecom industry? First, it seems like most sophisticated electronics we buy these days come with a separate software license agreement that must be executed as part of a purchase. So manufacturers of most of the gear you buy still think they own the proprietary software that runs your equipment. And many of them charge you yearly after buying electronics to ‘maintain’ that software. In our industry this is a huge high margin business for the manufacturers because telcos and ISPs get almost nothing in return for these annual software license fees.

I don’t think I have a client who isn’t still operating some older electronics. This may be older Cisco routers that keep chugging along, an old voice switch, or even something major like the electronics operating an entire FTTH network. It’s normal in the telecom industry for manufacturers to stop supporting most electronics within 7 to 10 years of its initial release. But unlike twenty years ago when a lot of electronics didn’t last more then the same 7 – 10 years, the use of integrated chips means that electronics are working a lot longer.

And therein lies the dilemma. Once a vendor stops supporting a technology they literally wash their hands of it – they no longer issue software updates, they stop stocking spare parts. They do everything in their power to get you to upgrade to something newer, even though the older gear might still be working reliably.

But if a telco or ISP makes any tweaks to this older equipment to keep it working – something many ISPs are notorious for – then theoretically anybody doing that has broken the law under the DMCA and could be subject to a fine up to $500,000 and a year in jail, for a first offense.

Of course, we all face this same dilemma at home. Almost everything electronic these days comes with proprietary software and the manufacturers of your PCs, tablets, smartphones, personal assistants, security systems, IoT gear and almost all new appliances probably think that they own the software in your device. And that raises the huge question of what it means these days to buy something, if you don’t really fully own it.

I know many farmers and I think John Deere is making a huge mistake. If another tractor company like Kubota or Massey Ferguson declares that they don’t maintain rights to the software then John Deere could see its market dry up quickly. There is also now a booming market in refurbished farm equipment that pre-dates proprietary software. But this might be a losing battle when almost everything we buy includes software. It’s going to be interesting to see how both the courts and the court of public opinion handle this.

Trends in Traditional TV

Nielsen has now been publishing quarterly reports on TV viewing habits since 2011. Comparing the latest report for the 4th quarter of 2016 to the original 2011 report shows a major decrease in the hours spent by younger Americans in watching traditional television – which is defined as the combination of both live viewing and time-delayed viewing of network television content.

The changes differ by age group and don’t paint a pretty picture for the traditional TV market:

  • Teens (12-17) watched almost 14 hours per week of television, but that’s down almost 11% from 2015 and down 38% from five years.
  • Younger Millennials (18-24) watched 15.5 hours per week of TV, and that’s down 39%, or 1.5 hours per day over 5 years.
  • Older Millennials (25-34) watched 22 hours per week, and which is down 26.5% over five years.
  • Gen-Xers (35-49) watched almost 40 hours per week and have seen a 10% drop over five years.
  • Baby Boomers (50-64) watched 43 hours per week and have had a slight increase over five years of 1.6% in viewing time.
  • 65+ viewers watched 52 hours per week which is up 0.6% over 2015 and is up 8.4% over five years.

So what are the younger people doing other than watching traditional TV?  The numbers for 19-24 year old users is interesting.

  • They spend 15.5 hours per week watching traditional television (including time-shifting).
  • They spend 20.8 hours watching subscription-based OTT content like Netflix or Amazon.
  • They spend another 17 hours watching something else, which includes things like DVRs, video on social media, or free web content like YouTube.
  • That’s an average of 53 hours per week, about the same amount of screen time as those over 65 watching traditional TV.

This same group also uses a variety of different screens. That includes an average of 9.2 hours per week watching video on a PC or laptop, 1.5 hours per week watching on a tablet and 1 hour per week watching on a smartphone. The rest still use a television screen, even if the content is not a traditional TV feed.

The good news for the whole industry is that young people are not tuning out from watching video content – they are just watching a lot less traditional television. And that means less of the major networks, less sports, and less of all of the various networks found on cable systems. They have decided, as a group that other content is of more interest.

It’s soon going to be harder for Nielsen and others to quantify the specific types of content viewing because the lines are starting to blur between the various categories. If somebody watches a live feed of a basketball game or a traditional network show on Sling TV that is basically the same as watching traditional TV. But on that same platform you can also watch streaming movies in the same manner as Netflix. And traditional broadcasters are doing something similar. For example, CBS All-access not only includes traditional CBS programming, but there is new content like the new Star Trek series that is only going to be available on-line.

We’ve known for a long time that younger viewers are not watching television in the same way as older generations, but these numbers really highlight the differences. Those over 65 years old are watching four times more traditional television than teens. And viewing hours for younger viewers are steadily dropping while older viewers are watching as much or more TV than five years ago. You only have to trend this forward for a decade to foresee continued dramatic drops in total TV viewership.

For years there has been hope in the industry that as kids age and get families and buy homes that they will return to the traditional pay-TV packages. But numerous surveys have shown that this is not happening. It seems that the viewing habits of youth influences viewing habits for life. And that creates a real challenge for the advertising-supported pay TV model. TV advertisers are only reliably reaching older viewers, and yet most advertisers still believe that TV advertising is one of their most effective tools. But each year TV advertising is going to reach fewer and fewer younger viewers, and at some point the advertisers are going to be forced to look elsewhere.