The End of the MP3?

Last month the Fraunhofer Institute for Integrated Circuits ended its licensing program for the MP3 digital file format. This probably means that the MP3 format will begin fading away to be replaced over time by newer file formats. MP3 stands for MPEG Audio Layer III and was the first standard that allowed for the compression of audio files without loss of sound quality. The US patent for MP3 was issued in 1996 by Fraunhofer and since then they have collected royalties for all devices that were able to create or use files in that format.

While it might seem a bit odd to be reading a blog about the end of a file format, MP3 files have had a huge impact in the tech and music industries that they are partly responsible for the early success of the Internet.

The MP3 file revolutionized the way that people listened to music. In the decade before that there had been a proliferation of portable devices that would play cassette tapes or CDs. But those devices did not really bring freedom to listen to music easily everywhere. I can remember the days when I’d have a pile of tapes or CDs in the car so that I could listen to my favorite music while I drove. But the MP3 file format meant that I could rip all of my music into digital files and could carry my whole music collection along with me.

And the MP3 digital files were small enough that people could easily share files with friends and could send music as attachments to emails. But file-sharing of MP3 files really took off in 1999 when Shawn Fanning, John Fanning, and Sean Parker launched the peer-to-peer network Napster. This service gave people access to the entire music collections of huge numbers of others. Napster was so popular that the traffic generated by the platform crashed broadband networks at colleges and caused havoc with many ISP networks.

In 2001 Apple launched iTunes, a service where people could legally download MP3 files. Apple used the MP3 format initially but in 2003 changed to the AAC format, probably mostly to avoid paying the MP3 licensing fees. Internet traffic to iTunes grew to be gigantic. It’s hard to remember when the Internet was so much smaller, but the transfer of MP3 files was as significant to Internet traffic in the early 2000s as Netflix is today.

Napster, along with Apple iTunes, revolutionized the music industry and the two are together credited with ending the age of albums. People started listening to their favorite songs and not to entire albums – and this was a huge change for the music industry. Album sales dropped precipitously and numerous music labels went out of business. I remember the day I cancelled my subscription to Columbia House because I no longer felt the need to buy CDs.

Of course, Napster quickly ran into trouble for helping people violate music copyrights and was driven out of business. But the genie was out of the bottle and the allure of sharing MP3 files was too tempting for music lovers. I remember musician friends who always had several large-capacity external hard drives in their car and would regularly swap music collections with others.

One of the consequences from ending the licensing of the MP3 format is that over time it’s likely that computers and other devices won’t be able to read the MP3 format any longer. MP3s are still popular enough that the music players on computers and smartphones all still recognize and play MP3 files. But the history of the Internet has shown us that unsupported formats eventually fizzle away into obscurity. For example, much of the programming behind the first web sites is no longer supported and many of today’s devices can no longer view old web sites without downloading software capable of opening the old files.

It’s interesting that most people think that once something has been digitized that it will last forever. That might be true for important data if somebody makes special effort to save the digitized files in a place that will keep them safe for a long time. Bu we’ve learned that digital storage media are not permanent. Old CDs become unreadable. Hard drives eventually stop working. And even when files are somehow kept, the software needed to run the files can fall into obscurity.

There are huge amounts of music since 2000 that has been created only in a digital format. Music by famous musicians will likely be maintained and replayed as long as people have an interest in those musicians. But music by lesser-known artists will probably fade away and much of it will disappear. It’s easy to envision that in a century or two that that most of the music we listen to today might have disappeared.

Of course there are the online music streaming services like Spotify that are maintaining huge libraries of music. But if we’ve learned anything in the digital age it’s that companies that make a living peddling digital content don’t themselves have a long shelf life. So we have to wonder what happens to these large libraries when Spotify and similar companies fade away or are replaced by something else.

Regional Differences in Broadband Adoption

The latest Akamai report State of the Internet Q1 2017 contains a lot of interesting facts about broadband adoption and usage in the US and around the world. One of the things that they track is the percentage of broadband users at various data speeds. I think their tracking is the most accurate in the industry because they measure the actual speeds of connectivity, not the subscribed rate that users think they are getting. Most of the largest Internet hubs use Akamai and so they get to see huge volumes of web connections.

Probably the most interesting statistic in the report from a US perspective is that the average broadband connection speed for the whole US has grown to 18.7 Mbps. This is up 8.8% over the last quarter of 2016 and is up 22% from a year ago. This increase was enough to move the US up to tenth place in the world in terms of average connectivity speed. The worldwide connectivity speed is 7.2 Mbps, but that comes with the caveat that it doesn’t include some parts of the world and also doesn’t include the billions who don’t yet have any broadband available.

What I find most interesting in the connectivity data is how disparate broadband is in different parts of the US. For the first time there are places in the US with average connectivity speeds greater than the FCC definition of broadband – the District of Columbia at 28.1 Mbps and Delaware at 25.2 Mbps. Contrast this with Idaho with an average connectivity speed of 12 Mbps, which is less than half of the speeds for the fastest states. Perhaps the most useful statistics in the report is the percentage of connections in each state that meet various speed thresholds:

4 Mbps Adoption. Akamai says that Delaware leads in this category with 98% of connections exceeding a speed of 4 Mbps, with Rhode Island close behind at 97%. Contrast this to the bottom of the list where West Virginia has only 77% of connections exceeding 4 Mbps and Arkansas the next lowest at 81%.

10 Mbps Adoption Rate. Delaware also leads this category with 86% of the broadband connections from the state exceeding 10 Mbps, again just ahead of Rhode Island with 85%. But at the bottom of this list are Idaho at 45%, and Arkansas and New Mexico at 47%.

15 Mbps Adoption Rate. Rhode Island leads this category with 66% of broadband connections exceeding 15 Mbps. At the bottom of this list was Idaho with only 23% of connections exceeding 15 Mbps.

25 Mbps Adoption Rate. The District of Columbia tops this list with 38% of connections exceeding 25 Mbps, with Delaware second at 33%. At the bottom of the list is Idaho where only 7.5% of connections exceeded 25 Mbps, with New Mexico the next lowest at 7.9%.

Since these are the actual speeds of Internet connections one can conjecture there are a number of reasons that contribute to the differences across various states, such as:

  • Availability of fast broadband. The states with the fastest broadband rates happen to be those where a significant percentage of the population has both fiber (Verizon FiOS) and cable modem broadband available. By contrast the states near the bottom of the list tend to have far fewer communities with fiber, and even many communities without cable systems.
  • Affordability. Numerous surveys have shown that affordability is still a major factor for homes being able to afford the broadband connection they want.
  • Choice. Even in places where there is fast broadband available, many households choose slower broadband speeds due to lack of perceived need.
  • Geography. Terrain plays a role as well. In working with rural communities across the country I see that in the Plains states with wide-open expanses of land that there has been a proliferation of rural homes served by point-to-multipoint wireless networks that are delivering speeds of 10 – 50 Mbps. But this technology is of far less value in places like West Virginia with hilly and wooded terrain.

One thing this report shows is that the disparity between the top and the bottom states on these various lists is widening. In places where fast broadband is available, the statistics show that a lot of people are upgrading to faster speeds. But in states near the bottom of the list where the broadband networks are the least robust the same upward migrations to faster speeds is not possible due to the lack of options. One would think that most of the country would look like Delaware in terms of broadband adoption rates if broadband was available to everybody. But the difference in technologies and infrastructure limits households from buying the broadband speeds they want.

The other thing to remember about these statistics is that they are only measuring the speeds for actual broadband connections, and so obviously exclude the millions of households in the country that still don’t have a reasonable broadband alternative. If those households were weighted into these statistics then states with large rural areas with no broadband would sink down the list.

The WISP Dilemma

For the last decade I have been working with many rural communities seeking better broadband. For the most part these are places that the large telcos have neglected and never provided with any functional DSL. Rural America has largely rejected the current versions of satellite broadband because of the low data caps and because the latency won’t support streaming video or other real-time activities. I’ve found that lack of broadband is at or near the top of the list of concerns in communities without it.

But a significant percentage of rural communities have access today to WISPs (wireless ISPs) that use unlicensed frequency and point-to-multipoint radios to bring a broadband connection to customers. The performance of WISPs varies widely. There are places where WISPs are delivering solid and reliable connections that average between 20 – 40 Mbps download. But unfortunately there are many other WISPs that are delivering slow broadband in the 1 – 3 Mbps range.

The WISPs that have fast data speeds share two characteristics. They have a fiber connection directly to each wireless transmitter, meaning that there are no bandwidth constraints. And they don’t oversubscribe customers. Anybody who was on a cable modem five or ten years ago understands oversubscription. When there are too many people on a network node at the same time the performance degrades for everybody. A well-designed broadband network of any technology works best when there are not more customers than the technology can optimally serve.

But a lot of rural WISPs are operating in places where there is no easy or affordable access to a fiber backbone. That leaves them with no alternative but to use wireless backhaul. This means using point-to-point microwave radios to get bandwidth to and from a tower.

Wireless backhaul is not in itself a negative issue. If an ISP can use microwave to deliver enough bandwidth to a wireless node to satisfy the demand there, then they’ll have a robust product and happy customers. But the problems start happening when networks include multiple ‘hops’ between wireless towers. I often see WISP networks where the bandwidth goes from tower to tower to tower. In that kind of configuration all of the towers and all of the customers on those towers are sharing whatever bandwidth is sent to the first tower in the chain.

Adding hops to a wireless network also adds latency and each hop means it takes longer for the traffic to get to and from customers at the outer edges of one of these wireless chains. Latency, or time lag, in signal is an important factor in being able to perform real-time functions like data streaming, voice over IP, gaming, or functions like maintaining connections to an on-line class or a distant corporate WAN.

Depending upon the brand of the radios and the quality of the internet backbone connection, a wireless transmitter that is connected directly to fiber can have a latency similar to that of a cable or DSL network. But when chaining multiple towers together the latency can rise significantly, and real-time applications start to suffer at latencies of 100 milliseconds or greater.

WISPs also face other issues. One is the age of the wireless equipment. There is no part of our industry that has made bigger strides over the past ten years than the manufacturing of subscriber microwave radios. The newest radios have significantly better operating characteristics than radios made just a few years ago. WISPs are for the most part relatively small companies and have a hard time justifying upgrading equipment until it has reached its useful life. And unfortunately there is not much opportunity for small incremental upgrades of equipment. The changes in the technologies have been significant enough that that upgrading a node often means replacing the transmitters on towers as well as subscriber radios.

The final dilemma faced by WISPs is that they often are trying to serve customers that are in locations that are not ideally situated to receive a wireless signal. The unlicensed frequencies require good line-of-sight and also suffer degraded signals from foliage, rain and other impediments and it’s hard to serve customer reliably who are surrounded by trees or who live in places that are somehow blocked by the terrain.

All of the various issues mean that reviews of WISPs vary as widely as you can imagine. I was served by a WISP for nearly a decade and since I lived a few hundred feet from the tower and had a clear line-of-sight I was always happy with the performance I received. I’ve talked to a few people recently who have WISP speeds as fast as 50 Mbps. But I have also talked to a lot of rural people who have WISP connections that are slow and have high latency that provides a miserable broadband experience.

It’s going to be interesting to see what happens to some of these WISPs as rural telcos deploy CAF II money and provide a faster broadband alternative that will supposedly deliver at least 10 Mbps download. WISPs who can beat those speeds will likely continue to thrive while the ones delivering only a few Mbps will have to find a way to upgrade or will lose most of their customers.

Can the States Regulate Internet Privacy

Since Congress and the FCC have taken steps to remove restrictions on ISPs using customer data, a number of states and even some cities have taken legislative steps to reintroduce some sort of privacy restrictions on ISPs. This is bound to end up in the courts at some point to determine where the authority lies to regulate ISPs.

Congress just voted in March to end restrictions on the ways that ISPs can use customer data, leading to a widespread fear that ISPs could profit from selling customer browsing history. Since then all of the large telcos and cable companies have made public statements that they would not sell customer information in this way, but many of these companies have histories that would indicate otherwise.

Interestingly, a new bill has been introduced in Congress called the BROWSER Act of 2017 that would add back some of the restrictions imposed on ISPs and would also make those restrictions apply to edge providers like Google and Facebook. The bill would give the authority to enforce the privacy rules to the Federal Trade Commission rather than the FCC. The bill was introduced by Rep. Marsha Blackburn who was also one of the architects of the earlier removal of ISP restrictions. This bill doesn’t seem to be getting much traction and there is a lot of speculation that the bill was mostly offered to save face for Congress for taking away ISP privacy restrictions.

Now states have jumped in to fill the void. Interestingly the states looking into this are from both sides of the political spectrum which makes it clear that privacy is an issue that worries everybody. Here is a summary of a few of the state legislative efforts:

Connecticut. The proposed law would require consumer buy-in before any “telecommunication company, certified telecommunications provider, certified competitive video service provider or Internet service provider” could profit from selling such data.

Illinois. The privacy measures proposed would allow consumers to be able to ask what information about them is being shared. The bills would also require customer approval before apps can track and record location information on cellphones.

Massachusetts. The proposed legislation would require customer buy-in for sharing private information. It would also prohibit ISPs from charging more to customers who don’t want to share their personal information (something AT&T has done with their fiber product).

Minnesota. The proposed law would stop ISPs from even recording and saving customer information without their approval.

Montana. The proposed law there would prohibit any ISPs that share customer data from getting any state contracts.

New York. The proposed law would prohibit ISPs from sharing customer information without customer buy-in.

Washington. One proposed bill would require written permission from customers to share their data. The bill would also prohibit ISPs from denying service to customers that don’t want to share their private information.

Wisconsin. The proposed bill essentially requires the same restrictions on privacy that were included in the repealed FCC rules.

This has even made it down to the City level. For example, Seattle just issued new rules for the three cable providers with a city franchise telling them not to collect or sell customer data without explicit customer permission or else face losing their franchise.

A lot of these laws will not pass this year since the new laws were introduced late into the legislative sessions for most states. But it’s clear from the laws that have been proposed that this is a topic with significant bipartisan support. One would expect a lot of laws to be introduced and enacted in legislative sessions that will occur later this year or early next year.

There is no doubt that at some point this is going to result in lawsuits to resolve the conflict between federal and state rules. An issue of this magnitude will almost certainly will end up at the Supreme Court at some point. But as we have seen in the past, during the period of these kinds of legislative and legal fights the status of any rules is muddy. And that generally means that ISPs are likely to continue with the status quo until the laws become clear. That likely means that ISPs won’t openly be selling customer data for a few years, although one would think that the large ones have already been collecting data for future use.

Are You Ready for 4K Video?

The newest worry for ISPs is the expansion of 4K video. Already today Netflix and Amazon are offering on-line 4K video to customers. Almost all of the new programming being created by both companies is being shot in 4K.

Why is this a concern for ISPs? Netflix says that in order to enjoy a streaming 4k signal that a user ought to have a spare 15 – 20 Mbps of bandwidth available if streaming with buffering. The key word is spare, meaning that any other household activity ought to be using other bandwidth. Netflix says that without buffering that a user ought to have a spare 25 Mbps.

When we start seeing a significant number of users stream video at those speeds even fiber networks might begin experiencing problems. I’ve never seen a network that doesn’t have at least a few bottlenecks, which often are not apparent until traffic volumes are high. Already today busy-hour video is causing stress to a lot of networks. I think about millions of homes trying to watch the Super Bowl in 4K and shudder to think what that will mean for most networks.

While 4K video is already on-line it is not yet being offered by cable companies. The problem for most of the industry is that there is no clear migration path between today and tomorrow’s best video signal. There are alternatives to 4K being explored by the industry that muddy the picture. Probably the most significant new technology is HDR (high-dynamic range) video. HDR has been around for a few years, but the newest version which captures video in 10-bit samples adds both contrast and color accuracy to TVs. There are other video improvements also being explored such as 10-bit HEVC (high-efficiency video coding) which is expected to replace today’s H.264 standard.

The uncertainty of the best technology migration path has stopped cable companies from making upgrades to HDR or 4K. They are rightfully afraid to invest too much in any one version of the early implementations of the technology to then face more upgrades in just a few years. But as the popularity of 4K video increases, the pressure is growing for cable companies to introduce something soon. It’s been reported that Comcast’s latest settop box is 4K capable, although the company is not making any public noise about it.

But as we’ve seen in the past, once customers start buying 4K capable TVs they are going to want to use them. It’s expected by 2020 that almost every new TV will include some version of HDR technology, which means that the quality of watching today’s 1080 pixel video streams will improve. And by then a significant number of TVs will come standard with 4K capabilities as well.

I remember back when HD television was introduced. I have one friend who is a TV buff and once he was able to get HD channels from Comcast he found that he was unable to watch anything that was broadcast in standard definition. He stopped watching any channel that did not broadcast HD and ignored a huge chunk of his Comcast line-up.

The improvements of going to 4K and/or true HDR will be equally as dramatic. The improvement in clarity and color is astonishing as long as you have a TV screen large enough to see the difference. And this means that as people grow to like 4K quality they will migrate towards 4K content.

One thing that is clear is that 4K video will force cable companies to broadcast video over the IP stream. A single 4K signal eats up an entire 6 MHz channel on a cable system making it impossible for any cable system to broadcast more than a tiny number of 4K channels in the traditional way. And, like Comcast is obviously preparing to do, it also means all new settop boxes and a slew of new electronics at the cable headend to broadcast IPTV.

Of course, like any technology improvement we’ve seen lately, the improvements in video quality don’t stop with 4K. The Japanese plan to broadcast the 2020 Olympics in 8K video. That requires four times as much bandwidth as 4K video – meaning an 80 – 100 Mbps spare IP path. I’m sure that ways will be found to compress the transmission, but it’s still going to require a larger broadband pipe than what most homes buy today. It’s expected that by 2020 that there will only be a handful of users in Japan and South Korea ready to view 8K video, but like anything dramatically new, the demand is sure to increase in the following decade.

Seniors and Broadband

A recent poll from the Pew Research Center shows that for the first time that more than half of Americans over 65 have a landline broadband connection in their homes. This is a milestone for the industry and is significantly higher than the last time Pew asked the same questions in 2013.

Since the inception of the web seniors have always had a significantly lower broadband adoption rate than other age groups, but this survey shows that seniors are now starting to close the gap. Part of this shift is probably due to the fact that baby boomers are now joining the senior category and bringing their much higher adoption rate for technology with them. But one also has to think that the benefits of broadband are luring more seniors into buying broadband.

The survey also showed the following:

  • 67% of seniors say that they use the Internet.
  • 42% of seniors now own a smartphone, which is triple the percentage from 2013.
  • Of those that use the Internet, 17% go on-line once a day, 51% use the Internet several times per day and 8% say they are on the Internet almost constantly.
  • A much smaller percentage of seniors use social media, but the ones that do use it often. For example, 70% of seniors on Facebook use the service daily.
  • 25% of seniors that go on-line play on-line video games.
  • 58% of seniors think that technology has a positive effect on society. Only 4% think technology is mostly negative.

The survey also looked deeper into the reasons why seniors say they don’t use broadband and found the following:

  • Only 26% of seniors say that they are very confident when using electronic devices. The percentages are far higher for younger age groups.
  • 73% of seniors say they need help using a new electronic device.
  • Disabled seniors seem to use broadband at a much lower rate than those with no disabilities.

ISPs have obviously always had challenges in selling to seniors. But I clients that have done very well selling to seniors and following are a few things I have seen work.

I have one client that has been holding weekly computer training classes for the public for nearly 15 years. Their free classes are filled every week mostly by seniors. They teach what people really want to learn – how to use Facebook, how to deal with emails and spam, how to save and send pictures, etc. They have a much higher broadband penetration rate with seniors than is shown by this survey and they credit their training classes for making seniors comfortable using broadband.

I have another client that sends an employee to sit with every new broadband customers to help them set up everything they want to use. They say they will often spend up to four hours with a new senior customer and will set up their Facebook and email accounts, show them how to use bookmarks, show them how to search for information, etc. And this ISP will take calls from these new customers to answer all of their questions and will make return home visits if needed. They say that word of mouth has emboldened a lot of seniors to buy broadband and because of their continued support they can’t recall any senior who dropped broadband. They think this up-front assistance is time and money well spent because they say that their seniors become the most loyal customers who also have the best track record of paying the monthly broadband bills on time.

I have another client that also holds training classes, but rather than have potential customers come to their office, they have placed computers in several places in the community where seniors gather daily – places like a senior community center, an indoor community swimming pool and gym, and in a popular restaurant that allowed them to put a few computers in a back room. This telco sends somebody to these locations a few times a week to answer questions and to show people how to use the Internet. They say this program has led to significant sales of broadband to seniors.

But I also have a lot of clients that have not done anything specific to help seniors and then see poor broadband adoption rates. My advice to them has always been to look at the efforts to sell to seniors as just another part of the sales process. As this survey shows, it is fear of technology that is still the primary reason why many seniors don’t buy broadband. Any ISP that makes a genuine effort to allay these fears will reap the benefits of increased broadband sales and an appreciative new customer base.

The Proliferation of Small Wireless Devices

Cities nationwide are suddenly seeing requests to place small wireless devices in public rights-of-way. Most of the requests today are for placing mini-cell sites, but in the near future there are going to be a plethora of other outdoor wireless devices to support 5G broadband and wireless loops.

Many cities are struggling with how to handle these requests. I think that once they understand the potential magnitude of future requests it’s going to become even more of an issue. Following are some of the many issues involved with outdoor wireless electronics placement:

Franchising. One of the tools cities have always used to control and monitor placement of things in rights-of-way is through the use of franchise agreements that specifically spell out how any given company can use the right-of-way. But FCC rules have prohibited franchises for cellular carriers for decades – rules that were first put into place to promote the expansion of cellular networks. Those rules made some sense when cities only had to deal with large cellular towers that are largely located outside of rights-of-way, but make a lot less sense for devices that can be placed anywhere in a city.

Aesthetics. These new wireless devices are not going to be placed in the traditional locations like large cellular towers, water towers and rooftops of buildings. Instead the wireless providers will want to place them on existing telephone poles and light poles. Further, I’ve heard of requests for the placement of new, taller poles as tall as 100 feet that would be used just for the wireless network.

The devices that will be used are going to vary widely in size and requirements, making it difficult to come up with any one-size-fits-all new rules. The devices might vary in sizes ranging from a laptop computer up to a small dorm refrigerator. And some of the devices will be accompanied by support struts and other devices that together make for a fairly large new structure. The vast majority of these devices will need an external power feed (some might be solar powered) and many are also going to need a fiber feed.

It’s also expected that 5G devices are going to want relatively clear line-of-sight and this means a lot more tree-trimming, including trimming at greater heights than in the past. I can picture this creating big issues in residential neighborhoods.

Proliferation. I doubt that any city is prepared for the possible proliferation of wireless devices. Not only are there four major cellular companies, but these devices are going to be deployed by the cable companies that are now entering the cellular market along with a host of ISPs that want to deliver wireless broadband. There will also be significant demand for placement for connecting private networks as well as for the uses by the cities themselves. I remember towns fifty years ago that had unsightly masses of telephone wires. Over the next decade or two it’s likely that we will see wireless devices everywhere.

Safety. One of the concerns for any city and the existing utilities that use poles and rights-of-way is the safety of technicians that work on poles. Adding devices to poles always makes it more complicated to work on a pole. But adding live electric feeds to devices (something that is fairly rare on poles) and new fiber wires and the complexity increases again – particularly for technicians trying to make repairs in storm conditions.

Possible Preemption of City Rights. Even after considering all these issues, it’s possible that the choice might soon be moot for cities. At the federal level both the FCC and Congress are contemplating rules that make it easier for cellular companies to deploy these devices. There are also numerous bills currently in state legislatures that are looking at the same issues. In both cases most of the rules being contemplated would override local control and would institute the same rules everywhere. And as you might imagine, almost all of these laws are being pushed by the big cellular companies and largely favor them over cities.

It’s easy to understand why the cellular companies want universal rules. It would be costly for them to negotiate this city by city. But local control of rights-of-way has been an effective tool for cities to use to control haphazard proliferation of devices in their rights-of-way. This is gearing up to be a big battle – and one that will probably come to a head fairly soon.

Big Companies and Telecommuting

One of the biggest benefits most communities see when the first get good broadband is the ability for people to telecommute or work from home. Communities that get broadband for the first time report that this is one of the most visible changes made in the community and that soon after getting broadband almost every street and road has somebody working from home.

CCG is a great example of telecommuting and our company went virtual fifteen years ago. The main thing that sent us home in those days was that residential broadband was better than what we could get at the office. All of our employees could get 1 – 2 Mbps broadband at home and that was also the only speed available at our offices over a T1. But we found that even in those early days that a T1 was not enough speed to share among multiple employees.

Telecommuting really picked up at about the same time that CCG went virtual. I recall that AT&T was an early promoter of telecommuting as was the federal government. At first these big companies let employees work at home a day or two a week as a trial. But that worked out so well that over time big organizations felt comfortable with people working out of their homes. I’ve seen a number of studies that show that telecommuting employees are more productive than office employees and work longer hours – due in part to not have to commute. Telecommuting has become so pervasive that there was a cover story in Forbes in 2013 announcing that one out of five American workers worked at home.

Another one of the early pioneers in telecommuting was IBM. A few years ago they announced that 40% of their 380,000 employees worked outside of traditional offices. But last week the company announced that they were ending telecommuting. They told employees in many of their major divisions like Watson development, software development and digital marketing and design that they must move back into a handful of regional offices or leave the company.

The company has seen decreasing revenues for twenty straight quarters and there is speculation that this is a way to reduce their work force without having to go through the pain of choosing who will leave. But what is extraordinary about this announcement is how rare it is. It’s only the second major company that has ended telecommuting in recent memory, the last being Yahoo in 2013.

Both IBM and Yahoo were concerned about earnings and that is probably one of the major reasons that drove their decision to end telecommuting. It seems a bit ironic that companies would make this choice when it’s clear that telecommuting saves money for the employer – something IBM crowed about earlier this year.

Here are just a few of the major findings that have been done about the benefits of telecommuting. It’s improves employee morale and job satisfaction. It reduces attrition, reduces sick and unscheduled leave. It saves companies on office space and overhead costs. It reduces discrimination by equalizing people by personality and talent rather than race, age or appearance. It increases productivity by eliminating unneeded meetings and because telecommuters work more hours than office workers.

But there are downsides. It’s hard to train new employees in a telecommuting environment. One of the most common ways to train new people is to have them spend time with somebody more experienced – something that is difficult with telecommuting. Telecommuting makes it harder to brainstorm ideas, something that benefits from live interaction. And possibly the biggest drawback is that telecommuting isn’t for everybody. Some people cannot function well outside of a structured environment.

As good as telecommuting is for companies it’s even better for smaller and rural communities. A lot of people want to live in the communities they grew up in, around friends and family. We’ve seen a brain drain from rural areas for decades as kids graduate from high school or college and are unable to find meaningful work. But telecommuting lets people live where there is broadband. Many communities that have had broadband come to town report that they see an almost instant uptick in housing prices and demand for housing. And part of that increased demand is from those who want to choose a community rather than follow a job.

One of the more interesting projects I’ve worked on with the telecommuting issue was when I helped the city of Lafayette, Louisiana get a fiber network. Lafayette is not a rural area but a thriving mid-size city, and yet one of the major reasons the residents wanted fiber was the chance to keep their kids at home. The area is largely Cajun with a unique culture and the community was unhappy to see their children have to relocate to larger cities to get jobs after graduating from the university there. Broadband alone can’t fix that kind of problem, but Lafayette is reportedly happy with the changes brought from the fiber network. That’s the kind of benefit that’s hard to quantify in dollar terms.

Net Neutrality and the Digital Divide

There is an interesting idea floating around the industry that is bound to annoy fans of net neutrality. The idea comes from Roslyn Layton who does telecom research at Aalborg University in Denmark. She served on the FCC Transition team for the new administration.

She envisions zero-rating as the best way to solve the digital divide and to finally bring Internet access to everybody. She says that after decades of not finding any other solutions that this might the only reasonable path to get Internet access to people that can’t afford a monthly subscription.

The idea is simple – there are companies who will provide an advertising-driven broadband connection for free to customers, particularly on a cellphone. It’s not hard to envision big companies like Facebook or Google sponsoring cellphone connections and providing data access to customers who would be a captive audience for their ads and content.

This idea is already working elsewhere. Facebook offers this same service in other countries today under the brand name “Free Basics.’ While it certainly costs Facebook to buy the wholesale data connections they must have done the math and figured that having a new customer on their platform is worth more than the cost. Facebook’s stated goal is to serve most of the billions of people on earth and this is a good way to add a lot of customers. With Free basics customers get full use of the Facebook platform along with the basic ability to surf the web. However, the free basic service does not allow a user to freely watch streaming video or to do other data-intensive activities that are not part of the Facebook universe – it’s not an unlimited data plan. I can remember similar products in the US back in the dial-up days when several dial-up providers that gave free connections as long as the customers didn’t mind being bombarded by ads.

There are certainly upsides to this. Such a service would provide enough bandwidth for people to use the web for the basics like hunting for a job or doing school work. And users would get unlimited use of the Facebook platform for functions such as messaging or watching Facebook-sponsored video and content. There are still a substantial number of people in the US who can’t afford a broadband subscription and this would provide a basic level of broadband to anybody willing to deal with the ad-heavy environment.

But there are downsides. This idea violates net neutrality. Even if the current FCC does away with net neutrality one has to think that a future FCC will institute something similar. But even with net neutrality rules in place the FCC could make an exception for a service that tackles the digital divide.

The real downside is that this is not the same as the real internet access that others enjoy. Users would be largely trapped inside whatever platform sponsors their product. That could be Facebook or Google, but it could also be an organization with a social or political agenda. Anybody using this kind of free platform would have something less than unfettered Internet access, and they would be limited to whatever the platform sponsor allows them to see or do outside the base platform. At best this could be called curated Internet access, but realistically it’s a platform to give sponsors unlimited access to users.

But I think we have to be realistic that nobody has yet found a solution to the digital divide. The FCC’s Lifeline program barely makes a dent in it. And I’m not aware of any major ISP who has ever found any mechanism to solve the digital divide issue.

While Facebook offers this in many countries around the globe they received massive pushback when they tried to bring this to India. The Indian government did not want a class of people given a clearly inferior class of Internet connectivity. But in India the government is working hard themselves to solve the digital divide. But there is nobody in the US giving the issue any more than lip service. The issue has been with us since the dial-up days and there has been little progress in the decades since then.

I read some persuasive articles a few years ago when the net neutrality debate was being discussed about this kind of product. There were arguments made that there would be long-term negative ramifications from having a second-class kind of Internet access. The articles worried about the underlying sponsors heavily influencing people with their particular agenda.

But on the flip side, somebody who doesn’t have broadband access probably thinks this is a great idea. It’s unrealistic to think that people have adequate broadband access when they can only get it at the library or a coffee shop. For broadband to benefit somebody it needs to be available when and where they need to use it.

I lean towards thinking this as an idea worth trying. I would hope that there would be more than one or two companies willing to sponsor this, in which case any provider who is too obnoxious or restrictive would not retain customers. People who go to sites like Facebook today are already voluntarily subjected to ads, so this doesn’t seem like too steep of a price to pay to get more people connected to the Internet.

The Myth of OTT Savings

One of the reasons touted in the press for the recent popularity of cord cutting is the desire of people to save money over a traditional cable TV subscription. But as I look at what’s popular on the web I wonder if the savings are really going to be there for people who like to watch a variety of the best content.

There has been an explosion of companies that are pursuing unique video content, and this means that great content can now be found at many different places on the web. Interestingly, most of this great content is not available on traditional TV, other than the content provided by the premium movie channels. But considering the following web platforms that are creating unique content:

  • Netflix. They are the obvious king of unique content and release new shows, specials, movies and documentaries seemingly weekly. And they seem to have a wide variety of content aimed at all demographics.
  • Hulu. They are a bit late to the game. But the newly released The Handmaid’s Tale is getting critical acclaim and will be part of a quickly growing portfolio of unique content.
  • HBO. HBO has always had a few highly popular series with Game of Thrones still drawing huge audiences.
  • CBS All-Access. CBS has made a bold move by offering the new series Star Trek: Discovery only online. It’s bound to draw a lot of customers to the online service.
  • Amazon Prime. The company says they are going to invest billions in unique programming and are aiming at overtaking Netflix. Their recent hit The Man in the High Castle is evidence of the quality programming they are pursuing.
  • Showtime. They have historically created limited amounts of unique content but are now also looking to create a lot more. Their new show Twin Peaks has come out with high reviews.
  • Starz. This network is also now chasing new content and has a hit series with American Gods.
  • Seeso. Even services that most people have never heard of, such as Seeso are creating popular content such as the comedy series My Brother, My Brother and Me.
  • YouTube Red. The industry leader of unique content is YouTube which has allowed anybody to create content. While most of this is still free, the platform is now putting a lot of great content such as the comedy Rhett and Link’s Buddy System behind a paywall.

Subscribing to the above online services with the minimum subscriptions costs $79 per month (and that’s without figuring in the annual cost of Amazon Prime, which most people buy because of the free shipping from Amazon). The above line-up doesn’t include any sports and you’d have to buy a $30 subscription from Sling TV to watch ESPN and a few other popular sports networks. ESPN recently announced that they still don’t have any plans to launch a standalone web product but are instead pursuing being included in the various skinny bundles.

Not considered, though, in the above list are numerous other less-known paid OTT subscriptions available on line. As listed in this recent blog there are dozens of other platforms for people who like specialized content like Japanese anime or British comedies.

Of course, one thing the above list shows is that there is a world of content these days that is not being created by the major networks or the traditional cable networks. There is likely more money pouring into the creation of content outside of the traditional networks.

So OTT doesn’t seem to save as much as hoped for people that wish to enjoy a variety of popular content across different providers.  But there are other benefits driving people to OTT programming.  One of the great benefits of OTT programming is the ability to subscribe and cancel services at will. I have been trying various OTT networks and it’s really tempting to subscribe to each for a month or two until you’ve seen what you want and then move on to something else. I’m starting to think that’s the way I will use these services as long as they continue to allow easy egress and exit.

And OTT programming allows for non-linear TV watching.  As long as somebody lives near to a metropolitan area a cord cutter can still view the traditional network channels using rabbit ears. But what a lot of cord cutters are finding is that they quickly lose their tolerance of linear programming. I know that when I travel and have TV available in the room that I only watch it if I want to catch a football or basketball game. I can no longer tolerate the commercial breaks or the inability to pause linear TV while I want to do something else. And that, perhaps more than anything, is what will bring down traditional cable TV. As much as cable companies tout TV Everywhere, their basic product is still showing content linearly at fixed times. There is such a huge volume of great OTT content available any time on any device that it’s not hard for somebody to walk away from the traditional networks and still always have something you want to watch.