Do We Need International Digital Laws?

German Chancellor Angela Merkel said a few weeks ago that the world needs international regulation of digital technology, much like we have international regulation for financial markets and banking.

She says that without some kind of regulations that isolated ‘islands’ of bad digital actors can emerge that are a threat to the rest of the world. I am sure her analogy is a reference to the handful of islands around the globe that play that same maverick role in the banking arena.

We now live in a world where a relatively small number of hackers can cause incredible harm. For instance, while never definitely proven, it seems that North Korea hackers pulled off the major hack of Sony a few years ago. There are accusations across western democracies that the Russians have been using hacking to interfere with elections.

Merkel certainly has a valid point. Small ‘islands’ of hackers are one of the major threats we face in the world today. They can cause incredible economic harm. They threaten basic infrastructure like electric grids. They make it risky for anybody to be in the Internet at a time when broadband access is becoming an integral part of the lives of billions.

There currently aren’t international laws that are aimed at fighting the nefarious practices of bad hackers or at punishing them for their crimes. Merkel wasn’t specific about the possible remedies. She said that the US and Germany have undertaken discussions on the topic but that it hasn’t gone very far. There are certainly a few things that would make sense at the international level:

  • Make certain kinds of hacking an international crime so that hacker criminals can more easily be pursued across borders.
  • Create a forum for governments to better coordinate monitoring hackers and devising solutions for blocking or stopping them.
  • Make laws to bring cryptocurrency under the same international auspices as other currencies.

But as somebody who follows US telecom regulation in this blog I wonder how fruitful such regulations might be? We now live in a world where hackers always seem to be one step ahead of the security industry that works to block them. The cat and mouse game between hackers and security professionals is a constantly changing one and I have to wonder how any set of rules might be nimble nimble enough to make any difference.

This does not mean that we shouldn’t have an international effort to fight against the bad actors – but I wonder if that cooperation might best be technical cooperation rather than the creation of regulations that might easily be out of date as they are signed into law.

Any attempt to create security regulations also has to wrestle with that fact that a lot of what we think of as hacking is probably really government sponsored cyberwarfare. How do we tell the difference between cyber-criminals and cyber-warriors? In a murky world where it’s always going to be hard to know who specifically wrote a given piece of code I wonder how we tell the criminal bad guys from the government bad guys?

I also see a dilemma in that any agreed-upon international laws must, by definition filter back into US laws. We now have an FCC that is trying to rid itself of having to regulate broadband. Assuming that Title II regulation will be reversed I have to wonder if the FCC would be able to try to require ISPs to comply with any international laws at a time when there might not even be many US laws that can be enforced on them.

It makes sense to me that there ought to be international cooperation in identifying and stopping criminal hackers and others that would harm the web. But I don’t know if there has even been an issue where the governments of the world engage in many of the same practices as the bad actors – and that makes me wonder if there can ever be any real cooperation between governments to police or control bad practices on the web.

The Fastest ISPs

PC Magazine has been rating ISPs in terms of speed for a number of years. They develop their rankings based upon speed tests taken at their own speed test site. They had about 124,000 speed tests taken that led to this year’s rankings. The scoring for each ISP is a composite number based 80% on the download speed and 20% of upload speeds. To be included in the rankings an ISP needed to have 100 customers or more take the speed test.

You always have to take these kinds of rankings with a grain of salt for several reasons. For example speeds don’t only measure the ISP but also the customer. The time of day can affect the speed test, but probably the type of connection affects it the greatest. We know these days that a lot of people are using out-of-date or poorly located WiFi routers that affect the speeds at their computer.

Measured speeds vary between the different speed tests. In writing this blog I took four different speed tests just to see how they compare. I took the one at the PC Magazine site and it showed my speeds at 27.5 Mbps down / 5.8 Mbps up. I then used Ookla which showed 47.9 Mbps down / 5.8 Mbps up. The Speakeasy speed test showed 17.6 Mbps down and 5.8 Mbps up. Finally, I took the test from Charter Spectrum, my ISP, which showed 31.8 Mbps down / 5.9 Mbps up. That’s a pretty startling set of different speeds measured just minutes apart – and which demonstrates why speed test results are not a great measure of actual speeds. I look at these results and I have no idea what speed I actually am receiving. However, with that said, one would hope that any given speed test would probably be somewhat consistent in measuring the difference between ISPs.

The results of the speed test ‘contest’ are done for different categories of ISPs. For years the winner of the annual speed test for the large incumbents has been Verizon FiOS. However, in this year’s test they fell to third in their group. Leading that category now is Hotwire Communications which largely provides broadband to multi-tenant buildings, with a score of 91.3. Second was Suddenlink at 49.1 with Verizon, Comcast and Cox and closely behind. The lowest in the top 10 was Wow! at a score of 26.7.

Another interesting category is the competitive overbuilders and ISPs. This group is led by Google Fiber with a score of 324.5. EPB Communications, the municipal network in Chattanooga, is second at 136.1. Also in the top 10 are companies like Grande Communications, Sonic.net, RCN, and Comporium.

PC Magazine also ranks ISPs by region and it’s interesting to see how the speeds for a company like Comcast varies in different parts of the country.

Results are also ranked by state. I find some of the numbers on this list startling. For instance, Texas tops the list with a score of 100.3. Next is South Dakota at 80.3 and Vermont at 70.6. If anything this goes to show that the rankings are not any kind of actual random sample – it’s impossible to think that this represents the true composite speeds of all of the people living in those states. The results of this contest also differs from results shown by others like Ookla that looks at millions of actual connection speeds at Internet POPs. Consider Texas. Certainly there are fast broadband speeds in Austin due to Google Fiber where all of the competitors have picked up their game. There are rural parts of the state with fiber networks built by telcos and cooperatives. But a lot of the state looks much like anywhere else and there are a lot of people on DSL or using something less than the top speeds from the cable companies.

But there is one thing this type of study shows very well. It shows that over the years that the cable companies are getting significantly faster. Verizon FiOS used to be far faster than the cable companies and now lies in the middle of a pack with many of them.

This test is clearly not a statistically valid sample. And as I showed above with my results from various speed tests the results are not likely even very accurate. But ISPs care about these kinds of tests because it can give them bragging rights if they are near the top of one of the charts. And, regardless of the flaws, one would think the same shortcomings of this particular test are similar across the board, which means it does provide a decent comparison between ISPs. That is further validated by the fact the results of this exercise are pretty consistent from year to year.

Industry Shorts, June 2017

Following are some topics I found of interest but which don’t justify a whole blog.

Amazon Bringing Alexa to Settop Boxes. Amazon has created a develop kit that would allow any settop box maker to integrate their voice service Alexa. The Alexa voice platform is currently supporting the popular Echo home assistant device. It’s also being integrated into some new vehicles and Amazon has made it available for integration into a whole range of home automation devices. The Amazon Alexa platform is currently ahead of the competitors at Apple, Google and Microsoft mostly due to having made the product open to developers who have already created over 10,000 applications that will work on the platform. Adding Alexa to a settop box could make it a lot easier to use the settop box as the hub for a smart home.

Comcast Tried to Shut Down anti-Comcast Website. LookingGlass Cyber Security Center, a vendor for Comcast, sent a cease-and-desist letter to the advocacy group Fight for the Future. This group is operating a website called comcastroturf.com. The advocacy group claims that Comcast has used bots to generate over a half million fake filings to the FCC in the network neutrality docket. These comments were all in favor of killing net neutrality and the group claims that Comcast used real people’s names to sign the filings, but without their permission. The website allows people to see if their name has been used. The cease-and-desist order was withdrawn after news of it got a lot of coverage in social media.

Net Neutrality Wins in Court. Not that it probably makes much difference now that the FCC is trying to undo Title II regulation, but the challenge filed by Verizon and other large ISPs against the FCC’s net neutrality decision was rejected at appeal. This affirms the ability of the FCC to use Title II rules for regulating broadband. The full U.S. Court of Appeals for the D.C. Circuit upheld an earlier court ruling that affirmed the FCC had the needed authority to implement the net neutrality decision.

Altice Buys Ad-Tech Company. Altice joins other big ISPs that want to take advantage of the end of the new FCC rules that allows ISPs to monetize customer’s private data. Altice, which is now the fourth largest US cable company after the acquisition of Cablevision, now joins the other big ISPs who have added the expertise to slice and dice customer data. Altice paid $300 million for Teads, a company specializing in targeting advertising based upon customer specific data.

Other large ISPs are already poised to take advantage of the new opportunity. For example, Verizon’s purchase of AOL and Yahoo brings this same expertise in-house. It has been widely speculated that the ISPs have been gathering customer data for many years and so are sitting on a huge treasure trove detailing customers web browsing usage, on-line purchasing habits, email and text information, and for the wireless ISPs the location data of cellphones.

Charter Rejects $100 Billion offer from Verizon. The New York Post reported that Charter rejected a purchase offer from Verizon. The Post reports that Charter thought the offer wasn’t high enough. It also came with some tax implications that would complicate the deal. Whether this particular offer is real or not, it points to the continuing consolidation of the industry ISPs, cable providers and cellular companies. The current administration is reportedly not against large mergers, so there’s no telling what other megadeals we might see over the next few years.

Top 7 Media CEOs made $343.8 Million in 2016. The CEOs of CBS, Comcast, Discovery Communications, Disney, Fox, Time Warner and Viacom collectively made a record salary last year, up 21.1% from 2015. It’s interesting in a time when the viewership of specific cable networks is dropping rapidly that the industry would be rewarding their leaders so handsomely. But all of these companies are compensating for losses of customers with continuing rate hikes for programming and most are having banner earnings.

Frontier Lays Off WV Senate President. Frontier just laid off Mitch Carmichael, the President of the Senate in West Virginia. This occurred right after the Senate passed a broadband infrastructure bill that was aimed at bringing more broadband competition to the state. The bill allows individuals or communities to create broadband cooperatives to build broadband infrastructure in areas with poor broadband coverage. Frontier is the predominant ISP in the state after its purchase of the Verizon property there. The West Virginia legislature is a part-time job that pays $20,000 per year and most legislators hold other jobs. West Virginia is at or near the bottom in most statistics concerning broadband speeds and customer penetration rates.

Latest Industry Statistics

The statistics are out for the biggest cable TV and data providers for the first quarter of the year and they show an industry that is still undergoing big changes. Broadband keeps growing and cable TV is starting to take some serious hits.

Perhaps the most relevant statistic of all is that there are now more broadband customers in the country than cable TV customers. The crossover happened sometime during the last quarter. This happened a little sooner than predicted due to plunging cable subscribers.

For the quarter the cable companies continued to clobber the telcos in terms of broadband customers. Led by big growth in broadband customers at Comcast and Charter the cable companies collectively added a little over 1 million new broadband customers for the quarter. Charter led the growth with 458,000 new broadband subscribers with Comcast a close second at 430,000 new customers.

Led by Frontier’s loss of 107,000 broadband customers for the quarter the telcos collectively lost 45,000 net customers for the quarter. Most of Frontier’s losses stem from the botched acquisition of Verizon FiOS properties. Verizon lost 27,000 customers for the quarter while AT&T U-verse was the only success among telcos adding 90,000 new customers for the quarter.

Looking back over the last year the telcos together lost 727,000 broadband customers while the cable companies together gained 3.11 million customers during the same period. The cable companies now control 63.2% of the broadband market, up from 61.5% of the market a year ago.

Overall the broadband market grew by 2.38 million new broadband subscribers for over the last year ending March 31. It’s a market controlled largely by the giant ISPs and the largest cable companies and telcos together account for 93.9 million broadband subscribers.

Cable TV shows a very different picture. The largest seven cable providers collectively lost 487,000 video subscribers for the quarter. That includes AT&T losing 233,000, Charter losing 100,000, Dish Networks losing 143,000, Verizon losing 13,000, Cox losing 4,000 and Altice losing 35,000. The only company to gain cable subscribers was Comcast, which gained 41,000.

Total industry cable subscriber losses were 762,000 for the quarter as smaller cable companies and telcos are also losing customers. That is five times larger than the industry losses of 141,000 in the first quarter of last year. This industry is now losing 2.4% of the market per year, but that r is clearly accelerating and will probably grow larger. The annual rate of decline is already significantly higher than last year’s rate of 1.8%.

At this point it’s clear that cord cutting is picking up steam and this was the worst performance ever by the industry.

The biggest losers have stories about their poor performance. Charter says it is doing better among its own historic customers but is losing a lot of customers from the Time Warner acquisition as Charter raises rates and does away with Time Warner promotional discounts. AT&T has been phasing out of cable TV over its U-Verse network. This is a DSL service that has speeds as high as 45 Mbps, but which is proving to be inadequate to carry both cable TV and broadband together. Dish Networks has been bogged down in numerous carriage and retransmission fights with programmers and has had a number of channels taken off the air.

But even considering all of these stories it’s clear that customers are leaving the big companies. Surveys of cord cutters show that very few of them come back to traditional cable after cutting the cord after they get used to getting programming in a different way.

What is probably most strikingly different about the numbers is that for years the first quarter has performed the best for the cable industry, which in recent years has still seen customer gains even while other quarters were trending downward. We’ll have to see what this terrible first quarter means for the rest of 2017.

 

 

Comparing Streaming and Broadcast Video

One thing that doesn’t get talked about a lot in the battle between broadcast TV and on-line video is video quality. For the most part today broadcast TV still holds the edge over on-line video.

When you think of broadcast TV over a cable system I can’t help but remember back twenty years ago when the majority of the channels on a cable system were analog. I remember when certain channels were snowy, when images were doubled with ghosts and the first couple of channels in the cable system were nearly unwatchable. Today the vast majority of channels on most cable systems are digital, but there are still exceptions. The conversion to digital resulted in a big improvement in transmission quality.

When cable systems introduced HDTV and the quality got even better. I can remember flipping back and forth between the HD and SD versions of the same channel on my Comcast system just to see the huge difference.

This is not to say that cable systems have eliminated quality issues. It’s still common on many cable systems to see pixilation, especially during high action scenes where the background is constantly changing. All cable systems are not the same, so there are differences in quality from one city to the next. All digital video on cable systems is compressed at the head-end and decompressed at the settop box. That process robs a significant amount of quality from a transmission and one only has to compare any cable movie to one from a Blu-ray to realize how much is lost in the translation.

In the on-line world buffered video can be as good as good as cable system video. But on-line video distributors tend to compress video even more than cable systems – something they largely can get away with since a lot of on-line video is watched on smaller screens. And this means that a side-by-side comparison of SD or HD movies would usually favor the cable system. But Netflix, Amazon and a few others have one advantage today with the spectacular quality of their 4K videos – there is nothing comparable on cable networks.

But on-line live-streamed video still has significant issues. I watch sports on-line and the quality is often poor. The major problem with live-streamed video is mostly due to delays in the signal getting to the user. Some of that delay is due to latency – either latency in the backbone network between the video creator and the ISP or latency in the connection between the ISP and the end-user. Unlike downloading a data file where your computer will wait until it has collected all of the needed packets, live-streamed video is sent to end-users with whatever pixels have arrived at the needed time. This creates all sorts of interesting issues when watching live sports. For instance, there is pixilation, but it doesn’t look like the pixilation you see on cable network. Instead parts of the screen often get fuzzy when they aren’t receiving all the pixels. There are also numerous problems with the video. And it’s still not uncommon for the entire picture to freeze for a while, which can cause an agonizing gap when you are watching sports since it always seems to happen at a critical time.

Netflix and Amazon have been working with the Internet backbone providers and the ISPs to fix some of these issues. Latency delays in getting to the ISPs is shrinking and, at least for the major ISPs, will probably not be an issue. But the one issue that still needs to be resolved is the crashes that happen when the Internet gets overloaded when the demand is too high. We’re seeing ISPs bogging down when showing a popular stream like the NBA finals, compared to a normal NBA game that might only be watched by a hundred thousand viewers nationwide.

One thing in the cable system’s favor is that their quality ought to be improving a lot over the next few years. The big cable providers will be implementing the new ATSC 3.0 video standard that is going to result in a significant improvement in picture quality on HD video streams. The FCC approved the new standard earlier this year and we ought to see it implemented in systems starting in 2018. This new standard will allow cable operators to improve the color clarity and contrast on existing HD video. I’ve seen a demo of a lab version of the standard and the difference is pretty dramatic.

One thing we don’t know, of course, is how much picture quality means to the average video user. I know my teenage daughter seems quite happy watching low-quality video made by other teens on Snapchat, YouTube or Facebook Live. Many people, particularly teens, don’t seem to mind watching video on a smartphone. Video quality makes a difference to many people, but time will tell if improved video quality will stem the tide of cord cutting. It seems that most cord cutters are leaving due to the cost of traditional TV as well as the hassle of working with the cable companies and better video might not be a big enough draw to keep them paying the monthly cable bill.

The End of the MP3?

Last month the Fraunhofer Institute for Integrated Circuits ended its licensing program for the MP3 digital file format. This probably means that the MP3 format will begin fading away to be replaced over time by newer file formats. MP3 stands for MPEG Audio Layer III and was the first standard that allowed for the compression of audio files without loss of sound quality. The US patent for MP3 was issued in 1996 by Fraunhofer and since then they have collected royalties for all devices that were able to create or use files in that format.

While it might seem a bit odd to be reading a blog about the end of a file format, MP3 files have had a huge impact in the tech and music industries that they are partly responsible for the early success of the Internet.

The MP3 file revolutionized the way that people listened to music. In the decade before that there had been a proliferation of portable devices that would play cassette tapes or CDs. But those devices did not really bring freedom to listen to music easily everywhere. I can remember the days when I’d have a pile of tapes or CDs in the car so that I could listen to my favorite music while I drove. But the MP3 file format meant that I could rip all of my music into digital files and could carry my whole music collection along with me.

And the MP3 digital files were small enough that people could easily share files with friends and could send music as attachments to emails. But file-sharing of MP3 files really took off in 1999 when Shawn Fanning, John Fanning, and Sean Parker launched the peer-to-peer network Napster. This service gave people access to the entire music collections of huge numbers of others. Napster was so popular that the traffic generated by the platform crashed broadband networks at colleges and caused havoc with many ISP networks.

In 2001 Apple launched iTunes, a service where people could legally download MP3 files. Apple used the MP3 format initially but in 2003 changed to the AAC format, probably mostly to avoid paying the MP3 licensing fees. Internet traffic to iTunes grew to be gigantic. It’s hard to remember when the Internet was so much smaller, but the transfer of MP3 files was as significant to Internet traffic in the early 2000s as Netflix is today.

Napster, along with Apple iTunes, revolutionized the music industry and the two are together credited with ending the age of albums. People started listening to their favorite songs and not to entire albums – and this was a huge change for the music industry. Album sales dropped precipitously and numerous music labels went out of business. I remember the day I cancelled my subscription to Columbia House because I no longer felt the need to buy CDs.

Of course, Napster quickly ran into trouble for helping people violate music copyrights and was driven out of business. But the genie was out of the bottle and the allure of sharing MP3 files was too tempting for music lovers. I remember musician friends who always had several large-capacity external hard drives in their car and would regularly swap music collections with others.

One of the consequences from ending the licensing of the MP3 format is that over time it’s likely that computers and other devices won’t be able to read the MP3 format any longer. MP3s are still popular enough that the music players on computers and smartphones all still recognize and play MP3 files. But the history of the Internet has shown us that unsupported formats eventually fizzle away into obscurity. For example, much of the programming behind the first web sites is no longer supported and many of today’s devices can no longer view old web sites without downloading software capable of opening the old files.

It’s interesting that most people think that once something has been digitized that it will last forever. That might be true for important data if somebody makes special effort to save the digitized files in a place that will keep them safe for a long time. Bu we’ve learned that digital storage media are not permanent. Old CDs become unreadable. Hard drives eventually stop working. And even when files are somehow kept, the software needed to run the files can fall into obscurity.

There are huge amounts of music since 2000 that has been created only in a digital format. Music by famous musicians will likely be maintained and replayed as long as people have an interest in those musicians. But music by lesser-known artists will probably fade away and much of it will disappear. It’s easy to envision that in a century or two that that most of the music we listen to today might have disappeared.

Of course there are the online music streaming services like Spotify that are maintaining huge libraries of music. But if we’ve learned anything in the digital age it’s that companies that make a living peddling digital content don’t themselves have a long shelf life. So we have to wonder what happens to these large libraries when Spotify and similar companies fade away or are replaced by something else.

Regional Differences in Broadband Adoption

The latest Akamai report State of the Internet Q1 2017 contains a lot of interesting facts about broadband adoption and usage in the US and around the world. One of the things that they track is the percentage of broadband users at various data speeds. I think their tracking is the most accurate in the industry because they measure the actual speeds of connectivity, not the subscribed rate that users think they are getting. Most of the largest Internet hubs use Akamai and so they get to see huge volumes of web connections.

Probably the most interesting statistic in the report from a US perspective is that the average broadband connection speed for the whole US has grown to 18.7 Mbps. This is up 8.8% over the last quarter of 2016 and is up 22% from a year ago. This increase was enough to move the US up to tenth place in the world in terms of average connectivity speed. The worldwide connectivity speed is 7.2 Mbps, but that comes with the caveat that it doesn’t include some parts of the world and also doesn’t include the billions who don’t yet have any broadband available.

What I find most interesting in the connectivity data is how disparate broadband is in different parts of the US. For the first time there are places in the US with average connectivity speeds greater than the FCC definition of broadband – the District of Columbia at 28.1 Mbps and Delaware at 25.2 Mbps. Contrast this with Idaho with an average connectivity speed of 12 Mbps, which is less than half of the speeds for the fastest states. Perhaps the most useful statistics in the report is the percentage of connections in each state that meet various speed thresholds:

4 Mbps Adoption. Akamai says that Delaware leads in this category with 98% of connections exceeding a speed of 4 Mbps, with Rhode Island close behind at 97%. Contrast this to the bottom of the list where West Virginia has only 77% of connections exceeding 4 Mbps and Arkansas the next lowest at 81%.

10 Mbps Adoption Rate. Delaware also leads this category with 86% of the broadband connections from the state exceeding 10 Mbps, again just ahead of Rhode Island with 85%. But at the bottom of this list are Idaho at 45%, and Arkansas and New Mexico at 47%.

15 Mbps Adoption Rate. Rhode Island leads this category with 66% of broadband connections exceeding 15 Mbps. At the bottom of this list was Idaho with only 23% of connections exceeding 15 Mbps.

25 Mbps Adoption Rate. The District of Columbia tops this list with 38% of connections exceeding 25 Mbps, with Delaware second at 33%. At the bottom of the list is Idaho where only 7.5% of connections exceeded 25 Mbps, with New Mexico the next lowest at 7.9%.

Since these are the actual speeds of Internet connections one can conjecture there are a number of reasons that contribute to the differences across various states, such as:

  • Availability of fast broadband. The states with the fastest broadband rates happen to be those where a significant percentage of the population has both fiber (Verizon FiOS) and cable modem broadband available. By contrast the states near the bottom of the list tend to have far fewer communities with fiber, and even many communities without cable systems.
  • Affordability. Numerous surveys have shown that affordability is still a major factor for homes being able to afford the broadband connection they want.
  • Choice. Even in places where there is fast broadband available, many households choose slower broadband speeds due to lack of perceived need.
  • Geography. Terrain plays a role as well. In working with rural communities across the country I see that in the Plains states with wide-open expanses of land that there has been a proliferation of rural homes served by point-to-multipoint wireless networks that are delivering speeds of 10 – 50 Mbps. But this technology is of far less value in places like West Virginia with hilly and wooded terrain.

One thing this report shows is that the disparity between the top and the bottom states on these various lists is widening. In places where fast broadband is available, the statistics show that a lot of people are upgrading to faster speeds. But in states near the bottom of the list where the broadband networks are the least robust the same upward migrations to faster speeds is not possible due to the lack of options. One would think that most of the country would look like Delaware in terms of broadband adoption rates if broadband was available to everybody. But the difference in technologies and infrastructure limits households from buying the broadband speeds they want.

The other thing to remember about these statistics is that they are only measuring the speeds for actual broadband connections, and so obviously exclude the millions of households in the country that still don’t have a reasonable broadband alternative. If those households were weighted into these statistics then states with large rural areas with no broadband would sink down the list.

Big Companies and Telecommuting

One of the biggest benefits most communities see when the first get good broadband is the ability for people to telecommute or work from home. Communities that get broadband for the first time report that this is one of the most visible changes made in the community and that soon after getting broadband almost every street and road has somebody working from home.

CCG is a great example of telecommuting and our company went virtual fifteen years ago. The main thing that sent us home in those days was that residential broadband was better than what we could get at the office. All of our employees could get 1 – 2 Mbps broadband at home and that was also the only speed available at our offices over a T1. But we found that even in those early days that a T1 was not enough speed to share among multiple employees.

Telecommuting really picked up at about the same time that CCG went virtual. I recall that AT&T was an early promoter of telecommuting as was the federal government. At first these big companies let employees work at home a day or two a week as a trial. But that worked out so well that over time big organizations felt comfortable with people working out of their homes. I’ve seen a number of studies that show that telecommuting employees are more productive than office employees and work longer hours – due in part to not have to commute. Telecommuting has become so pervasive that there was a cover story in Forbes in 2013 announcing that one out of five American workers worked at home.

Another one of the early pioneers in telecommuting was IBM. A few years ago they announced that 40% of their 380,000 employees worked outside of traditional offices. But last week the company announced that they were ending telecommuting. They told employees in many of their major divisions like Watson development, software development and digital marketing and design that they must move back into a handful of regional offices or leave the company.

The company has seen decreasing revenues for twenty straight quarters and there is speculation that this is a way to reduce their work force without having to go through the pain of choosing who will leave. But what is extraordinary about this announcement is how rare it is. It’s only the second major company that has ended telecommuting in recent memory, the last being Yahoo in 2013.

Both IBM and Yahoo were concerned about earnings and that is probably one of the major reasons that drove their decision to end telecommuting. It seems a bit ironic that companies would make this choice when it’s clear that telecommuting saves money for the employer – something IBM crowed about earlier this year.

Here are just a few of the major findings that have been done about the benefits of telecommuting. It’s improves employee morale and job satisfaction. It reduces attrition, reduces sick and unscheduled leave. It saves companies on office space and overhead costs. It reduces discrimination by equalizing people by personality and talent rather than race, age or appearance. It increases productivity by eliminating unneeded meetings and because telecommuters work more hours than office workers.

But there are downsides. It’s hard to train new employees in a telecommuting environment. One of the most common ways to train new people is to have them spend time with somebody more experienced – something that is difficult with telecommuting. Telecommuting makes it harder to brainstorm ideas, something that benefits from live interaction. And possibly the biggest drawback is that telecommuting isn’t for everybody. Some people cannot function well outside of a structured environment.

As good as telecommuting is for companies it’s even better for smaller and rural communities. A lot of people want to live in the communities they grew up in, around friends and family. We’ve seen a brain drain from rural areas for decades as kids graduate from high school or college and are unable to find meaningful work. But telecommuting lets people live where there is broadband. Many communities that have had broadband come to town report that they see an almost instant uptick in housing prices and demand for housing. And part of that increased demand is from those who want to choose a community rather than follow a job.

One of the more interesting projects I’ve worked on with the telecommuting issue was when I helped the city of Lafayette, Louisiana get a fiber network. Lafayette is not a rural area but a thriving mid-size city, and yet one of the major reasons the residents wanted fiber was the chance to keep their kids at home. The area is largely Cajun with a unique culture and the community was unhappy to see their children have to relocate to larger cities to get jobs after graduating from the university there. Broadband alone can’t fix that kind of problem, but Lafayette is reportedly happy with the changes brought from the fiber network. That’s the kind of benefit that’s hard to quantify in dollar terms.

Net Neutrality and the Digital Divide

There is an interesting idea floating around the industry that is bound to annoy fans of net neutrality. The idea comes from Roslyn Layton who does telecom research at Aalborg University in Denmark. She served on the FCC Transition team for the new administration.

She envisions zero-rating as the best way to solve the digital divide and to finally bring Internet access to everybody. She says that after decades of not finding any other solutions that this might the only reasonable path to get Internet access to people that can’t afford a monthly subscription.

The idea is simple – there are companies who will provide an advertising-driven broadband connection for free to customers, particularly on a cellphone. It’s not hard to envision big companies like Facebook or Google sponsoring cellphone connections and providing data access to customers who would be a captive audience for their ads and content.

This idea is already working elsewhere. Facebook offers this same service in other countries today under the brand name “Free Basics.’ While it certainly costs Facebook to buy the wholesale data connections they must have done the math and figured that having a new customer on their platform is worth more than the cost. Facebook’s stated goal is to serve most of the billions of people on earth and this is a good way to add a lot of customers. With Free basics customers get full use of the Facebook platform along with the basic ability to surf the web. However, the free basic service does not allow a user to freely watch streaming video or to do other data-intensive activities that are not part of the Facebook universe – it’s not an unlimited data plan. I can remember similar products in the US back in the dial-up days when several dial-up providers that gave free connections as long as the customers didn’t mind being bombarded by ads.

There are certainly upsides to this. Such a service would provide enough bandwidth for people to use the web for the basics like hunting for a job or doing school work. And users would get unlimited use of the Facebook platform for functions such as messaging or watching Facebook-sponsored video and content. There are still a substantial number of people in the US who can’t afford a broadband subscription and this would provide a basic level of broadband to anybody willing to deal with the ad-heavy environment.

But there are downsides. This idea violates net neutrality. Even if the current FCC does away with net neutrality one has to think that a future FCC will institute something similar. But even with net neutrality rules in place the FCC could make an exception for a service that tackles the digital divide.

The real downside is that this is not the same as the real internet access that others enjoy. Users would be largely trapped inside whatever platform sponsors their product. That could be Facebook or Google, but it could also be an organization with a social or political agenda. Anybody using this kind of free platform would have something less than unfettered Internet access, and they would be limited to whatever the platform sponsor allows them to see or do outside the base platform. At best this could be called curated Internet access, but realistically it’s a platform to give sponsors unlimited access to users.

But I think we have to be realistic that nobody has yet found a solution to the digital divide. The FCC’s Lifeline program barely makes a dent in it. And I’m not aware of any major ISP who has ever found any mechanism to solve the digital divide issue.

While Facebook offers this in many countries around the globe they received massive pushback when they tried to bring this to India. The Indian government did not want a class of people given a clearly inferior class of Internet connectivity. But in India the government is working hard themselves to solve the digital divide. But there is nobody in the US giving the issue any more than lip service. The issue has been with us since the dial-up days and there has been little progress in the decades since then.

I read some persuasive articles a few years ago when the net neutrality debate was being discussed about this kind of product. There were arguments made that there would be long-term negative ramifications from having a second-class kind of Internet access. The articles worried about the underlying sponsors heavily influencing people with their particular agenda.

But on the flip side, somebody who doesn’t have broadband access probably thinks this is a great idea. It’s unrealistic to think that people have adequate broadband access when they can only get it at the library or a coffee shop. For broadband to benefit somebody it needs to be available when and where they need to use it.

I lean towards thinking this as an idea worth trying. I would hope that there would be more than one or two companies willing to sponsor this, in which case any provider who is too obnoxious or restrictive would not retain customers. People who go to sites like Facebook today are already voluntarily subjected to ads, so this doesn’t seem like too steep of a price to pay to get more people connected to the Internet.