Data Caps Again?

My prediction is that we are going to see more stringent data caps in our future. Some of the bigger ISPs have data caps today, but for the most part the caps are not onerous. But I foresee data caps being reintroduced as another way for big ISPs to improve revenues.

You might recall that Comcast tried to introduce a monthly 300 GB data cap in 2015. When customers hit that mark Comcast was going to charge $10 for every additional 50 GB of download, or $30 extra for unlimited downloading.

There was a lot of public outcry about those data caps. Comcast backed down from the plan due to pressure from the Tom Wheeler FCC. At the time the FCC probably didn’t have the authority to force Comcast to kill the data caps, but the nature of regulation is that big companies don’t go out of their way to antagonize regulators who can instead cause them trouble in other areas.

To put that Comcast data cap into perspective, in September of 2017 Cisco predicted that home downloading of video would increase 31% per year through 2021. They estimated the average household data download in 2017 was already around 130 GB per month. You might think that means that most people wouldn’t be worried about the data caps. But it’s easy to underestimate the impact of compound growth and at a 31% growth rate the average household download of 130 GB would grow to 383 gigabits by 2021 – considerably over Comcast’s propose data cap.

Even now there are a lot of households that would be over that caps. It’s likely that most cord cutters use more than 300 GB per month – and it can be argued that the Comcast’s data caps would punish those who drop their video. My daughter is off to college now and our usage has dropped, but we got a report from Comcast when she was a senior that said we used over 600 GB per month.

So what are the data caps for the largest ISPs today?

  • Charter, Altice, Verizon and Frontier have no data caps.
  • Comcast moved their data cap to 1 terabyte, with $10 for the first 50 GB and $50 monthly for unlimited download.
  • AT&T has almost the stingiest data caps. The cap on DSL is 150 GB, on U-verse is 250 GB, on 300 Mbps FTTH is 1 TB and is unlimited for a Gbps service. They charge $10 per extra 50 GB.
  • CenturyLink has a 1 TB cap on DSL and no cap on fiber.
  • Cox has a 1 TB cap with $30 for an extra 500 GB or $50 unlimited.
  • Cable One has no charge but largely forces customers who go over caps to upgrade to more expensive data plans. Their caps are stingy – the cap on a 15 Mbps DSL connection is 50 GB.
  • Mediacom has perhaps the most expensive data caps – 60 Mbps cap is 150 GB, 100 Mbps is 1 TB. But the charge for violating the cap is $10 per GB or $50 for unlimited.

Other than AT&T, Mediacom and Cable One none of the other caps sound too restrictive.

Why do I think we’ll see data caps again? All of the ISPs are looking forward just a few years and wondering where they will find the revenues to increase the demand from Wall Street for ever-increasing earnings. The biggest cable companies are still growing broadband customers, mostly by taking customers from DSL. But they understand that the US broadband market is approaching saturation – much like has happened with cellphones. Once every home that wants broadband has it, these companies are in trouble because bottom line growth for the last decade has been fueled by the growth of broadband customers and revenues.

A few big ISPs are hoping for new revenues from other sources. For instance, Comcast has already launched a cellular product and also is seeing good success with security and smart home service. But even they will be impacted when broadband sales inevitably stall – other ISPs will feel the pinch before Comcast.

ISPs only have a few ways to make more money once customer growth has stalled, with the primary one being higher rates. We saw some modest increases earlier this year in broadband rates – something that was noticeable because rates have been the same for many years. I fully expect we’ll start seeing sizable annual increases in broadband rates – which go straight to the bottom line for ISPs. The impact from broadband rate increases is major for these companies – Comcast and Charter, for example, make an extra $250 million per year from a $1 increase in broadband rates.

Imposing stricter data caps can be as good as a rate increase for an ISPs. They can justify it by saying that they are charging more only for those who use the network the most. As we see earnings pressure on these companies I can’t see them passing up such an easy way to increase earnings. In most markets the big cable companies are a near monopoly and consumers who need decent speeds have fewer alternative as each year passes.Since the FCC has now walked away from broadband regulations there will be future regulatory hindrance to the return of stricter data caps.

Charter Upgrading Broadband

We are now starting to see the results of cable companies upgrading to DOCSIS 3.1. Charter, the second biggest ISP in the country recently announced that it will be able to offer gigabit speeds to virtually it’s whole footprint of over 40 million passings.

DOCSIS 3.1 is the newest protocol from Cable Labs that allows bonding an unlimited number of spare channel slots for broadband. A gigabit data path requires roughly 24 channels on a cable network using the new DOCSIS protocol. In bigger markets this replaces DOCSIS 3.0 that was limited to maximum download speeds in the range of 250 Mbps. I know there are Charter markets with even slower speeds that either operate under older DOCSIS standards or that are slow for some other reason.

Charter has already begun the upgrades and is now offering gigabit speeds to 9 million passings in major markets like Oahu, Hawaii; Austin, Texas; San Antonio, Texas, Charlotte, North Carolina; Cincinnati, Ohio; Kansas City, Missouri; New York City; and Raleigh-Durham, North Carolina. It’s worth noting that those are all markets where there is fiber competition, so it’s natural they would upgrade these first.

The new increased speed won’t actually be a gigabit and will be 940 Mbps download and 35 Mbps upload. (It’s hard to think there is anybody who is really going to care about that distinction). Cable Labs recently came out with a DOCSIS upgrade that can increase upload speeds, but there’s been no talk from Charter about making that upgrade. Like the other big cable companies, Charter serves businesses that want faster upload speeds with fiber.

Along with the introduction of gigabit broadband the company also says it’s going to increase the speed of it’s minimum broadband product. In the competitive markets listed above Charter has already increased the speed of its base product to 200 Mbps download, up from 100 Mbps.

It’s going to be interesting to find out what Charter means by the promise to cover “virtually’ their whole footprint. Charter grew by purchasing systems in a wide range of conditions. I know of smaller Charter markets where customers don’t get more than 20 Mbps. There is also a well-known lawsuit against Charter in New York State that claims that a lot of households in upstate New York are getting speeds far slower than advertised due to having outdated cable modems.

The upgrade to DOCSIS 3.1 can be expensive in markets that have not yet been upgraded to DOCSIS 3.0. An upgrade might mean replacing power taps and other portions of the network, and in some cases might even require a replacement of the coaxial cable. My guess is that the company won’t rush to upgrade these markets the upgrade to DOCSIS 3.1 this year. I’m sure the company will look at them on a case-by-case basis.

The company has set a target price for a gigabit at $124.95. But already in the competitive markets like Oahu the company was selling introductory packages for $104.99. There is also a bundling discount for cable subscribers.

The pricing list highlights that they still have markets with advertised speeds as low as 30 Mbps – and the company’s price for the minim speeds is the same everywhere, regardless if that product is 30 Mbps or 200 Mbps. And as always with cable networks, these are ‘up to’ speeds and as I mentioned, there are markets that don’t meet these advertised speeds today.

Overall this ought to result in a lot of home and businesses getting faster broadband than today. We saw something similar back when the cable companies implemented DOCSIS 3.0 and the bigger companies unilaterally increased speeds to customers without increasing the prices. Like other Charter customers, I will be interested in what they do in my market. I have the 60 Mbps product and I’ll be interested to see if my minimum speeds is increased to 100 Mbps or 200 Mbps and if I’m offered a gigabit here. With the upgrade time frame they are promising I shouldn’t have to wait long to find out.

Spectrum and 5G

All of the 5G press has been talking about how 5G is going to be bringing gigabit wireless speeds everywhere. But that is only going to be possible with millimeter wave spectrum, and even then it requires a reasonably short distance between sender and receiver as well as bonding together more than one signal using multiple MIMO antennae.

It’s a shame that we’ve let the wireless marketeers equate 5G with gigabit because that’s what the public is going to expect from every 5G deployment. As I look around the industry I see a lot of other uses for 5G that are going to produce speeds far slower than a gigabit. 5G is a standard that can be applied to any wireless spectrum and which brings some benefits over earlier standards. 5G makes it easier to bond multiple channels together for reaching one customer. It also can increase the number of connections that can be made from any given transmitter – with the biggest promise that the technology will eventually allow connections to large quantities of IOT devices.

Anybody who follows the industry knows about the 5G gigabit trials. Verizon has been loudly touting its gigabit 5G connections using the 28 GHz frequency and plans to launch the product in up to 28 markets this year. They will likely use this as a short-haul fiber replacement to allow them to more quickly add a new customer to a fiber network or to provide a redundant data path to a big data customer. AT&T has been a little less loud about their plans and is going to launch a similar gigabit product using 39 GHz spectrum in three test markets soon.

But there are also a number of announcements for using 5G with other spectrum. For example, T-Mobile has promised to launch 5G nationwide using its 600 MHz spectrum. This is a traditional cellular spectrum that is great for carrying signals for several miles and for going around and through obstacles. T-Mobile has not announced the speeds it hopes to achieve with this spectrum. But the data capacity for 600 MHz is limited and binding numerous signals together for one customer will create something faster then LTE, but not spectacularly so. It will be interesting to see what speeds they can achieve in a busy cellular environment.

Sprint is taking a different approach and is deploying 5G using the 2.5 GHz spectrum. They have been testing the use of massive MIMO antenna that contain 64 transmit and 64 receive channels. This spectrum doesn’t travel far when used for broadcast, so this technology is going to be used best with small cell deployments. The company claims to have achieved speeds as fast as 300 Mbps in trials in Seattle, but that would require binding together a lot of channels, so a commercial deployment is going to be a lot slower in a congested cellular environment.

Outside of the US there seems to be growing consensus to use 3.5 GHz – the Citizens Band radio frequency. That raises the interesting question of which frequencies will end up winning the 5G race. In every new wireless deployment the industry needs to reach an economy of scale in the manufacture of both the radio transmitters and the cellphones or other receivers. Only then can equipment prices drop to the point where a 5G capable phone will be similar in price to a 4GLTE phone. So the industry at some point soon will need to reach a consensus on the frequencies to be used.

In the past we rarely saw a consensus, but rather some manufacturer and wireless company won the race to get customers and dragged the rest of the industry along. This has practical implications for early adapters of 5G. For instance, somebody buying a 600 MHz phone from T-Mobile is only going to be able to use that data function when near to a T-Mobile tower or mini-cell. Until industry consensus is reached, phones that use a unique spectrum are not going to be able to roam on other networks like happens today with LTE.

Even phones that use the same spectrum might not be able to roam on other carriers if they are using the frequency differently. There are now 5G standards, but we know from practical experience with other wireless deployments in the past that true portability between networks often takes a few years as the industry works out bugs. This interoperability might be sped up a bit this time because it looks like Qualcomm has an early lead in the manufacture of 5G chip sets. But there are other chip manufacturers entering the game, so we’ll have to watch this race as well.

The word of warning to buyers of first generation 5G smartphones is that they are going to have issues. For now it’s likely that the MIMO antennae are going to use a lot of power and will drain cellphone batteries quickly. And the ability to reach a 5G data signal is going to be severely limited for a number of years as the cellular providers extend their 5G networks. Unless you live and work in the heart of one of the trial 5G markets it’s likely that these phones will be a bit of a novelty for a while – but will still give a user bragging rights for the ability to get a fast data connection on a cellphone.

Edging Closer to Satellite Broadband

A few weeks ago Elon Musk’s SpaceX launched two test satellites that are the first in a planned low-orbit satellite network that will blanket the earth with broadband. The eventual network, branded as Starlink, will consist of 4,425 satellites deployed at 700 miles above earth and another 7,518 deployed at around 210 miles of altitude.

Getting that many satellites into orbit is a daunting logistical task. To put this into perspective, the nearly 12,000 satellites needed are twice the number of satellites that have been launched in history. It’s going to take a lot of launches to get these into the sky. SpaceX’s workhorse rocket the Falcon 9 can carry about ten satellites at a time. They also have tested a Falcon Heavy system that could carry 20 or so satellites at a time. If they can make a weekly launch of the larger rocket that’s still 596 launches and would take 11.5 years. To put that number into perspective, the US led the world with 29 successful satellite launches last year, with Russia second with 21 and China with 16.

SpaceX is still touting this as a network that can make gigabit connections to customers. I’ve read the FCC filing for the proposed network several times and it looks to me like that kind of speed will require combining signals from multiple satellites to a single customer and I have to wonder if that’s practical when talking about deploying this networks to tens of millions of simultaneous subscribers. It’s likely that their standard bandwidth offering is going to be something significantly less.

There is also a big question to me about the capacity of the backhaul network that carry signal to and from the satellites. It’s going to take some major bandwidth to handle the volume of broadband users that SpaceX has in mind. We are seeing landline long-haul fiber networks today that are stressed and reaching capacity. The satellite network will face the same backhaul problems as everybody else and will have to find ways to cope with a world where broadband demand doubles every 3 years or so. If the satellite backhaul gets clogged or if the satellites get over-subscribed then the quality of broadband will degrade like with any other network.

Interestingly, SpaceX is not the only one chasing this business plan. For instance, billionaire Richard Branson wants to build a similar network that would put 720 low-orbit satellites over North America. Telesat has launched two different test satellites and also want to deploy a large satellite network. Boeing also announced intentions to launch a 1,000-satellite network over North America. It’s sounding like our skies are going to get pretty full!

SpaceX is still predicting that the network is going to cost roughly $10 billion to deploy. There’s been no talk of consumer prices yet, but the company obviously has a business plan – Musk want to use this business as the primary way to fund the colonization of Mars. But pricing is an issue for a number of reasons. The satellites will have some finite capacity for customer connections. In one of the many articles I read I saw the goal for the network is 40 million customers (and I don’t know if that’s the right number, but there is some number of simultaneous connections the network can handle). 40 million customers sounds huge, but with a current worldwide population of over 7.6 billion people it’s miniscule for a worldwide market.

There are those predicting that this will be the salvation for rural broadband. But I think that’s going to depend on pricing. If this is priced affordably then there will be millions in cities who would love to escape the cable company monopoly, and who could overwhelm the satellite network. There is also the issue of local demand. Only a limited number of satellites can see any given slice of geography. The network might easily accommodate everybody in Wyoming or Alaska, but won’t be able to do the same anywhere close to a big city.

Another issue is worldwide pricing. A price that might be right in the US might be ten times higher than what will be affordable in Africa or Asia. So there is bound to be pricing differences based upon regional incomes.

One of the stickier issues will be the reaction of governments that don’t want citizens using the network. There is no way China is going to let citizens bypass the great firewall of China by going through these satellites. Repressive regimes like North Kora will likely make it illegal to use the network. And even democratic countries like India might not like the idea – last year they turned down free Internet from Facebook because it wasn’t an ‘Indian’ solution.

Bottom line is that this is an intriguing idea. If the technology works as promised, and if Musk can find the money and can figure out the logistics to get this launched it’s going to be another new source of broadband. But satellite networks are not going to solve the world’s broadband problems because they are only going to be able to help some small limited percentage of the world’s population. But with that said, a remote farm in the US or a village in Africa is going to love this when it’s available.

Abandoned Telecom Infrastructure

I saw an article about Merrill, Oregon where the city was wrestling about what to do with an abandoned cable TV network hanging on poles in the City. It’s actually a fairly common occurrence to have abandoned telecom property on poles and I’ve been contacted by a number of cities over the years wondering what how to deal with the situation.

In this particular case the historic cable system in the city was operated by Rapid Communications out of Texas. That company sold cable properties to a number of companies in 2008 and the Merrill system went to Almega Cable Company, which stopped offering service in the City and went out of business in 2011.

There are all sorts of telecom assets that have been abandoned on poles and defunct cable companies are only one example. I saw a lot of WiFi mesh networks abandoned fifteen years ago as operators folded and never retrieved their equipment. There are numerous CLECs that folded in the late 1990s and that walked away from fiber networks on poles.

Having an abandoned set of wires on poles complicates the lives of any other pole users in the market. The unused wires take up space on poles and make it hard for anybody else to add additional wires onto the pole.

Abandoned networks also create havoc for the normal pole attachment process. This process requires buy-in from existing utilities to move or rearrange cables to make room for a new attacher. A new attacher can be paralyzed if they are unable to create the required clearance from existing wires.

In the end I’ve almost always seen the responsibility for getting rid of the network fall to the local government. Somebody has to go through the process of making certain there is no remaining active owner of the network before it can be condemned. Generally the pole owner is not willing to take on that role unless they have a need of their own to add wires to the poles.

Merrill is now undertaking the task of condemning the network. They have to follow law and post public notices to make sure that nobody claims rights to the cables. In the case of a cable company the City not only has to deal with the wires on poles, but also with customer drops and pedestals scattered throughout the community.

Merrill is hoping that some new carrier will want to use the cable network for overlashing fiber. Overlashing is the process of tying the fiber to existing wires and is generally the lowest cost method of fiber construction. But even if they find a taker for the offer my guess is that the new fiber provider is not going to want to assume ownership for the coaxial cables since that would give them liability for any issues or problems with the old wiring. So the City might end up owning the cables in perpetuity. If they don’t find a buyer, the city will have to pay to have the cables removed – although in today’s market there might be enough value in the copper inside the coaxial cables to offset the cost of removal.

We are going to see a lot more abandoned assets on poles in the future. We are just now entering a period when numerous companies are going to want to hang wireless devices of all types on poles. Some of these devices are tiny and I’ve seen others that are the size of a dorm refrigerator. It’s inevitable that some of the wireless deployments will fail, or that the wireless companies will lose the customers served by a given device.

Over time a significant inventory of abandoned wireless devices will likely grow in most cities. And unlike an abandoned cable network, my guess is that it’s often going to be hard to know which wireless devices have been abandoned or even who owns many of them. Cities ought to be considering ordinances today that require the companies that deploy wireless devices to somehow notify them of what they are doing and to also clearly label the ownership of each device.

But there is a movement at the FCC, in Congress and in States legislatures to institute rules for wireless carriers that would override any local rules. Such global rules are going to hinder cities in the coming decades when they try to deal with abandoned assets clogging their pole lines. Most of the proposed new rules I’ve seen don’t address this issue, which will make it messy to deal with later.

Is AT&T Violating Net Neutrality?

I got a text on my AT&T cellphone last month that told me that my wireless plan now includes sponsored data. Specifically they told me that I could now stream movies and other content from DirecTV or U-Verse TV without the video counting against my monthly data cap. This has been available to AT&T post-paid customers for a while, but now is apparently available to all customers. What I found most interesting about the message was that it coincided with the official end of net neutrality.

AT&T is not the first cellular company to do this. Verizon tried this a few years ago, although that attempt was largely unsuccessful because they didn’t offer much content that people wanted to watch. T-Mobile does something similar with their Binge-on program, but since most of their data plans are unlimited, customers can watch anything on their phones, not just the Binge-on video.

The sponsored data from AT&T would be a direct violation of net neutrality if it was still in effect and is a textbook example of paid prioritization. By excusing the DirecTV content from cellular data caps they have created an advantage for DirecTV compared to competitors. It doesn’t really matter that AT&T also happens to own DirecTV, and I imagine that AT&T is now shopping this same idea around to other video providers.

So what is wrong with what AT&T is doing? Certainly their many customers that buy both AT&T cellphones and DirecTV will like the plan. Cellular data in the US is still some of the most expensive data in the world and letting customers watch unlimited video from a sponsored video provider is a huge benefit to customers. Most people are careful to not go over monthly data limits, and that means they carefully curtail watching video on cellphones. But customers taking advantage of sponsored video are going to watch video that would likely have exceeded their monthly data cap – it doesn’t take more than a handful of movies to do that.

AT&T has huge market power with almost 140 million cellphones users on their network at the end of last year. Any video provider they sponsor is going to gain a significant advantage over other video providers. AT&T customers that like watching video on their cellphones are likely to pick DirecTV over Comcast or any other video provider.

It’s also going to be extremely tempting for AT&T to give prioritized routing to DirecTV video – what means implementing the Internet fast lane. AT&T is going to want their cellular customers to have a quality experience, and they can do that by making sure that DirecTV video has the best connections throughout their network. They don’t necessarily have to throttle other video to make DirecTV better – they can just make sure that DirectTV video gets the best possible routing.

I know to many people the AT&T plan is going to feel somewhat harmless. After all, they are bundling together their own cellular and video products. But it’s a short step from here for AT&T to start giving priority to content from others who are willing to pay for it. It’s not to hard to imagine them offering the same plan to Netflix, YouTube or Facebook.

If this plan expands beyond AT&T’s own video, we’ll start seeing the negative impacts of paid prioritization:

  • Only the biggest companies like Netflix, Facebook or Google can afford to pay AT&T for the practice. This is going to shut out smaller video providers and start-ups. Already in the short history of the web we’ve seen a big turnover in the popular platforms on the web – gone or greatly diminished are earlier platforms like AOL, CompuServe and Prodigy. But with the boost given by paid prioritization the big companies today will get a step-up to remain as predominant players on the web. Innovation is going to be severely hampered.
  • This is also the beginning of a curated web where many people only see the world through the filter of the predominant web services. We already see that phenomenon a lot today, but when people are funneled to only using the big web services this will grow and magnify.
  • It’s not hard to imagine the next step where we see reduced price data plans that are ‘sponsored’ by somebody like Facebook. Such platforms will likely make it a challenge for customers to step outside their platform. And that will lead to a segmentation and slow death of the web as we know it.

Interestingly, the Tom Wheeler FCC told AT&T that this practice was unacceptable. But through the change of administration AT&T never stopped the practice and is now expanding it. It’s likely that courts are going to stay some or all of the net neutrality order until the various lawsuits on the issue get resolved. But AT&T clearly feels emboldened to move forward with this, probably since they know the current FCC won’t address the issue even if net neutrality stays in effect.

Industry Shorts – March 2018

Following are a few topics that I find interesting, but which are too short to cover in a full blog:

Surge in Online Video Subscriptions. The number of households buying online video is surging. Netflix added almost 2 million US and 6.36 million international customers in the 4th quarter of 2017. That’s 18% more than the same quarter from a year earlier. There are also a growing number of vMVPD customers. At the end of last year CBS All Access has nearly 5 million customers. Showtime OTT also has nearly 5 million customers. Sling TV now has nearly 2 million customers. AT&T DirecTV hit the 1 million customer mark in December. PlayStation Vue reported 670,000 customers in mid-December. The new YouTube service has about 300,000. Hulu is also growing but doesn’t separately report it’s live TV customers from it’s video on demand customers (reported at 17 million total in December). Note that Hulu let’s customers buy one TV series or movies without needed a subscription.

Cellphone Data Usage Growth. According to the research firm NPD the average US smartphone now is used for an average of 31.4 GB per month of data. This is combined usage between cellular and WiFi data and is evidence that people are starting to really accept the small screen for video. This is up over 25% from a year earlier. The firm reports that video accounts for 83% of the usage.

The number of people willing to use a cellphone for video has also surged. NPD reports that 67% of cellphone users watched at least one video during the 3rd quarter of 2017, up from 57% in the 2nd quarter. Another research firm, Strategic Analytics reported that worldwide cellular data usage grew 115% in 2017, or more than doubled.

Global Streaming Doubled in 2017. Conviva, which provides systems to monitor and analyze online usage also reports that online video content more than doubled last year. They report that there were 12.6 billion viewing hours of online video in 2017 measured across 2.4 billon viewing devices. They report that 58% of video viewing came from North America; 21% from Europe; 19% from Asia 2% from the rest of the world.

Satellite TV Taking the Brunt of Cord Cutting. For some reason cord cutting seemed to be hitting the two big satellite TV providers even harder than landline cable companies. Dish Networks and DirecTV together lost 4.7% of their subscribers in the fourth quarter of 2017. We can only speculate for the reasons for the shift. The bundles of the landline cable companies make it harder for customers to drop their cable subscription. But to offset this, many satellite customers are in rural areas where there is often not a good broadband alternative to cable. But perhaps the FCC’s CAF II and ACAM programs are speeding up rural broadband enough for households to consider cutting the cord. It should be noted that AT&T is pushing their DirecTV now product more than their satellite TV, which also might account for part of the shift from satellite TV.

Apple Jumps into Programming. Apple quietly has gotten into the programing business. They’ve allocated over $1 billion in 2018 for the creation of new content. They’ve landed some big-name talent such as Steven Spielberg, Jennifer Aniston and Reese Witherspoon for projects. Apple doesn’t have a content platform and the industry is buzzing with speculation on how they might market and distribute the content.

Pirated Video Content on Rise. Sandvine reports that 6.5% of North American households have accessed pirated video content in the last year. I’ve read reports from Canada of companies openly advertising pirated content, including providing settop boxes that can download IPTV content directly from the Internet. Yesterday’s blog talked about new efforts by content owners to force ISPs to enforce copyright infringement.

The Newest Battle of Copyright Infringement

For years the big ISPs have paid lip service to complaints about customers who violate copyrights by sharing content on the web for music and video. Every big ISP has had a process in place that was intended to police violation of the Digital Millennium Copyright Act (DMCA).

The owners of copyrighted materials have long complained that the ISP response to violators has been weak and ineffective. And they are right in that most ISPs notify customers that they are accused of violating copyrights, but there has been little or no consequences for violators.

However, that might now be changing due to a lawsuit that’s been in the courts for a few years. Music label BMG sued Cox Communications for not providing adequate protection of it’s copyrighted music. Recently the 4th Circuit Court, on appeal, reversed the original verdict against Cox. However, in doing so the court threw out Cox’s primary defense, which was that they were protected by the ‘safe harbor’ laws that are part of DMCA.

The safe harbor rules protect ISPs like Cox against damages from customer theft of copyrighted materials. Removing the safe harbor means that the owners of copyrighted materials can seek and win damages against ISPs if they don’t take adequate steps to protect copyrights. In the specific case against Cox, the BMG issue was that Cox didn’t do anything to deter repeat offenders.

There are apparently a lot of repeat offenders – customers who share a lot of copyrighted material – so this ruling instantly got the attention of other big ISPs. Comcast responded last week by notifying customers of a new policy for repeat offenders of copyright theft. The new policy has several progressive stages of severity:

  • Customers notified of DMCA violations might be forced to log in fresh to their broadband account, and in doing so will probably have to agree to abide by the company’s DMCA policy before getting access. Customers might also have to talk to Comcast customer service before they can log into their broadband account.
  • Customer that continue to violate DMCA policies after this first stage face termination of their broadband and all other Comcast services.

This is going to have a chilling effect on those that share copyrighted materials. A majority of people live in markets where the cable company offers the best broadband, and losing the home broadband connection is drastic. I have to assume that telcos will come up with similar policies, meaning that DSL also won’t be a refuge for anybody who continues to violate copyrights.

There has always been people who share content. The old public bulletin boards were full of copyrighted songs and pictures that could be shared. Over time this morphed into Napster and other file-sharing services. Today there are still a number of sharing sites on Tor and other places on the web. And people have figured out how to use Kodi and other technologies to capture and share copyrighted video files.

Although they don’t want to play the role of policeman, I suspect the big ISPs will be forced to at least somewhat enforce policies like the one Comcast just initiated. There has always been a big tug of war between ISPs and content owners. This new response from Comcast shows that content owners now have the upper hand. It certainly means that those who continue to share copyrighted materials will face eventually losing their broadband. In today’s world that’s a severe penalty.

Smaller ISPs need to pay attention to this and watch what the big companies are doing. I wouldn’t be surprised to see BMG or some other content owner sue a smaller ISPs to make a point that this applies to everybody – and nobody wants to be that ISP. If the big ISPs really enforce this, then small ISPs need to follow suit and figure out an effective way to police and deter repeat copyright violators.


Public Networks and Privacy

I’ve been investigating smart city applications and one of the features that many smart network vendors are touting is expanded public safety networks that can provide cameras and other monitoring devices for police, making it easier to monitor neighborhoods and solve crimes. This seems like something most police departments have on their wish list, because cameras are 24/7 and can see things that people are never likely to witness.

The question I ask today is if this what America wants? There are a few examples of cities with ubiquitous video surveillance like London, but is that kind of surveillance going to work in the US?

I think we’ve gotten our first clue from Seattle. The City installed a WiFi mesh network using Aruba wireless equipment in 2013 with a $3.6 million grant from the Department of Homeland Security. The initial vision for the network was that it would be a valuable tool to provide security in the major port in Seattle as well as provide communications for first responders during emergencies. At the time of installation the city intended to also make the surveillance capabilities available to numerous departments within the City, not just to the police.

But when the antennas, like the one shown with this blog, went up in downtown Seattle in 2013, a number of groups began questioning the city about their surveillance policies and the proposed use of these devices. Various groups including the ACLU voiced concerns that the network would be able to track cellphones, laptops and other devices that have MAC addresses. This could allow the City to gather information on anybody moving in downtown or the Port and might allow the City to do things like identify and track protesters, monitor who enters and leaves downtown buildings, track the movement of homeless people who have cellphones, etc.

Privacy groups and the ACLU complained to the City that the network effectively was a suspicionless surveillance system that monitors the general population and is a major violation of various constitutional rights. The instant and loud protests about the network caught City officials by surprise and by the end of 2013 they deactivated the network until they developed a surveillance policy. The city never denied that the system could monitor the kinds of things that citizens were wary of. That surveillance policy never materialized, and the City recently hired a vendor to dismantle the network and salvage any usable parts for use elsewhere in the City.

I can think of other public outcries that have led to discontinuance of public monitoring systems, particularly speed camera networks that catch and ticket speeders. Numerous communities tried that idea and scrapped it after massive citizen outrage. New York City installed a downtown WiFi network a few years ago that was to include security cameras and other monitoring devices. From what I read they’ve never yet activated the security features, probably for similar reasons. A web search shows that other cities like Chicago have implemented a network similar to Seattle’s and have not gotten the negative public reaction.

The Seattle debacle leads to the question of what is reasonable surveillance. The developers of smart city solutions today are promoting the same kinds of features contained in the Seattle network, plus new ones. Technology has advanced since 2013 and newer systems are promising to include the first generation of facial recognition software and also the ability to identify people by their walking gait. These new monitoring devices won’t just track people with cellphones and can identify and track everybody.

I think there is probably a disconnect between what smart city vendors are developing and what the public wants out of their city government. I would think that most citizens are in favor of smart city solutions like smart traffic systems that would eliminate driving backups, such as changing the timing of lights to get people through town as efficiently as possible.

But I wonder how many people really want their City to identify and track them every time they go within reach of one of City monitors. The information gathered by such monitors can be incredibly personal. It identifies where somebody is including a time stamp. The worry is not just that a City might misuse such personal information, but IT security guys I’ve talked to believe that many Municipal IT networks are susceptible to hacking.

In the vendors defense they are promoting features that already function well. Surveillance cameras and other associated monitors are tried and true technologies that work. Some of the newer features like facial recognition are cutting edge, but surveillance  systems installed today can likely be upgraded with software changes as the technology gets better.

I know I would be uncomfortable if my city installed this kind of surveillance system. I don’t go downtown except go to restaurants or bars, but what I do is private and is not the city’s business. Unfortunately, I suspect that city officials all over the country will get enamored by the claims from smart city vendors and will be tempted to install these kinds of systems. I just hope that there is enough public discussion of city plans so that the public understands what their city is planning. I’m sure there are cities where the public will support this technology, but plenty of others where citizens will hate the idea. Just because we have the technical capabilities to monitor everybody doesn’t mean we ought to.

The Infrastructure Plan and Broadband

The administration finally published their infrastructure plan last week. The document is an interesting read, particularly for those with a financial bent like me. There is no way this early to know if this plan has any chance to make it through Congress, or how much it might change if it does pass. But it’s worth reviewing because it lets us know what the government is thinking about infrastructure and rural broadband.

First, the details of the plan:

  • The proposed plan provides $200B of federal funding over 10 years;
  • $100B goes to States in the form of a 20% grant for new projects that won’t require additional future federal spending;
  • $50B is set aside as grants to states as a grant program for rural infrastructure. States can use the money as they wish;
  • $20B goes to projects that are in the demonstration phase of new technologies and that can’t easily attract other financing;
  • $20B would to towards existing federal loan programs including Transportation Infrastructure Finance and Innovation Act (TIFIA) and the Water Infrastructure Finance and Innovation Act (WIFIA).
  • Finally, $10 billion would be used to create a revolving fund that would allow the purchase, rather than the lease of federal infrastructure.

The funding for the program is a bit murky, as you would expect at this early stage. It appears that some of the funding comes from existing federal highway infrastructure funds, and one might suppose those funds will still be aimed at highways.

This plan gives governors and state legislators a lot of new money to disperse, meaning that every state is likely to tackle this in a different way. That alone is going to mean a varied approach to funding or not funding rural broadband.

The plan is completely mute in terms of broadband funding. This makes sense since the plan largely hands funds to the states. The program does not promote rural broadband, but it certainly does not preclude it. The most likely source of any funding for rural broadband would come out of the $50B rural pot of funding. We’ll have to wait and see what strings are attached to that money, but the proposal would largely hand this money to states and let them decide how to use it.

The larger $100B pot is to be used to provide up to 20% of the funding for projects and there are very few rural fiber projects that don’t need more than 20% assistance to make them viable. If the 20% funding basis is firm for this pot of funding I can’t see it being used much for broadband.

States are not going to like this $100B funding pool because this completely flips the role of the federal government in infrastructure funding. Today, for many road and bridge projects the federal government supplies as much as 80% of the funding, and this flips the 80% to the states. Because of this, States are likely to view this as an overall long-term decrease in federal infrastructure spending. The trade-off for the flip, though, is that the money is immediate and the states get to decide what to fund. Today, when the feds are involved it can take literally decades to finish some road projects.

The overall goal of the plan is to promote private investment in infrastructure projects, which contrasts to today where almost all infrastructures projects are 100% government funded. Historically public/private partnerships (PPPs) have played only a small role in US infrastructure spending. PPPs have been successful in other countries for helping to build infrastructure projects on time and on budget – which is a vast improvement over government projects that routinely go over on both. But incorporating PPP financing into infrastructure spending is going to take a change of mindset. That challenge is going to be complicated by the fact that most of this spending will be dispersed by the states. And it’s the states that will or will not embrace PPPs, so we’ll probably have a varied response across the country.

One of the most interesting ideas embedded in the plan is that projects should be funded in such a way as to cover the long-term maintenance costs of a project. That’s a big change from today where roads, bridges and other major projects are constructed with no thought given about the funding for the ongoing maintenance, or even for related costs of a project for environmental and other ancillary costs. This is going to force a change in the way of thinking about infrastructure to account for the full life-cycle cost of a project up-front.

I’ve read a few private reports from investment houses and their take on the plans. The analysis I’ve seen believes that the vast majority of the money will go to highway, bridges and water projects. That might mean very little for rural broadband.

One thing is for sure, though. If something like this plan becomes law then the power to choose infrastructure projects devolves largely to states rather than the federal government. Today states propose projects to the feds, but under this plan the states would be able to use much of the federal funding as they see fit.

There are states that already fund some rural broadband infrastructure, and you might suppose those states would shuttle some of this new funding into those programs. But there are other states, some very rural, that have rejected the idea of helping to fund broadband. Expect a widely varying response if the states get the power to choose projects.

In summary, this plan is not likely going to mean any federal broadband grant program. But states could elect to take some of this funding, particularly the $50B rural fund, and use it to promote rural broadband. But there are likely to be as many different responses to this funding as there are states. We have a long way to go yet to turn this proposal into concrete funding opportunities.