Setting the FCC Definition of Broadband

In the recently released 2018 Broadband Progress Report the FCC reluctantly kept the official definition of broadband at 25/3 Mbps. I say reluctantly because three of the Commissioners were on record for either eliminating the standard altogether or else reverting back to the older definition of 10/1 Mbps.

I’m guessing the Commissioners gave in to a lot of public pressure to keep the 25/3 standard. Several Commissioners had also taken a public stance that they wanted to allow cellular data to count the same for a household as landline broadband – and that desire was a big factor in lowering the definition since cellphones rarely meet the 25/3 speed standard.

The deliberation on the topic this year raises the question if there is some way to create a rule that would better define the speed of needed broadband. It’s worth looking back to see how the Tom Wheeler FCC came up with the 25/3 definition. They created sample profiles of the way that households of various sizes are likely to want to use broadband. In doing so, they added together the bandwidth needed for various tasks such as watching a movie or supporting a cellphone.

But the FCC’s method was too simple and used the assumption that various simultaneous uses of broadband are additive. They added together the uses for a typical family of four which resulted in bandwidth needs greater than 20 Mbps download, and used that as the basis for setting the 25/3 standard. But that’s now home broadband works. There are several factors that affect the actual amount of bandwidth being used:

For example, doing simultaneous tasks on a broadband network increases the overhead on the home network. If you are watching a single Netflix stream, the amount of needed bandwidth is predictable and steady. But if three people in a home are each watching a different Netflix the amount of needed bandwidth is greater than adding together the three theoretical streams. When your ISP and your home router try to receive and untangle multiple simultaneous streams there are collisions of packets that get lost and which have to be retransmitted. This is described as adding ‘overhead’ to the transmission process. Depending on the nature of the data streams the amount of collision overhead can be significant.

Almost nobody directly wires the signal from their ISP directly too all of their devices. Instead we use WiFi to move data around to various devices in the home. A WiFi router has an overhead of its own that adds to the overall bandwidth requirement. As I’ve covered in other blogs, a WiFi network is not impacted only by the things you are trying to do in your home, but a WiFi network is slowed when it pauses to recognizes demands for connection from your neighbor’s WiFi network.

Any definition of home broadband needs should reflect these overheads. If a household actually tries to download 25 Mbps of usage from half a dozen sources at the same time on a 25 Mbps, the various overheads and collisions will nearly crash the system.

The FCC’s definition of broadband also needs to reflect the real world. For example, most of the unique programming created by Netflix and Amazon Prime are now available in 4K. I bought a large TV last year and we now watch 4K when it’s available. That means a stream of 15-20 Mbps download. That stream forced me to upgrade my home WiFi network to bring a router into the room with the TV.

The FCC’s speed definition finally needs to consider the busy hour of the day – the time when a household uses the most broadband. That’s the broadband speed that the home needs.

We know household bandwidth needs keep increasing. Ten years ago I was happy with a 5 Mbps broadband product. Today I have a 60 Mbps product that seems adequate, but I know from tests I did last year that I would be unhappy with a 25 Mbps connection.

The FCC needs a methodology that would somehow measure actual download speeds at a number of homes over time to understand what homes area really using for bandwidth. There are ways that this could be done. For example, the FCC could do something similar for broadband like what Nielsen does for cable TV. The FCC could engage one of the industry firms that monitor broadband usage such as Akamai to sample a large number of US homes. There could be sample voluntary homes that meet specific demographics that would allow monitoring of their bandwidth usage. The accumulated data from these sample homes would provide real-life bandwidth usage as a guide to setting the FCC’s definition of broadband. Rather than changing the official speed periodically, the FCC could change the definition as needed as dictated by the real-world data.

The FCC does some spot checking today of the broadband speeds as reported by the ISPs that feed the national broadband map. But that sampling is random and periodic and doesn’t provide the same kind of feedback that a formal ongoing measuring program would show. We have tools that could give the FCC the kind of feedback it needs. Of course, there are also political and other factors used in setting the official definition of broadband, and so perhaps the FCC doesn’t want real facts to get into the way.

Comparing Streaming and Broadcast Video

One thing that doesn’t get talked about a lot in the battle between broadcast TV and on-line video is video quality. For the most part today broadcast TV still holds the edge over on-line video.

When you think of broadcast TV over a cable system I can’t help but remember back twenty years ago when the majority of the channels on a cable system were analog. I remember when certain channels were snowy, when images were doubled with ghosts and the first couple of channels in the cable system were nearly unwatchable. Today the vast majority of channels on most cable systems are digital, but there are still exceptions. The conversion to digital resulted in a big improvement in transmission quality.

When cable systems introduced HDTV and the quality got even better. I can remember flipping back and forth between the HD and SD versions of the same channel on my Comcast system just to see the huge difference.

This is not to say that cable systems have eliminated quality issues. It’s still common on many cable systems to see pixilation, especially during high action scenes where the background is constantly changing. All cable systems are not the same, so there are differences in quality from one city to the next. All digital video on cable systems is compressed at the head-end and decompressed at the settop box. That process robs a significant amount of quality from a transmission and one only has to compare any cable movie to one from a Blu-ray to realize how much is lost in the translation.

In the on-line world buffered video can be as good as good as cable system video. But on-line video distributors tend to compress video even more than cable systems – something they largely can get away with since a lot of on-line video is watched on smaller screens. And this means that a side-by-side comparison of SD or HD movies would usually favor the cable system. But Netflix, Amazon and a few others have one advantage today with the spectacular quality of their 4K videos – there is nothing comparable on cable networks.

But on-line live-streamed video still has significant issues. I watch sports on-line and the quality is often poor. The major problem with live-streamed video is mostly due to delays in the signal getting to the user. Some of that delay is due to latency – either latency in the backbone network between the video creator and the ISP or latency in the connection between the ISP and the end-user. Unlike downloading a data file where your computer will wait until it has collected all of the needed packets, live-streamed video is sent to end-users with whatever pixels have arrived at the needed time. This creates all sorts of interesting issues when watching live sports. For instance, there is pixilation, but it doesn’t look like the pixilation you see on cable network. Instead parts of the screen often get fuzzy when they aren’t receiving all the pixels. There are also numerous problems with the video. And it’s still not uncommon for the entire picture to freeze for a while, which can cause an agonizing gap when you are watching sports since it always seems to happen at a critical time.

Netflix and Amazon have been working with the Internet backbone providers and the ISPs to fix some of these issues. Latency delays in getting to the ISPs is shrinking and, at least for the major ISPs, will probably not be an issue. But the one issue that still needs to be resolved is the crashes that happen when the Internet gets overloaded when the demand is too high. We’re seeing ISPs bogging down when showing a popular stream like the NBA finals, compared to a normal NBA game that might only be watched by a hundred thousand viewers nationwide.

One thing in the cable system’s favor is that their quality ought to be improving a lot over the next few years. The big cable providers will be implementing the new ATSC 3.0 video standard that is going to result in a significant improvement in picture quality on HD video streams. The FCC approved the new standard earlier this year and we ought to see it implemented in systems starting in 2018. This new standard will allow cable operators to improve the color clarity and contrast on existing HD video. I’ve seen a demo of a lab version of the standard and the difference is pretty dramatic.

One thing we don’t know, of course, is how much picture quality means to the average video user. I know my teenage daughter seems quite happy watching low-quality video made by other teens on Snapchat, YouTube or Facebook Live. Many people, particularly teens, don’t seem to mind watching video on a smartphone. Video quality makes a difference to many people, but time will tell if improved video quality will stem the tide of cord cutting. It seems that most cord cutters are leaving due to the cost of traditional TV as well as the hassle of working with the cable companies and better video might not be a big enough draw to keep them paying the monthly cable bill.

Are You Ready for 4K Video?

The newest worry for ISPs is the expansion of 4K video. Already today Netflix and Amazon are offering on-line 4K video to customers. Almost all of the new programming being created by both companies is being shot in 4K.

Why is this a concern for ISPs? Netflix says that in order to enjoy a streaming 4k signal that a user ought to have a spare 15 – 20 Mbps of bandwidth available if streaming with buffering. The key word is spare, meaning that any other household activity ought to be using other bandwidth. Netflix says that without buffering that a user ought to have a spare 25 Mbps.

When we start seeing a significant number of users stream video at those speeds even fiber networks might begin experiencing problems. I’ve never seen a network that doesn’t have at least a few bottlenecks, which often are not apparent until traffic volumes are high. Already today busy-hour video is causing stress to a lot of networks. I think about millions of homes trying to watch the Super Bowl in 4K and shudder to think what that will mean for most networks.

While 4K video is already on-line it is not yet being offered by cable companies. The problem for most of the industry is that there is no clear migration path between today and tomorrow’s best video signal. There are alternatives to 4K being explored by the industry that muddy the picture. Probably the most significant new technology is HDR (high-dynamic range) video. HDR has been around for a few years, but the newest version which captures video in 10-bit samples adds both contrast and color accuracy to TVs. There are other video improvements also being explored such as 10-bit HEVC (high-efficiency video coding) which is expected to replace today’s H.264 standard.

The uncertainty of the best technology migration path has stopped cable companies from making upgrades to HDR or 4K. They are rightfully afraid to invest too much in any one version of the early implementations of the technology to then face more upgrades in just a few years. But as the popularity of 4K video increases, the pressure is growing for cable companies to introduce something soon. It’s been reported that Comcast’s latest settop box is 4K capable, although the company is not making any public noise about it.

But as we’ve seen in the past, once customers start buying 4K capable TVs they are going to want to use them. It’s expected by 2020 that almost every new TV will include some version of HDR technology, which means that the quality of watching today’s 1080 pixel video streams will improve. And by then a significant number of TVs will come standard with 4K capabilities as well.

I remember back when HD television was introduced. I have one friend who is a TV buff and once he was able to get HD channels from Comcast he found that he was unable to watch anything that was broadcast in standard definition. He stopped watching any channel that did not broadcast HD and ignored a huge chunk of his Comcast line-up.

The improvements of going to 4K and/or true HDR will be equally as dramatic. The improvement in clarity and color is astonishing as long as you have a TV screen large enough to see the difference. And this means that as people grow to like 4K quality they will migrate towards 4K content.

One thing that is clear is that 4K video will force cable companies to broadcast video over the IP stream. A single 4K signal eats up an entire 6 MHz channel on a cable system making it impossible for any cable system to broadcast more than a tiny number of 4K channels in the traditional way. And, like Comcast is obviously preparing to do, it also means all new settop boxes and a slew of new electronics at the cable headend to broadcast IPTV.

Of course, like any technology improvement we’ve seen lately, the improvements in video quality don’t stop with 4K. The Japanese plan to broadcast the 2020 Olympics in 8K video. That requires four times as much bandwidth as 4K video – meaning an 80 – 100 Mbps spare IP path. I’m sure that ways will be found to compress the transmission, but it’s still going to require a larger broadband pipe than what most homes buy today. It’s expected that by 2020 that there will only be a handful of users in Japan and South Korea ready to view 8K video, but like anything dramatically new, the demand is sure to increase in the following decade.

Do We Really Want to be Watched?

Outdoor cameraI noticed an article that mentioned that the Google free WiFi hotspots in New York City are equipped with cameras and are able to be used for surveillance. Google says that this function has not been activated, but it got me to thinking about public video surveillance in general.

There has been a surge in recent years in the installation of public surveillance cameras, fostered in part by the fiber networks that are being built everywhere. The sale of outdoor surveillance equipment is growing at about 7% per year. And the quality of that equipment is rapidly improving. New surveillance cameras no longer produce the grainy pictures we all think of as synonymous with security footage but are now using high definition and even 4K video technologies to drastically improve the quality of the images. Fiber bandwidth is allowing for higher frame rates and fewer gaps in the sequence of images.

The city of London led the way over a decade ago and installed cameras to saturate certain downtown neighborhoods in the city. After having had these in place for a long time the statistics show that the cameras haven’t really changed the crime rate in the watched neighborhoods. While it did change the nature of the crimes in the areas somewhat, the overall crime rate is close to what it was before the cameras.

Probably the biggest public fear about surveillance is that public cameras will be used to track where we go in public. I know I’m not nuts about the idea of a store knowing who I am as soon as I walk through the door, and I’m even more skeptical of having the government track me as I walk down city streets.

That aspect of surveillance is going to require better facial recognition technology. Currently, Facebook facial recognition is said to be able to identify people 98% of the time. Facebook is able to get such good results by limiting its search to friends and friends-of-friends of the person that posts a picture. Facebook also benefits from the pictures of people from different angles and different lighting which lets it build better profiles. The FBI’s software is said to be 85% accurate if they can limit a search to no more than 50 people for the software to consider.

There is no facial recognition software yet that is very good at identifying random people on a public street. However, everybody expects that software to be here in less than a decade through assistance from Artificial Intelligence.

Public surveillance cameras open up a number of ethical issues. The first is that it’s too tempting for law enforcement insiders to misuse surveillance information. Back in 1997 a high-ranking police officer in DC was convicted of using surveillance cameras near a gay bar for identifying patrons through license plates and then blackmailing them. The Detroit Free Press reported on cases of policemen using surveillance systems to help friends, stalk estranged spouses, or harass those with whom they had a traffic altercation.

Terrorism experts say that public surveillance cameras not only don’t deter terrorist attacks, but might instead invite them by producing images of a terrorist attack.

There are also arguments that video surveillance constitutes fourth amendment violations through unreasonable searches. The concern comes not just from having government cameras identifying you on the street, but that over time using that data to create a profile about where and when you go out, who you see, and what you do in public.

I know that a lot of US cities are considering putting in a lot more surveillance cameras as part of smart city initiatives. Tampa, near to me, has already begun the installation of an extensive outside camera network. I’m sure the city officials that do this have what they think are good reasons for watching their citizens, but our short history with the technology shows that such systems will be used for purposes different than what was intended. I, for one, am not a fan of the whole concept and I suspect most people don’t really want to be watched.

The Growth of 4K Video

4K CameraIt looks like 4K video is making it into the mainstream and is going to put a big strain on broadband networks serving residential customers. 4K video resolution is 3840 x 2160 pixels, or about 8 million pixels on a screen, which is about four times more resolution than an HD display. It takes a lot of bandwidth to stream that many pixels and with current compression technologies 4K video requires 15 – 20 Mbps download speeds. Google and others are working on better compression techniques that might cut that in half, but even so that would mean videos streams at 7 – 10 Mbps. That’s a whole new level of broadband demand that will increase the household need for faster data speeds.

Just a year ago it wasn’t easy to find 4K video on the web, but this year there is a lot of content being shot in the format. This includes:

  • Netflix is currently shooting most of its original content like House of Cards, Breaking Bad, Jessica Jones, Daredevil in 4K. It also has a big array of documentaries in the format as well as a number of classic movies being reformatted to 4K.
  • Amazon Prime is also filming new content like Alpha House, Transparent, Mozart in the Jungle and Man in the High Castle in 4K. They have a small library of movies in the format.
  • Sony probably has more mainstream movies in the 4K format than anybody. Rather than streaming you download Sony movies and a typical movie can take 40 GB of storage space. It doesn’t take too many movie downloads to blow the data caps of AT&T or Comcast.
  • M-Go has developed a small but growing 4K library in conjunction with Samsung. They also will be adding title from Fox.
  • Comcast offers a few movies in 4K online for customers in partnership with NBC Universal.
  • YouTube has a huge amount of user-generated 4K video of all different types. YouTube is also now producing original content sold under YouTube Red and which contains 4K content.
  • Ultraflix has a big library of 4K nature documentaries including some originally produced for IMAX. They are also carrying lot of Hollywood movies.
  • Vudu, which is owned by Walmart has a small, but high quality 4K set of content. They are the first to marry 4K video to Dolby surround sound.

If 4K follows the same migration path of standard definition video to HD video, then within a few years 4K content is going to be everywhere. Where just a few years ago there was little video on the web, video now seems to be everywhere. There are video ads on all sorts of websites and social media services like Facebook and Twitter spit out piles of video at a user these days.

One of the biggest problems with broadband regulation in this country is that it fails to recognize the ever-growing nature of broadband demand. Once households start using 4K video then the FCC’s newly minted definition of broadband at 25 Mbps download will already be getting stressed. The fact is that the household needs for broadband are just going to keep growing year after year and any regulatory definition of demand will be obsolete almost as soon as it is established.

Broadband demand has been growing steadily and doubling about every three years and there is no reason to think that we are anywhere close to the time when that growth curve is going to slow. 4K video is not the last new technology that will stretch our needs for broadband. When I read about where virtual and augmented reality are headed over the next five years it’s not to hard to see where the next big push for more broadband will come from.

A Network Without Wires

Wi-FiThere is an emerging trend in the industry to try to create home networks without wires. ISPs and cable companies are all putting a lot of faith into WiFi as an alternative for wires running to computers and settop boxes.

It’s an interesting trend but one that is not without peril. The problem is that WiFi, at least like the big ISPs deliver it, is not always the best solution. The big cable companies like Comcast tend to provide customers with a cable modem with a decent quality WiFi router built in. This router is placed wherever the cable enters the home, which might not be the ideal location.

A single strong WiFi router can be a great device in a home with a simple network and uncomplicated demands. A family with two TVs, one computer, and a few smartphones is probably going to do fine with a strong WiFi router as long as the house isn’t too large for the signal to get where it’s needed.

But we are quickly changing to a society where many homes have complex data needs scattered throughout the house. People are likely to be demanding video streams from all over the home, and often many at the same time. There are bound to be a few computers and it’s not unlikely that somebody in the house works at home at least part of the time. Demands for big bandwidth for things like gaming and the new virtual reality sets that are just now hitting the market are increasing. And we are on the verge of seeing 4K video streams at 15 Mbps. On top of all this will be a variety of smart IoT devices that are going to want occasional attention from the network.

When a home gets crowded with devices it’s very easy to overwhelm a WiFi router. The new routers are pretty adept at setting up multiple data paths. But with too many streams the router will lose efficiency as it constantly tries to monitor and change the bandwidth for each stream it is managing. When this happens a home network can bog down, dropping the efficiency of the router precipitously.

There are a few solutions to this problem. First, you can run wires directly to a few of the bigger data eaters in a house and remove them from the WiFi network. Just make sure in doing so that you also disable having them search for a WiFi signal. But people don’t really want more wires in their home, and ISPs definitely do not like this idea.

The other solution is to add additional WiFi hotspots in the home. The simplest example of this are WiFi repeaters that simply amplify the signal from the base WiFi hotspot. However, repeaters don’t improve the contention issue, they simply bring a stronger signal closer to some of the devices that need them.

A more complex solution is to set up a network of interconnected WiFi hotspots. This consists of separate WiFi routers that all feed through one base router, a configuration that is familiar to any network engineer but alien to most home owners. The main problem with this solution is obvious to anybody who has ever operated a network with multiple routers – getting them to work together efficiently. Setting up a multiple-router network can be challenging to those unfamiliar with networks. And if configured poorly this kind of network can operate worse than one big hotspot.

But these kinds of interconnected WiFi networks are the cutting edge of home networking. I was recently talking to an engineer from a mid-size cable company and he admitted that as many as 20% of their customers already need this kind of solution. It’s a bit ironic that the demand for WiFi is mushrooming so soon after the ISPs went to the one-router solution. The percentage of homes that need a better solution is growing rapidly as homes jam more devices onto WiFi.

So there is an opportunity here for any ISP. Customers need better networks in their homes and there is a revenue opportunity in helping them to set these up. The downside, at least for now, is that this is labor intensive and there may be a lot of maintenance to keep these networks running right. But there are a number of vendors looking into solutions and one would hope that home WiFi networks will soon become plug and play.

Video and Broadband Speeds

slow-downAkamai has released their latest quarterly report on the state of broadband around the world. Akamai runs network monitoring software for large ISPs and the Internet backbone providers and they get a peek inside actual broadband speeds achieved by end users.

Overall the worldwide Internet keeps getting faster each year. The average speeds achieved by end users was 5.1 Mbps download in the third quarter of 2015, up 14% from the year before. Topping that list was South Korea at 20.5 Mbps followed by Sweden at 17.4 Mbps and Norway at 16.4 Mbps. The US placed 16th globally with an average speed of 12.6 Mbps, up 9.4% from a year ago.

Akamai says that only about 15% of the connection in the world are ready for 4K video which they estimate will require about a 15 Mbps connection. That’s not a totally accurate figure, but rather an average speed for a 4K video connection. Like with all video, the speeds required for any given video clip varies by how much the picture changes, with high action video requiring more bandwidth than low-action scenes.

And so a house that had exactly a 15 Mbps connection could watch some 4K video, but they might not be able to watch a very high-action film. Further, this measurement ignores the fact that these days homes have an additional need for bandwidth for a host of other uses that range from emails, programs and apps that talk to the cloud and a host of other things that happen in the background. It’s more realistic to think that a home is going to need something closer to 20 Mbps if they are going to want to reliably watch 4K video while accommodating other normal uses of bandwidth.

One of the most interesting statistics of the survey is that the number of homes that get at least 15 Mbps rose to 15% from only 5.2% a year earlier. It’s obvious that ISPs are selling more higher bandwidth connections.

There was a recent announcement that is going to have a big impact on the ability of people to watch quality video. Netflix announced that it is rolling out a new technology that is going to maximize the quality of video to each user experience. It is going to offer what it thinks is the best bit rate based upon the content being viewed and the viewer’s video stream. Again, this goes back to the fact that there is a significant difference between a high-action movie and one that just has people sitting and talking.

In the past Netflix only had a few standard speeds that they tried. If they were unable to get a stream through at the speed that people requested they would step the speed down to a fairly low level and hoped it worked. But for people on slow connections, this often has meant lower quality movies, but also transmission problems such pauses in the movie stream when viewing outpaced download.

The new technology is supposedly going to be a lot more dynamic. Before, if somebody asked for an HD stream then Netflix tried to send it out at 5.8 Mbps. If a customer’s ISP couldn’t handle this they were automatically downloaded to something much slower.

But now, Netflix will first set the download speed according to the content. There are low-action HD videos that might only need 4 or 4 Mbps. And so Netflix will figure out the optimum target speeds for each type of content. Further, they will use a wide range of possible step-downs in speeds rather than going directly from HD to a very slow speed.

I’ve seen this being touted in a number of articles as something that will save a lot of bandwidth for Netflix since they will not force all HD content into 5.8 Mbps streams. But those articles also see this as a savings for ISPs and I think they are wrong. I think this means that ISPs with very fast speeds will also see a bandwidth savings, but interestingly, ISPs with slow network speeds will probably see an overall increase in bandwidth demand from Netflix.

Today if an ISP offers 3 Mbps, then Netflix might send them an HD video at a third of that speed. But with this new technology Netflix is going to try to maximize the customer experience and will use up more of the available bandwidth. This technology will also make it easier for households with somewhat slow bandwidth to watch more than one video at a time and the Netflix algorithms will try to fit the content into the available data path.

For now Netflix is the only company doing this, but like with all breakthroughs you can expect the rest of the industry to catch up in a year or so. One thing is certain, and that is that web video is here to stay and ISPs are going to be under tremendous pressure to provide enough bandwidth to allow people to watch what they want online. There doesn’t seem to be any end in sight for the demand of household bandwidth.

 

What’s Up with 4K Video?

4K videoIt seems like I can’t read tech news lately without seeing an article mentioning something new going on with 4K video. So I thought I would talk a bit about what 4K is and how it differs from other current types of video.

4K is the marketing term to cover what is officially named Ultra High Definition (UHD) video. UHD video is distinguished from current high definition video by having a higher picture resolution (more pixels) as well as more realistic colors and higher frame rates (meaning more pictures per second).

Let’s start with some definitions. 4K video is defined by the Consumer Electronics Association as a video stream that has at least 3,840 X 2,160 pixels. This contrasts to existing high definition (HD) video that has 1,920 X 1,080 pixels and standard definition video (SD) that has 720 X 480 pixels. These are not precise standards—for example there is some SD video that is broadcast at 540 pixels. There is also an existing standard for some video cameras that record at 4,096 X 2,160 pixels which is also considered 4K.

The 4K standard was developed in an attempt to be able to deliver digital media to movie theaters. This would save a lot of money compared to shipping around reels of film. Standard HD does not project well onto the big screens and 4K will overcome a lot of these shortfalls. But high action movies require more definition than is provided by 4K and will require the upcoming 8K video standard to be able to be digitally transmitted for use on the largest screens.

Interestingly, there is not a huge increase in quality from shifting home viewing from HD to 4K. There is a huge improvement in quality between SD and HD, but the incremental improvements between HD and 4K are much harder to discern. The improvements are more due to the number of different colors being projected, because the human eye cannot really see the pixel differences when viewed on relatively small computers or home TV screens. It’s easy to get fooled about the quality of 4K due to some of the spectacular demo videos of the technology being shown on the web. But these demos are far different than what run-of-the-mill 4K will look like, and if you think back there were equally impressive demos of HD video years ago.

The major difference between HD and 4K for the broadband industry is the size of the data stream needed to transmit all of the pixel data. Current 4K transmissions online require a data path between 18 Mbps and 22 Mbps. This is just below the FCC’s definition of broadband and according to the FCC’s numbers, only around 20% of homes currently have enough broadband speed to watch 4K video. Google just recently announced that they have developed some coding schemes that might reduce the required size of a 4K transmission by 40% to 50%, but even with that reduction 4K video is going to put a lot of strain on ISPs and broadband networks, particularly if homes want to watch more than one 4K video at a time.

I recently read that 15% of the TVs sold in 2015 were capable of 4K and that percentage is growing rapidly. However, lagging behind this is 4K capable settop boxes; anybody that wants to get 4K from their cable provider will require a new box. Most of the large cable providers now offer these boxes, but often at the cost of another monthly fee.

Interestingly, there is a lot of 4K video content on the web, much of it filmed by amateurs and available on sites like YouTube or Vimeo. But there is a quickly increasing array of for-pay content. For instance, most of the Netflix original content is available in 4K. Amazon Prime also has Breaking Bad and other original content in 4K. It’s been reported that the next soccer World Cup will be filmed in 4K. There are a number of movies now being shot in 4K as well as a library of existing IMAX films which fit well into this format. Samsung has even lined up a few movies and series in 4K which are only available to people with Samsung 4K TVs.

One thing is for sure, it looks like 4K is here to stay. More and more content is being recorded in the format and one has to imagine that over the next few years 4K is going to become as common as HD video is today. And along with the growth of 4K demand will come demand for better bandwidth.

Comcast Trying Data Caps Again

comcast-truck-cmcsa-cmcsk_largeYet again Comcast is trying to introduce data caps. They have introduced what they are calling ‘data usage trials’ in Miami and the Florida Keys. For some reason most of their past trials for this have also been in the southeast. The new plan gives customers a monthly data cap of 300 gigabits of downloaded data. After you hit that cap then every additional 50 gigabits costs $10. For $30 extra you can get unlimited downloads.

When Comcast tried caps a few years ago they used a monthly cap of 250 gigabits. Since the average household has been doubling the amounts of data used every three years, the new cap is stingier than the old 250 GB cap since households would have normally almost doubled usage compared to the last time Comcast tried this. This means the 300 GB cap is going to affect a lot more people than the old cap.

What is probably most annoying about this is that Comcast is refusing this time to call these data caps. Instead they are calling this a ‘data usage trial’ and are trying hard to compare themselves to the plans sold by the cell phone companies. Of course, everybody in the world understands those cellular plans to be data caps.

It’s not hard to understand why Comcast wants to do this. While broadband subscriptions continue to grow, with the overall US market at an 83% broadband penetration there is not a lot of future upside in broadband sales. Further, I know that Comcast is eyeing the financial performance of the cellphone companies with envy since they can see the significant revenues generated by AT&T and Verizon with their data caps.

But Comcast also must understand that customers are absolutely going to hate these caps. Households are watching online video more and more and it is that usage that is driving the vast majority of downloads. There are other households that have big usage due to gaming, and some households that still engage in file-sharing, even though that is often illegal and riskier than it used to be.

The last time Comcast did this they saw a massive customer revolt and I certainly expect that to happen again. Take my case. I estimate that we probably use at least 500 GB per month. So for me this is basically means a $30 increase in my data rate. They have already pushed me to the edge of tolerance by forcing me to buy a basic TV package that I don’t use in order to get a 50 Mbps cable modem. If they introduce this cap they would push me over $100 per month just to get a broadband connection. At that point I start taking a very serious look at CenturyLink, the other provider in my neighborhood.

The biggest problem with any data caps is that, no matter where the cap is set, over time more and more customers are going to climb over it. We are just now starting to see the first proliferation of 4K video, and at download requirements of 18–22 Mbps this will blow past the data cap in no time.

What is most ridiculous about data caps either for cellular or landline data is that the US already has the most expensive Internet access of all of the developed countries. ISPs are already reaming us with ridiculously expensive broadband access and are now scheming for ways to make us pay more. The margins on US broadband are astronomical, in the 90% plus profit margin range. So data caps at a company like Comcast are purely greed driven, nothing else. There are zero network or performance issues that could justify penalizing customers who actually use the data they are paying for.

I am not entirely against data caps. For example, I have one client that has a 1 terabit cap on their basic data product and 2 terabits on their fastest product. They don’t use these caps to jack up customer prices, but instead use them as an opportunity to discuss usage with customers. For instance, they might convince somebody who is constantly over the 1 terabit cap to upgrade to a product with a higher cap. But mostly they use these caps as a way to force themselves to monitor customers. Their monitoring found a few customers who went over the cap because they were operating some kind of commercial retail server out of their home. Their terms of service prohibit operating a business service over a residential product and they upgraded such customers to a business product, which has no data cap.

If you want to get really annoyed, look at this Comcast blog which explains the new ‘data usage trials.’ It is frankly one of the worst cases of corporate doublespeak that I have read in a long time. You have to feel a bit sorry for the corporate communications people who had to write this drivel, but the ones to hate are their corporate bosses who are determined to make us all pay more for using data.

 

Will 4K Video Make It?

Samsung_UHD_TVIt usually takes a while to see if a new technology gets traction with the public. For example, the 3D television craze of a few years ago fell flat on its face with the viewing public. And now 4K ultra high definition (UHD) video is making enough waves to gets its real world test in the marketplace.

The high-end TV makers certainly are pushing the technology and 2.1 million UHD televisions were shipped in the second quarter of 2014, up from 1.6 million sets for all of 2013. Amazon announced a deal with Samsung to roll out world-wide availability of 4K video streams to Samsung smart TVs. Amazon announced earlier this year that they are building a UHD library by filing all of the unique program made for Amazon in UHD. Netflix has already been filming Breaking Bad and House of Cards in UHD. Fox is marketing a set of 40 movies in UHD that includes Star Trek: Into Darkness.

But there are some obstacles to overcome before UHD becomes mainstream. The cameras and associated hardware and storage needed to film in UHD are expensive, so filmmakers are being cautious about converting to the technology until they know there is a market for it. But the big obstacle for UHD being universally accepted is getting the content into homes. There are issues of both bandwidth and quality.

Pure uncompressed UHD video is amazing. I saw a UHD clip at a trade show of House of Cards running next to an HD clip and the difference was astounding. But it is not practical to broadcast in uncompressed UHD and the compression techniques in use today reduce the quality of the picture. The UHD being delivered by Netflix today is better than their HD quality, but nearly as good as uncompressed UHD.

For those not familiar with compression techniques, they are techniques that reduce the transmission size of video signals, which is necessary to make programming fit into channels on traditional cable systems. And the same sorts of compression techniques are applied to video streams over the Internet from companies like Netflix and AmazonPrime. There are many different techniques used to compress video streams, but the one that saves the most bandwidth is called block-matching, which finds and then re-uses similarities between video frames.

Bandwidth is another roadblock to UHD acceptance. Netflix reports that it requires a steady 15 Mbps download stream to bring UHD to a home. A significant percentage of American homes don’t get enough bandwidth to view UHD. And even having enough bandwidth is no guarantee of a quality experience as has been witnessed with Netflix’s recent fights with Comcast and Verizon over the quality of the SD and HD video streams. It was  reported that even some customers who subscribed to 100 Mbps download products were not getting good Netflix streams.

There are also the normal issues we see in the television industry due to lack of standards. Each manufacturer is coming up with a different way to make UHD work. For example there are two different HDMI standards already in use by different TV manufacturers and the predictions are that HDMI might need to be abandoned altogether as the industry works to goose better quality out of UHD using higher frame rates and enhanced color resolution. And this all causes confusion to home owners or companies that install high-end TVs.

But there is some hope that there will be new technologies and new compression techniques that can be used to improve the quality and decrease the digital footprint of UHD streams. As an example, Faroudja Enterprises, owned by Yves Faroudja, one of the pioneers of HD television standards, announced it has found some improvements that will greatly benefit UHD. His new technique basically will pre-process content before compression and after decompression to get better efficiency in the sharing of bits between frames. He believes he can reliably cut the size of video streams in half using the new technology. His process also would bring efficiencies to HD streams, which is good news for an Internet that is getting bogged down today by video.

Only time is going to tell if the technology is widely accepted. Certainly there is going to be demand from cinephiles who want the absolute best quality from the movies they watch. But we’ll have to see if that creates enough demand to convince more filmmakers to shoot in the UHD format. This is like many new technologies in that there is some of the cart before the horse involved in bringing this fully to market. But there are many in the industry who are predicting that the extra quality that comes from UHD will make it a lasting technology.