Categories
Technology

An Update on ATSC 3.0

This is the year when we’ll finally start seeing the introduction of ATSC 3.0. This is the newest upgrade to broadcast television and is the first big upgrade since TV converted to all-digital over a decade ago. ATSC 3.0 is the latest standard that’s been released by the Advanced Television Systems Committee that creates the standards used by over-the-air broadcasters.

ATSC 3.0 will bring several upgrades to broadcast television that should make it more competitive with cable company video and Internet-based programming. For example, the new standard will make it possible to broadcast over-the-air in 4K quality. That’s four times more pixels than 1080i TV and rivals the best quality available from Netflix and other online content providers.

ATSC 3.0 also will support the HDR (high dynamic range) protocol that enhances picture quality by creating a better contrast between light and dark parts of a TV screen. ATSC 3.0 also adds additional sound channels to allow for state-of-the-art surround sound.

Earlier this year, Cord Cutters News reported that the new standard was to be introduced in 61 US markets by the end of 2020 – however, that has slowed a bit due to the COVID-19 pandemic. But the new standard should appear in most major markets by sometime in 2021. Homes will either have to buy ATSC-enabled TVs, which are just now hitting the market, or they can buy an external ATSC tuner to get the enhanced signals.

One intriguing aspect of the new standard is that a separate data path is created with TV transmissions. This opens up some interesting new features for broadcast TV. For example, a city could selectively send safety alerts and messages to homes in just certain parts of a city. This also could lead to targeted advertising that is not the same in every part of a market. Local advertisers have often hesitated to advertise on broadcast TV because of the cost and waste of advertising to an entire market instead of just the parts where they sell service.

While still in the early stages of exploration, it’s conceivable that ATSC 3.0 could be used to create a 25 Mbps data transmission path. This might require several stations joining together to create that much bandwidth. While a 25 Mbps data path is no longer a serious competitor of much faster cable broadband speeds, it opens up a lot of interesting possibilities. For example, this bandwidth could offer a competitive alternative for providing data to cellphones and could present a major challenge to cellular carriers and their stingy data caps.

ATSC 3.0 data could also be used to bring broadband into the home of every urban school student. If this broadband was paired with computers for every student, this could go a long way towards solving the homework gap in urban areas. Unfortunately, like most other new technologies, we’re not likely to see the technology in rural markets any time soon, and perhaps never. The broadband signals from tall TV towers will not carry far into rural America.

The FCC voted on June 16 on a few issues related to the ATSC 3.0 standard. In a blow to broadcasters, the FCC decided that TV stations could not use close-by vacant channels to expand ATSC 3.0 capabilities. The FCC instead decided to maintain vacant broadcast channels to be used for white space wireless broadband technology.

The FCC also took a position that isn’t going to sit as well with the public. As homeowners have continued to cut the cord there have been record sales in the last few years of indoor antennas for receiving over-the-air TV. Over-the-air broadcasters are going to be allowed to sunset the older ATSC 1.0 standard in 2023. That means that homes will have to replace TVs or will have to install an external ATSC 3.0 tuner if they want to continue to watch over-the-air broadcasts.

Categories
The Industry

How We Use More Bandwidth

We’ve known for decades that the demand for broadband growth has been doubling every three years since 1980. Like at any time along that growth curve, there are those that look at the statistics and think that we are nearing the end of the growth curve. It’s hard for a lot of people to accept that bandwidth demand keeps growing on that steep curve.

But the growth is continuing. The company OpenVault measures broadband usage for big ISPs and they recently reported that the average monthly data use for households grew from 201.6 gigabytes in 2017 to 268.7 gigabytes in 2018 – a growth rate of 33%. What is astounding is the magnitude of growth, with an increase of 67.1 gigabytes in just a year. You don’t have to go back very many years to find a time when that number couldn’t have been imagined.

That kind of growth means that households are finding applications that use more bandwidth. Just in the last few weeks I saw several announcements that highlight how bandwidth consumptions keep growing. I wrote a blog last week describing how Google and Microsoft are migrating gaming to the cloud. Interactive gaming already uses a significant amount of bandwidth, but that usage is going to explode upwards when the machine operating the game is in a data center rather than on a local computer or game console. Google says most of its games will operate using 4K video, meaning a download speed of at least 25 Mbps for one stream plus an hourly download usage of 7.2 GB.

I also saw an announcement from Apple that the users of the Apple TV stick or box can now use it on Playstation Vue to watch up to four separate video steams simultaneously. That’s intended for the serious sports fan and there are plenty of households that would love to keep track of four sporting events at the same time. If the four separate video streams are broadcast in HD that would mean downloading 12 GB per hour. If the broadcasts are in 4K that would be an astounding 29 GB per hour.

The announcement that really caught my eye is that Samsung is now selling an 8K video-capable TV. It takes a screen of over 80 inches for the human eye to perceive any benefit from 8K video. There are no immediate plans for anybody to broadcast in 8K, but the same was true when the first 4K TVs were sold. When people buy these TVs, somebody is going to film and stream content in the format. I’m sure that 8K video will have some improved compression techniques, but without a new compression scheme, an 8K video stream is 16 times larger than an HD stream – meaning a theoretical download of 48 GB per hour.

Even without these new gadgets and toys, video usage is certainly the primary driver of the growth of household broadband. In 2014 only 1% of homes had a 4K-capable TV – the industry projects that to go over 50% by the end of this year. As recently as two years ago you had to search to find 4K programming. Today almost all original programming from Netflix, Amazon, and others is shot in 4K, and the web services automatically feed 4K speeds to any customer connection able to accept it. User-generated 4K video, often non-compressed, is all over YouTube. There are now 4K security cameras on the market, just when HD cameras have completely replaced older analog cameras.

Broadband usage is growing in other ways. Cisco projects machine-to-machine connections will represent 51% of all online connections by 2022, up from 40% today. Parks and Associates just reported that the average broadband home now has ten connected devices, and those devices all make internet connections on their own. Our computers and cellphone automatically update software over our broadband connections. Many of us set our devices to automatically back-up our hard drives, pictures, and videos in the cloud. Smart home devices constantly report back to the alarm monitoring service. None of these connections sound large, but in aggregate they really add up.

And sadly, we’re also growing more inefficient. As households download multiple streams of music, video, and file downloads we overload our WiFi connection and/or our broadband connection and thus request significant retransmission of missing or incomplete packets. I’ve seen estimates that this overhead can easily average 20% of the bandwidth used when households try to do multiple things at the same time.

I also know that when we look up a few years from now to see that broadband usage is still growing that there will be a new list of reasons for the growth. It may seem obvious, but when handed enough bandwidth, households are finding a way to use it.

Categories
The Industry

Gaming Migrates to the Cloud

We are about to see a new surge in demand for broadband as major players in the game industry have decided to move gaming to the cloud. At the recent Game Developer’s Conference in San Francisco both Google and Microsoft announce major new cloud-based gaming initiatives.

Google announced Stadia, a platform that they tout as being able to play games from anywhere with a broadband connection on any device. During the announcement they showed transferring a live streaming game from desktop to laptop to cellphone. Microsoft announced the new xCloud platform that let’s Xbox gamers play a game from any connected device. Sony Playstation has been promoting online play between gamers from many years and now also offers some cloud gaming on the Playstation Now platform.

OnLive tried this in 2011, offering a platform that was played in the cloud using OnLive controllers, but without needing a computer. The company failed due to the quality of broadband connections in 2011, but also due to limitations at the gaming data centers. Both Google and Microsoft now operate regional data centers around the country that house state-of-the-art whitebox routers and switches that are capable of handling large volumes of simultaneous gaming sessions. As those companies have moved large commercial users to the cloud they created the capability to also handle gaming.

The gaming world was ripe for this innovation. Current gaming ties gamers to gaming consoles or expensive gaming computers. Cloud gaming brings mobility to gamers, but also eliminates need to buy expensive gaming consoles. This move to the cloud probably signals the beginning of the end for the Xbox, Playstation, and Nintendo consoles.

Google says it will support some games at the equivalent of an HD video stream, at 1080p and 60 frames per second. That equates to about 3GB of downloaded per hour. But most of the Google platform is going to operate at 4K video speeds, requiring download speeds of at least 25 Mbps per gaming stream and using 7.2 GB of data per hour. Nvidia has been telling gamers that they need 50 Mbps per 4K gaming connection.

This shift has huge implications for broadband networks. First, streaming causes the most stress on local broadband networks since the usage is continuous over long periods of times. A lot of ISP networks are going to start showing data bottlenecks when significant numbers of additional users stream 4K connections for hours on end. Until ISPs react to this shift, we might return to those times when broadband networks bogged down in prime time.

This is also going to increase the need for download and upload speeds. Households won’t be happy with a connection that can’t stream 4K, so they aren’t going to be satisfied with a 25 Mbps connection that the FCC says is broadband. I have a friend with two teenage sons that both run two simultaneous game streams while watching a steaming gaming TV site. It’s good that he is able to buy a gigabit connection on Verizon FiOS, because his sons alone are using a continuous broadband connection of at least 110 Mbps, and probably more

We are also going to see more people looking at the latency on networks. The conventional wisdom is that a gamer with the fastest connection has an edge. Gamers value fiber over cable modems and value cable modems over DSL.

This also is going to bring new discussion to the topic of data caps. Gaming industry statistics say that the average serious gamer averages 16 hours per week of gaming. Obviously, many play longer than the average. My friend with the two teenagers is probably looking at least at 30 GB per hour of broadband download usage plus a decent chunk of upload usage. Luckily for my friend, Verizon FiOS has no data cap. Many other big ISPs like Comcast start charging for data usage over one terabyte per month – a number that won’t be hard to reach for a household with gamers.

I think this also opens up the possibility for ISPs to sell gamer-only connections. These connections could be routed straight to peering arrangements with the Google or Microsoft to guarantee the fastest connection through their network and wouldn’t mix gaming streams with other household broadband streams. Many gamers will pay extra to have a speed edge.

This is just another example of how the world find ways to use broadband when it’s available. We’ve obviously reached a time when online gaming can be supported. When OnLive tried is there were not enough households with fast enough connections, there weren’t fast enough regional data centers, and there wasn’t a peering network in place where ISPs connect directly to big data companies like Google and bypass the open Internet.

The gaming industry is going to keep demanding faster broadband and I doubt they’ll be satisfied until we have a holodeck in every gamer’s home. But numerous other industries are finding ways to use our increasing household broadband capcity and the overall demand keeps growing at a torrid pace.

 

Categories
What Customers Want

The Terabyte Household

I was just in a meeting the other day with a bunch of ISPs were talking about household downloads. Several said that they were now seeing monthly data usage exceed a terabyte, and those with Comcast were lamenting that this is causing them a lot of money.

I wrote a lot about Comcast data caps a few years ago when the company experimented with really low data caps of 300 gigabytes per month. At that time a lot of households complained that they were exceeding those caps. Comcast was arguing at the time to end net neutrality and I think this persuaded them to back off of the low caps, which they set to 1 terabyte.

Here we are only a few years later and a lot of households are bumping up against and exceeding that data cap. Comcast absolutely knew this was coming and they just pushed the ability to monetize data caps a few years into the future. As an ISP the company knows better than most that the household demand for total downloaded data has been doubling every three years or so. That kind of growth will push a huge number of households over a terabyte within a decade – with many already hitting it now.

Comcast tries to justify data caps by arguing fairness – the same argument they made a few years ago. They say that those that use the Internet the most ought to pay the most. Even if you can buy that argument the penalty for exceeding the data caps are excessive. Comcast doesn’t charge a household for the first two months they exceed a terabyte. After that they have two plans. They will automatically bill $10 for every extra 50 Gigabytes over the data cap – with total excess charges capped at $200 per month. Customers who expect to exceed the data cap can also agree to pay $50 extra every month to get unlimited usage.

Comcast goes on to explain away the terabyte cap by describing what it takes to exceed the cap, as follows:

  • Stream between 600 and 700 hours of HD video
  • Play online games for more than 12,000 hours
  • Stream more than 15,000 hours of music
  • Upload or download more than 60,000 hi-res photos

This explanation is simplistic for a number of reasons. First, full Netflix HD broadcast at 1080p streams at over 7 Mbps and uses roughly 2.5 GB per hour, meaning a terabyte will cover about 400 hours of full HD video. If you have a good broadband connection the chances are that you are watching a lot of 4K video today – it’s something that Netflix and Amazon Prime offer automatically. It only takes only about 180 hours of 4K video in a month to hit the terabyte data cap – a number that is not hard to imagine in a cord-cutting home. The chart also misses obvious large uses like downloading games – with download sizes over 40 GB for one game becoming common.

The Comcast charts also fail to recognize the hidden ways that we all burn through bandwidth today. It’s not untypical for the average household to have a 30% to 40% overhead on Internet usage. That comes from the network having to transmit data multiple times to complete a download request. This overhead is caused for a number of reasons. First are inefficiencies inherent in the open Internet. There are always packets lost on transit that much be sent multiple times. There are also delays caused by the ISP network, particularly networks that are undersized in neighborhoods and that hit capacity during the busy hours. The biggest cause of delays for most of us is in-home WiFi networks that creates a lot of collisions from competing signals.

There are also a lot of background use of the Internet today that surprises people. We now routinely use web storage to back up files. All of the software on our machines upgrade automatically. Many now use applications like video cameras and home alarms that connect in the cloud and that ping back and forth all day. All sorts of other things go on in the background that are a mystery – I’ve noticed my house has significant broadband usage even when we aren’t home. I’ve estimated that this background communication probably eats about 150 gigabytes per month at my house.

When I consider those issues the Comcast terabyte data caps are stingy. A household with a lot of network noise and with a lot of background traffic might hit the data caps using only half of a terabyte of downloaded video or other services like those listed by Comcast. A home today might hit the cap with 200 hours of full HD streaming or 90 hours of 4K streaming.

The other amazing aspect of the terabyte data caps is the charge for using more than a terabyte in a month. As mentioned above, Comcast charges $10 for every extra 50 GB. I’ve done the math for dozens of ISPs and most of my clients spend between $2 and $4 per month on average for the bandwidth per broadband customer. That number includes not only residential users, but for most ISPs also includes some huge commercial broadband customers. The average price varies the most according to how far an ISP is away from the Internet, and that component of the cost is fixed and doesn’t increase due to higher data volumes by the ISP. After backing out this fixed transport cost, my math says that an extra 50 GB of broadband costs an ISP only a few pennies. For a large ISP like Comcast that cost is significantly lower since they peer with the big broadband companies like Netflix, Google and Amazon – and traffic exchanged in those arrangements have nearly zero incremental cost of extra bandwidth.

Finally, the Comcast website claims that less than 1% of their users exceed the terabyte data caps. Only they know the numbers, but I find that hard to believe. When you look at the amount of usage needed to hit that cap there has to be a lot of cord-cutter households already exceeding a terabyte.

The bottom line is that Comcast is extorting homes when they force them to spend $50 per month for unlimited data usage. That extra bandwidth costs them almost nothing. Unfortunately, there isn’t a damned thing any of us can do about this any since Comcast and the other big ISPs got their wish and broadband is no longer regulated by the FCC.

Categories
Regulation - What is it Good For?

Setting the FCC Definition of Broadband

In the recently released 2018 Broadband Progress Report the FCC reluctantly kept the official definition of broadband at 25/3 Mbps. I say reluctantly because three of the Commissioners were on record for either eliminating the standard altogether or else reverting back to the older definition of 10/1 Mbps.

I’m guessing the Commissioners gave in to a lot of public pressure to keep the 25/3 standard. Several Commissioners had also taken a public stance that they wanted to allow cellular data to count the same for a household as landline broadband – and that desire was a big factor in lowering the definition since cellphones rarely meet the 25/3 speed standard.

The deliberation on the topic this year raises the question if there is some way to create a rule that would better define the speed of needed broadband. It’s worth looking back to see how the Tom Wheeler FCC came up with the 25/3 definition. They created sample profiles of the way that households of various sizes are likely to want to use broadband. In doing so, they added together the bandwidth needed for various tasks such as watching a movie or supporting a cellphone.

But the FCC’s method was too simple and used the assumption that various simultaneous uses of broadband are additive. They added together the uses for a typical family of four which resulted in bandwidth needs greater than 20 Mbps download, and used that as the basis for setting the 25/3 standard. But that’s now home broadband works. There are several factors that affect the actual amount of bandwidth being used:

For example, doing simultaneous tasks on a broadband network increases the overhead on the home network. If you are watching a single Netflix stream, the amount of needed bandwidth is predictable and steady. But if three people in a home are each watching a different Netflix the amount of needed bandwidth is greater than adding together the three theoretical streams. When your ISP and your home router try to receive and untangle multiple simultaneous streams there are collisions of packets that get lost and which have to be retransmitted. This is described as adding ‘overhead’ to the transmission process. Depending on the nature of the data streams the amount of collision overhead can be significant.

Almost nobody directly wires the signal from their ISP directly too all of their devices. Instead we use WiFi to move data around to various devices in the home. A WiFi router has an overhead of its own that adds to the overall bandwidth requirement. As I’ve covered in other blogs, a WiFi network is not impacted only by the things you are trying to do in your home, but a WiFi network is slowed when it pauses to recognizes demands for connection from your neighbor’s WiFi network.

Any definition of home broadband needs should reflect these overheads. If a household actually tries to download 25 Mbps of usage from half a dozen sources at the same time on a 25 Mbps, the various overheads and collisions will nearly crash the system.

The FCC’s definition of broadband also needs to reflect the real world. For example, most of the unique programming created by Netflix and Amazon Prime are now available in 4K. I bought a large TV last year and we now watch 4K when it’s available. That means a stream of 15-20 Mbps download. That stream forced me to upgrade my home WiFi network to bring a router into the room with the TV.

The FCC’s speed definition finally needs to consider the busy hour of the day – the time when a household uses the most broadband. That’s the broadband speed that the home needs.

We know household bandwidth needs keep increasing. Ten years ago I was happy with a 5 Mbps broadband product. Today I have a 60 Mbps product that seems adequate, but I know from tests I did last year that I would be unhappy with a 25 Mbps connection.

The FCC needs a methodology that would somehow measure actual download speeds at a number of homes over time to understand what homes area really using for bandwidth. There are ways that this could be done. For example, the FCC could do something similar for broadband like what Nielsen does for cable TV. The FCC could engage one of the industry firms that monitor broadband usage such as Akamai to sample a large number of US homes. There could be sample voluntary homes that meet specific demographics that would allow monitoring of their bandwidth usage. The accumulated data from these sample homes would provide real-life bandwidth usage as a guide to setting the FCC’s definition of broadband. Rather than changing the official speed periodically, the FCC could change the definition as needed as dictated by the real-world data.

The FCC does some spot checking today of the broadband speeds as reported by the ISPs that feed the national broadband map. But that sampling is random and periodic and doesn’t provide the same kind of feedback that a formal ongoing measuring program would show. We have tools that could give the FCC the kind of feedback it needs. Of course, there are also political and other factors used in setting the official definition of broadband, and so perhaps the FCC doesn’t want real facts to get into the way.

Categories
Technology The Industry

Comparing Streaming and Broadcast Video

One thing that doesn’t get talked about a lot in the battle between broadcast TV and on-line video is video quality. For the most part today broadcast TV still holds the edge over on-line video.

When you think of broadcast TV over a cable system I can’t help but remember back twenty years ago when the majority of the channels on a cable system were analog. I remember when certain channels were snowy, when images were doubled with ghosts and the first couple of channels in the cable system were nearly unwatchable. Today the vast majority of channels on most cable systems are digital, but there are still exceptions. The conversion to digital resulted in a big improvement in transmission quality.

When cable systems introduced HDTV and the quality got even better. I can remember flipping back and forth between the HD and SD versions of the same channel on my Comcast system just to see the huge difference.

This is not to say that cable systems have eliminated quality issues. It’s still common on many cable systems to see pixilation, especially during high action scenes where the background is constantly changing. All cable systems are not the same, so there are differences in quality from one city to the next. All digital video on cable systems is compressed at the head-end and decompressed at the settop box. That process robs a significant amount of quality from a transmission and one only has to compare any cable movie to one from a Blu-ray to realize how much is lost in the translation.

In the on-line world buffered video can be as good as good as cable system video. But on-line video distributors tend to compress video even more than cable systems – something they largely can get away with since a lot of on-line video is watched on smaller screens. And this means that a side-by-side comparison of SD or HD movies would usually favor the cable system. But Netflix, Amazon and a few others have one advantage today with the spectacular quality of their 4K videos – there is nothing comparable on cable networks.

But on-line live-streamed video still has significant issues. I watch sports on-line and the quality is often poor. The major problem with live-streamed video is mostly due to delays in the signal getting to the user. Some of that delay is due to latency – either latency in the backbone network between the video creator and the ISP or latency in the connection between the ISP and the end-user. Unlike downloading a data file where your computer will wait until it has collected all of the needed packets, live-streamed video is sent to end-users with whatever pixels have arrived at the needed time. This creates all sorts of interesting issues when watching live sports. For instance, there is pixilation, but it doesn’t look like the pixilation you see on cable network. Instead parts of the screen often get fuzzy when they aren’t receiving all the pixels. There are also numerous problems with the video. And it’s still not uncommon for the entire picture to freeze for a while, which can cause an agonizing gap when you are watching sports since it always seems to happen at a critical time.

Netflix and Amazon have been working with the Internet backbone providers and the ISPs to fix some of these issues. Latency delays in getting to the ISPs is shrinking and, at least for the major ISPs, will probably not be an issue. But the one issue that still needs to be resolved is the crashes that happen when the Internet gets overloaded when the demand is too high. We’re seeing ISPs bogging down when showing a popular stream like the NBA finals, compared to a normal NBA game that might only be watched by a hundred thousand viewers nationwide.

One thing in the cable system’s favor is that their quality ought to be improving a lot over the next few years. The big cable providers will be implementing the new ATSC 3.0 video standard that is going to result in a significant improvement in picture quality on HD video streams. The FCC approved the new standard earlier this year and we ought to see it implemented in systems starting in 2018. This new standard will allow cable operators to improve the color clarity and contrast on existing HD video. I’ve seen a demo of a lab version of the standard and the difference is pretty dramatic.

One thing we don’t know, of course, is how much picture quality means to the average video user. I know my teenage daughter seems quite happy watching low-quality video made by other teens on Snapchat, YouTube or Facebook Live. Many people, particularly teens, don’t seem to mind watching video on a smartphone. Video quality makes a difference to many people, but time will tell if improved video quality will stem the tide of cord cutting. It seems that most cord cutters are leaving due to the cost of traditional TV as well as the hassle of working with the cable companies and better video might not be a big enough draw to keep them paying the monthly cable bill.

Categories
Technology

Are You Ready for 4K Video?

The newest worry for ISPs is the expansion of 4K video. Already today Netflix and Amazon are offering on-line 4K video to customers. Almost all of the new programming being created by both companies is being shot in 4K.

Why is this a concern for ISPs? Netflix says that in order to enjoy a streaming 4k signal that a user ought to have a spare 15 – 20 Mbps of bandwidth available if streaming with buffering. The key word is spare, meaning that any other household activity ought to be using other bandwidth. Netflix says that without buffering that a user ought to have a spare 25 Mbps.

When we start seeing a significant number of users stream video at those speeds even fiber networks might begin experiencing problems. I’ve never seen a network that doesn’t have at least a few bottlenecks, which often are not apparent until traffic volumes are high. Already today busy-hour video is causing stress to a lot of networks. I think about millions of homes trying to watch the Super Bowl in 4K and shudder to think what that will mean for most networks.

While 4K video is already on-line it is not yet being offered by cable companies. The problem for most of the industry is that there is no clear migration path between today and tomorrow’s best video signal. There are alternatives to 4K being explored by the industry that muddy the picture. Probably the most significant new technology is HDR (high-dynamic range) video. HDR has been around for a few years, but the newest version which captures video in 10-bit samples adds both contrast and color accuracy to TVs. There are other video improvements also being explored such as 10-bit HEVC (high-efficiency video coding) which is expected to replace today’s H.264 standard.

The uncertainty of the best technology migration path has stopped cable companies from making upgrades to HDR or 4K. They are rightfully afraid to invest too much in any one version of the early implementations of the technology to then face more upgrades in just a few years. But as the popularity of 4K video increases, the pressure is growing for cable companies to introduce something soon. It’s been reported that Comcast’s latest settop box is 4K capable, although the company is not making any public noise about it.

But as we’ve seen in the past, once customers start buying 4K capable TVs they are going to want to use them. It’s expected by 2020 that almost every new TV will include some version of HDR technology, which means that the quality of watching today’s 1080 pixel video streams will improve. And by then a significant number of TVs will come standard with 4K capabilities as well.

I remember back when HD television was introduced. I have one friend who is a TV buff and once he was able to get HD channels from Comcast he found that he was unable to watch anything that was broadcast in standard definition. He stopped watching any channel that did not broadcast HD and ignored a huge chunk of his Comcast line-up.

The improvements of going to 4K and/or true HDR will be equally as dramatic. The improvement in clarity and color is astonishing as long as you have a TV screen large enough to see the difference. And this means that as people grow to like 4K quality they will migrate towards 4K content.

One thing that is clear is that 4K video will force cable companies to broadcast video over the IP stream. A single 4K signal eats up an entire 6 MHz channel on a cable system making it impossible for any cable system to broadcast more than a tiny number of 4K channels in the traditional way. And, like Comcast is obviously preparing to do, it also means all new settop boxes and a slew of new electronics at the cable headend to broadcast IPTV.

Of course, like any technology improvement we’ve seen lately, the improvements in video quality don’t stop with 4K. The Japanese plan to broadcast the 2020 Olympics in 8K video. That requires four times as much bandwidth as 4K video – meaning an 80 – 100 Mbps spare IP path. I’m sure that ways will be found to compress the transmission, but it’s still going to require a larger broadband pipe than what most homes buy today. It’s expected that by 2020 that there will only be a handful of users in Japan and South Korea ready to view 8K video, but like anything dramatically new, the demand is sure to increase in the following decade.

Categories
Current News Technology

Do We Really Want to be Watched?

I noticed an article that mentioned that the Google free WiFi hotspots in New York City are equipped with cameras and are able to be used for surveillance. Google says that this function has not been activated, but it got me to thinking about public video surveillance in general.

There has been a surge in recent years in the installation of public surveillance cameras, fostered in part by the fiber networks that are being built everywhere. The sale of outdoor surveillance equipment is growing at about 7% per year. And the quality of that equipment is rapidly improving. New surveillance cameras no longer produce the grainy pictures we all think of as synonymous with security footage but are now using high definition and even 4K video technologies to drastically improve the quality of the images. Fiber bandwidth is allowing for higher frame rates and fewer gaps in the sequence of images.

The city of London led the way over a decade ago and installed cameras to saturate certain downtown neighborhoods in the city. After having had these in place for a long time the statistics show that the cameras haven’t really changed the crime rate in the watched neighborhoods. While it did change the nature of the crimes in the areas somewhat, the overall crime rate is close to what it was before the cameras.

Probably the biggest public fear about surveillance is that public cameras will be used to track where we go in public. I know I’m not nuts about the idea of a store knowing who I am as soon as I walk through the door, and I’m even more skeptical of having the government track me as I walk down city streets.

That aspect of surveillance is going to require better facial recognition technology. Currently, Facebook facial recognition is said to be able to identify people 98% of the time. Facebook is able to get such good results by limiting its search to friends and friends-of-friends of the person that posts a picture. Facebook also benefits from the pictures of people from different angles and different lighting which lets it build better profiles. The FBI’s software is said to be 85% accurate if they can limit a search to no more than 50 people for the software to consider.

There is no facial recognition software yet that is very good at identifying random people on a public street. However, everybody expects that software to be here in less than a decade through assistance from Artificial Intelligence.

Public surveillance cameras open up a number of ethical issues. The first is that it’s too tempting for law enforcement insiders to misuse surveillance information. Back in 1997 a high-ranking police officer in DC was convicted of using surveillance cameras near a gay bar for identifying patrons through license plates and then blackmailing them. The Detroit Free Press reported on cases of policemen using surveillance systems to help friends, stalk estranged spouses, or harass those with whom they had a traffic altercation.

Terrorism experts say that public surveillance cameras not only don’t deter terrorist attacks, but might instead invite them by producing images of a terrorist attack.

There are also arguments that video surveillance constitutes fourth amendment violations through unreasonable searches. The concern comes not just from having government cameras identifying you on the street, but that over time using that data to create a profile about where and when you go out, who you see, and what you do in public.

I know that a lot of US cities are considering putting in a lot more surveillance cameras as part of smart city initiatives. Tampa, near to me, has already begun the installation of an extensive outside camera network. I’m sure the city officials that do this have what they think are good reasons for watching their citizens, but our short history with the technology shows that such systems will be used for purposes different than what was intended. I, for one, am not a fan of the whole concept and I suspect most people don’t really want to be watched.

Categories
The Industry

The Growth of 4K Video

It looks like 4K video is making it into the mainstream and is going to put a big strain on broadband networks serving residential customers. 4K video resolution is 3840 x 2160 pixels, or about 8 million pixels on a screen, which is about four times more resolution than an HD display. It takes a lot of bandwidth to stream that many pixels and with current compression technologies 4K video requires 15 – 20 Mbps download speeds. Google and others are working on better compression techniques that might cut that in half, but even so that would mean videos streams at 7 – 10 Mbps. That’s a whole new level of broadband demand that will increase the household need for faster data speeds.

Just a year ago it wasn’t easy to find 4K video on the web, but this year there is a lot of content being shot in the format. This includes:

  • Netflix is currently shooting most of its original content like House of Cards, Breaking Bad, Jessica Jones, Daredevil in 4K. It also has a big array of documentaries in the format as well as a number of classic movies being reformatted to 4K.
  • Amazon Prime is also filming new content like Alpha House, Transparent, Mozart in the Jungle and Man in the High Castle in 4K. They have a small library of movies in the format.
  • Sony probably has more mainstream movies in the 4K format than anybody. Rather than streaming you download Sony movies and a typical movie can take 40 GB of storage space. It doesn’t take too many movie downloads to blow the data caps of AT&T or Comcast.
  • M-Go has developed a small but growing 4K library in conjunction with Samsung. They also will be adding title from Fox.
  • Comcast offers a few movies in 4K online for customers in partnership with NBC Universal.
  • YouTube has a huge amount of user-generated 4K video of all different types. YouTube is also now producing original content sold under YouTube Red and which contains 4K content.
  • Ultraflix has a big library of 4K nature documentaries including some originally produced for IMAX. They are also carrying lot of Hollywood movies.
  • Vudu, which is owned by Walmart has a small, but high quality 4K set of content. They are the first to marry 4K video to Dolby surround sound.

If 4K follows the same migration path of standard definition video to HD video, then within a few years 4K content is going to be everywhere. Where just a few years ago there was little video on the web, video now seems to be everywhere. There are video ads on all sorts of websites and social media services like Facebook and Twitter spit out piles of video at a user these days.

One of the biggest problems with broadband regulation in this country is that it fails to recognize the ever-growing nature of broadband demand. Once households start using 4K video then the FCC’s newly minted definition of broadband at 25 Mbps download will already be getting stressed. The fact is that the household needs for broadband are just going to keep growing year after year and any regulatory definition of demand will be obsolete almost as soon as it is established.

Broadband demand has been growing steadily and doubling about every three years and there is no reason to think that we are anywhere close to the time when that growth curve is going to slow. 4K video is not the last new technology that will stretch our needs for broadband. When I read about where virtual and augmented reality are headed over the next five years it’s not to hard to see where the next big push for more broadband will come from.

Categories
Technology

A Network Without Wires

There is an emerging trend in the industry to try to create home networks without wires. ISPs and cable companies are all putting a lot of faith into WiFi as an alternative for wires running to computers and settop boxes.

It’s an interesting trend but one that is not without peril. The problem is that WiFi, at least like the big ISPs deliver it, is not always the best solution. The big cable companies like Comcast tend to provide customers with a cable modem with a decent quality WiFi router built in. This router is placed wherever the cable enters the home, which might not be the ideal location.

A single strong WiFi router can be a great device in a home with a simple network and uncomplicated demands. A family with two TVs, one computer, and a few smartphones is probably going to do fine with a strong WiFi router as long as the house isn’t too large for the signal to get where it’s needed.

But we are quickly changing to a society where many homes have complex data needs scattered throughout the house. People are likely to be demanding video streams from all over the home, and often many at the same time. There are bound to be a few computers and it’s not unlikely that somebody in the house works at home at least part of the time. Demands for big bandwidth for things like gaming and the new virtual reality sets that are just now hitting the market are increasing. And we are on the verge of seeing 4K video streams at 15 Mbps. On top of all this will be a variety of smart IoT devices that are going to want occasional attention from the network.

When a home gets crowded with devices it’s very easy to overwhelm a WiFi router. The new routers are pretty adept at setting up multiple data paths. But with too many streams the router will lose efficiency as it constantly tries to monitor and change the bandwidth for each stream it is managing. When this happens a home network can bog down, dropping the efficiency of the router precipitously.

There are a few solutions to this problem. First, you can run wires directly to a few of the bigger data eaters in a house and remove them from the WiFi network. Just make sure in doing so that you also disable having them search for a WiFi signal. But people don’t really want more wires in their home, and ISPs definitely do not like this idea.

The other solution is to add additional WiFi hotspots in the home. The simplest example of this are WiFi repeaters that simply amplify the signal from the base WiFi hotspot. However, repeaters don’t improve the contention issue, they simply bring a stronger signal closer to some of the devices that need them.

A more complex solution is to set up a network of interconnected WiFi hotspots. This consists of separate WiFi routers that all feed through one base router, a configuration that is familiar to any network engineer but alien to most home owners. The main problem with this solution is obvious to anybody who has ever operated a network with multiple routers – getting them to work together efficiently. Setting up a multiple-router network can be challenging to those unfamiliar with networks. And if configured poorly this kind of network can operate worse than one big hotspot.

But these kinds of interconnected WiFi networks are the cutting edge of home networking. I was recently talking to an engineer from a mid-size cable company and he admitted that as many as 20% of their customers already need this kind of solution. It’s a bit ironic that the demand for WiFi is mushrooming so soon after the ISPs went to the one-router solution. The percentage of homes that need a better solution is growing rapidly as homes jam more devices onto WiFi.

So there is an opportunity here for any ISP. Customers need better networks in their homes and there is a revenue opportunity in helping them to set these up. The downside, at least for now, is that this is labor intensive and there may be a lot of maintenance to keep these networks running right. But there are a number of vendors looking into solutions and one would hope that home WiFi networks will soon become plug and play.

Exit mobile version