The Future Viability of WiFi

Español: Logo WiFi Vectorizado

Español: Logo WiFi Vectorizado (Photo credit: Wikipedia)

Just last week I wrote a blog that talked about how busy the WiFi spectrum is getting. It seems like every telecom business is using or has plans to use the spectrum in a big way. Since I wrote that blog I noticed the following article which outlines how wireless companies intend to deploy WiFi transmitters as part of their urban cell sites (and any other cell site that experiences congestion).

Just about every telecom business has some use for WiFi. Telcos and cable companies are using WiFi in their data routers to spread their data around homes and businesses. The wireless carriers are all planning on using WiFi to offload their tremendously busy licensed spectrum. Businesses use it to set up public hotspots. Settop manufacturers are going to use it to serve multiple TVs in your home. Devices that connect to your TV like Roku and Playstation use it. And it is becoming the default spectrum to use for the Internet of things and billions of devices are going to be made WiFi capable.

So this raises the issue of whether there is some point when there is just too many different people trying to use the same spectrum in the same area at the same time. People use the WiFi spectrum because it’s free. But like any radio spectrum, it has physical limitations. At some point we can simply overwhelm a given spectrum band in a given area and it will not work well for any of the applications trying to use it. It sure seems to me like we are headed towards that possibility with WiFi. I am sure that everybody remembers in the 90’s when cordless phones came out. The spectrum got so busy in some neighborhoods that the phones just wouldn’t work.

Without getting too technical, let me discuss some of the issues associated with radio interference. There are a whole lot of different ways that interference can affect a spectrum in a given location. Consider some of the following:

Adjacent Channel Interference (ACI). The WiFi spectrum is not one big swath of data but is divided up into discrete channels. Earlier versions of WiFi used one channel per transmission, but the latest standards allow for bonding channels together. Many of the problems that are experienced in the real world with WiFi is that many of the devices using it are not built to the same high standards that you find in licensed spectrum gear. And so there are numerous devices that bleed usage into adjacent channels. When such a device is transmitting, it then not only uses the assigned channel but pollutes the two channels on both sides.

Electromagnetic Interference (EMI). This is the interference that we all remember when listening to AM radio. This is interference that comes from some outside source that can range from microwave ovens, computer terminals, solar flares, doorbell transformers, and hundreds of other sources. WiFi is not immune from external interference and so part of the spectrum is eaten through native interference.

Co-channel Interference (CCI). This is the interference that comes when more than one user us trying to use the same channel at the same time. In the voice world this is known as crosstalk, which we have all experienced on cell phones from time to time. But in a data transmission this manifests as slower data speeds since each concurrent user loses part of their signal.

Common Mode Interference (CMI). This is interference that comes from using spectrum to conduct two-way transmissions. This is basically interference between transmitting and receiving WiFi signals at the same time.

As we put more and discrete WiFi paths in the same neighborhood the effects of each one of these types of interference get magnified. In any given area there is at least a little bit of all of these types of interference. That is inherent in the way that radio waves interact with each other on a physics basis. There are engineering techniques that can be used to minimize interference. For example, it’s typical to put the transmit and receive signals as far apart as you can get them. But if you out enough different signals into the same environment there comes a point where no techniques can overcome the sheer facts of physical interference. The spectrum can get overwhelmed and essentially becomes worthless until the demand on it reduces.

I know there are a lot of scientists and engineers who look at all of the planned used for WiFi and just shudder. Because in urban environments it is likely that the spectrum is going to get overwhelmed and none of the uses will work as they should.

The Explosion of WiFi

Wi-Fi Signal logo

Wi-Fi Signal logo (Photo credit: Wikipedia)

WiFi has been around since the mid 90’s as a local wireless data connection. WiFi products grew somewhat slowly with the two primary uses being external WiFi networks used to supply point-to-point data in mostly rural areas, and as the way to connect wirelessly to computers within a home or business. And companies like Cisco, Linksys and others made a decent living selling WiFi transmitters.

But then along came the smartphone and suddenly cellular data offload became a huge business as everybody scrambled to use WiFi data from their landline network rather than pay for more expensive cellular data. All of a sudden WiFi routers became a necessity and most homes that have a landline data connection now also have a WiFi router. In fact, most cable companies, FTTX companies and telcos have built WiFi into their standard data modems.

And as successful as WiFi has been, the spectrum is about to get a lot busier. Consider the following industry trends:

Proliferation of Commercial Hotspots. There has been a proliferation of public WiFi hotspots in recent years. It used to be when you wanted free WiFi you would head to a Starbucks. But since most businesses now have data connections many of them had added WiFi for their customer’s convenience. One good indicator of this is the website This site tracks known public WiFi hotspots and conveniently maps them. And this site shows many hotspots, but there are many additional hot spots that are not shown on these maps.

In addition to businesses deploying hot spots, some carriers have started deploying hot spots as part of their business plan. For example, it was recently reported that cable companies have deployed over 300,000 public WiFi hot spots, with most of those being deployed by Comcast, Comcast is deploying public hotspots in areas where they have stiff competition with fast landline data, such as areas with Verizon FiOS. So in some of these areas Comcast has deployed hot spots in areas where the public tends to congregate. For instance, they tout that they have completely covered the Jersey shore. When they can they sell hot spots to businesses as a money-making venture, but many of the Comcast hot spots are free for the use of any Comcast customer and have been installed to give them a competitive marketing advantage over their local competition. They report that the public is flocking to their hotspots with cell phones and tablets.

Settop Boxes. Many of the settop makers for cable television are coming out with version of their boxes that use WiFi to connect and transmit TV from one central hub to other televisions or to tablets, PCs or cell phones. There has already been a trend of creating a ‘whole-house’ centralized DVR / settop box that is able to record and playback multiple shows to any other TV in the home. Settop box manufacturers are going to count on the new 801.11ac standard to provide enough bandwidth to transmit cable signal between TVs.

City-wide WiFi networks. There have been a number of municipalities and other entities that have been expanding free WiFi networks. Wikipedia now lists 65 US cities that have deployed WiFi networks in some or all parts of the City. For the most part these networks offer free service although some of them instead offer WiFi by the hour or day similar to what is available in airports. I know of cities who do this which are not on this list, so the actual count of cities with some public WiFi coverage is probably quite a bit higher than 65. And I read almost daily of cities who are thinking about adding more of this. Additionally there are many cities that have added WiFi networks for first responders and City employees without offering these networks for the public.

The Internet of Things. But the real explosion of WiFi is going to come from the Internet of Things. There is only two current reliable ways for the multitude of IoT devices to communicate with a central hub, either WiFi or Bluetooth. It appears that most device makers are leaning towards WiFi as the preferred communications method since Bluetooth is mostly limited by line of sight to the central router. It’s estimated that over the next decade that billions of new IoT devices will be deployed and will start sharing the WiFi bandwidth.

There are a lot of concerns that the number of devices that will be using WiFi is going to cause a lot of local interference, which is an issue I will cover in a later blog.

The Internet of Things Today

Image representing Electric Imp as depicted in...

Image via CrunchBase

I’ve written a number of blogs about the future potential for the Internet of Things. But there are a number of devices on the market now that make the first steps of the IoT a reality today. I look at these devices and the approaches they are taking to the word of connected things to be the precursor to where we are headed over the next few decades.

SmartThings SmartThings are selling a box of smart sensors that you can use for functions today like home security and energy management. But they also provide a developer kit for those who want to program custom applications and there is a huge range of possible functions for the system. One would thing that soon that custom apps will begin appearing on the web allowing you to do cool things without having to be a coder.

MobiPlug. Mobiplug is a start-up that promises to be able to integrate all of your wireless devices regardless of the platform or wireless protocol used. It’s most unique feature is that it makes your smart phone the hub device. Most other platforms supply a separate box as a hub and I am just picturing a closet full of old hubs one day in the same way that I gathered a pile if WiFi routers. Most IoT systems allow your smart phone to control settings, so why not just make it the hub too?

FitBit.  By now you have probably seen your Facebook friends with the annoying posts showing how fast and where they ran today, brought to you by FitBit. But FitBit has it in their sights to become a lot more than just a training aid and monitor and they are hoping to evolve their system into everything fitness and health related in your life. FitBit is already storing data on you that can become the future basis for a heath monitoring system.

AllJoyn. AllJoyn is not a device, but rather a platform of software being created by Qualcomm. They are taking a very different approach to the IoT and developing a platform that will work independently of the Internet. This has some basic merit in that many of the other platforms store at least some of the central core in the cloud and be non-functional during an Internet outage. But it also is a bold step in separating our IoT data from the general internet for privacy reasons. Do you really want your medical monitor data or security system to be hackable?

Evrythng. This company is looking at a totally different aspect of the IoT, in how you interact with your devices and with the outside world. Evrythng is a software platform that will let you more dynamically interface with your IoT devices in a Facebook-type of platform. However, one aspect of this system is that your devices can ‘suggest’ additional purchases to you and this platform brings advertising into your life and your smart fridge might be suggesting what you should purchase to create a recipe with what you already have stored inside.

Electric Imp. And let us not forget the geeks among us who want a fully customized IoT. Electric Imp has developed a SD Card-sized WiFi node that can then be used with any device. A user can program it to do anything they wish. And the cards are swappable because the programming is stored in the cloud. Think of this as the never-ending coding party that lets you program your toaster to perform amazing feats.

Freescale.  This is still under development, but Freescale is looking at swallowable monitors for inside of the body. Nobody is entirely sure yet just what this is going to be looking at, but the guess is that this will be partnered with some other system such as FitBit as additional health monitors. Probably one of the most promising long-term use of the IoT is in-blood monitors that will head you off from being sick from the first signs of an infection and stopping pre-cancerous cells before they get started. This technology has to start somewhere and hopefully this is the first step.

The Quiet Expansion of Wi-Fi Networks


Wi-Fi (Photo credit: kristinmarshall)

I am sure I am like most business travelers and one of the first things I look for when I get to a new place is a WiFi connection for both my laptop and cellphone. Finding WiFi lets me get online with the computer and stops me from racking up data charges on my cell plan.

And for the longest time there has been very little public WiFi outside of Starbucks and hotels. But that is starting to change, at least in some places. There are several companies that have quietly been pursuing w WiFi deployments.

The biggest of these is the cable companies. It’s hard to get accurate counts of how many hot spots they have deployed. In 2012 a consortium of cable companies  – Comcast, Cox, Time Warner, Bright House and Optimum – banded together as the Cable WiFi consortium to deploy hotspots. Comcast claims that the industry has deployed over 300,000 hot spots. However, the Cable WiFi web site claims over 200,000. But whatever the number this is far larger than anybody else.

The Cable WiFi networks are offered to the customers of those companies as a mobile data extension of their service. Today these hotspots are centered around big cities – the northeastern corridor, San Francisco, Chicago, Los Angeles, Tampa, Austin and others.

The next biggest provider is AT&T which claims about 30,000 hot spots. AT&T claims over 705 million WiFi connections onto its WiFi network in the fourth quarter of 2012. However, Google has announced that it is getting in the game and nobody knows how big they might get with this effort. But their first announcement is that they are taking over all of the hotspots at Starbucks Coffee (which is a lot of the AT&T hotspots).

The cable companies have been deploying the hotspots in several ways. In some communities they are installing them on utility poles. In other situations they are going into establishments similar to the Starbucks WiFi.

WiFi is becoming more and more important to people’s daily life, so this trend is going to be very popular. Cellphone plans are getting stingier and stingier with cellular data at the same time that cell phones and tablets have the ability to use more and more data. If that data is not offloaded onto WiFi networks then customers are facing some gigantic cellphone bills.

WiFi is never going to be a replacement of cellular. For example, the technology used and the spectrum used make it very difficult to do dynamic handoffs like happens with your cell phone. You can literally walk out of WiFi coverage on foot where cellular coverage will stick with you driving at speeds of 60 miles per hour.

But people are finding more and more uses for WiFi all of the time, and so the desire for public WiFi is probably going to explode. The cable companies report that every time they open a new hot spot that usage explodes soon after people figure out it is available. One area where they have seen the biggest use is at the Jersey shore where vacationers and visitors are relieved to find WiFi available.

Anybody building a fiber network ought to consider a wireless deployment. There are several ways to monetize the investment. The obvious revenue from WiFi is through daily, weekly and monthly usage fees. But if you are a triple play provider, a more subtle benefit of wireless is in making your customers stickier since you are giving them a mobile component of their data service. Another revenue stream is to sell prioritized WiFi access to the local municipality, electric company and others, with priority meaning that their employees get a prioritized access to the network, with first responders trumping everybody else. There are also smaller revenue streams such as earning commissions on the DNS traffic for people who purchase products over your WiFi network.

The Future of Rural Broadband

Verizon Wireless "Rule the Air" Ad C...

Verizon Wireless “Rule the Air” Ad Campaign (Photo credit: Wikipedia)

There were several events this week that are telling rural subscribers the future of rural broadband. It is a bleak picture.

First, at a Goldman Sachs conference on Tuesday, the CEO of AT&T said that he hoped that the new FCC chairman Tom Wheeler would be receptive to AT&T’s desire to begin retiring its copper network in favor of its wireless network. At the end of last year AT&T had said in an FCC filing that they were going to be seeking to retire the copper plant from ‘millions of subscribers’.

In that filing AT&T had asked to move from the copper network to an all-wireless all-IP network. Stephenson said that cost savings from getting rid of the copper network would be dramatic.

On that same day, Verizon CEO Lowell McAdam said that the idea of offering unlimited data plans for wireless customers was not sustainable and defied the laws of physics. Earlier this year Verizon had ended all of its unlimited wireless data plans and now has caps on every plan.

Verizon already has a rural wireless-based landline surrogate product that it calls VzW. This uses the 4G network to deliver a landline phone and data anywhere that Verizon doesn’t have landline coverage. The base plan is $60 per month and includes voice and 10 gigabytes of data. Every extra gigabyte costs $10. There is an option to buy a $90 plan that includes 20 gigabytes or $120 for 30 gigabytes.

Finally, at the same Goldman Sachs conference mentioned above, the CFO of Time Warner said that they saw more room for increasing data rates.

So what does all of this mean for rural subscribers? First, it means that if you are served by a large incumbent like AT&T that they are going to be working hard to retire your copper and force you onto wireless. And we all know that the wireless data coverage in rural America is not particular fast when you can even get data. The data speeds delivered from a cell tower drop drastically with distance. In urban areas where towers are only a mile or less apart this doesn’t have much practical effect. But in a rural environment a mile is nothing and homes might be a mile apart. People lucky enough to live near to a cell tower can probably get okay data speeds, but those further away will not.

And even if you can get wireless data your usage is going to be capped. Rural landline data usage today may be slow, but it is unlimited. Customers have learned that if they put in WiFi routers that they can channel all of the data usage on their cell phones and tablets to their unlimited landline data connections. But once those connections are wireless, then every byte of data leaving your home, whether directly from a device or though the WiFi router, is going to count against the data caps. So rural America can expect a future where they will have data caps while people in urban areas will not.

Finally, one can expect the price of data to keep climbing. I have been predicting this for a decade. The large telcos and cable companies are facing a future where the old revenues streams of voice and cable TV are starting to decline. The only sustainable product they have is data. And so as voice and cable continue to tumble, expect incumbents to get into the habit of raising data prices every year to make up for those declines. Competition won’t help because the cell company data is already expensive, and both the incumbent cable and telcos will be raising data rates together.

This is not a pretty picture for a rural subscriber. Customers will be forced from copper to wireless. Speeds are not likely to get much faster. Data is going to be capped and prices will probably be increased year after year.

Time for a New Spectrum Plan

The spectrum in this country is a mess. And this is not necessarily a complaint against the FCC because much of the mess was not foreseeable. But the FCC has contributed at least some to the mess and if we are going to be able to march into the future we need to start from scratch and come up with a new plan.

Why is this needed? It’s from the sheer volume of devices and uses that we see coming for wireless spectrum. The spectrum that the wireless carriers are using today is already inadequate for the data that they are selling to customers. The cellular companies are only making it because a large percentage of the wireless data is being handed off to WiFi today. But what happens when Wifi gets too busy or if there are just too many devices?

As of early 2013 there were over half a billion internet connected devices in the US. This is something that ISPs can count, so we know that is fairly accurate. And the number of devices being connected is growing really quickly. We are not device nuts in my house and our usage is pretty normal. And we have a PC, a laptop, a tablet, a reader and two cell phones connected to wireless. And I am contemplating adding the TV and putting in a new burglar alarm system which would easily double our devices overnight.

A huge number of devices are counting on WiFi to work adequately to handle everything that is needed. But we are headed for a time when WiFi is going to be higher power and capable of carrying a lot more data, and with that comes the risk that the WiFi waves will get saturated in urban and suburban environments. If every home has a gigabit router running full blast a lot of the bandwidth is going to get cancelled out by interference.

What everybody seems to forget, and which has already been seen in the past with other public spectrum, is that every frequency has physical limits. And our giant conversion to the Internet of Things will come to a screeching halt if we ask more of the existing spectrum than it can physically handle.

So let’s jump back to the FCC and the way it has handled spectrum. Nobody saw the upcoming boom in wireless data two decades ago. Three decades ago the smartest experts in the country were still predicting that cell phones would be a market failure. But for the last decade we have known what was coming – and the use is wireless devices is coming faster than anybody expected, due in part to the success of smartphones. But we are on the edge of the Internet of Things needing gigantic bandwidth which will make cell phone data usage look tiny.

One thing the FCC has done that hurts the way we use the data is to chop almost every usable spectrum into a number of small channels. There are advantages to this in that different users can grab different discrete channels without interfering with other users, but the downside to small channels is that any given channel doesn’t carry much data. So one thing we need is some usable spectrum with broader channels.

The other way we can get out of the spectrum pinch is to reallocate more spectrum to wireless data and then let devices roam over a large range of spectrum. With software defined radios we now have chips that are capable of using a wide variety of spectrum and can change on the fly. So a smart way to move into the future is to widen the spectrum available to our wireless devices. If one spectrum is busy in a given local area the radios can find something else that will work.

Anybody who has ever visited a football stadium knows what it’s like when spectrum gets full. Practically nobody can get a connection and everybody is frustrated. If we are not careful, every downtown and suburban housing area is going to look like a stadium in terms of frequency usage, and nobody is going to be happy. We need to fix the spectrum mess and have a plan for a transition before we get to that condition. And it’s going to be here a lot sooner than anybody hopes.

Delivering Gigabit Speeds

English: A gigabit HP-ProCurve network switch ...

English: A gigabit HP-ProCurve network switch in a nest of Cat5 cables. (Photo credit: Wikipedia)

There is a lot of talk about companies like Google and many municipal networks delivering Gigabit speeds to homes and residents. But what is not discussed is the fact is that there are no existing wiring technologies that can deliver the bandwidth for any significant distance. Most people are shocked when they find out how quickly data speeds drop with existing wiring technologies.

Existing wiring is adequate to deliver Gigabit speeds to the smaller homes or to small offices. Carriers have typically used category 5 wiring to deliver data signal, and that technology can deliver 1 Gigabit for about 100 feet from the fiber terminal. But after that the speeds drop off significantly.

Wiring technology was never a significant issue when we were using the wiring to deliver slower data speeds. The same fall-off occurs regardless of the data speeds being delivered, but a customer won’t notice as much when a 20 Mbps data connection falls to a few Mbps as when a Gigabit connection falls to the same very slow speed.

Many carriers are thinking of using the new 802.11ac WiFi technology as a surrogate for inside wiring. But the speeds on WiFi drop off faster than speeds on data cabling. So one has to ask if a customer ought to bother paying extra for a Gigabit if most of it doesn’t get delivered to his devices?

Below is a chart that compares the different technologies used today for data wiring along with a few that have been proposed, like WiGig. The speeds in this table are at the ‘application layer’. That means theoretical speeds but is the easiest number to use in a chart because it is the speeds that each technology touts when being promoted. But you must note that actual delivered data speeds are significantly less than these application layer speeds for every technology listed due to such things as overheads and for the bandwidth due to modulation techniques.

Speeds Chart

The technology that stands out on the chart is ultra-broadband from PulseLink of Carlsbad California. PulseLink uses the radio frequency (RF) spectrum on coaxial cable above 2 GHz and can deliver data rates exceeding 1 Gbps. They are marketing the technology under the name of CWave. This technology uses a wide swath of RF spectrum in the 3 to 5 GHz range. As a result the RF signal is out-of-band (OOB) to both Cable TV and Satellite and will peacefully co-exist with both. Typically RF spectrum above 3 GHz on coax cable has been considered unusable RF spectrum, but due to the unique techniques used Pulse-LINK’s CWave chipset the technology reliably delivers Gigabit data rates while not disturbing existing frequencies used by cable TV and cable modems. Effectively it adds a whole new Ethernet data path over existing coaxial and that needs no new wires when coax is already present.

The differences in the various technologies really matters when you are looking at delivering data to larger buildings like schools and hospitals. As was recently in the news, President Obama announced a ConnectED initiative that has the stated goal of bringing a minimum of 100 Mbps and a goal of 1 Gbps to 99% of students within five years. But there does not seem like any good reason to bring a gigabit to a school if only a tiny fraction of that bandwidth can be delivered to the classrooms. I think that the PulseLink ultrabroadband technology might be the only reasonable way to get broadband to our classrooms.

FCC Makes Changes to 60 GHz Spectrum

United States radio spectrum frequency allocat...

United States radio spectrum frequency allocations chart as of 2003 (Photo credit: Wikipedia)

On August 12, 2013 the FCC, in [ET Docket No 07-113] amended the outdoor use for the 60 GHz spectrum. The changes were prompted by the industry to make the spectrum more useful. This spectrum is more commonly known as the millimeter spectrum, meaning it has a very short wavelength and operates between 57 GHz and 64 GHz. Radios at high frequencies like this have very short antennae which are typically built into the unit.

The spectrum is used today in two applications, a) as outdoor short-range point-to-point systems used in place of fiber, such as connecting two adjacent buildings, and b) as in-building transmission of high-speed data between devices for functions such as transmitting uncompressed high-definition (HD) video between devices like blu-ray recorders, cameras, laptops and HD televisions.

The new rules modify the outside usage to increase power and thus increase the distance of the signal. The FCC is allowing an increase in emissions from 40 dBm to 82 dBm which will increase the outdoor distance for the spectrum up to about 1 mile. The order further eliminates the need for outside units to send an identifying signal, which now makes this into an unlicensed application. This equipment would be available to be used by anybody, with the caveat that it cannot interfere with existing in-building uses of the spectrum.

One of the uses of these radios is that multiple beams can be sent from the same antenna site due to the very tight confinement of the beams. One of the drawbacks of this spectrum is it is susceptible to interference from heavy rain, which is a big factor in limiting the distance.

Radios in this spectrum can deliver up to 7 Gbps of ethernet (minus some for overheads) and so this is intended an alternative to fiber drops to buildings needed less bandwidth than that limit. A typical use for this might be to connect to multiple buildings in a campus or office park environment rather than having to build fiber. The FCC sees this mostly as a technology to be used to serve businesses, probably due to the cost of the radios involved.

Under the new rules the power allowed by a given radio is limited to the precision of the beam created by that radio. Very precise radios can use full power (and get more distance) while the power and distance are limited for less precise radios.

The FCC also sees this is an alternative for backhaul to 4G cellular sites, although the one mile limitation is a rather short one. Most 4G sites that are already within a mile of fiber have largely been connected.

This technology will have a limited use, but there will be cases where using these radios could be cheaper than installing fiber and/or dealing with inside wiring issues in large buildings. I see the most likely use of these radios to get to buildings in crowded urban environments where the cost of leasing fiber or entrance facilities can be significant.

The 60 GHz spectrum has also been allowed for indoor use for a number of years. The 60GHz band when used indoors has a lot of limitations related to both cost and technical issues. The technical limitations are 60 GHz must be line-of-sight and the spectrum doesn’t go through walls. The transmitters are also very power consumptive and require big metal heat sinks and high-speed fans for cooling. Even if a cost effective 60 GHz solution where to be available tomorrow battery operated devices would need a car battery to power them.

One issue that doesn’t get much play is the nature of the 60 GHz RF emissions. 60 GHz can radiate up to 10 Watts with the spectrum mask currently in place for indoor operation. People are already concerned about the 500mW from a cell phone and WiFI and it is a concern in a home environment to have constant radiation at 10 Watts of RF energy. That’s potentially 1/10 the power of a microwave oven radiated in your house and around your family all of the time.

Maybe at some point in the distant future there may be reasonable applications for indoor use of 60 GHz in some vertical niche market, but not for years to come.

The Future of TV – The Sets

English: Various remote controls fot TV-set, D...

English: Various remote controls fot TV-set, DVD and VHS. (Photo credit: Wikipedia)

I think everybody agrees that television viewing is changing rapidly, and everybody in the industry has been thinking about how these changes will impact the cable business. I am going to do a series of blogs for a few Mondays looking at where industry experts think the business is moving. I will start off today looking at the future of the television set and then move on to other aspects of the business such as advertising, content production and viewing habits.

For the first time in many decades the purchase of new television sets is down. This seems to be due to two primary factors. First, 11% of homes now say that they now watch all of their video from computers, laptops, tablets or smartphones. So some households have given up on the communal nature of having a centralized set that everybody can watch together. However, the communal nature of TV viewing probably means that most households are going to want to keep a TV set of some sort. Second, TVs are being upgraded less often and people are treating them as a screen more so than a standalone device. When somebody connects a Hulu or Goggle Chromecast device to their TV they have in effect upgraded without the necessity of buying a new monitor.

So I looked around to see what experts think will happen to the TV set over time? Here are some guesses for both short-term and long-term.

Short-Term.  In the short term TV sets are going to get glitzier and have even more functions than they do today. Of course, not all big TV innovations succeed such as the fizzle that came with 3D TVs in 2010. But before TV manufacturers agree that the future of TVs is dead they are going to try to sell new sets by pushing new features. Some of the features being seen on new TVs now include:

  • Split screens. This takes the idea of picture in the screen and creates up to four separate pictures on the screen at the same time. Thus, a sports fan could watch four football games simultaneously. This has to be giving nightmares to companies delivering IPTV over DSL if each set can be asking for up to four HD channels at the same time.
  • Ultra High Definition. There are not TVs being made with 4k resolution which provides 4 times as many pixels with a 3840 X 2160 pixel grid as compared to today’s 1920 X 1080 grid.
  • OLED (Organic Light Emitting Diodes) TVs. These are ultrathin TVs made of layers of sprayed on materials that create a new kind of diode. The diodes emit their own light and turn black when not being used. The Koreans have made an OLED screen that is flexible and only 4 mm thick.
  • IGZO (Indium Gallium Zinc Oxide). Sharp has introduced a new LCD screen that is much brighter and also that can change colors much faster than older LCD screens. This ideal for gaming but also makes a superior TV screen.
  • Smart TVs. It is being rumored that Apple TV is almost ready to release its iTV, or the next generation of smart TV. A smart TV is really a new kind of smarter settop box combined with a screen. Apple will probably include Siri and iSight and other computer and smart phone features into the box. The smart TV will no longer be just a tuner and recorder but will be a full-functioning application machine that can bring the web and cellphone apps fully integrated to the TV set.

Long Run. In the long run it is likely that the TV settop box functionality will be completely separated from the display. The OLED flexible and transparent displays will mean that a TV will be able to be installed anywhere by laying a film over an existing surface. And so there could easily be an inexpensive TV display on the side of the refrigerator, on every mirror in the house or on any wall. These TVs will be operated using the combination of a smart box along with very fast WiFi in the house that will let all of the TVs be integrated into one system. This will allow for interesting new features such as ‘follow-me’ TV where the TV signal would follow the person from device to device and from room to room as they move throughout the house.

TV is also likely to become far more personal to each person in the household, a topic which I will look at in a future blog.

One small detail I almost forgot. The lowly TV remote is likely to die soon. The remote we have today is largely still needed due to a rule at the FCC called the integration ban which requires cable settop box manufacturers to produce a removable tuner, called a cable card. And so the current remotes still work on ancient infrared technology.

Remotes are starting to be replaced by smartphones and there are apps which can take over many of the remote functions. But in the not-too-distant future the smart TVs are going to do away with the need for any device and you will be able to control the TV by voice commands or by gestures. I know this will save me the five minutes it takes me every time I go to watch TV and try to remember where I left the remote!

Do You Understand Your Chokepoints?

Almost every network has chokepoints. A chokepoint is some place in the network that restricts data flow and that degrades the performance of the network beyond the chokepoint. In today’s environment where everybody is trying to coax more speed out of their network these chokepoints are becoming more obvious. Let me look at the chokepoints throughout the network, starting at the customer premise.

Many don’t think of the premise as a chokepoint, but if you are trying to deliver a large amount of data, then the wiring and other infrastructure at the location will be a chokepoint. We are always hearing today about gigabit networks, but there are actually very few wiring schemes available that will deliver a gigabit of data for more than a very short distance. Even category 5 and 6 cabling is only good for short runs at that speed. There is no WiFi on the market today that can operate at a gigabit. And technologies like HPNA and MOCA are not fast enough to carry a gigabit.

But the premise wiring and customer electronics can create a choke point even at slower speeds. It is a very difficult challenge to bring speeds of 100 Mbps to large premises like schools and hospitals. One can deliver fast data to the premise, but once the data is put onto wires of any kind the performance decays with distance, and generally a lot faster than you would think. I look at the recent federal announced goal of bringing a gigabit to every school in the country and I wonder how they plan to move that gigabit around the school. The answer mostly is that with today’s wiring and electronics, they won’t. They will be able to deliver a decent percentage of the gigabit to classrooms, but the chokepoint of wiring is going to eat up a lot of the bandwidth.

The next chokepoint in a network for most technologies is neighborhood nodes. Cable TV HFC networks, fiber PON networks, cellular data networks and DSL networks all rely on creating neighborhood nodes of some kind, a node being the place where the network hands off the data signal to the last mile. And these nodes are often chokepoints in the network due to what is called oversubscription. In the ideal network there would be enough bandwidth delivered so that every customer could use all of the bandwidth they have been delivered simultaneously. But very few network operators want to build that network because of the cost, and so carriers oversell bandwidth to customers.

Oversubscription is the process of bringing the same bandwidth to multiple customers since we know statistically that only a few customers in a given node will be making heavy use of that data at the same time. Effectively a network owner can sell the same bandwidth to multiple customers knowing that the vast majority of the time it will be available to whoever wants to use it.

We are all familiar with the chokepoints that occur in oversubscribed networks. Cable modem networks have been infamous for years for bogging down each evening when everybody uses the network at the same time. And we are also aware of how cell phone and other networks get clogged and unavailable in times of emergencies. These are all due to the chokepoints caused by oversubscription at the node. Oversubscription is not a bad thing when done well, but many networks end up, through success, with more customers per node than they had originally designed for.

The next chokepoint in many networks is the backbone fiber electronics that delivers bandwidth to from the hub to the nodes. Data bandwidth has grown at a very rapid pace over the last decade and it is not unusual to find backbone data feeds where today’s data usage exceeds the original design parameters. Upgrading the electronics is often costly because in some network you have to replace the electronics to all nodes in order to fix the ones that are full.

Another chokepoint in the network can be hub electronics. It’s possible to have routers and data switches that are unable to smoothly handle all of the data flow and routing needs at the peak times.

Finally, there can be a chokepoint in the data pipe that leaves a network and connects to the Internet. It is not unusual to find Internet pipes that hit capacity at peak usage times of the day which then slows down data usage for everybody on the network.

I have seen networks that have almost all of these chokepoints and I’ve seen other networks that have almost no chokepoints. Keeping a network ahead of the constantly growing demand for data usage is not cheap. But network operators have to realize that customers recognize when they are getting shortchanged and they don’t like it. The customer who wants to download a movie at 8:00 PM doesn’t care why your network is going slow because they believe they have paid you for the right to get that movie when they want it.