Keep People in the Equation

Tribrid_CarAs I keep reading about the coming Internet of Things I keep running into ideas that make me a bit uneasy. And since I am a tech head, I imagine that things that make me a little uneasy might make many people a whole lot uneasy.

For instance, I read about the impending introduction of driverless cars. I have to admit that when I am making a long drive on the Interstate that having the ability to just hand the driving off to a computer sounds very appealing. I would think that the challenge of driving on wide-open highways at a consistent speed is something that is quite achievable.

But it makes me uneasy to think about all cars everywhere becoming driverless. I sit here wondering if I really want to trust my personal safety to traveling in a car in which software is making all of the decisions. I know how easily software systems crash, get into loops and otherwise stutter and I can’t help but picturing being in a vehicle when a software glitch raises its ugly head.

I know that a road accident can happen to anybody, but when I drive myself I have a sense of control, however misplaced. I feel like I have the ability to avoid problems a lot better than software might when it comes down to a bad situation.

I am probably wrong, but it makes me uneasy to think about climbing into a cab in a crowded City and trusting my life to an automated vehicle. And I really get nervous thinking about sharing the road with robot tractor-trailers. The human-driven ones are scary enough.

I am probably somewhat irrational in this fear because I would guess that if all vehicles were computer-controlled there would be a lot fewer accidents, and we certainly would be protected from drunk drivers. Yet a nagging part of my brain still resists the idea.

I also worry about hacking. Perhaps one of the easiest ways to bump somebody off would be to hack their car and make it have an accident at a fast speed. You know it’s going to happen and that will make people not trust the automated systems. Hacking can break our faith in a whole lot of the IoT since there will be ample opportunities to hurt people by interfering with their car or their medicine or other technology that can harm as easily as it can help.

I can’t think I am untypical in this kind of fear. I think somehow as we make these big changes that somehow people have to be part of the equation. I don’t have an answer to this and frankly this blog just voices the concern. But it’s something we need to consider and talk about as a society.

The people issue is going to spring up around a lot of the aspects of IoT. It has already surfaced with Google Glass and many people have made it clear that they don’t want to be recorded by somebody else surreptitiously. As the IoT grows past its current infancy there are bound to be numerous clashes coming where tech confronts human fears, feelings and emotions.

There are certainly many of the aspects of the IoT that excite me, but as I think about them I would bet these same changes will frighten others. For instance, I love the idea of nanobots in my bloodstream that will tell me days early if I am getting sick or that will be able to kill pre-cancerous cells before they get a foothold in my body. But I am sure that same concept scares the living hell out of other people, the idea of having technology in our blood.

I don’t know how it’s going to happen, but the human equation must become part of the IoT. It has to. If nothing else, people will boycott the technology if it doesn’t make us feel safe.

An Industry of Hype?

Bandwidth_thickAlmost every day I see somebody in this industry making a claim that makes wince a little bit. It might be vendors talking about gigabit speeds. It may be service providers talking about gigabit cities. And to some extent I get it. It’s a world driven by marketing and everybody competes first with hype. Those in the know quickly figure out the truth, but I guess what bothers me is that others don’t.

Let’s start with the equipment vendors. The country is pushing hard to get gigabit bandwidth into our schools. And since schools are already wired with coaxial cable, this led me to look at the technologies that are in use today that can deliver bandwidth over existing cable in schools. After all, what good is bringing a gigabit to a school if you can’t actually get it to the classroom? The various technologies including HPNA, MOCA and HomePlug all claim gigabit-capable speeds. Additionally, the new WiFi standard of 802.11ac promises gigabit and above speeds. Another upcoming technology is G.Fast that is promising to do gigabit speeds on copper.

But none of these technologies actually delivers a gigabit at the application layer, which is the usable speed of bandwidth that is available to an end user. Some of these technologies do provide a gigabit of theoretical data at the transport layer, but after accounting for the various overheads, noise, interference and other factors, the actual bandwidth is much slower than advertised. Additionally, the speeds they tout are the total bandwidth of the technology and those speeds need to be divided into an upstream and downstream component, further diluting the bandwidth.

At best of these various technologies today deliver maybe a total of 400 Mbps in total bandwidth, and a few of them are quite a bit slower than that. So it turns out that these gigabit technologies are not really a gigabit, or even half a gigabit. But a non-engineer would not know this by looking at how they are advertised.

We have the same thing going on by service providers. For years broadband providers have sold ‘up-to’ data speeds that they were never able to achieve. There is still a lot of that going on, particularly in smaller markets where the advertisements talk about the speeds in nearby urban areas and are far in excess of what can actually be achieved in small towns.

But the one that really gets me is the term gigabit cities. When I hear gigabit cities I picture a place that is building a network that will make a gigabit data product available to every home and business in the community. And there are almost no cities like that.

People think Google is bringing gigabit everywhere, but they aren’t. First, they only go to neighborhoods that guarantee a certain penetration rate for Google. And once there they don’t serve any apartment complexes or businesses. Google is basically cherry-picking residents willing to pay $70 per month for data. While laudable (and I wish I could get it), Google is not building gigabit cities.

There seems to be other cities announcing themselves as gigabit cities almost weekly. Some of them offer gigabit speeds to residents, but at very high prices as much as $250 per month. Most of these cities only supply gigabit speeds to schools and a handful of large businesses. Again, very laudable and I am happy to see anybody invest in fiber. But gigabit to the schools and factories does not make a gigabit city. It just makes fast schools and factories.

There are a small handful of places that really are gigabit communities. There are some small telcos, municipalities and cooperatives that are offering gigabit to everybody in their footprint. But this is really rare and for the most part these are small communities. Interestingly, the folks that actually do it don’t tout themselves and just quietly deliver fast speeds to customers. I’m starting to think that the ones who yell loudest are the least likely to actually be doing it. I hope somebody can prove me wrong about this.

Another Regulatory Gotcha

FCC_New_LogoThe FCC recently went through the process of eliciting stories about ideas for rural broadband. I had a bit of a problem with how they went about it because they made it sound like anybody who would tell them their story was eligible to be chosen to get funding for a rural broadband experiment. And this wasn’t true, and the FCC really was just gathering stories. The actual applications to get funded will come later this year.

There is another thing that the FCC didn’t make very obvious to possible applicants. Any entity that wants to get money out of the Connect America Fund must be an Eligible Telecommunications Carrier (ETC). To be fair, the FCC says that companies that request funding don’t have to be an ETC at the time of filing, but that they must achieve that status before they can actually receive funds. The FCC language makes it clear that it expects ETC status to be obtained rather quickly.

What the FCC doesn’t seem to understand is that it can be very time consuming to become an ETC and in some cases impossible for some of the entities who are interested in the broadband experiments.

In most states there is a two-step process to become an ETC. First you must be certified as a carrier in your home state. The type of certification required varies by state. In some states you would have to obtain a Certificate for Public Convenience and Necessity (CPCN) and in other state you would have to become a CLEC or some other form of carrier.

Getting that kind of certification is not a slam dunk for start-ups and municipalities. Generally somebody wanting to get these certifications needs to pass three tests – that they are financially capable, managerially capable and technically capable of being a carrier. A start-up trying to get the FCC funding might fail one of these tests. For instance, a City might not be able to demonstrate technical capability because that is something they were going to hire after they got the funding. And in some states start-ups have trouble meeting the financial capability test set by their state regulatory commission. The process of getting certified can take anywhere from 90 days to 180 days in most states assuming you can meet all of the requirements.

Then, after getting the certification as a carrier, an entity can file to become an ETC. There are some very specific requirements in becoming an ETC that are going to stop some filers. For instance, an ETC must be willing and able to serve everybody in an existing ‘exchange’. An exchange is the service areas of the incumbent telcos and most rural exchanges have a town in the center surrounded by a sizable rural area. So anybody who wants to be an ETC must agree to serve that whole area. In some states a municipality is prohibited from or has a very difficult time serving anybody outside their City borders. And let’s face it, serving broadband to farms is expensive, and so having to agree to serve those areas can break a start-up business plan. So even if a City or ISP gets certified, it’s no slam dunk that they will meet the requirements to become an ETC. And even if they can, I know that there are many states where the ETC process can take a year.

Additionally, in both of these steps, the process can be further delayed if somebody intervenes in the regulatory process. The local telco or cable company can (and often does) intervene in the certification and/or ETC process as a delaying tactic to slow down potential competition. It’s not hard for the whole process end-to-end of becoming a carrier and then an ETC to take two years. And that will not work for the funding process. So many of those who are thinking about asking for this money have no idea that the regulatory cards are stacked against them.

At the end of the day, all that is proven by getting an ETC status is that you are good at the paperwork process of regulation. The status really has no other practical benefit. And I say this as somebody who gets paid to obtain these kinds of certifications. Some regulation is good, but I hate regulation for regulation’s sake. And this requirement of having to be an ETC to bring broadband to rural places is a stupid dinosaur kind of regulatory requirement.

5G Already?

Cell-TowerWe knew it was coming and the wireless industry is already bantering about the term 5G. Nobody knows exactly what it is going to be, but the consensus is that it’s going to be fast. The South Koreans are devoting $1.5 billion in research to develop the next generation wireless. And there are vendors like Samsung who are already starting to claim that the upgrades in their labs today are 5G.

And of course, all of this is hype. There is still not any broadband anywhere that complies to the original 4G specifications. This all got out of hand when the marketing groups started to tout 3G performance for networks that were not yet at the 3G specs. And then came 3.5 G and 4G, and now I guess 5G.

But let’s look at the one claim that it seems 5G is going to have, which is blistering fast speeds, perhaps up to 1 gigabit per second. What would it really take to provide a 1 gigabit cell phone data link? The answers can all be derived by looking at the basic physics of the spectrum.

Probably the first characteristic is going to be proximity to the transmitter. When you look at spectrum between 3 GHz and 6 GHz, the likely candidates for US deployment, then the math tells you that it’s going to be hard to send a 1 gigabit signal very far, maybe 150 feet from the transmitter. After that the signal is still fast, but the speeds quickly drop with distance. Unless we are going to place a mini cell site in every home and on every floor of a business it is not very likely that we are going to get people close enough to transmitters to achieve gigabit speeds.

It certainly is possible to generate speeds that fast at the transmitter. But such a network would need fiber everywhere to feed cell phones. A network with fiber that dense probably wouldn’t even need to be cellular and could handle nearby phones using WiFi.

We certainly need new antenna technologies and those things are being worked on in labs. I’ve written previous blog posts about the various breakthroughs in antenna technology such as with very bit arrays using large number of MIMO antennas. I think we can believe that antennas will get better with more research.

We need better processors and chips. A chip capable of receiving and processing a gigabit of data is going to be an energy hog in terms of the power available in a cell phone. Such chips are already here but they are deployed with bigger devices with enough power to run them. So we are going to need the next generation chip sets that will require less energy and that will generate less heat before any cell phone can actually use a gigabit of data.

We need carriers willing to supply that much data. Let’s face it, cellular networks are designed to provide something okay to many rather than something great to a few. Perhaps making cell sites smaller would help alleviate this issue, but it is a real one. If somebody is really dragging a gigabit out of a cell site there is not a whole lot left for anybody else. And this would require an increase the backhaul to cell sites to 100 GB or even terabit speeds if 1 GB phones became the norm.

Finally, we need a new FCC. Because the way that spectrum is divvied up in the US makes these kinds of speeds nearly impossible. Gigabit speeds would be easily achievable today if there were some giant swaths of bandwidth available. But our bandwidth is split into little discrete pieces and most of those pieces are further divided into channels. This makes it really hard to cobble together a big consistent bandwidth delivery system. We tend to think of wireless as a big pipe in the same manner than a fiber is. But it’s really a whole lot of discrete little signals that somebody has to join together to get a huge throughput.

Hot Spot 2.0

Wi-FiHot Spot 2.0 is the name used by an effort to link WiFi networks together to make it easier for people to get WiFi when they are not at home. Anybody who travels understands the incredible hassle of constantly looking for hot spots, figuring out how to connect to them and then doing it over and over as you move around.

The goal of Hot Spot 2.0 is to make it as easy to use WiFi as it is to use your cell phone when you step off an airplane. Your smartphone automatically finds a compatible cellular data network and begins downloading emails as soon as you are on the runway. By the time you walk into the terminal you up and running.

There are several different aspects that need to come together to make this work. First is a set of standards, and that process is already underway. The concept is being developed by the WiFi Alliance and they are using the trademarked name of ‘Passpoint’ to certify the hardware and software involved in the process. The first set of standards were released in June 2012 and further releases are in the development pipeline. The first release covered the basics that include automating network discovery, authentication and security.

Next is hardware and software systems that will support the standard. There are already companies like Ruckus that are working on solutions you can buy. A network that implements this is going to have to be able to recognize users trying to connect with the protocol and then be able to authenticate users automatically.

Finally is selling the idea to users. It’s a no brainer to sell this to people who travel a lot, but the goal will be to get this out to all of your customers who wants the ease of being connected to WiFi whenever it’s available. With the data caps on cell phones this is a no brainer for smart phone users, but can be valuable to people with tablets and laptops as well.

One can imagine that as more and more carriers get on board with the concept that a large nationwide associations of networks will grow. There is already talk of creating WiFi roaming arrangements that will let your customers use Hot Spot 2.0 on your network or any other network with which you have an arrangement. As big consortiums are created the value of this to customers will grow. Roaming might even become a source of revenue if you have a network that entertains a lot of visitors.

But there is merit in implementing this on your own network. This product brings a valuable service to customers who are willing to pay for this. Or you might bundle it in with anybody who buys one of your landline data products. Customers will love the mobility aspect of being connected automatically to WiFi as they move around their home town. And this gives you a good reason to sell more hot spots to businesses in the town so that they can be part of this network.

Cities often talk about the goal of being wired, and mobile data is a huge component of that concept. If a City has enough hot spots then they have enabled cheap broadband access to anybody with a cell phone, tablet or laptop. This could finally be the solution to the digital divide since this could enable broadband for even the poorest among us. Get them a Hot Spot 2.0 capable device and they will have broadband at many locations around town.

This effort certainly has potential and should have legs because of the gigantic number of smartphone users that can benefit by automatically connecting to WiFi when it is available. And I know carriers are thinking about this. When I signed up for my new Comcast data service in November the terms of service included a caveat that my broadband connection might be shared with other users. That was something new that I had never seen before and I think Comcast is preparing for the day soon when every one of their data users is also a Hot Spot 2.0 site. That creates a huge network from the moment it is activated.

Only now, instead of yelling at kids to get off my lawn I will be yelling at them to get out of my WiFi!

Beyond Cookies

120px-VirusThis not a blog entry about cakes and pies, but rather more discussion about how companies are tracking people on the web. A few weeks back I wrote a primer on cookies, which are the scripts that are left on your machine to store facts about you. Cookies can store almost anything and can be as simple as something that remembers your login and password to as complex as storing all sorts of other information about what you are doing on the web.

But many people have become very conscious of cookies and routinely delete them from their computers. Further, our web habits have changed and we access the web from multiple platforms. Cookies are only good for the device they are stored on and are not particularly effective in today’s multi-device environment. So there are new techniques being used to track what you do on the web including authenticated tracking, browser footprinting and cross-device tracking.

We make it easy for big companies to track us without cookies because we basically tell them who we are when we log onto our devices. You routinely authenticate who you are when you use sites like Facebook, iTunes, Gmail and others. An example of how you do this is your android phone. The first thing you are asked to do when you buy an android phone is to log on with a Gmail account as part of the activation process. It never asks you for this again, but every time you turn on your phone it automatically logs you in to that Gmail account again and Google always knows who you are. Apple phones and tablets have something similar in that each device is given a unique identifier code known as a UDID.

So Google is tracking android phones and Apple is tracking iPhones and I have to guess that Microsoft is tracking their phones. Since you ‘authenticate’ yourself by logging onto a cell phone you have basically given permission for somebody to learn a lot about you without the need for cookies – where you are and what you are doing on your cell phone.

The next tool that can be used to identify you is browser footprinting. This is interesting because each one of us basically creates our own digital fingerprint telling the world who we are through our browser footprint. The browser footprint is the sum total of all of the things that are stored in your browser. Some of this is pretty basic data like your screen size, the fonts you prefer, your time zone, your screen settings. But there are other identifying features like Plugins or any other program that wants to create a place on one of your tool bars.

As it turns out, almost everybody has a unique browser footprint. You can test this yourself. You can go to the website Panopticlick and this will tell you if your browser footprint is unique. It will show the kind of information that others can see online about you and your machine. One would think that most people have the same sort of stuff on their computers, but it only takes one thing different to give you a unique browser footprint and almost everybody is unique. And the people who are not unique still share a browser footprint with a discrete number of other people.

Finally there is cross-device tracking and Google is at the forefront of this effort. Over time as you log onto Google from different devices, or as you authenticate who you are on multi-devices, Google and others can note that information coming from these various devices are all from you. And so when your browse from home and are looking at new cars, it will become possible for them to tell an auto dealer what you have already done in terms of research once Google notices by your cellphone GPS that you are at a car dealer. They aren’t doing this quite yet, and for now they are just linking and tracking you across your multiple devices. But this tracking effort gives them a more complete picture of who you are, which is what big data is all about.

What’s up with Cord Cutters?

Fatty_watching_himself_on_TVMorgan Stanley just released their fourth annual survey on the media, cable and satellite business. In thus survey they talked to 2,501 adults nationwide. In this survey they looked in detail about how people use media – what they watch and how they watch it.

The most interesting statistic to come out of the survey is that for the fourth straight year a significant percentage of people said they are going to cut the cord in the coming year. 10% of respondents said that they were definitely going to cut the cord and another 11% said that they would probably be cutting the cord. If these percentages were true, and 21% of the country was going to be cutting the cord, the cable industry would be in a major tailspin. This survey ought to be major headlines on every business page, right?

But it’s not, and that’s because there have been similar responses to this survey the last few years. In last year’s survey those same two percentages added to 17%. The prior year they added to 15%. But the cable companies did not experience cord cutting to anywhere even remotely close to those percentages in the last two years. Certainly there is cord cutting going on and the industry has certainly lost at least several million people due to this new trend.

But what this survey tells us is that people want to cut the cord. One full fifth of households with cable are clearly unhappy with the big bundles of channels, and eventually that is going to come home to roost with the cable industry. The other statistic that bears this out is that only 50% of the respondents in the survey actually like the big package bundles, a number that is dropping every year.

We’ve seen the same thing before with home telephones. For years people talked about getting rid of their home phone and yet it took a number of years for many people to do so. But eventually people will act on how they feel and the cable industry has a big problem brewing.

As you might expect, there is an age component to potential cord cutters. 30% of the people who said they would or might cut the cord are in the 18 – 29 year-old age bracket. And that percentage decreases as the age increases.

I find these results interesting because almost everybody I talk to is unhappy with what they pay for cable TV. Maybe that’s because most of those people I talk to are in the industry. But I know many cord cutters and I know that this is really happening. I would be a cord cutter myself, but Comcast made me take basic TV (20 or so channels) if I wanted to buy a cable modem faster than 12 Mbps. So I am officially a TV subscriber, even though I don’t own a TV and the cable box they gave me is gathering dust in the closet. There can’t be a lot of people with my same story and who are being coerced into buying cable and who then don’t even watch it. But this does show that perhaps the reported subscribers of the big cable companies are a bit inflated due to these kinds of policies.

The survey also showed that the OTT programmers are doing quite well. 30% of the households in the survey watch NetFlix, up over 5% from the last survey. 18% watch AmazonPrime, up 10% from the last survey. And while the free Hulu service lost about a percentage of viewers, its for-pay service Hulu Plus is up almost 5%.

I titled this blog ‘What’s up with Cord Cutters’, but perhaps a better title would be ‘What’s up with the Almost-Cord Cutters’? There are apparently a whole lot of people who are thinking of cutting the cord. Perhaps one year soon a large percentage of the number of people who say they are going to cut the cord will actually do it. And then the wheels start coming off the cable model.

A Little Bit Closer to OTT

TabletWe keep inching closer and closer to the day when customers will have a viable access to real time over-the-top programming. The first company to make any progress in this area was Aereo who is sending the network channels to people’s cellphones and tablets in major markets. But Aereo has an upcoming day in court and the US Supreme Court could put them out of business.

It’s not like there isn’t any programming available on the web, because there are mountains of old TV shows and movies available on NetFlix and AmazonPrime and the many other companies that have deals to put content on the web. And many customers of the major cable providers have TV anywhere where the cable company lets them watch some of the channels they subscribe to on remote devices.

But what is still missing, and what will finally give a lot of people the impetus to cut the cord is when they can get the programming they most want in real-time on devices other than televisions. I have largely cut the cord and watch the programming available on NetFlix and AmazonPrime. But I would be very happy if I could buy ESPN and the Big10 Network a la carte. And maybe some news network like CNN.

There were two announcements this past week that inch us closer to an OTT alternative. The CEO of Verizon Wireless, Lowell McAdam announced that he has had discussions with content providers about launching an OTT service for customers using the Verizon LTE network and also possibly for those using other broadband providers.

The second announcement came from Dish Networks who announced a major deal with Disney that would allow them to distribute Disney and ESPN wirelessly. The agreement was complex and also resolved a number of issued between Disney and Dish for satellite carriage. Last week I reported on the spectrum that Dish has been buying, and this announcement demonstrates that they have plans to use some of that spectrum to offer an OTT product.

When the Verizon CEO was asked about the Dish Networks announcement his response was that he thought Verizon has a huge head start and that it would take Dish at least a year to construct a wireless network. So I think we can expect Verizon to roll something out soon to take advantage of the existing network.

Both announcements make it sound like customers will be able to buy the OTT programming without having to subscribe elsewhere to a landline version of the same channels. This would be the first time that such live content like sports has been made available this way. I wrote last year that there are only a handful of channels with enough market power to pull off OTT programming, and that very short list includes ESPN. I know that I would gladly pay $20 for ESPN a la carte rather than have to buy a $60 package to get it. And I don’t think I am that unusual. Just in the last week I have had conversations with several other sports fans who say the same thing.

I had cable service several years ago with all of the channels and all of the movies. And I found that I would go weeks, and sometimes even months without turning on the TV (especially outside of football season). I am really hoping that these announcements are the first little crack in the programming monopoly and that the first pieces of OTT are here. But I won’t believe it until I can buy it. It’s possible that Dish and Verizon Wireless will be forced to also sell bundles of programming including a lot of things I won’t want. But I can’t see them getting into the OTT business if they aren’t going to let customers buy the smaller packages they really want. I will be watching.

The Young and the Old

Old TVI’ve just seen some recent statistics that talk about TV viewing in different demographics. On the young side, Verizon just released a study it did of the TV viewing habits of Millennials, which it defined as those between the ages of 16 and 34. On the older side, there have been some interesting statistics released talking about who watches network TV.

Verizon’s study quantifies what we have already all suspected – that the viewing habits of young people are a lot different than the rest of us. This is not to say that everybody’s viewing habits aren’t changing, but the young have changed to a greater degree. For example, all age groups watch over-the-top video online, but Millennials spend three times as much of their viewing time on line as everybody else.

Millennials have not yet abandoned cable services and 75% of them still watch cable TV. Only 13% of Millennials have cut the cord compared to 9% of the rest of us. But unlike the rest of us, they are also huge subscribers other services like AmazonPrime, NetFlix and Hulu. They simply have a lower tolerance for linear TV programming and want to watch things on their terms when they are ready to watch it. Millenials also like to browse more than watch specific TV shows at set times. Millennials are more likely (64%) to be using some other viewing device like a tablet, laptop or cell phone than everybody else (49%).

Millennials seem to be very brand-loyal and the brands they like are not the same as everybody else. For example, when naming their top entertainment brands, Millennials don’t put any of the broadcast networks (ABC, NBC, CBS and FOX) into their top ten brands while all four make it into the top brands for non-Millennials. Interestingly the company that makes it as the top name brand for everybody is Amazon.

Contrary to what other surveys have found, Millennials are willing to pay for multiple kinds of TV services and they are more likely to subscribe to both cable and online entertainment sources. But in looking at their viewing habits, they are more likely to engage in binge watching and more attuned to when entire series of shows are released on line.

Millennials also get more of their entertainment from non-traditional sources like YouTube and social sites. Millennials are more likely to game and to play fantasy sports than others. They also frequent a number of social sites like Reddit, Imjur, 4chan and 9gag that nobody else uses.

Very much as a compliment to the Verizon survey, I was looking through statistics about who watches network TV. The demographics of the major networks is aging even faster than the population. The median age of viewer for network TV is now 54 while twenty years ago it was 41. In 1993 the number one show was ‘Home Improvement’ with the age of the median watcher at 34. Today the most popular show is ‘NCIS’ with a median age of viewers of 61.

Interestingly, the networks still get the majority of the advertising dollars, but the increasing age of their viewers is probably going to change this a lot. Back in the Verizon survey, only 32% of Millennials said they would even miss the major networks if they went away. Advertisers want to find better ways to get to Millennial and other younger viewers, but the way they watch programming makes it hard to get to them in the same was as they can get to viewers with network TV.

The Verizon survey should give pause to anybody in the cable TV business. The Millennials and the following generations will be the majority of viewers in a few decades and once has to ask if it is possible to have a set of products that they are willing to pay for. They are not afraid to spend money for entertainment, but a lot of that money goes to online sources instead of to the local cable TV.

The New Satellite Internet

Communications_satellite_(PSF)A new satellite Internet service launched earlier this year. I’ve been meaning to write about this and was prompted by seeing them in a booth at a rodeo I went to on Saturday. The service is provided by ViaSat under the brand name of Exede. They launched a new satellite, the ViaSat-1, last October for the sole purpose of selling rural broadband.

The broadband they are selling is a big step up over other previous satellite broadband, including earlier products offered by ViaSat. The basic broadband product offers up to 12 Mbps download and 3 Mbps upload. I went to the web and read reviews and people are saying that they are actually getting those speeds and in some cases even a little more. I would caution that like any broadband system, as they get more customers the satellite will get contention and the speeds will slow down.

The base product is priced at $50 per month and is a huge improvement over other satellite products. Exede’s older base product was also $50 but offered 512 kbps download and 128 kbps upload. For $79 you could get 1.5 Mbps download and 256k kbps upload.

But like everything there is a catch and that catch is data caps. The speeds are a great improvement because even web browsing at 512 kbps is nearly impossible. But the caps are a killer. For the $50 product the cap is 7 Gigabits of total download for the month. To put that into perspective, that is around 4 HD movies per month.

You can buy larger caps. For $80 per month you can get a 15 GB cap and for $130 per month you can get a 25 GB cap. If you hit the cap Exede doesn’t shut you down, but instead sets you to a very slow crawl for the rest of the month.

So obviously the satellite program is not going to be useful for anybody who wants to use the Internet for watching video or doing those kinds of things that most families use the Internet for. There can be no real gaming over a satellite connection, both due to the cap but also to the latency, since the signal bounces high above the earth and back. The latency also plays hell with voice over the Internet. You can do a mountain of emails and web surfing within that cap, but you have to always be cautious about downloading too much. Imagine if you worked from home and one of your kids watched too many videos and for the rest of the month you just crawled along at dial-up speeds.

For now this is only available on the east and west coasts and won’t be available in the middle of the country until they launch another satellite. Exede has a product in the Midwest that is $50 for up to 5 Mbps download and 1 Mbps upload, but reports are that most people there are not getting those speeds.

I am the first to say that this is a big step up in the rural areas. If I was on dial-up this would feel wonderful. But any home that gets this is not getting the same Internet that the rest of us get. One of my employees has four kids and they watch 4 – 5 hours per day of Internet video. We estimated that some months he is probably using a terabit of total download. His speeds are only half of this satellite service, but the unlimited download makes a huge difference in the way his family can use the Internet.

The scariest thing about this product is that I know that one of these days that some policy-head at the FCC is going to announce that the whole country has broadband and then they can wash their hands of the rural broadband gap. This is the fastest download speeds that anybody has brought to much of rural America. But anybody on this service is going to be so throttled by the data caps that they are not going to be able to use the Internet like the rest of us. So this is a good service, but it’s not broadband – it’s something else.