Telephone Deregulation Continues

Fuld-modell-frankfurtThere is a package of five bills before the legislature in Colorado that would deregulate telephone service there. If they pass, and it looks likely that they will, Colorado will become the 21st state where telephone deregulation has occurred to some extent.

I have been part of that regulatory process for years and I have mixed feelings about. One of the primary bulwarks of telecom policy in most states has been to hold the cost of residential telephone service as low as possible to make landlines affordable. I can remember many rates cases in various states where requests for rate increases in residential phones were either denied or highly curtailed. And the result of this persistent regulation over the years has been that in many states the prices for residential rates has been held below costs.

When I say that, I am referring to fully allocated costs where all of the costs of the business are spread across all of the products being sold. Regulators had many reasons for keeping costs low. One was public policy and the belief that all houses should be able to afford landlines. And back at the time when 98%+ of homes had landlines that probably was good policy. But over the years there also has grown the feeling that the large telcos have milked the profits out of the copper network and that almost anything they charge for them is excess profits.

One thing that is for sure is that after deregulation that the price for residential phones rises. The most extreme example was in California where the rates went up over a few years by 260%. But that is partially due to the rates there being help extraordinarily low for many years compared to other states.

But the current more towards deregulation is not about charging more for phone service, although that is a natural consequence. Instead, the phone companies are trying to create the mindset that copper is obsolete and needs to be phased out of service.

Let’s face it, the copper networks are getting old. An AT&T spokesperson on Colorado has been quoted to say that landlines will be obsolete by 2020. Many of the copper networks still operating in big cities and older suburbs were built in the 1960s. And frankly the phone companies have cut back on maintenance of copper and the networks have deteriorated. Small town America understands this better than anybody since in many rural areas there are barely any technicians to be found to fix problems.

Replacing the copper is inevitable. The problem comes in that in many areas there are no alternatives. First is the issue of cost. In rural areas the only alternative to copper is wireless, and households would need to replace a $20 landline with one or more cell phone, which could easily cost $100 or even much more per month. And that is where cell phones work. No matter how pretty of a picture that AT&T and Verizon try to paint with their nationwide coverage maps, there are still plenty of places in the country where cell phone coverage is terrible. If you have even been in a town where you have to walk around outside to find that one magic spot where you can get a cell signal you will understand that cellular is not always an option. And you don’t have to go very far outside City limits in rural counties to find huge zones with no cellular coverage.

The Colorado bills don’t just deregulate the price of phone service. The bills go further and take away the complaint process for phone service away from the state Commission there. If a consumer wants to file a complaint they would have to go to the FCC. Another bill deregulates VoIP service, which is telephone service delivered over the Internet or over an IP connection on a cable network. When you see those kinds of provisions in laws you know they were written by the large phone companies who want a lot more than a plan to look at the end of copper.

One Year and Counting

HAPPY_~1Today marks the one year anniversary of this Pots and Pans blog. I must tell you that it feels like a lot longer, because I have a hard time now remembering when I didn’t write a blog every day. My new routine is to get up early, make some tea, feed the cats, walk the dog and write a blog entry.

As the year has gone on I have slowly and steadily picked up readers, and I thank you all. I have 114 people who get the blog every day by email and roughly fifty other people read the blog on an average day. I know that writing about telecom is only of interest to the few and I am pleased with those numbers. Many of the people who read this are my friends and colleagues and these blogs have led to some lively discussions.

I find that my brain has gotten good at thinking about the world in the format of the 500-word essay. My brain fought against this format when I first got started because I am an explainer. I want to tell you everything I know about a topic and that is impossible to do in a short essay. But the blog has taught me to get pithy and to get straight to the heart of a matter quickly. I have written enough blogs now that my brain composes my thoughts easily into the blog format, which is interesting all by itself.

That discipline of trying to write every day has been very good for me, both personally and professionally. Personally it has made me a better writer. The blog has also let me express myself. My first blogs were very factual, but over time I have allowed my opinions to come into the blog. I am certain that I am not always right, but I have strong opinions about many telecom topics, be they right or wrong.

Professionally writing this blog has forced me to read a lot more to keep up with what is going on in the industry. And in my job as a telecom strategic consultant, the more I know the better my advice. In the last year I have done a much better job of keeping up with the industry, but I also have expanded outward in my reading to learn more about all sorts of technology that tangentially affects our industry. For example, I find myself fascinated by the Internet of Things. I feel lucky to be alive at a time when human knowledge is literally doubling every few years. The stuff that scientists and engineers are working on is fascinating.

I try to write s blog every business day, but I don’t always succeed. I skip major holidays just because. And there have been a few days where I had the flu and my brain had a hard time remembering my name. I can only imagine what would have hit the page had I been dumb enough to write on those days. And once in a great while I just run out of time, particularly on days when I am traveling. But I somehow managed to get 241 of them done this year and I am proud of that.

When I started doing this I thought I would be out of ideas in a month. I see many other bloggers who only write a few blogs per month and suspect that lack of ideas is a problem. Early on it was a struggle every day figuring out what to write about. But now I have more ideas than I have days, because we work in an industry that is changing quickly. Almost every single aspect of telecom is different today or soon will be different than what we all grew up with.

I have no idea how long I can keep this up. But for now I love the daily discipline of writing this blog and I love the way this mental exercise is making me think. As long as I keep getting such positive waves from the experience I guess I’ll keep writing for a while.

Those Insane Programmers

MTVThere is currently a dispute going on between a programmer and a bunch of cable companies that illustrates the nearly insane greed in the cable industry. I say the greed is insane because it seems like the programmers want to hasten their own demise.

The programmer is Viacom and they own channels like MTV, Nickelodeon, Spike and Comedy Central. The dispute is with the National Cable Television Cooperative (NCTC) which represents 890 of the smaller cable companies in the country. That’s just about everybody who isn’t large.

The current contract between Viacom and NCTC expires on March 31. None of us know the exact numbers, but a NCTC spokesperson says that Viacom wants a rate increase that is 40 times the rate of inflation. That itself may be an inflated claim, but in 2012 Viacom asked for a 30% rate increase from DirectTV. When DirectTV refused to pay, Viacom pulled their channels off the air for DirectTV until the issue was settled.

And Viacom is still today asking for huge fee increases. If Viacom was the only programmer doing this we could say that they are extra greedy. But the fact is that all of the programmers are doing the same thing. Every programmer is increasing fees to cable systems at rates far above inflation. The cost to cable providers for programming has climbed over 7% per year for nearly a decade, and in recent years I know some systems that have seen increases over 10%. Cable companies have no choice but to pass these increases on to customers and so we keep seeing big rate increases year after year.

But we are now at a time when customers are getting tired of the price of big cable packages and when a significant percentage of customers are thinking about abandoning cable. To keep pushing these big increases make no market sense. Just do the math to see how insane this is. For a customer paying $50 per month today, these increases will increase their rates to $70 in five years and $98 in ten years. Somebody paying $70 today will see rates of $138 dollars in ten years.

This can’t be sustained. It seems as if programmers like Viacom are making the risky bet that people will not cut the cord and find an alternative to the big cable packages. But every one of my smaller cable clients is losing customers, and at a faster and faster pace. They hear the stories every day of people who are fed up with the big monthly bill and who decide that they can get by with rabbit ears, NetFlix and AmazonPrime.

I’ve heard the idea that once most people drop cable that all of the networks will just go a la carte and sell to people over the Internet. But that is naïve and there are only a small handful of channels that have enough appeal to survive in an a la carte world. It’s likely that people will pay a monthly fee to get ESPN or Disney and maybe even Comedy Central. But probably 90% of the channels on cable systems will die if the big cable model breaks.

So why are Viacom and the other programmers being so greedy? One can chalk part of it up to the large company mentality that all that matters is the earnings next quarter. But one can also blame amazing arrogance in that they believe that the people of the US will keep paying huge fees to watch them. Are there really that many households where Nick at Nite is so important to people that they will pay $100 monthly in order to watch it? As OTT programming gets better, I foresee a day in the not too distant future when customers will bail on cable companies in droves. And then the hubris of the programmers will be fully exposed. The programmers are pushing hard to speed up the day when they will fail. If that is not insane, what is?

Do the Big Companies Even Want to Get it Right?

020916-F-4728F-001The latest Consumer Reports rankings are out for telecom providers, and the results are much the same as in past years. There are many different groups that rate companies and we often hear of reports that put the cable companies at the bottom of all companies in terms of customer service.

But the Consumer Reports ranking is more comprehensive. It looks at a lot of factors such as the perceived value that customers see with the provide, reliability, speeds, and support both in the home and over the phone. And they compare all of the major telecom companies and don’t compare to other industries.

Not surprisingly, HughesNet and their satellite broadband ranks the lowest. I’ve never heard anybody talk nicely about their product since it’s slow, costly and also has a lot of latency and delays. Many people say it is barely better than dial-up. It will be interesting to see how satellite ranks now that Exede is in the market with a faster product. As I reported a few weeks ago, the issue with Exede is the low total data caps, but at least the 12 mbps download is a huge improvement.

Ranked next to satellite is MediaCom which always comes in dead last among cable and telcos. Ranked next at the bottom are the various DSL providers, with Frontier, Fairpoint, Windstream and AT&T DSL. For the most part the customers on these services have older DSL technology that is only delivering a few Mbps download speeds. There is faster DSL technology available today and better ways to deploy it by bringing the DSLAMs closer to customers, but the companies listed are for the most part not pumping much money into DSL. The exception is Frontier who has gotten a pile of federal subsidy money from the new USF fund to upgrade and expand its DSL footprint.

But right next to this old DSL technology is Comcast, followed closely by Verizon DSL and then Time Warner. Verizon barely even advertises that it has DSL anymore and it is a surprise to see it more favorable with customers than Comcast.

At the top of the list and doing the best job are the smaller cable companies and fiber providers. At the top of the list are WOW and Wave (Astound) followed by Verizon FiOS.

It just amazes me to see these large companies like Comcast and Time Warner do so poorly with their customer service. They have been at the bottom of these kinds of rankings for well over a decade now and it’s obvious that they are willing to live with giving poor service. When you look at the rankings and see that Comcast is viewed by customers to be doing a worse job than Verizon and CenturyLink DSL you just have to shake your head.

It’s very obvious that they don’t care to become better because by spending some money they could do much better. Doing customer service well is not some unreachable mountain of a task. Thousands of companies do it well. If WOW and Astound can do it well, so can Comcast and Time Warner. It’s a matter of investing in the right systems, the right training and the right management of the process. Being big is not an excuse for being crappy, and if it is a valid excuse, then this alone ought to stop the Comcast / Time Warner merger.

I would think that the management of these companies would hate seeing themselves at the bottom of these lists. But they obviously like profits more than they hate doing a poor job. And that is what I don’t get. These companies have lost millions of customers due to dissatisfaction with their service and their best growth strategy is to lure back customers in their existing markets by doing a better job.

Keep People in the Equation

Tribrid_CarAs I keep reading about the coming Internet of Things I keep running into ideas that make me a bit uneasy. And since I am a tech head, I imagine that things that make me a little uneasy might make many people a whole lot uneasy.

For instance, I read about the impending introduction of driverless cars. I have to admit that when I am making a long drive on the Interstate that having the ability to just hand the driving off to a computer sounds very appealing. I would think that the challenge of driving on wide-open highways at a consistent speed is something that is quite achievable.

But it makes me uneasy to think about all cars everywhere becoming driverless. I sit here wondering if I really want to trust my personal safety to traveling in a car in which software is making all of the decisions. I know how easily software systems crash, get into loops and otherwise stutter and I can’t help but picturing being in a vehicle when a software glitch raises its ugly head.

I know that a road accident can happen to anybody, but when I drive myself I have a sense of control, however misplaced. I feel like I have the ability to avoid problems a lot better than software might when it comes down to a bad situation.

I am probably wrong, but it makes me uneasy to think about climbing into a cab in a crowded City and trusting my life to an automated vehicle. And I really get nervous thinking about sharing the road with robot tractor-trailers. The human-driven ones are scary enough.

I am probably somewhat irrational in this fear because I would guess that if all vehicles were computer-controlled there would be a lot fewer accidents, and we certainly would be protected from drunk drivers. Yet a nagging part of my brain still resists the idea.

I also worry about hacking. Perhaps one of the easiest ways to bump somebody off would be to hack their car and make it have an accident at a fast speed. You know it’s going to happen and that will make people not trust the automated systems. Hacking can break our faith in a whole lot of the IoT since there will be ample opportunities to hurt people by interfering with their car or their medicine or other technology that can harm as easily as it can help.

I can’t think I am untypical in this kind of fear. I think somehow as we make these big changes that somehow people have to be part of the equation. I don’t have an answer to this and frankly this blog just voices the concern. But it’s something we need to consider and talk about as a society.

The people issue is going to spring up around a lot of the aspects of IoT. It has already surfaced with Google Glass and many people have made it clear that they don’t want to be recorded by somebody else surreptitiously. As the IoT grows past its current infancy there are bound to be numerous clashes coming where tech confronts human fears, feelings and emotions.

There are certainly many of the aspects of the IoT that excite me, but as I think about them I would bet these same changes will frighten others. For instance, I love the idea of nanobots in my bloodstream that will tell me days early if I am getting sick or that will be able to kill pre-cancerous cells before they get a foothold in my body. But I am sure that same concept scares the living hell out of other people, the idea of having technology in our blood.

I don’t know how it’s going to happen, but the human equation must become part of the IoT. It has to. If nothing else, people will boycott the technology if it doesn’t make us feel safe.

An Industry of Hype?

Bandwidth_thickAlmost every day I see somebody in this industry making a claim that makes wince a little bit. It might be vendors talking about gigabit speeds. It may be service providers talking about gigabit cities. And to some extent I get it. It’s a world driven by marketing and everybody competes first with hype. Those in the know quickly figure out the truth, but I guess what bothers me is that others don’t.

Let’s start with the equipment vendors. The country is pushing hard to get gigabit bandwidth into our schools. And since schools are already wired with coaxial cable, this led me to look at the technologies that are in use today that can deliver bandwidth over existing cable in schools. After all, what good is bringing a gigabit to a school if you can’t actually get it to the classroom? The various technologies including HPNA, MOCA and HomePlug all claim gigabit-capable speeds. Additionally, the new WiFi standard of 802.11ac promises gigabit and above speeds. Another upcoming technology is G.Fast that is promising to do gigabit speeds on copper.

But none of these technologies actually delivers a gigabit at the application layer, which is the usable speed of bandwidth that is available to an end user. Some of these technologies do provide a gigabit of theoretical data at the transport layer, but after accounting for the various overheads, noise, interference and other factors, the actual bandwidth is much slower than advertised. Additionally, the speeds they tout are the total bandwidth of the technology and those speeds need to be divided into an upstream and downstream component, further diluting the bandwidth.

At best of these various technologies today deliver maybe a total of 400 Mbps in total bandwidth, and a few of them are quite a bit slower than that. So it turns out that these gigabit technologies are not really a gigabit, or even half a gigabit. But a non-engineer would not know this by looking at how they are advertised.

We have the same thing going on by service providers. For years broadband providers have sold ‘up-to’ data speeds that they were never able to achieve. There is still a lot of that going on, particularly in smaller markets where the advertisements talk about the speeds in nearby urban areas and are far in excess of what can actually be achieved in small towns.

But the one that really gets me is the term gigabit cities. When I hear gigabit cities I picture a place that is building a network that will make a gigabit data product available to every home and business in the community. And there are almost no cities like that.

People think Google is bringing gigabit everywhere, but they aren’t. First, they only go to neighborhoods that guarantee a certain penetration rate for Google. And once there they don’t serve any apartment complexes or businesses. Google is basically cherry-picking residents willing to pay $70 per month for data. While laudable (and I wish I could get it), Google is not building gigabit cities.

There seems to be other cities announcing themselves as gigabit cities almost weekly. Some of them offer gigabit speeds to residents, but at very high prices as much as $250 per month. Most of these cities only supply gigabit speeds to schools and a handful of large businesses. Again, very laudable and I am happy to see anybody invest in fiber. But gigabit to the schools and factories does not make a gigabit city. It just makes fast schools and factories.

There are a small handful of places that really are gigabit communities. There are some small telcos, municipalities and cooperatives that are offering gigabit to everybody in their footprint. But this is really rare and for the most part these are small communities. Interestingly, the folks that actually do it don’t tout themselves and just quietly deliver fast speeds to customers. I’m starting to think that the ones who yell loudest are the least likely to actually be doing it. I hope somebody can prove me wrong about this.

Another Regulatory Gotcha

FCC_New_LogoThe FCC recently went through the process of eliciting stories about ideas for rural broadband. I had a bit of a problem with how they went about it because they made it sound like anybody who would tell them their story was eligible to be chosen to get funding for a rural broadband experiment. And this wasn’t true, and the FCC really was just gathering stories. The actual applications to get funded will come later this year.

There is another thing that the FCC didn’t make very obvious to possible applicants. Any entity that wants to get money out of the Connect America Fund must be an Eligible Telecommunications Carrier (ETC). To be fair, the FCC says that companies that request funding don’t have to be an ETC at the time of filing, but that they must achieve that status before they can actually receive funds. The FCC language makes it clear that it expects ETC status to be obtained rather quickly.

What the FCC doesn’t seem to understand is that it can be very time consuming to become an ETC and in some cases impossible for some of the entities who are interested in the broadband experiments.

In most states there is a two-step process to become an ETC. First you must be certified as a carrier in your home state. The type of certification required varies by state. In some states you would have to obtain a Certificate for Public Convenience and Necessity (CPCN) and in other state you would have to become a CLEC or some other form of carrier.

Getting that kind of certification is not a slam dunk for start-ups and municipalities. Generally somebody wanting to get these certifications needs to pass three tests – that they are financially capable, managerially capable and technically capable of being a carrier. A start-up trying to get the FCC funding might fail one of these tests. For instance, a City might not be able to demonstrate technical capability because that is something they were going to hire after they got the funding. And in some states start-ups have trouble meeting the financial capability test set by their state regulatory commission. The process of getting certified can take anywhere from 90 days to 180 days in most states assuming you can meet all of the requirements.

Then, after getting the certification as a carrier, an entity can file to become an ETC. There are some very specific requirements in becoming an ETC that are going to stop some filers. For instance, an ETC must be willing and able to serve everybody in an existing ‘exchange’. An exchange is the service areas of the incumbent telcos and most rural exchanges have a town in the center surrounded by a sizable rural area. So anybody who wants to be an ETC must agree to serve that whole area. In some states a municipality is prohibited from or has a very difficult time serving anybody outside their City borders. And let’s face it, serving broadband to farms is expensive, and so having to agree to serve those areas can break a start-up business plan. So even if a City or ISP gets certified, it’s no slam dunk that they will meet the requirements to become an ETC. And even if they can, I know that there are many states where the ETC process can take a year.

Additionally, in both of these steps, the process can be further delayed if somebody intervenes in the regulatory process. The local telco or cable company can (and often does) intervene in the certification and/or ETC process as a delaying tactic to slow down potential competition. It’s not hard for the whole process end-to-end of becoming a carrier and then an ETC to take two years. And that will not work for the funding process. So many of those who are thinking about asking for this money have no idea that the regulatory cards are stacked against them.

At the end of the day, all that is proven by getting an ETC status is that you are good at the paperwork process of regulation. The status really has no other practical benefit. And I say this as somebody who gets paid to obtain these kinds of certifications. Some regulation is good, but I hate regulation for regulation’s sake. And this requirement of having to be an ETC to bring broadband to rural places is a stupid dinosaur kind of regulatory requirement.

5G Already?

Cell-TowerWe knew it was coming and the wireless industry is already bantering about the term 5G. Nobody knows exactly what it is going to be, but the consensus is that it’s going to be fast. The South Koreans are devoting $1.5 billion in research to develop the next generation wireless. And there are vendors like Samsung who are already starting to claim that the upgrades in their labs today are 5G.

And of course, all of this is hype. There is still not any broadband anywhere that complies to the original 4G specifications. This all got out of hand when the marketing groups started to tout 3G performance for networks that were not yet at the 3G specs. And then came 3.5 G and 4G, and now I guess 5G.

But let’s look at the one claim that it seems 5G is going to have, which is blistering fast speeds, perhaps up to 1 gigabit per second. What would it really take to provide a 1 gigabit cell phone data link? The answers can all be derived by looking at the basic physics of the spectrum.

Probably the first characteristic is going to be proximity to the transmitter. When you look at spectrum between 3 GHz and 6 GHz, the likely candidates for US deployment, then the math tells you that it’s going to be hard to send a 1 gigabit signal very far, maybe 150 feet from the transmitter. After that the signal is still fast, but the speeds quickly drop with distance. Unless we are going to place a mini cell site in every home and on every floor of a business it is not very likely that we are going to get people close enough to transmitters to achieve gigabit speeds.

It certainly is possible to generate speeds that fast at the transmitter. But such a network would need fiber everywhere to feed cell phones. A network with fiber that dense probably wouldn’t even need to be cellular and could handle nearby phones using WiFi.

We certainly need new antenna technologies and those things are being worked on in labs. I’ve written previous blog posts about the various breakthroughs in antenna technology such as with very bit arrays using large number of MIMO antennas. I think we can believe that antennas will get better with more research.

We need better processors and chips. A chip capable of receiving and processing a gigabit of data is going to be an energy hog in terms of the power available in a cell phone. Such chips are already here but they are deployed with bigger devices with enough power to run them. So we are going to need the next generation chip sets that will require less energy and that will generate less heat before any cell phone can actually use a gigabit of data.

We need carriers willing to supply that much data. Let’s face it, cellular networks are designed to provide something okay to many rather than something great to a few. Perhaps making cell sites smaller would help alleviate this issue, but it is a real one. If somebody is really dragging a gigabit out of a cell site there is not a whole lot left for anybody else. And this would require an increase the backhaul to cell sites to 100 GB or even terabit speeds if 1 GB phones became the norm.

Finally, we need a new FCC. Because the way that spectrum is divvied up in the US makes these kinds of speeds nearly impossible. Gigabit speeds would be easily achievable today if there were some giant swaths of bandwidth available. But our bandwidth is split into little discrete pieces and most of those pieces are further divided into channels. This makes it really hard to cobble together a big consistent bandwidth delivery system. We tend to think of wireless as a big pipe in the same manner than a fiber is. But it’s really a whole lot of discrete little signals that somebody has to join together to get a huge throughput.

Hot Spot 2.0

Wi-FiHot Spot 2.0 is the name used by an effort to link WiFi networks together to make it easier for people to get WiFi when they are not at home. Anybody who travels understands the incredible hassle of constantly looking for hot spots, figuring out how to connect to them and then doing it over and over as you move around.

The goal of Hot Spot 2.0 is to make it as easy to use WiFi as it is to use your cell phone when you step off an airplane. Your smartphone automatically finds a compatible cellular data network and begins downloading emails as soon as you are on the runway. By the time you walk into the terminal you up and running.

There are several different aspects that need to come together to make this work. First is a set of standards, and that process is already underway. The concept is being developed by the WiFi Alliance and they are using the trademarked name of ‘Passpoint’ to certify the hardware and software involved in the process. The first set of standards were released in June 2012 and further releases are in the development pipeline. The first release covered the basics that include automating network discovery, authentication and security.

Next is hardware and software systems that will support the standard. There are already companies like Ruckus that are working on solutions you can buy. A network that implements this is going to have to be able to recognize users trying to connect with the protocol and then be able to authenticate users automatically.

Finally is selling the idea to users. It’s a no brainer to sell this to people who travel a lot, but the goal will be to get this out to all of your customers who wants the ease of being connected to WiFi whenever it’s available. With the data caps on cell phones this is a no brainer for smart phone users, but can be valuable to people with tablets and laptops as well.

One can imagine that as more and more carriers get on board with the concept that a large nationwide associations of networks will grow. There is already talk of creating WiFi roaming arrangements that will let your customers use Hot Spot 2.0 on your network or any other network with which you have an arrangement. As big consortiums are created the value of this to customers will grow. Roaming might even become a source of revenue if you have a network that entertains a lot of visitors.

But there is merit in implementing this on your own network. This product brings a valuable service to customers who are willing to pay for this. Or you might bundle it in with anybody who buys one of your landline data products. Customers will love the mobility aspect of being connected automatically to WiFi as they move around their home town. And this gives you a good reason to sell more hot spots to businesses in the town so that they can be part of this network.

Cities often talk about the goal of being wired, and mobile data is a huge component of that concept. If a City has enough hot spots then they have enabled cheap broadband access to anybody with a cell phone, tablet or laptop. This could finally be the solution to the digital divide since this could enable broadband for even the poorest among us. Get them a Hot Spot 2.0 capable device and they will have broadband at many locations around town.

This effort certainly has potential and should have legs because of the gigantic number of smartphone users that can benefit by automatically connecting to WiFi when it is available. And I know carriers are thinking about this. When I signed up for my new Comcast data service in November the terms of service included a caveat that my broadband connection might be shared with other users. That was something new that I had never seen before and I think Comcast is preparing for the day soon when every one of their data users is also a Hot Spot 2.0 site. That creates a huge network from the moment it is activated.

Only now, instead of yelling at kids to get off my lawn I will be yelling at them to get out of my WiFi!

Beyond Cookies

120px-VirusThis not a blog entry about cakes and pies, but rather more discussion about how companies are tracking people on the web. A few weeks back I wrote a primer on cookies, which are the scripts that are left on your machine to store facts about you. Cookies can store almost anything and can be as simple as something that remembers your login and password to as complex as storing all sorts of other information about what you are doing on the web.

But many people have become very conscious of cookies and routinely delete them from their computers. Further, our web habits have changed and we access the web from multiple platforms. Cookies are only good for the device they are stored on and are not particularly effective in today’s multi-device environment. So there are new techniques being used to track what you do on the web including authenticated tracking, browser footprinting and cross-device tracking.

We make it easy for big companies to track us without cookies because we basically tell them who we are when we log onto our devices. You routinely authenticate who you are when you use sites like Facebook, iTunes, Gmail and others. An example of how you do this is your android phone. The first thing you are asked to do when you buy an android phone is to log on with a Gmail account as part of the activation process. It never asks you for this again, but every time you turn on your phone it automatically logs you in to that Gmail account again and Google always knows who you are. Apple phones and tablets have something similar in that each device is given a unique identifier code known as a UDID.

So Google is tracking android phones and Apple is tracking iPhones and I have to guess that Microsoft is tracking their phones. Since you ‘authenticate’ yourself by logging onto a cell phone you have basically given permission for somebody to learn a lot about you without the need for cookies – where you are and what you are doing on your cell phone.

The next tool that can be used to identify you is browser footprinting. This is interesting because each one of us basically creates our own digital fingerprint telling the world who we are through our browser footprint. The browser footprint is the sum total of all of the things that are stored in your browser. Some of this is pretty basic data like your screen size, the fonts you prefer, your time zone, your screen settings. But there are other identifying features like Plugins or any other program that wants to create a place on one of your tool bars.

As it turns out, almost everybody has a unique browser footprint. You can test this yourself. You can go to the website Panopticlick and this will tell you if your browser footprint is unique. It will show the kind of information that others can see online about you and your machine. One would think that most people have the same sort of stuff on their computers, but it only takes one thing different to give you a unique browser footprint and almost everybody is unique. And the people who are not unique still share a browser footprint with a discrete number of other people.

Finally there is cross-device tracking and Google is at the forefront of this effort. Over time as you log onto Google from different devices, or as you authenticate who you are on multi-devices, Google and others can note that information coming from these various devices are all from you. And so when your browse from home and are looking at new cars, it will become possible for them to tell an auto dealer what you have already done in terms of research once Google notices by your cellphone GPS that you are at a car dealer. They aren’t doing this quite yet, and for now they are just linking and tracking you across your multiple devices. But this tracking effort gives them a more complete picture of who you are, which is what big data is all about.