The History of Cellphones

IBM-SimonThis is another blog that looks at the history of the industry and that today I look at the history of the cellphone. Cellphones are arguably the most successful product in the history of our industry, but young people are often surprised to find out that the industry and technology are still relatively very new.

Prior to 1973 and stretching back into the 1920s there was some version of radio phones that were mostly used by businesses with vehicle fleets. These services were generally of somewhat poor quality and also limited either by the number of simultaneous users (only 3 at a time, per city in the early 50’s) or by geography (you couldn’t leave the range of the tower you were connected to).

But several breakthroughs enabled the cellphone technology we know today. First, in the late 1960’s Philip T. Porter and a team of engineers at Bell Labs proposed the system of modern directional cell phone towers that we still have in place today. In 1970 Amos E. Joel of Bell Labs invented the ‘three-sided trunk circuit’ that is the basis for cellular roaming, allowing a call to be handed from one cell tower to another.

The big breakthrough came in 1973 when Martin Cooper of Motorola and researchers at Bell Labs came up with the first hand-held cellphone. The first phone weighted two and a half pounds and was nine inches long. The first phone could hold enough charge for 30 minutes of talking and took ten hours to recharge. But the idea of having a handheld portable phone took hold and several companies began developing a wireless product. Interestingly, none of the prognosticators at the time thought that the technology had much of a future. They predicted future customers in the tens of thousands and not in the billions that we see today.

The first commercial use of the new cellular technologies was introduced in Tokyo in 1979, Scandinavia in 1981 and in the US in 1983. The technology was analog and referred to as Advanced Mobile Phone System (AMPS). It had a number of flaws by modern standards in that it was susceptible to eavesdropping by use of a scanner and it was easy to introduce unauthorized phones onto the network. I can recall occasionally seeing somebody talking on one of these mobile phones in the 80s, but there were relatively rare. But the phones got smaller and batteries improved and the first flip phone was introduced in 1989.

The first system that was more like what we have today was also introduced in the US by DynaTAC using 1G technology. Early 1G was an analog service and was made into a digital offering in 1990. In the early 1990s the second generation network was introduced using 2G. There were two competing technologies at the time (and still are today) that differed by the underlying standards – the GSM standard from Europe and the US-developed CDMA standard. The first GSM network was introduced in Finland in 1991 and hit the US in 1993.

Also introduced in 1993 was the IBM Simon phone that could be called the first smartphone. It has features like a pager, fax machine and PDA merged with a cellphone. It included advanced features for the time including things like a stylus touch screen, address book, calendar, calculator, notepad and email. About this same time was the introduction of texting. The first text message was sent in England in December 1992 followed by Finland in 1993. Texting was everywhere by the mid-1990s.

The demand for accessing the web from a cellphone drove the creation of 3G. This changed the phone from circuit switching to packet switching allowing the introduction of a data connection. The first 3G network was introduced in Japan in 2001, Korea in 2002 and in the rest of the world starting in 2003. By the end of 2007 there were 295 million customers using a 3G network which represented 9% of worldwide cell phone subscribers. Apple released its first iPhone in 2007 that used the 3G technology. That phone was the first ‘modern’ smartphone and today smartphone sales dominate the worldwide market. Finally, around 2009 saw the introduction of the first 4G networks, This increased theoretical data speeds by a factor of 10. There were two different commercial standards for 4G data – WiMAX and LTE. Many of these networks in the US have just been completed for most urban and suburban customers.

So it’s easy for a kid to think we have always had cellphones. But the first iPhone was only seven years ago and the flip-phone was the predominant phone for more than a decade before that. Before the flip phone there were very few cellphones users compared to today. This is an industry that has grown entirely during my career in the industry and it’s still hard sometimes to believe how well it has done. Now, if I had just bought that Apple stock . . .

AT&T’s IP Transition

telephone cablesA few weeks ago I talked about how Verizon and AT&T are using the FCC’s IP Transition to try to get out of serving regulated services on copper landlines. Today I want to talk more about AT&T. Earlier this month they met with the FCC staff to talk about their ideas on the IP transition and they followed up that meeting with this memo.

The FCC’s IP Transition looks at replacing the PSTN (Public Switched Telephone Network). This is the complex network that has been used to carry voice traffic and that assures that every call attempted ought to be completed. It’s now an old network, is separate from the Internet and it still mostly uses time division multiplexing technology (TDM) that is based upon using circuits that are some multiple of T1s.

The PSTN has served the country well, but pure IP technology is a lot more efficient and the FCC is working towards replacing the PSTN with something new and IP-based. The PSTN is a series of connections, called trunks that connect the central offices switches of all of the carriers in the country along with the electronics that control the network. It’s important to note that the PSTN is only the network between carriers and does not involve any connections to customers. The PSTN has always been technology agnostic in terms of supporting any kind of network such as copper, coaxial or wireless and in allowing any kind of phone or customer device as long as a carrier can locally support it.

But the PSTN network is more than just the wires connecting carriers because it includes things like the SS7 network that is used along with each call to transmit the calling number and other information. And the PSTN comes with a number of specific regulatory requirements that define the ways that carriers of different types can interconnect with each other. The FCC’s major role in this process is to rework all of the rules that define how carriers interact in the new IP world. So the FCC is being careful in dismantling the PSTN because if it’s done incorrectly there could be chaos between carriers and even problems in completing calls. Before they order a mass migration from the PSTN the FCC has authorized a number of trials to convert parts of the PSTN to IP to look in detail to make sure that everything works as hoped.

But AT&T and Verizon have hijacked the IP Transition and persist on using it as an excuse for replacing copper connections with something else. In the case of AT&T they talk about wanting to replace millions of home phones on copper with cellular. The FCC’s IP Transition never intended to require or be associated with changing technologies used by customers. The FCC must be getting very frustrated to see AT&T and Verizon continuously blame them for the changes they are trying to foist on customers. I’m actually somewhat surprised that the FCC hasn’t told them to knock it off. If I was the FCC I would be telling customers that it is not my intention to kick people off of copper. There is no technical or regulatory reason that copper networks can’t work with an IP version of the PSTN.

As you can see by this memo, AT&T intends to kick people off copper in several communities as part of what they call Technology Trials. But they don’t want to say ahead of time where those communities are because they know full-well they will be met with a lot of resistance. It’s funny that AT&T says they don’t want to divulge where their trials will be done due to fear of how competitors will act. The only competitors that benefits from AT&T’s plan will be the cable company in each town which will pick up most of the abandoned customers along with AT&T’s own cellular business. It’s incredibly unlikely that the cable companies are going to find any problem with AT&T’s plans or that releasing the information early would somehow give the cable companies some kind of edge.

AT&T has some really great writers and their memos always sound very logical and well thought out. This memo certainly seems reasonable if one didn’t understand what they are actually talking about doing. They want to knock people off copper, wait until the last possible minute to announce who that will be, and then blame it on an FCC as part of the IP Transition. AT&T will largely be forcing customers to the cable companies if they want landline voice or data.

One might not think this is all that bad of a thing. After all, the copper is getting old and perhaps it is time for it to go. But a large part of the reason rural copper is so bad is from years of neglect. One might not feel so bad about people living in small towns who end up having to go to the cable company if AT&T bails on a town. But you have to realize that small-town cable networks are sometimes in worse condition than the copper. And in most places, if AT&T shuts down the copper then the cable company becomes the only game in town.  One really has to feel bad for the people who live outside rural towns, outside the reach of the cable companies. They are going to lose the only wire to their homes.

An Alternative to GPS?

Outdoor-GPS-Unit1A few weeks ago DARPA (Defense Advanced Research Projects Agency) issued a request to the electronics and aviation industries to consider if there can and ought to be an alternative to GPS (Global Positioning System). DARPA has several concerns about GPS: It doesn’t work underground or underwater. It can be severely degraded by solar flares. And, since GPS is satellite-based, it is susceptible to being jammed or knocked out of commission by an enemy. And so DARPA asks if we should be exploring an alternative to GPS that overcomes these deficiencies. Since a lot of things we do relies on GPS any replacement has to be at least as accurate as GPS, which can pinpoint anything within 25 feet anywhere on earth 95% of the time. When coupled with land-based GPS augmentation technologies the accuracy can be narrowed in the best cases to within a few centimeters for land-based locations.

GPS was developed by the military in the late 1970s as a needed component of more accurately firing missiles from atomic submarines. The submarines needed to know exactly where they were located in order to calculate the desired path of a missile. But in 1983 after the Soviet Union shot down a civilian airliner KAL 007 that had strayed into their airspace, Ronald Reagan ordered that GPS be made available to all commercial aircraft.

GPS basically works by triangulation. Today there are a series of thirty GPS satellites at about 12,500 miles above the earth. To get the most accurate reading a location must be able to see at least four of these satellites. Each GPS satellite contains a very accurate clock which is almost as accurate as the atomic clock that is the basis for keeping official time. Each GPS satellite continuously transmits a message that includes the time the message was transmitted and the position of the satellite at the time it was sent. On the earth, a GPS device reads these transmissions and does a calculation to determine the coordinates of the GPS device. The math is somewhat complex in that a sphere is calculated around each of the received GPS signals and where those spheres intersect is the location of the GPS device.

In 1996 President Clinton authorized GPS to be used for any commercial use and by around 2000 it became widely adopted. Since then the number of ways that GPS is used has mushroomed. Following are some of the more important uses today of GPS, with the telecom uses listed first:

  • Cellular telephony. GPS is essential today in cellular roaming in handing cell phone calls from one tower to the next. GPS is also used for determining the caller location for cellular calls to 911.
  • Telematics. GPS is used to determine the location of moving vehicles. Telematics enables technologies like using Siri to help you with driving directions. This is also used for tracking and locating ships. This same technology enables stores to track the location of shoppers based upon their cellphone signal.
  • Geotagging. This is used in modern mapping systems to overlay photographs over maps.
  • Surveying and mapping. We now use GPS when mapping the routes of proposed fiber or other utilities and to determine property boundaries.
  • Geofencing. This is the technology used in fitness trackers, dog collars and other systems used today to track the location and travel history of a GPS device.
  • Clock synchronization. The accuracy of GPS time signals (±10 ns) is second only to the atomic clocks upon which they are based and many of our telecom devices get their timing from the GPS satellites.
  • Automated vehicles. GPS is going to be key in developing automated vehicle and drones.
  • Meteorology. GPS is used in sensors and balloons used measure and calculate atmospheric pressure, wind speed and direction in the upper atmosphere.
  • Aircraft tracking and navigation.
  • Tectonics. GPS enables direct fault motion measurements to pinpoint the epicenter of an earthquake.

This partial list shows you how quickly GPS has been integrated into our everyday lives in just the last decade. GPS is now a key component of huge number of industry and functions that we count on daily. I can see why DARPA is concerned about the security of GPS. The thought that the GPS system could be disabled in an attack on the country is scary. Luckily DARPA thinks there are alternatives and suggest some possibilities including “electro-optic/infrared (EO/IR) and radio frequency (RF) imaging (active or passive imaging), active/semi- active/passive guidance by EO/IR and/or RF signals, and tracking by exploitation of signals of opportunity.”

The Soundtrack of Our Lives

beatles3Today is my 365th blog entry, and while that has taken over a year and a half to publish that represents a full year worth of short essays. I am going to use this personal milestone to step out of my normal daily blog and talk about something that has been on my mind. It’s still something that is somewhat tech-related but it’s also quite personal and I bet most of you reading this will see yourself in here somewhere.

I want to talk about how I grew up with music and how the web has changed that experience. I was prompted to think about this a few days ago when on the last day of my recent vacation I played four Beatles albums end-to-end. That’s something I haven’t done for a while because the modern music experience doesn’t favor listening to whole albums.

I did this using a modern music web site, Spotify. This music service provides millions of songs on their service but also lets me import and integrate my own music library. I generally let Spotify mix up my music and use it like a radio station, but instead I listened straight through Magical Mystery Tour, Revolver, Rubber Soul and Sgt. Pepper’s Lonely Heart Club Band. And as I listened I got that old feeling of listening to music linearly like when we plopped albums onto a turntable and listened to them end-to-end. The satisfaction of listening this way came from the fact that I knew the words to every song, having listened to these albums many times, but I also always knew what song was coming next. My brain not only stored all of the lyrics of these Beatles songs, but also the play order on the albums.

This was refreshing to me since I hadn’t done this for a while. It was like meeting a long-lost friend. But it made me think about the difference in the personal experience of music today versus music back then. When I was young we obviously did not have millions of songs at our disposal. What we had instead was the radio, music stores and friends with album collections. Radio was pretty vibrant in those days, particularly when I moved to Washington DC, and it introduced you to a lot of great music. You would listen as much as you could to the radio or to friend’s collections to see what you liked and then you made an investment in buying an album. Since none of us had unlimited funds, the choices you made became the music that you listened to over and over (and over). You got to know certain artists really well.

I remember the great satisfaction once a month when I had enough excess funds to make a trip to the music store. This would always be on a Friday night and I would linger from bin to bin making the choices that I knew I would have to live with. Whether I had enough money to buy one album or half a dozen, these trips were one of the highlights of every month. And while buying a few albums at a time was somewhat limiting, it didn’t stop me over the years from migrating from classic rock, to punk, to folk, to reggae, and to new wave with many other side trips.

But then jump forward to today. Spotify, iTunes and other music services are more geared to songs than albums. I look at my daughter’s play list and she has one or two songs from hundreds of artists rather than a lot of a stuff from a few of them. And to some degree I have jumped on the same bandwagon because there is such an immense library of music available, including many of those things that I almost bought years ago on a Friday night buying trip. I can now indulge every musical whim.

But this smorgasbord of choices makes our music into a personal radio station. What I notice is that my daughter and wife drop and add songs all of the time, making their play list fresh and different. Artists are sampled and if something tickles their fancy it gets added to the playlist, and if it gets boring it goes. This is so different than the linear experience where you listened to an album with its good songs, bad songs and great songs and you came to know and love them all.

I’m not being nostalgic because I love the options that Spotify offers me. One of my favorite activities when I have a spare hour is to just leap from song to song, from artist to artist and listen to music I’ve never heard before. That is a freedom that was not there in the analog days. But I do lament the loss of intimacy and commitment that came from choosing an album and choosing an artist. That became your music and you listened to it and you learned it and it became ingrained in your mind and in your soul. Every person’s album collection was different and we each created our own personal soundtrack to accompany our lives.

Can Web Experiments Go Too Far?

Numismatics_and_Notaphily_iconI remember a few months back when there was a big stir in the Facebook community when it was announced that Facebook had been experimenting to see if they were able to influence the moods of Facebook users. They gave some people very upbeat feeds and gave others more negative feeds to see if the different feeds would influence people’s moods positively or negatively. And as one would suspect it did impact people and there was a difference between seeing puppies and kittens versus bus wrecks and war stories. But Facebook got caught and they issued the appropriate apology and promised they would never do it again.

I find the whole story amusing since people are experimented on every day on the web. I’m not sure that everybody gets that the vast majority of our web experience is funded by advertising. Most of the sites that people enjoy are there because of advertising, and web advertisers experiment on us every day trying to find that one technique, that one color scheme, that one catch phrase that will get more people to buy what they are selling.

There are countless examples of how experiments are done on users to find out what works and doesn’t work. The companies that run these experiments are often open about it and not apologetic like Facebook was. For example, just last month Google announced that it was launching a major set of experiments to improve its performance on cellphones. They’ve gotten very good on computers but are not getting the same results from phones. Google’s ultimate goal is to be able to track people’s purchasing across all platforms so that they can know when somebody sees an ad on a cellphone but completes the purchase on a computer.

Most big companies that sell things on the web experiment with their web site to see what best influences the number of sales or clicks they get. The process of experimenting with website design is called Conversion Rate Optimization (CRO). That’s a fancy way of saying that that a company will change subtle things about their site to see if it makes a difference in sales. They can change everything from color, fonts, pictures, message, layout etc. to see what is most effective. The web sales process is basically one large ongoing experiment on customers.

What works often defies logic. For example, Trip Advisor found that having a blue background was more effective when a customer came to their site from Google but that having a yellow background was more effective for customers who came straight to Trip Advisor. They have no idea why.

The subtle differences that come from CRO can make a big difference in results. For example, Google revealed earlier this year that using a different shade of blue on search results caused more people to click links and this one change increased their revenue by $200 million for the year.

This is not to say that all such CRO changes are ethical or as easy as changing colors. For example, some web sellers use techniques like using deliberately confusing language to get people to buy or click something. Or they may trick customers into checking boxes that give away the right to return a product. And web sales have always used techniques like hiding expensive shipping prices until the last step of the process. There has always been an unsavory side to sales and it’s no different on the web that has its own version of high-pressure sales techniques.

You can take some advantage of CRO with your own website. If you are trying to sell broadband products or add-on features on the web you should be taking steps to maximize your sales. You may not have the time or resources to conduct continuous CRO experiments, but you can still take advantage of the process. For example, take heed of the companies that are successful at selling on the web. Some of the most successful sellers of web telecom services are the various companies that sell VoIP services, so you might want to look closely at their web sites or to similar companies and compare them to your own. What colors are they using? What’s their mix of text and pictures? Do they use full sentences or phrases?

I often browse carrier web sites and I see many that are terrible at describing their products and prices. Too many companies build a website once and never really look at the design again for many years. This might be acceptable if your website is used for nothing more than to provide basic information about your company. But if you are hoping to drive any sales from your web site you have to put more effort into the details. Don’t be afraid to experiment a bit with different ideas, different looks, different presentations. And if you do, take notes so that you know what worked and didn’t work.

Living Within Our Data Caps

Cell-TowerAn interesting thing happened to the wireless carriers on their trip to bring us 4G. They warned us repeatedly that we could expect issues as they upgraded their networks, and they forced us onto skinny data plans of a few gigabits of data so that most of us have learned to use WiFi with our cellphones rather than spend a fortune with the cellphone provider.

But maybe the wireless carriers have gone too far. Adobe Systems reported last week that that more than half of all data from cell phones is now using WiFi instead of 3G or 4G. Total WiFi traffic from mobile passed data directly on the wireless networks more than a year ago. This has to be troubling to AT&T and Verizon because their business plans rely on consumers using the faster 4G LTE networks. They have made huge investments over the last few years in increasing data speeds and that is the basis of all of their advertising.

So perhaps the tactic of imposing small data caps has backfired on them. They are not seeing their new expensive networks used nearly as much as they counted on, and this is limiting their ability to monetize the expensive upgrades. I know that I personally am very happy buying a 2 gigabit monthly cap and I only use cellphones data for directions while driving or when I have no other choice when traveling. I would never consider watching a video on my phone when I’m not at home. Apparently there are a lot of people like me in the world.

When AT&T and Verizon realized that people weren’t using as much data as they had hoped for they both got into the tablet business hoping that it would boost the use of their 4G LTE data. They have been bundling tablets into plans and even selling them below cost as a way to drive more data usage on their networks. But that move has also backfired and I saw a report that estimated that 93% of tablet data usage is using WiFi instead of the LTE network.

The WiFi trend is only going to get worse for the carriers as Hotspot 2.0 rolls out. That is the new WiFi standard that is going to let cellphones and other devices easily and automatically log into public hotspots without going through today’s annoying process of having to log onto each new network. With Hotspot 2.0 you can be pre-certified to join any WiFi router that is part of that network. So as you walk down the street in a business district you might long onto numerous different WiFi routers as you walk by them – while staying off the LTE network.

The precursors for Hotspot 2.0 are already in the market today. I know that once I have logged in once with my cellphone to any AT&T or Comcast hotspot that my phone doesn’t ask my permission whenever I come into range of another of their hotspots and just automatically connects me.

It’s been reported that the wireless carriers have had pretty good success getting families to upgrade to monthly 10 GB deluxe plans. But what they didn’t count on is how so many people are being careful to stay within their plan to avoid getting hit with charges for extra data.

It’s been reported that both AT&T and Verizon have invested heavily in the Internet of Things and they are touting 4G connectivity as the best way to connect for a wide range of devices from wireless utility meters to animal-tracking collars. But a lot of the IoT devices in the world are going to be inside of homes and businesses where an LTE connection is often not as good as a signal from an inside-the-home WiFi router. The fact is that any outdoor radio broadcast signal is going to vary with factors like weather, temperature and the amount of the spectrum being used by others. This often makes LTE less reliable locally than a solid WiFi signal.

It will be interesting to see how the wireless carriers react to this. They have spent many billions upgrading their wireless networks and are not seeing the kind of revenue they expected from that effort. This might make them more cautious about leaping in to make the next big network upgrade, which seems to be needed every few years. It’s possible that they will expand their network more through mini-cell sites to make their signal stronger where people live as a way to make it more usable. The one thing they are unlikely to do, at least for a while is to give customers more data in the base wireless plans. They are likely to stick with the incremental data usage plans in place today.

One place the wireless carriers are counting on is in the connected car industry since that is one market where WiFi is not a real alternative. It is expected that every new car will come with data connectivity and that the amount of data used by each car will climb over time as more and more apps are included with cars. Expect them to be selling tens of millions of small monthly data plans to car owners as a way to make up for us all avoiding their expensive data on our cellphones. But even in that market they are competing against the smartphone which can handle all of the functions promised for the 4G functionality that is part of the smart car. I know I would rather get driving directions as part of my existing cellphone plan than buy a second data plan for my car.

Comments to the FCC on Data Speeds

FCC_New_LogoI’ve been reading through the comments in FCC Docket 14-126 that asks the question if the FCC should increase the definition of broadband. The comments are  sticking mostly to the expected script. It seems that all of the large incumbents think the current definition of 4 Mbps download and 1 Mbps upload are just fine. And just about everybody else thinks broadband should be something faster. In the Docket the FCC suggested that a low-use home today needs 4 Mbps download, a moderate-use home needs 7.9 Mbps and a high-use home needs 10 Mbps.

AT&T says that the current definition of 4 Mbps is adequate to define ‘advanced telecommunications capability’ per Section 706 of the FCC rules. They argue that customers don’t use as much bandwidth as the FCC is suggesting. For example, they argue that most of their customers who pay for 12 Mbps service rarely hit a maximum of 10 Mbps during a typical month. They argue that the FCC is trying to change the definition of broadband by only looking at what the heaviest users of broadband are using.

AT&T goes on to say that they and other companies like Google and the large cable companies are now deploying gigabit-capable technology and so the FCC has no reason to worry about data speeds since the industry will take care of the problem by increasing speeds. I obviously disagree with AT&T on this argument. They are using the red herring of what is happening in places like Austin Texas and extrapolating that to mean that the whole country is seeing huge broadband upgrades. As I have written many times, small town America is not getting any of the new broadband investment that AT&T touts in their comments. And rural America is still often stuck with dial-up, satellite or cellphone data. Further, AT&T has been actively saying elsewhere that they want to kick millions of customers off copper and get rid of their DSL option.

Verizon took a different tactic in their filing. They also don’t want the definition increased from 4 Mbps. They first argue that they have made a lot of investments in broadband, and they certainly have done so with their FiOS fiber network in cities and suburbs. But they then go on to argue that cellular data ought to be counted as broadband and that they are offering a great cellular alternative to people. They cite that 97.5% of people in the country have access to LTE with broadband speeds greater than 10 Mbps download and that this should be counted as broadband.

There are a few problems with their claim. First, Akamai collects the speeds from millions of cellular data downloads and they report that the average cellular data speed actually achieved in the country is 4.4 Mbps and not Verizon’s theoretical 10 Mbps. And cellular data is bursty, meaning that it’s designed to be fastest for the first few seconds of download and then normally slows down. More interestingly, a few months back Comcast citied Verizon and AT&T cellular data as evidence that Comcast has robust broadband competition. Verizon Wireless’s CEO countered the Comcast’s claim and said, “LTE certainly can compete with broadband, but if you look at the physics and the engineering of it, we don’t see LTE being as efficient as fiber coming into the home.” Finally, everybody is aware that cellular data plans include tiny data caps of only a few cumulative gigabits of download per month and cellphone users know that they must park on WiFi from landlines data sources as much as possible to make their cellphones usable for video and other heavy data usage.

Verizon goes on to cite the National Broadband Map several times as justification that there is already great broadband coverage in the US today. They say that 99% of households already have access to broadband according to the map. I have written several times about the massive inaccuracies in that map due to the fact that all of the data in it is self-reported by the carriers.

The big cable companies did not make comments in the docket, but there is a filing from the National Cable Telecommunications Association on behalf of all of them. NCTA says that the definition of broadband should not be increased. Their major argument is that the FCC is not measuring broadband deployment correctly and should measure it every year and report within six months of such measurements. They also say that the FCC should take more consideration of the availability of cellular and satellite data which they say are broadband. I haven’t commented on satellite data for a while. Some parts of the country can now get a satellite connection advertised with a maximum download speed of 15 Mbps. It’s been reported to be a little slower than that, but like cellular data a satellite connection has tiny data caps that make it nearly impossible for a family with a satellite connection to watch video.

In a speech last week FCC Chairman Tom Wheeler said that 10 Mbps is too low to be considered broadband and that federal funds like the Connect America Fund should not be funding the construction of any broadband with speeds lower than that. It’s going to be interesting to see where the FCC comes out on this. Because if they raise the threshold too much then a whole lot of households are going to be declared to no longer have true broadband, which is pretty much the truth.

The Growth of Mobile Video

ipad-review-3-new-10One trend worth noting is the explosion of mobile video, meaning on-line video that is watched on devices other than televisions or computers. Ooyala recently published a report looking at the trends in mobile video and the numbers are eye-opening.

  • In the past year mobile video watching has doubled and the rate of growth is accelerating. In February 2014 it represented 21% of all on-line video being watched and by June had grown to 27%.
  • It’s projected that by 2016 that more on-line video will be watched on mobile devices than on televisions and computers.
  • Cisco projects out further and says that mobile data could represent 69% of the world’s Internet traffic by 2018.

This has some real implications for anybody in the video business. Not only is on-line video growing rapidly with content being provided by Netflix, AmazonPrime and YouTube, but that video is being watched more and more on smartphones and tablets rather than televisions and computers.

This trend is being driven by a lot of different factors:

  • In the US this trend is partially driven by age. A recent Nielsen poll showed that Millenials are now watching 4.5 hours less of traditional TV per month than they did a year ago.
  • There is a big increase in TV Everywhere and cable operators say that about 90% of US cable subscribers now have access to TVE.
  • There has been an explosion in the number of mobile devices capable of watching video and sales of smartphones and tablets are sharply up.
  • There are now more worldwide users connected to the Internet through mobile devices than through landline connections.
  • There has also been rapid growth worldwide in both 3G and 4G mobile networks. Akamai reports that the average mobile data speed in the US is 5.5 Mbps. They also say that there are 21 countries that now have mobile speeds that average over 4 Mbps.
  • There are huge amounts of content being produced, particularly in the shorter lengths of under 30 minutes.

Viewing habits still vary by size of screen:

  • 81% of the on-line video watched on television screens is of lengths greater than 10 minutes.
  • 70% of the on-line video watched on tablets is of lengths greater than 10 minutes.
  • But smartphone users prefer shorter content and 45% of the video watched on smartphones is of 6 minutes or less. But the viewing of 30-minute+ videos on smartphones is growing rapidly

The interesting thing about mobile data is that in the US a large percentage of this traffic is being carried through WiFi using landline connections. The capped mobile data plans make it very hard for most customers in the US to watch very much data on their mobile plans without paying a big premium price. As I’ve reported in other blogs, American consumers are getting very smart about using WiFi whenever it’s available.

It’s also worth noting that video quality is increasing. Netflix and others broadcast a lot of content in high definition and are now starting to stream in ultrahigh definition 4K. The quality for shorter videos on sites like YouTube is also getting better with much more HD content. And better quality means more bandwidth demand for the network operator.

What does this all mean to network owners? My takeaways include:

  • The explosive growth of on-line video that is being watched on landline networks means a continued pressure to offer faster speeds in order to support multiple devices watching video.
  • A cable provider must offer a TV Everywhere product to stay relevant.
  • This is more mounting evidence that we are losing Millenials from traditional cable TV packages.

When Customers Comment

comment-boxPew Research Center has released another interesting poll that looks at how people interact with each other on social networks. There were two primary findings from that poll, which are both things that most of us have observed but that were interesting to see validated.

The first is that social networks tend to have a suppressive impact on the willingness of people to express personal opinions on a social network. On sites like Facebook and Twitter people tend to hang out with people of a like mind and this creates what Pew calls a ‘spiral of silence’. This is something I have always thought of as peer pressure. When people are on the same network with their kids, their parents, other relatives, their coworkers and their friends they tend to be reluctant to share views that they know are contrary or controversial to the views shared by their ‘friends’ on the social sites.

The study was conducted by looking at how willing people were to discuss the Edward Snowden – NSA story about the government spying on apparently everybody in the world. It turns out that people were less likely to discuss the topic on Facebook and Twitter (42%) than they were when talking live with somebody (86%). It’s obvious that the peer pressure of a social network stops people from expressing views that they might freely express somewhere else.

That’s interesting, but the other finding is that an opposite thing happens when people post on other sites like newspapers or customer service sites. There, the peer pressure seems to have the opposite effect and people tend to pile on to negative comments made by others. It’s almost as if seeing a negative comment gives them the courage to also say something negative. Anybody who owns a web site with a customer service contact page that that allows public comments knows about this phenomenon. On such sites many people will say things that they would never say in public and comments can quickly escalate and get incredibly nasty.

This creates a real dilemma for a company that wants to maintain a place for customers to comment, seek help or ask questions on the web. Many companies have shown that having a public forum can be an extremely effective way to identify problems that they might otherwise never know about. And the web creates a way to respond and often solve problems quickly.

But you need to have a thick skin if the comments on your site take a turn towards the ugly. One angry comment can lead to another until your site is flocked by angry people, many who might not even be your customers. All of the social media experts I read recommend that a company must engage customers in this sort of situation rather than withdraw or delete comments. They say experience shows that when a company addresses hostility in a reasonable, calm, persistent and truthful way that the company will be viewed as more human.

If you can further demonstrate that you are willing to solve some of the problems that caused the comments to escalate it’s quite possible to win some of your detractors over to your side. It seems that the phenomenon of piling on to negative comments, perhaps described as negative peer pressure, can be defused by reasonable tactics by a company.

You can’t wade into such a situation and just try to mollify people by being nice. That is the sort of behavior that people expect from customer service reps on the phone and they generally don’t like it on line. Instead you can feel free to disagree with people as long as you are doing so with facts and respect.

There are going to be angry people that you cannot mollify or even have a discussion with and sometimes the comments might get so vile that you will have little choice but to delete or ignore them. But when people have legitimate concerns and they go overboard in expressing unhappiness and frustration you can usually win them over by providing facts and solutions. After all, anybody complaining on your site obviously has a vested interest in your product or service and they generally want to like it. Your job in this situation is to help them do so.

The Second Machine Age

**HANDOUT IMAGE, MANDATORY CREDIT, NO TRADE, NO SALE**I just finished reading The Second Machine Age: Work, Progress and Prosperity in a Time of Brilliant Technologies by Erik Brynjolfsson and Andrew McAfee. These are two MIT professors who looked at a topic that I have written about often in this blog – the accelerating speed of everything related to computers and what that might mean for society.

The call it the second machine age and define the first machine age starting with the invention and implementation of the steam engine. That was the invention that created the first industrial revolution and changed the world in myriad ways. They say that we are on the cusp of the same kinds of changes due to everything that is going to come out of the development of computer technology.

This book was preaching to the choir with me and I have talked about a lot of the same trends in this blog. But these guys went out and talked to many of the technology innovators to make sure that what they were seeing was real and so this book is basically an affirmation of the idea that the exponential improvements in computer technology – chip speeds, memory, download speeds, size of components – is going to soon change the world in drastic ways.

We have already seen a lot of changes in our daily lives due to computers and the changes have been very spectacular, from the growth of PCs through a ubiquitous Internet. But the relentless improvements that are being made in the underlying technology due to the doubling of performance every few years is going to result in changes that are almost hard to believe.

Years ago I worked for Control Data, one of the two companies along with Cray, that developed the first supercomputers. What most people don’t realize is that your smart phone has far more computing power than the Cray-1, the best-selling supercomputer of the late 1970s and that machine was huge and required a ton of power to operate and cool.

Most people fail to grasp the nature of exponential growth. If computers and the related technologies double in power / speed / capacity every 18 months, then by 2025 the computers we use will be 64 times more powerful than the ones we use today (which is also true of today compared to 2005). But carry that out 19 years and the computers of 2033 will be over 8,000 times more powerful than those of today. That is a mind-boggling number. And there is no end in sight for this growth. Scientists and engineers have continued to find ways to improve all aspects of computer technology. I just reported last week on a new storage technology developed by IBM that is 100 times more efficient than the best memory today.

It is the sheer power of the computing in the near-future that is going to reshape the world in the same way that the steam engine changed the world a few centuries ago. The authors postulate that while we have all lived through the doubling of computer power for a few decades, the things that computers are capable of today are still within our mental grasp. We are just now starting to see some applications with processing big data that are starting to do things that were unimaginable in the past. For instance, we can sequence your genome in a few minutes now and tell you all sorts of things about your current and future health. Computers will soon be able to drive cars and converse easily in any language and act like the universal translator of Star Trek. Through sheer processing power computers can now enter a new environment and make the same sort of assessment as to what is there as a human. And who knows what they will be able to do in ten or twenty years.

The authors call this a revolution because it’s going to change a lot of paradigms that we are used to. For examples, computers are likely to take over any job that is repetitive in nature, and that doesn’t mean just factory line jobs, but things like writing news articles, analyzing requests form bank loans, handling packages. All sorts of jobs will likely disappear over the next two decades because machines are far more economical than people.

What nobody knows is if there will be new jobs created to replace those missing jobs. The authors think that there will be a period of decades where the replacements won’t happen and that the old worry that robots will replace us will become reality for a lot of people. In the book they cite Eastman Kodak and an example of how this might happen. The whole photo industry with its hundreds of thousands of jobs was displaced in a very short time by digital cameras and finally by online photo sharing services like Instagram. They believe industry after industry will implode as there are better economic alternatives due to computerization.

This book is not looking out at a distant future like at the end of this century, but instead is predicting widespread changes and disruptions to the society starting a decade from now. The book holds forth some exciting possibilities of the things that will be available in our futures. But it also paints a scary picture of the possibility of displacing a large chunk of humanity from the workforce. As a society we need to start thinking about these changes now, because they are going to be here a whole lot sooner than most people think.