New Technology – December 2014

MagneticMapHere are some of the interesting new technologies I’ve run across in recent weeks:

Faster Data Speeds. Researchers at Aalborg University, MIT and Caltech have developed a new mathematically-based technique that can boost Internet data speeds up to 10 times. In a nutshell they code data packets and embed them within an equation. The equation can be solved when all of the packets are received at the other end.

While this sounds complicated, it is vastly faster than the current TCP/IP standard that is used to transmit packets. With TCP/IP once a data file begins to be transmitted the packets must be both sent and received in order, and they use the same data path over the Internet. If a packet is bad or gets lost the TCP/IP process slows down trying to find the missing packet. But under the new technique, different packets can take different paths on the Internet and it doesn’t matter if they are receive in the right order. They are reordered as the equation is solved.

In prototype trials this speeded up data transmissions from between 5 and 10 times. And transmissions are inherently safer because all of the packets don’t take the same path, making it a lot harder to intercept them. This technology can apply to any data transmission network. This is one of those changes that is a fundamental breakthrough because we have been using TCP/IP for decades and everything is geared to use it. But this has promise to become the new data transmissions standard.

Any Surface Can be an Antenna. Scientists at Southeast University in Nanjing China have developed a meta-material that can turn any hard surface into an antenna. They do this by embedding tiny U-shaped metallic components in the surface. These little Us act like what is called a Luneburg lens. Normal lenses are made out of one material and refract light in a consistent way. But a Luneburg lens is made up of multiple materials and can bend the light in multiple ways. For example, these materials can be used to focus on a point that is off to the side of the lens (something normal lenses can’t do) or they can radiate all incoming radiation in the same direction.

These meta-material surfaces can be designed to act as an antenna, meaning that almost any surface could become an antenna without having to have an external dish or receiver. Perhaps even more interesting, these same meta-materials can be used to scatter radiation which could make fighter jets invisible to radar.

Another Step Towards Photonic Chips. Researchers at Stanford have developed an optical link that uses silicon strips to bend light at right angles. This adds a 3D aspect into the chip topography which will help to accommodate the speeds needed by future faster computers. The can be reconfigured on the fly to use different light wavelengths making it possible to use the strips to change the nature of the computer as needed. This is one of the many steps that is needed to create a purely photonic computer chip.

Cooling With Magnets. Scientists in Canada and Bulgaria have developed a way to produce cooling using magnetic fields. This works by removing ferromagnetic materials from magnetic fields which causes them to cool down. They have found several substances that are efficient in heat transfer. Further, they are using water as the heat transfer fluid eliminating harmful hydrofluorocarbons. This can be used for refrigerators or air conditioners without the coils and pipes by just rotating the cooling element in a magnetic field.

Synthetic Gasoline out of Water. German company Sunfire GmbH has developed a process that can make synthetic fuel from water and carbon dioxide. The technology has been around for a long time and uses a process called the Fischer-Tropsch process. But the company has found a way to make the process far more efficient. The fuel that is produced has a high energy coefficient of 50%, similar to diesel fuel, compared to a much lower efficiency for gasoline between 14% and 30%. But the company thinks they can get the efficiency up to 70%.

The interesting thing about the technology is that it is carbon neutral since it takes the carbon dioxide out of the atmosphere to create the fuel, as compared to pulling it out of the ground. The are also numerous benefits from having a more efficient. With this technology we can keep our gasoline cars without having to rely on the petroleum industry. It could help to take the politics out of oil and could let us cut back on the amount of petroleum we need to refine.

The Dark Side of Web Advertising

virusYesterday I talked about the general way that Internet ads function. But today I want to look at one of the darker aspects of web advertising by looking at how ads spread malware.

Cisco’s Annual Security Report for 2013 provided some pretty amazing statistics about Internet advertising:

  • They said that the highest concentration of online security threats are not found on pornography, pharmaceutical or gambling sites, but rather that the most danger today comes from major search engines, retail web pages and social media outlets,
  • They said that online shopping sites are 21 times more likely, and search engines are 27 times more like to deliver a malicious piece of software than a counterfeit software site.
  • But no threat compares to online advertising, and Internet ads are 182 times more likely to give you a virus as searching the web for porn. (Of course, they didn’t say how the intrepid Cisco researchers made the comparison to porn).

Probably the major culprit of malware in advertising comes from a practice called real-time bidding. When you go to load a web page that has real-time bidding, an ad company like AppNexus (or many others) asks for bids for placing ads on your page. The solicitation gives a quick profile of who you are in terms of age, demographics, geography, etc. The highest bidder then gets the ad space, and this all happens in a flash. The problem with this kind of system is that nobody has time to monitor the ads that are placed and so malicious advertisers gain access to you by bidding the highest. And they don’t have to bid much. It takes only a very tiny fraction of a penny to get an ad placed at one specific user.

The malicious ads don’t look malicious and are usually disguised to look like an ad for some normal company. But the purpose of the malicious ad is to put a piece of code on your computer. The bad news these days is that you don’t have to click on the ad to get the bad software – the act of opening the web page is often enough to activate it.

I run a malware checker regularly and I am amazed at how many pieces of malicious software I get regularly. It is not unusual for my computer to have picked up a hundred pieces of malware within three days after having scrubbed it. I don’t shop much on-line, but I read a lot of articles and I assume that is the source of most of my malware.

According to my malware software, most of the bad things that I pick up are adware, which they define as a piece of code that is gathering and transmitting data about me to the world. These days adware is generally something a little more complex than a cookie. Cookies are somewhat passive files that sit on your machine to tell somebody later that you have already been to a certain web site or something similar. Think of adware as cookies+ in that they gather specific data and either store it for later retrieval or, in the worst cases send it out to the world.

I’d say 99% of what I get is adware with only the occasional more malicious malware, which could be a virus or some other nasty piece of code. But think about what I am getting. I am inadvertently downloading 100 pieces of adware within just a few days, each of which is looking for specific facts about me and reporting back to whoever placed the malware. I am sure that mostly they are tracking the web sites I’ve visited in order to build up a more detailed profile about me. But these little pieces of malware can pick up almost anything else from bank account numbers to passwords.

I think we all understand that half of what is on the web these days is designed to build a profile for each of us. But I don’t think most people realize how intrusive this effort has become. They are not building a profile by slowly studying your web usage. They are spying on your directly to know everything you do. It’s a bit scary when the most dangerous place on the web is a search engine or a major news site that has ads.

Yesterday I talked about ad blocking and perhaps this is what is going to save us from this rash of malicious malware and adware. Certainly if somebody will block all ads to my computer then I can’t be receiving ads with malware. But I would be just as happy if somebody could deliver ads to my machine that are certifiably safe. It doesn’t take a lot of effort for an ad company to test an ad first to make sure it doesn’t leave bad code behind. But that can’t be done in a process where an ad space is advertised and subscribed in milliseconds. This gives the bad guys a really cheap way to get their ads to anybody they want.

So I think Google is onto something with their product that can block all ads. But as I described yesterday, Google is not the last company in the chain between a web site and a user, so I am guessing that even with Google ad blocking that some ads and malware are still introduced after Google has wiped out the first ads. Your ISP is the last entity to touch the data stream coming to your house and thus has the final chance to get rid of malware. I think ISPs might be missing the opportunity to offer better security to their customers by either blocking ads or by making sure that ads are safe.

Who Owns Internet Ad Space?

advertise-hereGoogle made a very interesting announcement a few weeks ago that led me to find out more about the ad space on web sites. Google announced that for $2 per month they would block all ads on web sites for a customer as long as they browse through the Chrome browser.

I find this fascinating because it means that Google thinks that they have the ability to block an ad, even when they are not the one to have placed the ad in the first place. Google sells a lot of ads, and so it makes sense that they can block ads that they have placed on a web page. But when they say they can block all ads it also means that they think they have the ability to block ads placed by somebody else.

Just to be clear about what I mean by ads, look at this web page. At the top is a banner ad. At the top right of the story is an ad. And across the bottom of the article are four ads. After loading this web site multiple times I noticed that the ads changed.

It turns out that there are two kinds of ads on a web page. There are fixed ads and remnant ads. Fixed ads are placed there by the web site owner or somebody they partner with to advertise for them. Fixed ads embedded into the web page and can only be accessed by the website owner. The other kind of ads are called remnant ads. These are coded in such a way as to be available to outsiders, and anybody that has access to a website before it reaches a customer can change what is in the remnant ad space.

And as you would expect, these remnant ad spaces get changed all of the time. There are a lot of companies that sell advertising into the remnant ad space including Google (DoubleClick), Yahoo, Amazon, Facebook, AOL, AppNexus, Openx, Adroll, RightMedia and dECN. It was very easy for me to spot remnant ads in the recent election season, because I swear that every web page I looked at here in Florida had a political ad for Rick Scott who was running for reelection as Governor. So somebody was being paid in Florida to put those ads onto Florida computers.

The first question this raised for me is: who owns this ad space? The web page example is from the TechCrunch web site. TechCrunch chose to make the ads open to the public and I assume they gets revenues from at least some of the parties that use that space, which is their motivation to use remnant ad space. Google thinks they have a right to go in and block whatever is on the remnant ad space on that page, so they are sure that it is theirs to grab. I know that some of the larger ISPS like cable companies are also in the advertising business, through partners, and I wouldn’t be surprised if it was Comcast that gave me all of the Rick Scott ads.

I was shown a recent legal opinion by one of the companies that advertises in the remnant space who was gracious enough to share it with me as long as I don’t publish it. The opinion says basically that nobody owns the remnant ad space. The legal opinion says that the act of a web site owner in making this available to the public means just that, and it can be used by anybody who somehow has access to the website before it reaches a customer. That generally is going to mean some company who is part of the chain between a web site and the customer. Obviously the web site owner can hire somebody to place ads in the remnant space. If you reach the web site through a browser then the browser owner can place the ad in there. If you get to a web site through a link on another web site like Yahoo News then they can place ads there. And your ISP also would have access to this ad space.

I really like the Google product that blocks ads. I think there are plenty of customers who would love to avoid all of those ads. Further, blocking ads means a faster Internet experience for a customer. I know there are web sites I go to that have multiple videos automatically running that seems like an extravagant use of my bandwidth. I have a 50 Mbps Internet connection and there are still web sites that load very slowly due to all of the extra videos that have been layered into the ad spaces. I also learned that remnant ads are one of the most common sources today of adware and malware and I will talk about that more in tomorrow’s blog.

A History of Net Neutrality

Network_neutrality_poster_symbolThese days it seems like everybody has an opinion about net neutrality. Ever since Arpanet was opened to the public in 1981 we have had almost the same debate we are having today. So today I thought I would look back at some of the key history in the net neutrality debate.

The first key event that could be called the beginning of the net neutrality debate was the publication of a paper entitled End-to-End Arguments in System Design by three computer scientists, Jerome Saltzer, David Reed and David Clark. For the real nerds among us I’ve included a link to that paper. This paper was written for a conference and was not intended as a scholarly piece, and yet it shaped the thinking of the early public Internet.

In the paper the authors said that the only logical way to design a network that had limited resources and that had to serve a large number of users with widely different interests was to have a network that performed logical operations on the edges, rather than the core. What they meant by this was that the core of the Internet should consist only of fast but dumb pipes and that any manipulation of data, and the paper focused on error correction as the example, should be done at or near the edge of the network with the last mile ISP or the user.

This paper had a big influence on the way the Internet was operated and for many years the Internet operated in a way consistent with this paper. Everything that was done on the Internet was done near the edge. For instance, the servers for large services like CompuServe or AOL were on the edge. The functions that ISPs made to receive and reconstruct files were on the edge. And end user software was contained in computers on the edge. In the middle were a handful of large carriers that transmitted data from hub to hub.

As the general public got introduced to the Internet the idea that the Internet ought to somehow be regulated arose. People who used the Internet liked the wide open feel of it and were worried that commercial uses of the Internet would change the nature and experience for everybody. During the 19080s we started seeing things like early versions of VPNs where large corporate data was given priority over other data. There was talk of creating priority bits for real time events like voice calls and video. And so the discussion began on whether the government ought to intervene and regulate the Internet in some fashion.

In 2000 Harvard law professor Lawrence Lessig published a book Code and Other Laws of Cyberspace. This was a scholarly work that explored the topic of Internet regulation. Lessig said that the end-to-end principle was one of the most important reasons that the Internet had produced growth and innovation and that a free and open Internet ought to be maintained. Lessig argued that there was a role for government, which was to maintain the end-to-end principle. He thought that without government regulation of some sort that commercial interests would chip away at the freedom and function of the Internet until it would lose the characteristics that make it so beneficial to society.

He used the word ‘code’ as a surrogate for software, meaning that whoever controls the software of the Internet can control what happens on it. He thought, rightfully so, that either commercial or government code could eventually interfere with the operation of the Internet. Today it’s obvious that both kinds of control are going on. Entire countries have been carved away from the open Internet by governments and other countries like Russia are considering doing the same. US carriers want to create Internet fast lanes and the ones in Europe have already done so. And we find ourselves being spied upon by governments and by commercial entities who either record everything we do or who plant spyware on our computers.

Tim Wu, a law professor at the University of Virginia built on the ideas in Lessig’s book and published an article in 2002, A Proposal for Network Neutrality. Wu argued in favor of the same end-to-end principle and said that an open internet caused a Darwinian competition among every conceivable use of the Internet and that only the best uses would survive. He said that network neutrality (he coined the phrase) was necessary to make sure that there was no bias against any use of the Internet.

Wu understood that some network bias was unavoidable, such as giving priority to voice packets so that voice could be transmitted over the Internet. But he thought that there should be some sort of defined dividing line between permissible bias and impermissible bias. And that dividing line, almost by definition has to be defined by regulators.

And so today we are still at the same point where Wu left the argument. Sadly much of the debate about network neutrality has wandered off into political directions and no longer has to do with the way we manage packets. But absent some sort of regulation it seem clear to me that commercial and government use of the Internet will continue to chip away a little at a time until the Internet is a controlled environment, and that any user’s Internet experience is going to be subject to the whims of whoever controls their local part of the Internet.

Watching Networks Die

old telephone wiresA few weeks ago I went to Phoenix, and in driving around various neighborhoods I noticed a lot of problems with the copper network. I was out in an outer suburb, in horse country where the lots are large and where most people live back long unpaved lanes. It was a rural area, but a pretty upscale rural area. What I saw was that a number of poles were looking pretty ragged, with some looking ready to fall down in a stiff wind. But what was most noticeable was that a lot of pedestals looked to be in bad shape. Many had been knocked over at some point in the past and were lying on the ground. Some were cracked open leaving the wiring exposed. But my favorite was a pedestal that was held up against a stop sign using duct tape.

A few months ago I reported on a telephone network on an Indian reservation where the carrier cabinets were hanging wide open exposing the electronics to the elements. And this wasn’t in the dry southwest desert but in the snowy northern plains. That same network had telephone cables draped for long distances over the tops of barbed wire fences.

I also did a Google search and it appears that it has become a common practice to leave cables running across sidewalks. I know when I got cable service at my house that they ran the cables across the sidewalk and the yard, but within a week a crew showed up and buried it. But I am reading about cases where telephone or coaxial cable has been left lying across sidewalks for years at a time.

I can’t recall ever hearing of this practice until sometime during the last decade. Before that the cable or telephone companies simply did not string cables over the ground for more than a day or so as part of a new installation.

It certainly is possible for poorly maintained cables to lead to disaster. Years ago when I worked for CP National, one of our customers in rural Nevada was killed when they strangled on a low hanging cable they ran into while horseback riding. That certainly is a rare occurrence, but it is not hard to foresee all sorts of problems arising when cables are left where they can touch people.

I also recently read an article at Stop the Cap that gave pictorial and video evidence of cables that have been draped permanently over backyard fences or left on the ground for long periods of time. That same article talks about how Cleveland has dozens of complaints about telephone wires that have been cut and are dangling to the ground. The City has tried to get AT&T to fix the dangling cables, but it turns out that due to deregulation the City has no legal authority to require the company to clean up its mess.

One can think of many reasons why we are seeing more and more of these kinds of situations. A lot of carriers are now using contract labor that is paid by the installation, which gives them the incentive to take shortcuts to finish jobs quickly. Years ago installations were done by trained employees who worked to good standards and who took the time to make sure that an installation was done correctly,

The issue in Cleveland is probably the result of competition. As competitors bring new service to a home they often just cut the old service drop without caring what happens to it. This seems like something that state Commissions could deal with, even in this day of declining regulation.

But the really bad networks like the ones I saw in Phoenix and the ones on the Indian reservation are due to the total neglect of the copper network by companies that plan to walk away from copper at some point.  Both AT&T and Verizon have made it clear to the FCC that they both tend to walk from large swaths of their rural networks within just a few years.

The large telcos have systematically ignored rural areas. They have closed customer service centers and cut back on maintenance staff to the point where an average rural installer is often in charge of a huge geographic area. It often takes more than a week to get a technician to the house when a customer has a problem.

The big telcos don’t neglect all copper, just the rural copper. CenturyLink serves the area where I live and the network looks to be in great shape. This is partly due to the fact that the town I live in was devastated a decade ago by hurricane Charley and much of the plant has been rebuilt. But this is also an upscale area where CenturyLink is pushing their Prism TV product, which requires a decent copper network. But I don’t have to travel too far inland away from the water in Florida to see older and more neglected networks.

The neglect of rural networks is not new. The large telcos have severely cut back on copper maintenance for years and even decades in rural areas. It was widely reported by people I know in West Virginia that Verizon basically walked away from the rural parts of the state almost 25 years ago when they decided to sell the whole network to somebody else. It took them almost two decades to find a buyer and in the meantime the copper network degraded significantly. I have a nephew there who is a lineman for Frontier, who now owns that network, and he is not sure that what is left can ever be made to work well.

It seems pretty clear that the telcos are going to walk away from copper. And so perhaps it does no real good to complain about the quality of the copper networks they plan to abandon. In just a few years we will instead be talking about a whole lot of rural people who won’t even be able to get dial-tone to access dial-up Internet. Very rural places are just going to have a harder and harder time being connected to the rest of us.

How We Love to Hate the Large ISPs

Poor-customer-satisfaction-272x300I have read a number of articles lately that reminded me of the love / hate relationship that Americans generally have with the large ISPs. Here is a summary of some of these stories.

Americans Pay More for Less Bandwidth. The Open Technology Institute at the New American Foundation recently released its third annual report where it compared US broadband speeds and prices in 24 US cities and in cities around the world. This report shows that speeds have increased in US cities since 2012, but on a cost per megabit delivered most US cities still fall to the bottom of the comparative list. The broadband winner is Seoul where a gigabit of data costs $30 per month followed by Hong Kong and Tokyo at $37 and $39. Contrast this to Verizon FiOS where 500 Mbps costs $300. Very few places in the US outside of Google, some municipalities and some Independent telcos offer an affordable gigabit service.

One of the more interesting comparisons made by the report is comparing the cost for buying 25 Mbps connectivity. The most affordable place for this was London at $24 followed Seoul, Paris, Tokyo, Copenhagen and Prague. The cheapest US City is Kansas City at $41, due to competition with Google. The US cities with Verizon FioS came in around $50. The lowest price in a US City not served by a fiber provider is San Francisco at $58 per month. Most US cities are over well over $60. Not surprisingly, the larger municipal networks like Chattanooga and Lafayette LA are at the head of the US affordability list after Google. The US is also the only country that charges monthly fees for a cable modem and the cable modem customer spends over $100 per year for the cable modem.

The report went on to note that 75% of US customers who can get 25 Mbps service have only one service option. The report concluded that around the world that one thing that holds down landline data prices is significant competition with cellular data. For example, in much of the rest of the world the monthly data caps on cellular phone plans are up to 40 times higher than they are in the US. But our low data caps and the relatively slow speeds of our cellular data networks means that cellular is not a good substitute here for a landline connection.

Customers Continue to Rate Large ISPs Poorly. The results of the annual American Customer Service Satisfaction Survey was recently released and showed that satisfaction with large ISPs is still quite low and is getting worse. This is an annual poll of 70,000 consumers and asks about a wide swath of large businesses. The composite satisfaction with all large ISPs was at 63 on a scale of 100, down from 65 a year ago, and which puts the ISPs at the bottom of the list of all industries. Within those numbers, Verizon FiOS held steady at a rating of 71. Time Warner did the worst dropping from a rating of 63 in 2013 down to 54 this year. Comcast was not far behind dropping from 62 to 57. Century link is the only ISP that improved slightly and went from 64 to 65. Both Cox and Charter dropped 4 points in the last year.

Consumers felt slightly better about their cable TV service and that got a composite rating of 65 compared to the 63 for broadband, But that rating is down from a 68 a year ago.  The ratings were down for every major cable provider compared to 2013. The highest ratings for cable were 69 by DirectTV and AT&T U-verse, while the lowest rating was again Time Warner with a 56.

What is probably the most disheartening about these ratings is that they are dropping year over year. Consumers already rate ISPs and cable companies at the bottom of their satisfaction list across all industries. One would think that would prompt them to improve. And perhaps to some degree they are improving some since speeds are slowly getting faster. But overall satisfaction continues to drop. One might think that price has a lot to do with this, particularly for the cable TV business where there are hefty rate increases each year. But prices have also started to creep up for data and several of the major ISPs are now planning on raising data rates a little each year.

AT&T U-verse Told to Change Advertising. The national Advertising Division (NAD) told AT&T to modify the way they advertise  U-Verse data speeds. AT&T has widely advertised the product as offering up to 45 Mbps and NAB found that in many markets this speeds was either not available or not widely enough available to justify the claim. NAB is a division of the Council of Better Business Bureaus and monitors national advertising claims of all sorts. The NAD recommendations are not mandatory, but since big companies participate in the Better Business Bureau they generally take heed of NAD findings. NAD has made similar findings against CenturyLink in recent years.

I guess it’s really not surprising that customers rate the large ISPs so poorly when you consider some of their practices. Many of them use poorly trained contract installers who don’t put a good face on their company. Many of these companies are notorious for not showing up for scheduled appointments, which is something that a lot of consumers never get over. This year we heard several recordings from Comcast reps who would not let customers drop service. And there is the annual and persistent rate increases.

The End Game for Technology

Zero Marginal CostThere was an intriguing new book published this year by Jeremy Rifkin, The Zero Marginal Cost Society: the Internet of Things, the Collaborative Commons and the Eclipse of Capitalism. The book is well written and engaging and is packed with interesting ideas. It takes a look at our economic and technological future and makes some bold predictions.

First, Rifkin says that we are headed towards a time when there will be zero, or near zero marginal cost of producing many of the goods we need. Marginal costs refers to the cost of producing the next unit of something, be that a consumer good or a kilowatt of electricity (while ignoring fixed costs). He provides numerous examples of how the cost of many goods have already decreased or disappeared.

An example would be an electronic copy of a popular song. Once the song has been digitized it costs nothing extra to get that song onto the hands of hundreds, or millions or billions of consumers. We have already seen the entertainment and publishing industries be devastated by the dissemination of electronic copies of songs, books and videos. There are now millions of people worldwide getting college credits from courses that were recorded one time by distinguished professors. And we see the start of a 3D printing industry that can make a huge variety of consumer items from recycled materials (including making new 3D printers).

But Rifkin looks into the near-term future and says that within fifty years that this phenomenon will have spread to more of the economy. For instance, he foresees technological breakthroughs that will allow for the nearly free production of green electricity locally where people need it. He foresees solar cells and other technologies being produced by 3D printers, with the net results that people will not have to be reliant on a centralized electric grid. But he thinks the biggest breakthrough that will reduce the costs of production is the Internet of Things. He says that as big data becomes available to everybody that productivity will increase so significantly as to transform our society.

Second, the books says that we are headed towards a society of the Collaborative Commons. He says that since the onset of the industrial revolution that we have thought of the capitalist market and the government as the only two ways to organize economic life. But before the industrial revolution most things were done cooperatively. Families, neighborhoods and towns all worked together to satisfy the basic needs for people. If a new building was needed, people got together and built it. If infrastructure like a water supply or grain storage were needed, it was handled cooperatively.

But with the industrial revolution came capitalism which was the most efficient way to pay for and organize costly factories and the supply chains that fed them. But Rifkin believes that as marginal costs approach zero and profits are largely removed from the picture for many good and services that we will see a resurgence on cooperatives and other collaborative ways of getting things done locally.

Cooperatives have never died and today cooperatives and non-profits represent about 5% of the GDP production in the US and 10% of the workforce. He sees that increasing significantly as cooperatives (collaborative commons) become a normal way for society to work together absent the profit motive. One of the biggest issues with Rifkin’s ideas is that somebody has to supply the basic infrastructure. There will still be a need for government to take care of the basic things like roads and water systems. But he foresees collaborative commons eventually taking over much of the rest. As an example, I am currently working with a new start-up cooperative in Minnesota that is building a fiber network – because the capitalist system has failed to bring them the infrastructure they want.

Rifkin’s third idea in the book is that capitalism will decline. He says that as the profit motive is reduced that there is a reduced incentive for capitalists to enter markets for goods that can be produced collaboratively. There will always be a need for capitalism because some things are too complex to produce locally. Somebody is still going to have to make the jet planes and the robots and somebody will have to mine and process the raw ingredients needed for our technological future.

Rifkin tackles the idea that only capitalism can bring about innovation. He points to such things as open source software as an example that collaboration can produce a better product than capitalism. Open source software has been demonstrated to have fewer bugs and can respond more quickly to user needs by having thousands of programmers looking at a piece of code rather than a small team.

I don’t know if Rifkin is right, but I do know that he has recognized and made a coherent story out of a number of trends that I write about in this blog all of the time. We have already seen technology completely transform industries like music and entertainment. We are on the verge of the widespread use of robots tied into big data analytics and artificial intelligence that is going to transform large swaths of our economy. Rifkin has painted one possible picture of the end game of all of these changes and he has posed dozens of really good questions in this book that are well worth thinking about.

Big Data is Coming Your Way

The InternetI read all of the time that there is an explosion in the amount of data that companies have to process and that the data is getting more complex. Today I look at the data that big carriers have at their fingertips and that eventually will be available to smaller carriers.

Let me start by talking about small carriers, the folks who read this blog. For many years we have had a good grasp of what we need to know about our customers. Companies have had somewhat flat database files that captured things like the customer name and address, a list of the services they purchase and a list of the equipment associated with their account. Some small carriers have taken this further and captured things like a history of every service call at a given address. Overall, if you consider the amount of data that we keep on any one customer it’s a manageable pile in information.

And most of the world was just like the little carriers. If you went back ten or fifteen years it’s likely that the customer database at a large carrier like AT&T Wireless was not very much more complicated than the databases at the small carriers. But over the last decade, and particularly over the last few years a lot more data about customers is available to somebody like AT&T. It’s natural that they would start capturing new kinds of data, even if they aren’t entirely sure how to use it yet.

So what kind of data is available to AT&T that was not available ten years ago? Following are a few examples:

Geospatial Data. Once GPS apps were activated in smartphones we gained the ability to know at all times where the phone (and presumably the caller) is at. In the past AT&T could identify the cell site of a caller, but only during the time that they were using the phone. They didn’t have any way to otherwise routinely know the location of a customer.

But today through the GPS built into a smartphone they can know where a phone is at all of the time it’s powered. So naturally companies like AT&T gather this kind of data, as do many other companies such as Google or Apple who control the operating systems of the smartphone. This data is just now starting to be of use in targeted advertising and is expected to become more and more valuable over time.

Voice Recognition. In the old days somebody like AT&T would have noted that a given customer called them. They would know what the call was about to the extent that somebody at AT&T took notes that became part of the customer service record. But today natural language processing technology means that customer calls can be recorded and then saved as text. AT&T can have a stored transcript of every call they receive.

Text Analysis. Whether the source is transcribed voice call, a text message, a tweet or an email, there is now software that can make contextual sense out of correspondence with your customers. This means that you can not only save text from customers but that your software can take a stab data analyzing it and categorizing according to key words and phrases within the message.

Sentiment Analysis. To go one step further, there is now software that can make a judgment about the mood and intent of customers. This can be done with recorded voice looking for emotion or can be done with text looking positive or negative sentiments.

And these are just examples of the new source of customer data that any company might have access to. AT&T is now storing a vastly greater amount of data about each customer than they did in the past. They are keeping a record of where the customer is at all times and records on every interaction with the customer.

This doesn’t even consider the huge amount of data that AT&T Wireless can derive from being the ISP. As an ISP AT&T knows every website a customer visits, the text of every email, tweet or Facebook posting, ever topic they have looked at with search engines, every video watched over the web. The amount of data gathered by the ISP side of a large company is truly daunting.

We are at the very beginning of the big data analytics industry. It’s not hard to believe that within another decade that companies will be able to accurately profile their customers (or can buy the data from somebody else to do so). It’s already eerie how much somebody like Google can know about you. The next evolution of the industry is probably going to come from incorporating artificial intelligence from something like IBM’s Watson computer platform that will direct the analysis automatically.

We all know that what the big companies like AT&T do today that smaller carriers will be doing in ten years. For example, the software to analyze things like the sentiments of customers will get cheap enough or common enough that it will come built into customer service software. And this means that your company will be gathering vastly more data than you do today.

How’s Cable Doing?

Cord cuttingWith all of the talk of cord cutting, cord-shaving and the general demise of the cable industry I thought it would be useful to take a snapshot of the cable industry at the end of the third quarter of 2014 to see how the industry is doing. Here are some key facts for a numbers of major cable providers:

Comcast. For the quarter they lost 81,000 TV subscribers compared to losing 127,000 in the 3rd quarter of 2013. Meanwhile they gained 315,000 data customers compared to 297,000 customer a year before. Overall profits were up 4% over the year before. Comcast now has 22.4 million video customers and 21.6 million data customers.

Time Warner Cable. The company lost 184,000 cable subscribers in the third quarter compared to 122,000 in the previous year. But the company did add 92,000 residential data customers for the quarter. Earnings were up 3.6%, driven by cable rate increases and growth in the business services group. The company saw a 9.6% increase in programming costs, driven by a bad deal they made for the programming rights to the LA Dodgers.

Charter Communications. Charter lost 22,000 video customers for the quarter compared to 27,000 a year earlier. They saw data customers increase by 68,000 compared to 46,000 a year ago. Overall profits were up 8% driven by rate increases and data customer gains. Charter finished the quarter with 4.15 million cable customers.

CableVision. The company saw significant loss of 56,000 cable customers, Profits for the company dropped to $71.5 million for the quarter down from $294.6 million a year earlier.

Cable One. The company lost 14,000 video subs and ended with 476,000 at the end of the quarter. The company has not renewed programming from Viacom starting in April of this year

Suddenlink. The company added 2,200 video customers for the quarter compared to a loss the previous year of 3.200 subs even though they have dropped Viacom programming. Revenues increased by 6.6% compared to a year ago.

AT&T. U-verse added 216,000 cable customers for the quarter and added 601,000 data customers. The company now has more than 6 million video customers and 12 million data customers. U-verse profits were up 23.8% compared to a year earlier.

Verizon. The company added 114,000 new video customers and 162,000 new data customers for the quarter. The company now has 5.5 million video customers and 6.5 million data customers.

DirectTV. The company saw a decrease of 28,000 customers for the quarter while revenues grew by 6% due to rate increases. The average satellite bill is up to $107.27 per customer per month.

Netflix. Netflix added 1 milllion US subscribers and 2 million international subscribers for the quarter. They now have 37 million US customers and almost 16 million international ones. But these growth rates were less than their predictions and their stock tumbled 25% on the news.

Amazon Prime. The company does not report number of customers. But their earnings release says they gained significant customers even while increasing their annual fee from $79 to $99.

What does all of this mean? As can be seen by looking at all of the major players who make quarterly releases (companies like Cox do not), one can see that total video subs are down by maybe a net of 100,000 for the quarter. But cord cutting is growing when you consider that the industry used to routinely grow by 250,000 customers per quarter for now households being built. So it looks like cord cutting is growing by perhaps 1.5 million per year.

Within these numbers one can’t see the effects of cord shaving. It’s been widely reported that customers are downsizing their cable package as a way to save money. None of these companies report on their mix of types of customers.

Netflix and Amazon Prime continue to grow significantly along with other on-line content providers. It’s been reported that over half of the households in the country pay for at least one of the on-line services and many others watch free content available at Hulu and other sites.

One thing that is obvious is that broadband is still growing for all of the service providers. In fact, Comcast and other traditional cable providers are starting to refer to themselves more as ISPs than as cable companies.

Supreme Court to Rule on Internet Threats

Scales-Of-Justice-12987500-300x300Yesterday the Supreme Court took oral arguments on the issue of Internet threats. Specifically, they are being asked to decide is if there is a legal line to define where something posted on the Internet can be perceived as an illegal threat.

The specific case being heard is Elonis v. United States. In the case Anthony Elonis was convicted of writing graphic fantasies about killing his wife and other women he knows on Facebook. Elonis argues that his posts were meant as rap music lyrics and that the court should view them using a subjective intent standard, meaning they should look at how they were intended. The original prosecutors thought that the court should look at an objective standard, meaning that they should look at whether a reasonable person would perceive what was written as a threat.

This is a difficult thing for courts in general to interpret. There has been centuries of case law based upon verbal threats where the tone and actions of the person making the threats could be taken into consideration. But it’s difficult to understand the tone, context or intent of something written online.

In judging if the standard should be objective the Court has the difficult task of defining a “reasonable person”. Language that might shock somebody’s grandmother might be very normal for teenagers. We live in a world with extreme subcultures. Groups like rappers, skinheads, white supremacists and even war gamers use language within their sub-culture that might be shocking to anybody else. Should people in these subcultures be prosecuted for using language that is different than the societal norm?

The case happens to be very timely because there is a lot of discussion currently about Internet threats. For example, there has been a lot of coverage of ‘gamergate’ where male war gamers have been harassing and making threats against women who say that the gaming culture is too sexist. And there are numerous cases of on-line bullying where kids are harassed in the net.

The outcome of this case has implications for both the police and for ISPs. If the Supreme Court takes the side of the government in the case they could be opening up the floodgates for prosecutors arresting people for what they say on the Internet.

In this particular case Elonis comes across as an unsavory character. The on-line postings began immediately after he split with his wife. He then harassed a coworker and got fired. The final postings that got him arrested talked about killing a female FBI agent who had questioned him about his postings about shooting up an elementary school.

It certainly appears that Mr. Alonis has a lot of issues. But when the Supreme Court chooses between judging him with a subjective or objective standard they will be giving guidance for how other on-line speech should be treated. We are all aware of the phenomenon where people on-line tend to be more aggressive and say thing that they would never say in person. There is something about the anonymity of sitting at a keyboard that lets people go farther with speech than they normally would. But should people go to jail for saying something outrageous on the Internet?

There is also an implication in this for ISPs. ISPs are often the first party contacted when somebody is threatened on-line. ISPs have a wide range of responses to such requests. Some do nothing and tell people to contact law enforcement. Others will take action and ban a customer who is harassing somebody else. So ISPs ought to pay attention to this ruling to make sure that you understand if there is going to be a change in the national standard for on-line harassment.

There are many in the country who don’t entirely trust law enforcement to be even-handed. One can envision indictments against what people write on the Internet as an easy way for the police to harass somebody. There certainly is a debate raging in the country about how far a police force ought to be able to go with using military might or in shooting suspects.

But on the flip side there is a lot of on-line harassment. But there is also a lot of real life harassment that people don’t get normally prosecuted for. I’m sure many of you saw a recent video of a woman who was the subject of multiple catcalls and lewd statements as she walked down city streets.

Our society seems to be reassessing what we find acceptable as normal behavior. Laws often change as a reaction to a change in societal norms. We certainly had times in our past where it was unacceptable to assail women in the street. But we are now at the other extreme and it would not be unusual for the pendulum to swing back the other way towards less permissiveness.

There probably is nothing harder to deal with in the Country than our first amendment speech rights. Courts have usually protected free speech even when it’s vile and not the norm. But we also know that threats of any kind often turn into later actions, and this court is being asked to figure out a national policy on a very tough issue.