Shorts for July

HTC-Incredible-S-SmartphoneToday I am going to look at a few short topics that I found of industry interest.

Comcast Ordered to Identify Commenter. The Illinois Supreme Court ordered Comcast to disclose the identity of a subscriber who was identified in a defamation suit. That is of interest both to people who make comments on the Internet as well as to ISPs.

The particulars in the case are that an anonymous poster compared Bill Hadley, who was running for a county Board seat, to Jerry Sandusky, who is convicted of child molestation. Hadley has persisted in seeking the identity of the poster through several layers of state courts.

The order is an interesting precedent because if it holds then it means that people cannot hide behind avatars and imaginary names and that they can be held responsible for what they say on the Internet. So Internet trolls, beware. But it also puts ISPs in an awkward position. Your customers want to believe that you will protect their identity and if ISPs have to routinely turn over these kinds of records it will be one more reason for people to not trust their ISP.

Computers Don’t Get Sarcasm. There are companies doing data analytics on comments left on websites like Twitter and Facebook to get an indication about how the country feels about topics in the news. One thing they have discovered is that they are unable to get their computers to understand sarcasm.

This matters because they try to classify what people say on a given topic as positive or negative, and they regularly misclassify sarcastic comments. This is not surprising since the primary purpose of sarcasm is to say a thing one way but mean just the opposite. This has implications beyond big data, because as we move towards having digital assistants that are more sophisticated than Siri, they will have to understand the way we really talk, including the ability to recognize sarcasm. I can picture numerous bad consequences arising from having a smart car, for example, misunderstand a direction because it doesn’t recognize when someone is being sarcastic.

Cell Phones Can Enable Stalking. Both Microsoft and Apple phones will soon have the ability for people to track the locations of others. Apple has had an app for a while called Find My Friends and now Microsoft has developed an app called People Sense. Both of these apps let you follow friends across a map to see their location and soon will get more sophisticated and let you initiate communication with people just by touching their dot on the map.

I can understand why some people find this of use and this could be a good way to keep track of your teenagers. But I find these applications disturbing for several reasons. First, it means that it will be even easier for your cell phone provider to keep tabs on where you go. I am further disturbed by the lack of privacy and wonder if people are ever going to have real privacy in the future as these kinds of apps become widespread. Finally, I can imagine hackers sneaking these applications onto phones and then stalking somebody in real time. I know that this is the sort of things that smartphones can be good at, but that doesn’t mean that it’s a good idea.

Court Rules that Butt-Dials are Not Private. Almost everybody with a smartphone has butt-dialed a call from time to time. I have one business associate that seems to butt-dial me fairly often. Good etiquette is to hang up on a call when somebody butt-dials you and not listen to what is happening at the other end.

But the Sixth Circuit Court of Appeals just ruled that people have a right to listen to a call made in this manner. The case in question involved a person who butt-dialed somebody and then talked for 90 minutes about firing the boss of the person who was called. The person who received the call recorded part of the call.

The court rejected the argument that the person who made the accidental call had any expectation of privacy. Instead, the court said that it’s up to people to implement apps, lock their phones, or in some other way make sure that they don’t make such calls.

This is obviously an important lesson for anybody. Before you have a conversation about anything sensitive you ought to check to make certain your phone isn’t broadcasting your conversation.


What is 5G?

Cell-TowerThe International Telecommunications Union (ITU) has created an official plan to bring 5G data to the market by 2020. So what is 5G and how does it differ from 4G? The goal of 5G is to increase the data capacity of cell sites, reduce latency, and increase the distance that can be served from a cell site. The goal of 5G is to build a wireless data path with built-in intelligence that can maximize the data delivery to a given handset or device. There is no specific bandwidth goal in the 5G plan, but it’s assumed to be a lot faster than today’s 4G networks.

But there are a lot of challenges to overcome to get to that future vision. Delivering more bandwidth is going to require more spectrum. Every slice of spectrum in use has limitations imposed by physics, and since today’s spectrum is already stressed, achieving 5G will mean adding more bands of spectrum into the cellular network.

It looks like the ITU is depending upon using both existing WiFi spectrum as well as a lot of higher frequencies that are not in use today. I’ve recently written about the wireless industry’s hope of poaching existing WiFi spectrum and I hope the FCC stops that attempt in its tracks. If 5G is ever allowed to use WiFi then that spectrum will quickly become a cellular spectrum that won’t be useful for anything else.

There is a lot of development work to be done to use higher frequencies, particularly for handsets. The higher the frequency used, the bigger the challenge to hold a connection with a non-fixed receiver like a handset. Even if they solve these issues, the higher frequencies they are considering, by definition, travel very short distances, and so the higher frequency portion of 5G will only benefit those very close to a cell site. This might be a great solution inside of a convention center, but not so much in the outside world.

What all of this means is that a 5G network is going to require a lot more cell sites packed closer together than today’s network. That has a lot of implications. First, it means a lot more investment in towers or in mini-cell sites of some type. But it also means a lot more fiber to feed the new cell sites. And those two factors together mean that any 5G solution is likely to be an urban solution only, or a suburban solution only for those places where a lot of users are packed tightly together. No wireless company is going to invest in a lot more 5G towers and fiber to cover suburban housing sprawl and certainly nobody will invest in the technology in rural areas.

We already have a cellular wireless divide today with urban areas getting pretty decent 4G and rural areas with 3G and even some 2G. Expect that gulf to become greater as high-bandwidth technologies come into play. This is the big catch-22 of wireless. Rural jurisdictions have always been told to wait a while and not clamor for fiber because there will eventually be a great wireless solution for them. But nobody is going to invest in rural 5G any more than they have invested in rural fiber. So even if 5G is made to work, it’s not going to bring a wireless solution to anywhere outside of cities.

I’ve read a number of technologists who are skeptical about the targeted 2020 date for 5G, but it’s the nature of progress to set aggressive goals in order to goad improvement. But when you look at all of the issues that must resolved to implement 5G, 2020 looks unrealistic.

Instead, what is likely to happen is that the carriers will implement some pieces of 5G over time as each technological challenge is solved. This means we are likely to see a whole series of incremental upgrades over the next decade rather than one big flash-cut to a faster data network. This will provide numerous marketing opportunities and I would expect that by the time that the ITU’s version of 5G is fully implemented we will be calling it 10G. After all, we are still a long way from meeting the original specification for 4G, which was to implement 100 Mbps data speeds for a moving user in a car and 1 Gbps for a stationary user. Even the planned 5G isn’t going to do that.

Is the Internet a Necessity?

The InternetIn a speech recently made by FCC Commissioner Michael O’Rielly, he said that the Internet was “not a necessity in the day-to-day lives of Americans.” That’s a rather startling statement from somebody who seemingly has the job of making sure that the country has adequate broadband to meet its needs. But if we look at his statements in context, it raises some important policy issues that are worth public discussion.

O’Rielly made the comment as a counterargument to the spreading concept that access to the Internet has become a necessity, perhaps even a right. It’s also widely expressed today that broadband is now a utility, much like electricity and water.

It’s an interesting discussion. Several surveys in the last few years show that a significant majority of households rank Internet access as the most important service purchased at their homes. I certainly know that my daughter and all of her 16-year old friends would ‘die’ without the Internet and it seems like the younger you are, the more the Internet is an important component of daily life.

But O’Rielly’s comment was really a political and policy statement. There are certainly a lot of implications for governments if the country adopts the idea that having Internet access is a right. For instance, that would put a lot more pressure to bring Internet access to the places that don’t yet have it and to work hard to close the gap in the quality of the Internet between urban and rural places.

But it seems to me that the FCC has largely already bought into the argument that the Internet is a necessity. They are pouring billions of dollars into improving rural broadband. They are going to subsidize broadband access to low income households. They have adopted net neutrality as a policy which, to some degree, protects the ability of consumers to get what they pay for from an ISP. These all seem like the actions of an agency who thinks that everybody ought to have access to broadband.

FCC Commissioner Tom Wheeler responded to O’Rielly’s statement by saying that “broadband is the defining infrastructure of the 21st century. We should not and will not let up on our policies that make broadband more available.”

It’s obvious that Internet access is now a fundamental part of daily life for many people. I work from my home and I can’t imagine how I would function without it. Actually, I can imagine it, because after a hurricane and tornado hit me a few years ago I was without power and Internet access for 6 weeks. I basically regressed to what felt like the Stone Age. I essentially threw my hands up and gave up on work (and spent the time instead cleaning up the huge mess the storm left behind). I use the Internet almost continuously in making my living and as a society we have grown to a place where there is no realistic substitute for email and the ability to quickly exchange files and work products with others.

This is an issue that hundreds of municipalities are wrestling with. Communities look in envy at urban places that have great Internet bandwidth and they understand that if they don’t have adequate Internet in their community that they are liable to decline economically and fade away from relevance. Internet access is to cities today what the railroads were two centuries ago, and what electricity and Interstate highways were in the last century. Put into that context it starts feeling a lot like a necessity, at least at the community level.

I work with dozens of rural communities that have limited or no Internet access today. It’s heart-wrenching to hear people talk about trying to maintain a household of teenagers with only a cellular wireless plan or to hear parents lament that their kids can’t keep up in school without access to the Internet. For the vast majority of us who have Internet access it’s really hard to imagine going without.

I understand where Commissioner O’Rielly is coming from. He was formerly a Republican congressional aide and the Republicans feel generally that there are few ‘rights’ that the Federal government is obligated to recognize. But on this specific topic he might be on the wrong side of history, because my guess is that the vast majority of people in this country have grown to believe that having Internet access is a right and is something they cannot live without.

Can Cable Networks Deliver a Gigabit?

coax cablesTime Warner Cable recently promised the Los Angeles City Council that they could bring gigabit service to the city by 2016. This raises the question – can today’s cable networks deliver a gigabit?

The short answer is yes, they are soon going to be able to do that, but with a whole list of caveats. So let me look at the various issues involved:

  • DOCSIS 3.1: First, a cable company has to upgrade to DOCSIS 3.1. This is the latest technology from CableLabs that lets cable companies bond multiple channels together in a cable system to be able to deliver faster data speeds. This technology is just now hitting the market and so by next year cable companies are going to be able to have this implemented and tested.
  • Spare Channels: To get gigabit speeds, a cable system is going to need at least 20 empty channels on their network. Cable companies for years have been making digital upgrades in order to cram more channels into the existing channel slots. But they also have continued demands to carry more channels which then eats up channel slots. Further, they are looking at possibly having to carry some channels of 4K programming, which is a huge bandwidth eater. For networks without many spare channels it can be quite costly to free up this much empty space on the network. But many networks will have this many channels available now or in the near future.
  • New Cable Modems: DOCSIS 3.1 requires a new, and relatively expensive cable modem. Because of this a cable company is going to want to keep existing data customers where they are on the system and use the new swath of bandwidth selectively for the new gigabit customers.
  • Guaranteed versus Best Effort: If a cable company wants to guarantee gigabit speeds then they are not going to be able to have too many gigabit customers at a given node. This means that as the number of gigabit customers grows they will have to ‘split’ nodes, which often means building more fiber to feed the nodes plus an electronics upgrade. In systems with large nodes this might be the most expensive part of the upgrade to gigabit. The alternative to this is to have a best-effort product that only is capable of a gigabit at 3:00 in the morning when the network has no other traffic.
  • Bandwidth to the Nodes: Not all cable companies are going to have enough existing bandwidth between the headend and the nodes to incorporate an additional gigabit of data. That will mean an upgrade of the node transport electronics.

So the answer is that Time Warner will be capable of delivering a gigabit next year as long as they upgrade to DOCSIS 3.1, have enough spare channels, and as long as they don’t sell too many gigabit customers and end up needing massive node upgrades.

And that is probably the key point about cable networks and gigabit. Cable networks were designed to provide shared data among many homes at the same time. This is why cable networks have been infamous for slowing down at peak demand times when the number of homes using data is high. And that’s why they have always sold their speeds as ‘up to’ a listed number. It’s incredibly hard for them to guarantee a speed.

When you contrast this to fiber, it’s relatively easy for somebody like Google to guarantee a gigabit (or any other speed). Their fiber networks share data among a relatively small number of households and they are able to engineer to be able to meet the peak speeds.

Cable companies will certainly be able to deliver a gigabit speed. But I find it unlikely for a while that they are going to price it at $70 like Google or that they are going to try to push it to very many homes. There are very few, if any, cable networks that are ready to upgrade all or even most of their customers to gigabit speeds. There are too many chokepoints in their networks that can not handle that much bandwidth.

But as long as a cable network meets the base criteria I discussed they can sell some gigabit without too much strain. Expect them to price gigabit bandwidth high enough that they don’t get more than 5%, or some similar penetration of customers on the high bandwidth product. There are other network changes coming that will make this easier. I just talked last week about a new technology that will move the CMTS to the nodes, something that will make it easier to offer large bandwidth. This also gets easier as cable systems move closer to offering IPTV, or at least to finding ways to be more efficient with television bandwidth.

Finally, there is always the Comcast solution. Comcast today is selling a 2 gigabit connection that is delivered over fiber. It’s priced at $300 per month and is only available to customers who live very close to an existing Comcast fiber. Having this product allows Comcast to advertise as a gigabit company, even though this falls into the category of ‘press release’ product rather than something that very many homes will ever decide to buy. We’ll have to wait and see if Time Warner is going to make gigabit affordable and widely available. I’m sure that is what the Los Angeles City Council thinks they heard, but I seriously doubt that is what Time Warner meant.

Unbundling the Broadband Networks

canada_flag-1920x1080The Canadian Radio-television and Telecommunications Commission (CRTC) has ordered that large telecom companies, both telcos and cable companies, must unbundle the last mile of their network and make the facilities available to competitors.

With this ruling the CRTC has said that competition and choice is important. This was a surprising ruling because all telecom companies had filed comments stating that forced unbundling would be a disincentive for them to build expensive fiber facilities to homes and businesses.

This ruling was only the first step; the processes and procedures needed to accomplish unbundling still need to be worked out. It’s estimated that perhaps the first unbundled connections will be available to competitors by the end of 2016

This ruling applies to both fiber and coaxial networks and will apply to the larger providers like BCE (Bell Canada Enterprises) as well as to the two biggest cable companies, Rogers Communications and Shaw Communications. But the biggest impact is expected to be on BCE which has invested heavily in fiber to both businesses and residences.

The CRTC said that this was the only path they saw towards competition since the cost of building duplicate fiber networks was expensive and not likely to happen.

We know something about unbundling in this country. The Telecommunications Act of 1996 ordered large US telcos to unbundle their copper networks and make them available to competition. This promoted the explosion of CLECs in the late 90s, but the use of unbundled copper largely died when many of the CLECs formed during that period imploded during the telecom crash in the early 00s.

But the FCC in this country has never required unbundling of fiber. In fact, the 1996 Act removed the unbundling requirement as soon as a telco replaced copper with fiber. The Act did require the unbundling of dark fiber (fiber sold without electronics), but as is typical in this country, the telcos chipped away at that requirement to the point where it became incredibly difficult for a competitor to get access to telco dark fiber.

Our experience in this country is that the large companies will comply with this requirement only reluctantly, and here they put as many roadblocks as they could in the way of competitors. The telcos here required difficult paperwork for every step of the process and dragged their feet as much as possible any time they worked with a competitor. There is a famous rumor in the industry that in the work space at one of the large US telcos that dealt with unbundling there was a large sign reading “Delay, Delay, Delay”. Too bad this was before cellphone cameras because several reputable industry people swear this is true.

The idea of unbundling active fiber is an interesting one. Certainly if a competitor could get access to fiber affordably they could offer an alternate suite of products and bring both product and price competition into the network.

The idea of unbundling a cable company’s coaxial network is not as easy to contemplate. Coaxial cables are arranged so that there is not a unique cable for each customer. At the pole each customer is added into the same data and cable TV transmission path as everybody else in their neighborhood. It’s hard to think of a neat technical way to unbundle anything in an HFC network. It might be possible to unbundle the data path, but this is also shared through most of the network. It will be interesting to see how the CRTC deals with the technical issues.

Obviously competitors here will keep an eye on the Canadian experiment to see how it progresses. There has been no cry here for unbundling of fiber networks, but if there was such a ruling I think it would enable a raft of new competitive providers and would bring real competition into the duopoly networks we have in most US markets. Certainly the US suffers from the same duopoly competition that drove Canada to make this ruling.

Remember the Human Equation

MR MONOPOLYI took a lot of economics courses in college – not quite enough to get a degree, but enough to keep me interested today in keeping track of how economists view the world. One thing that economists have always been trying to do is to build economic models that predict how people act in the real world.

Recently the World Bank issued a new report in a series of what they call World Development Reports and they suggest that economists are still not accurately predicting some key human behavior in their modeling. They mention three areas where economists need to improve their models. I found these three areas interesting, because these are also types of behavior that any good salesperson knows very well. This report reminded me that it’s as important for salespeople as it for economists to keep the human equation in mind. The report said that economists need to do better in reflecting the following three things:

The first principle is that all people think automatically. Automatic thinking means that people are often intuitive and impulsive. This differs from a lot of economic models that assume that people are logical and deliberative when making buying decisions. Certainly some people and businesses make deliberate buying decisions. But the real world is full of examples of things we all do that are not logical. Perhaps one of the most common example is how we save for our retirements. I don’t think you can find anybody who doesn’t understand that saving for retirement is really important. Yet a majority of people still don’t take the steps needed to be ready for retirement.

And every good telecom salesperson knows that buying decisions are often made on impulse or based upon emotional factors and are not always fully logical. When somebody changes telecom providers they generally do so somewhat blindly and based upon trust. They really hope that the quality of the service or the level of customer service will be better with the new provider than it was with the old provider. And so they often make an emotional decision to change based upon something they don’t like about their old provider—perhaps a negative billing issue or customer service experience.

The second finding in the Development Report is that humans think socially. This means that they often make decisions based upon either pleasing others or in accordance to what other people think. By contrast, economist models generally assume that people make decisions based upon their own selfish best interest. This finding isn’t as relevant to telecom buying as the other two items, but salespeople still see it in the market. For example, it’s a lot easier selling to people with kids to make the sales pitch based upon what’s good for the kids rather than what’s good for the parents.

The third principle is that people often think using mental models. For example, people might identify themselves as part of a larger group and make decisions based upon that identity. For example, young urban millenials are now a very hard sell for traditional cable TV. Once somebody is a part of that particular culture then they often make many buying decisions based upon the peer pressure of their friends. They might not buy a car and instead use Uber and they might not buy traditional telecom services and rely completely on their cellphone and other people’s WiFi.

It is possible to break a group identify mindset, but it must be done deliberately. For example, many elderly people are of the mindset that technology is beyond them and so they are immune to any normal sales pitch you might make to them. But if you take the time to show them what technology might do for them and let them know that there is training and help for them to learn to use the Internet, then they can become good customers.

I build a lot of business plans and every client who is thinking about building a new network always wants to know what their market penetration rate is going to be. That’s an easy thing to predict if you build in an area that doesn’t have broadband, because most people in that situation will buy what you have as long as it’s affordable.

But it’s a lot harder to predict market penetration when building to a market that already has broadband. Predicting the take rates in existing markets requires understanding the human equation. Here are a few of the things that I tell people, based upon the experience of having seen hundreds of market launches:

  • If you sell residential broadband you are almost always going to get at least 20% and maybe as much as 30% of the market rather easily as long as you have a decent price and as long as your product works well. It seems that in every market there are at least that many people who just can’t stand the incumbents and who will leap to a new competitor. And if you do a good job you will generally keep these customers.
  • But after this first easy pile of customers, how many customers you get is going to depend upon how good you are at selling. And selling means understanding the market and understanding the human equation. I generally see that companies that sell based upon having a good story to tell will do better than companies that try to sell on price alone. A customer that buys from you due to a low price will also drop you when they find a better price elsewhere. But if you can instead show them that there are reasons other than price to use you, then you have a chance of building a loyal customer base.

Interestingly, almost all businesses buy based upon reliability, and not price. Business customers know how badly they suffer when their voice or Internet service is down and so they care about the reliability of your network first and foremost. So selling telecom to businesses is something that meets existing economists’ models well because most businesses will choose a telecom provider deliberately and logically. It’s easy to build models to predict business penetrations, because if you do a good job and you are willing to put knowledgeable salespeople on the street, they will be successful over time.

New CableLabs Standard will Improve Cable Networks

coaxial cableCableLabs just announced a new set of specifications that is going to improve cable HFC networks and their ability to deliver data services. They announced a new distributed architecture that they are calling the Converged Cable Access Platform (CCAP).

This new platform separates functions that have always been performed at the headend, which is going to allow for a more robust data network. Today, the cable headend is the place where all video is inserted, all cable management is done, where the QAM modulation and RF Modulation is performed, and most importantly where the CMTS (cable modem termination system) function is done.

The distributed CCAP allows these functions to be separated and geographically distributed as needed throughout the cable network. The main benefit of this is that a cable operator will be able to push pure IP to the fiber nodes. Today, the data path between the headend and the neighborhood nodes needs to carry two separate paths – both a video feed and a DOCSIS data feed. By moving the CMTS and the QAM modulators to the fiber node the data path to the node becomes a single all-IP path that contains both IPTV and IP data. The new CCAP node can then convert everything to RF frequencies as needed at the node.

We’ve been expecting this change since for the last few years Chinese cable networks have implemented the distributed network functions. Probably one of the biggest long-term potentials for this change is that it sets the stage for a cable company to offer IPTV over DOCSIS frequencies, although there is more development work to be done in this area.

There are several immediate benefits to a cable system. First, this improves video strength since the TV signals are now originating at the neighborhood nodes rather than back at the headend. This will be most noted by customers who are currently at the outer fringes of a cable node. The change also will boost the overall amount of data delivered to a neighborhood node between 20–40%. It’s not likely this mean faster speeds, but instead will provide more bandwidth for busy times and make it less likely that customers lose speed during peak hours. Finally, it means that a cable company can get more life out of existing cable nodes and will be able to wait longer before having to ‘split’ nodes to provide faster data to customers.

Cable companies are not likely to rush to implement this everywhere. It would mean an upgrade at each node and most cable companies have a node for every 200–400 customers—that’s a lot of nodes. But one would think this will quickly become the standard for new nodes and that cable companies will implement it over time into the existing network.

This is the first step of what is being called the IP transition for cable companies. Most of my readers are probably aware that the telcos are working feverishly towards making a transition to all-IP. But cable companies are going to want to do that for a different reason. There is a huge amount of bandwidth capability on coaxial cable and if the entire cable network becomes IP from end-to-end then the huge data capacity in the cable network would be realized. Today cable companies use a broadcast system where they send all cable channels to every home and they then provide data services on whatever bandwidth is left. But in an all-IP system they would only send a customer the channels they are watching, meaning that most of the bandwidth on the system would be available for high-speed Internet services.

So think of this as the first step in a transition to an all-IP cable network. There are a number of additional steps needed to get there, but this pushes IP out to the neighborhood nodes and starts the transition.

Taxing the Internet

Numismatics_and_Notaphily_iconStarting on September 1, Chicago is trying something new and will be adding a 9% tax onto almost every service provided on-line. The city, along with the state of Illinois, is having huge budget problems and they are obviously leaving no stone unturned in looking to fill the tax coffers. But Chicago is the first local jurisdiction in many years that is trying to tax Internet-based services, something that will have wide repercussions.

So what are they trying to tax? The short answer is every information service that uses the Internet. For instance, the tax would apply to services that provide searchable databases—things like the LexisNexis system used by lawyers to find legal cases and precedents. As the data we use moves to the web this is a huge source of potential revenue for the city. Consider all of the services around today that charge people to access data. The Ordinance lists services like access to consumer credit reports, real-estate listings, car prices, stock prices, economic statistics, weather statistics, job listings, resumes, company profiles, consumer profiles, marketing data—any information or data that has been compiled, entered, and stored in a provider’s computer and then sold to others. The tax is also going to apply to taxable leases of personal property that include “cloud computing, cloud services, hosted environment, software as a service, platform as a service, or infrastructure as a service.

This tax does not apply to buying things over the Internet; it is not a sales tax on tangible assets, for instance, it would not apply to all of the physical products bought from Amazon. It would instead apply to companies like Netflix and Spotify and any other web service that sells electronic products. It would be up to the companies selling the onlune services to collect the tax and to remit the revenues to Chicago.

Obviously this new law will be challenged because it taxes a whole lot of things for the first time. It will also be interesting to see if this law is infringes on the protections provided several times by Congress in the Internet Tax Freedom Act as well as multiple times by the FCC, most recently as part of the Net Neutrality ruling.

But the city might have found a clever loophole. They are not taxing Internet access, but rather are taxing access to information and information services that happens to be stored somewhere else and then delivered over the Internet. It will be up to courts to sort out that nuance (or for Congress to pass a new law which is more specific).

One has to think that this law is very bad for businesses in Chicago. A 9% tax on anything is significant. Businesses spend huge amounts of money today on access to online databases and on cloud-based services that are moving their own information to the cloud. In effect, this law would tax companies for accessing their own data that they have chosen to store somewhere other than at their own business. I would not be surprised if this law drives businesses that spend heavily for such IT functions out of the city.

This also affects most people who live in the City directly. Almost everybody today who has an Internet connection buys some service over the web, be that a movie service like Netflix or Amazon prime or a music service like Spotify or Apple.

This kind of tax potentially adds a lot of cost for on-line service providers. Every town, county and state in the country has a different basis for assessing sales and service taxes like this one, and so this is going to require companies like Spotify to incorporate the tax assessment and collection process when they sell a subscription – something they don’t do today.

One would think that there will be a lot of avoidance of such a tax. It’s not hard for a business with multiple locations to be billed from a location that doesn’t get the tax. And since most on-line services don’t verify people’s addresses, somebody living in Chicago could most likely avoid these fees just by telling a Spotify that they live somewhere else. It’s hard to think that the City is ever going to be able to dig deep enough into online transaction to ever audit this.

But the real issue is not how the people in Chicago will deal with this. I am sure people and businesses there will take steps to avoid the new taxes if possible. The bigger issue is that other localities will copy the Chicago example if this holds up in Court. There is an old maxim that politicians have never seen a tax they don’t like, and so its not hard to foresee this tax spreading all over the country. And that is going to add costs to the online services we buy today, and since more and more things are migrating to the cloud this will become even more significant over time.

Court Jumps into OTT Fray

Fatty_watching_himself_on_TVIn a really surprising ruling, a federal judge has ruled that FilmOn X should be able to get access to local network programming like a cable TV company. US District Court judge George Wu ordered that FilmOn X be treated like a cable company and is entitled to retransmit broadcaster’s content.

For those not familiar with FilmOn X, check them out on the web. They have a huge amount of  on-line content that includes local TV from around the world as well as 600 other channels. There is a little bit of everything from non-traditional sports, music from around the world, and channels of almost any topic you can imagine. They also carry a mountain of for-pay video-on-demand content that ranges from music to major league baseball. All of the free content is ad-supported. Viewers can also create their own channels.

FilmOn X also had their own version of the Aereo model and they offered a premium subscription model in a few markets, which gave customers access to 120 HD channels on any computer or smartphone through the use of a dongle. Just like Aereo this was done from antenna farms.

The company has been in a battle with the major networks in the US since its inception. The company began carrying the local networks on the Internet in 2010. In 2011 they were ordered by a court to stop the practice. But in 2012, the local channels were all allowed back onto the system through a federal appeal and FilmOn X carried local content on its broadcast dongle product. But in 2013 the US District Court of the District of Columbia issued a nationwide injunction against the antenna service.

This latest ruling overturns that injunction and seemingly gives FilmOn X the same right to content as a cable company. Obviously this is going to be appealed further and one has to doubt that the networks are going to negotiate retransmission agreements with the company while the appeals are still being fought in court.

But the case raises serious questions. Although addressing a different set of issues than the Aereo case, it still sets up conflicting district court decisions. Aereo had taken the legal tactic of dancing around the issue of whether they were a cable company by concentrating on the issue of copyright infringement. FilmOn X took a more direct legal approach and argued that they had the rights to rebroadcast the content as a cable company. And apparently the court bought it.

Realistically nothing is going to happen in the area of on-line content until the FCC decides where it wants to go with this. Recall that in January of this year the FCC opened up a Notice for Proposed Rulemaking to look at the issue of on-line content. FilmOn X was mentioned several times in that document and the FCC is asking if on-line companies can have the same rights as cable companies to get content.

The FCC can put all of these lawsuits to rest by defining the rights, or lack of rights, of on-line providers. It’s fairly clear in reading the NPRM that the FCC has a bias towards allowing content on-line and is probably seeking a legal way to do that since they are required to follow the various cable laws that have been passed by Congress.

It’s hard to think that on-line content providers are ever going to be able to comply with all of the rules included in the current cable regulations. Those rules very rigidly define tiers of programming. They also define the retransmission process whereby cable companies can rebroadcast local content. But there are a ton of other requirements that range from closed captioning to emergency alert systems that also apply to cable companies. It’s going to be a challenge to give just a few of these rights to on-line providers while making cable providers continue to comply with all of the rules.

For now this ruling is just one more confusing court ruling that has defined the on-line broadcast industry so far. There have been several conflicting rulings as part of earlier cases with Aereo and FilmOn X that muddy the legal waters for the business model. But this is something that the general public very much wants and traditional cable will be in a lot of trouble if local content ends up on the Internet. It is that content along with sports that are the primary drivers behind maintaining the cable companies’ grips on customers.

Control of the Future Voice Network

FCC_New_LogoThe FCC is looking at how to transition from the traditional TDM-based PSTN to an all-IP telephone network. A number of carriers have submitted proposals that provide their vision of an all-IP network. Today blog looks at AT&T’s vision of the future IP network.

AT&T has proposed to the FCC that there be a handful of major switching hubs created in the country. Every carrier would then send their voice traffic to these hubs to be sorted and handed back to the various terminating carriers. Their argument is that the whole idea behind the IP transition is that the network be made as efficient as possible; they are promoting this idea as the simplest network to implement.

But is it the most efficient? Over the years I’ve done a lot of traffic engineering, meaning that I’ve helped companies analyze where their voice traffic comes from and goes to. What I’ve seen is that approximately 80% of voice traffic for most companies stays in a relatively small circle of perhaps 60 miles. This distance can vary a bit by company depending on how far away they might be from a major metropolitan area, but this basic traffic rule seems to apply pretty much everywhere I’ve looked.

So let me first look at the practical application of what AT&T is proposing. One would have to assume that if there was only a handful of nationwide interconnection points that they would be put in the same places as the major internet hubs – Atlanta, Chicago, Washington DC, New York City, Dallas, etc. What this idea means is that states not near to those hubs—say Montana, Nebraska, Maine, etc. would have to ship all the voice traffic from their state to the nearest big hub and back again.

While it might be more efficient to have only a few hubs, it certainly would not be efficient from a transport perspective. Somebody has to pay for all of that transport to and from the main hubs, and that is the real purpose behind the AT&T proposal. Under their proposal carriers other than them would pay to have all traffic brought to these main hubs. That is a major change from the way the industry works today.

Today there are two different sets of transport arrangements—one for regulated telcos and one for competitive CLECs and other kinds of carriers. Regulated companies today provide joint trunking between each other. For instance, if there is a fiber route between AT&T and another telco, AT&T generally owns the part that is within their territory and the other telco owns the part in their own territory. Sometimes this is negotiated differently, but under this arrangement both sides bear some of the cost of carrying voice traffic.

CLECs and other competitive carriers have a different situation. CLECs are allowed by the Telecommunications Act of 1996 to establish a point of interface (POI) at any technically feasible spot within a LATA (or region). Once that point is established, the CLEC is responsible for all costs on their side of the POI and AT&T (or Verizon or CenturyLink) is responsible for the costs on the other side of the POI.

AT&T’s suggested new network gets rid of both of these arrangements and instead moves all of the points of interconnection to the small handful of locations, and in doing so shifts 100% of the cost of the transport onto other carriers.

In a further money grab, AT&T would (as the assumed owner of these large hubs) charge a fee to other carriers for handing traffic from carrier to another. These fees exist today and are called transit fees. But today transit fees are charged on a relatively small percentage of voice calls since there are no fees charged for all of the jointly-owned arrangements I described above. Instead, under this new arrangement there would be a transit fee charged for every call that is handed from one carrier to another.

AT&T’s proposal is ridiculous for several reasons. First, the transit fees are not cheap and they cost more today than the traditional access charges that the FCC has been trying to eliminate. So AT&T’s proposal would increase the cost of making voice calls. The proposal is also a blatant attempt to shove all of the cost of carrying voice traffic to somebody other than AT&T. And finally, it forces companies to carry calls a much greater distance than they go today. This likely will lower call quality and increases the danger of the voice network going down due to a fiber cut or other network problem.

There is a much simpler alternative to what AT&T is suggesting, which is to let carriers negotiate how and where they hand off traffic. There are already huge numbers of local and regional interconnection points in existence, most established to hand-off local traffic in the local market. Carriers should be free to continue to use arrangements that work and with which they are happy. Think of these local arrangements as voice peering arrangements and they quickly make technical sense. Nobody is going to be unhappy if local connections transition to IP instead of TDM. But making the IP transition doesn’t mean that the whole nationwide network has to be massively reorganized. That is far more inefficient and costly than letting carriers find their own solutions.