Is the Internet a Necessity?

The InternetIn a speech recently made by FCC Commissioner Michael O’Rielly, he said that the Internet was “not a necessity in the day-to-day lives of Americans.” That’s a rather startling statement from somebody who seemingly has the job of making sure that the country has adequate broadband to meet its needs. But if we look at his statements in context, it raises some important policy issues that are worth public discussion.

O’Rielly made the comment as a counterargument to the spreading concept that access to the Internet has become a necessity, perhaps even a right. It’s also widely expressed today that broadband is now a utility, much like electricity and water.

It’s an interesting discussion. Several surveys in the last few years show that a significant majority of households rank Internet access as the most important service purchased at their homes. I certainly know that my daughter and all of her 16-year old friends would ‘die’ without the Internet and it seems like the younger you are, the more the Internet is an important component of daily life.

But O’Rielly’s comment was really a political and policy statement. There are certainly a lot of implications for governments if the country adopts the idea that having Internet access is a right. For instance, that would put a lot more pressure to bring Internet access to the places that don’t yet have it and to work hard to close the gap in the quality of the Internet between urban and rural places.

But it seems to me that the FCC has largely already bought into the argument that the Internet is a necessity. They are pouring billions of dollars into improving rural broadband. They are going to subsidize broadband access to low income households. They have adopted net neutrality as a policy which, to some degree, protects the ability of consumers to get what they pay for from an ISP. These all seem like the actions of an agency who thinks that everybody ought to have access to broadband.

FCC Commissioner Tom Wheeler responded to O’Rielly’s statement by saying that “broadband is the defining infrastructure of the 21st century. We should not and will not let up on our policies that make broadband more available.”

It’s obvious that Internet access is now a fundamental part of daily life for many people. I work from my home and I can’t imagine how I would function without it. Actually, I can imagine it, because after a hurricane and tornado hit me a few years ago I was without power and Internet access for 6 weeks. I basically regressed to what felt like the Stone Age. I essentially threw my hands up and gave up on work (and spent the time instead cleaning up the huge mess the storm left behind). I use the Internet almost continuously in making my living and as a society we have grown to a place where there is no realistic substitute for email and the ability to quickly exchange files and work products with others.

This is an issue that hundreds of municipalities are wrestling with. Communities look in envy at urban places that have great Internet bandwidth and they understand that if they don’t have adequate Internet in their community that they are liable to decline economically and fade away from relevance. Internet access is to cities today what the railroads were two centuries ago, and what electricity and Interstate highways were in the last century. Put into that context it starts feeling a lot like a necessity, at least at the community level.

I work with dozens of rural communities that have limited or no Internet access today. It’s heart-wrenching to hear people talk about trying to maintain a household of teenagers with only a cellular wireless plan or to hear parents lament that their kids can’t keep up in school without access to the Internet. For the vast majority of us who have Internet access it’s really hard to imagine going without.

I understand where Commissioner O’Rielly is coming from. He was formerly a Republican congressional aide and the Republicans feel generally that there are few ‘rights’ that the Federal government is obligated to recognize. But on this specific topic he might be on the wrong side of history, because my guess is that the vast majority of people in this country have grown to believe that having Internet access is a right and is something they cannot live without.

Can Cable Networks Deliver a Gigabit?

coax cablesTime Warner Cable recently promised the Los Angeles City Council that they could bring gigabit service to the city by 2016. This raises the question – can today’s cable networks deliver a gigabit?

The short answer is yes, they are soon going to be able to do that, but with a whole list of caveats. So let me look at the various issues involved:

  • DOCSIS 3.1: First, a cable company has to upgrade to DOCSIS 3.1. This is the latest technology from CableLabs that lets cable companies bond multiple channels together in a cable system to be able to deliver faster data speeds. This technology is just now hitting the market and so by next year cable companies are going to be able to have this implemented and tested.
  • Spare Channels: To get gigabit speeds, a cable system is going to need at least 20 empty channels on their network. Cable companies for years have been making digital upgrades in order to cram more channels into the existing channel slots. But they also have continued demands to carry more channels which then eats up channel slots. Further, they are looking at possibly having to carry some channels of 4K programming, which is a huge bandwidth eater. For networks without many spare channels it can be quite costly to free up this much empty space on the network. But many networks will have this many channels available now or in the near future.
  • New Cable Modems: DOCSIS 3.1 requires a new, and relatively expensive cable modem. Because of this a cable company is going to want to keep existing data customers where they are on the system and use the new swath of bandwidth selectively for the new gigabit customers.
  • Guaranteed versus Best Effort: If a cable company wants to guarantee gigabit speeds then they are not going to be able to have too many gigabit customers at a given node. This means that as the number of gigabit customers grows they will have to ‘split’ nodes, which often means building more fiber to feed the nodes plus an electronics upgrade. In systems with large nodes this might be the most expensive part of the upgrade to gigabit. The alternative to this is to have a best-effort product that only is capable of a gigabit at 3:00 in the morning when the network has no other traffic.
  • Bandwidth to the Nodes: Not all cable companies are going to have enough existing bandwidth between the headend and the nodes to incorporate an additional gigabit of data. That will mean an upgrade of the node transport electronics.

So the answer is that Time Warner will be capable of delivering a gigabit next year as long as they upgrade to DOCSIS 3.1, have enough spare channels, and as long as they don’t sell too many gigabit customers and end up needing massive node upgrades.

And that is probably the key point about cable networks and gigabit. Cable networks were designed to provide shared data among many homes at the same time. This is why cable networks have been infamous for slowing down at peak demand times when the number of homes using data is high. And that’s why they have always sold their speeds as ‘up to’ a listed number. It’s incredibly hard for them to guarantee a speed.

When you contrast this to fiber, it’s relatively easy for somebody like Google to guarantee a gigabit (or any other speed). Their fiber networks share data among a relatively small number of households and they are able to engineer to be able to meet the peak speeds.

Cable companies will certainly be able to deliver a gigabit speed. But I find it unlikely for a while that they are going to price it at $70 like Google or that they are going to try to push it to very many homes. There are very few, if any, cable networks that are ready to upgrade all or even most of their customers to gigabit speeds. There are too many chokepoints in their networks that can not handle that much bandwidth.

But as long as a cable network meets the base criteria I discussed they can sell some gigabit without too much strain. Expect them to price gigabit bandwidth high enough that they don’t get more than 5%, or some similar penetration of customers on the high bandwidth product. There are other network changes coming that will make this easier. I just talked last week about a new technology that will move the CMTS to the nodes, something that will make it easier to offer large bandwidth. This also gets easier as cable systems move closer to offering IPTV, or at least to finding ways to be more efficient with television bandwidth.

Finally, there is always the Comcast solution. Comcast today is selling a 2 gigabit connection that is delivered over fiber. It’s priced at $300 per month and is only available to customers who live very close to an existing Comcast fiber. Having this product allows Comcast to advertise as a gigabit company, even though this falls into the category of ‘press release’ product rather than something that very many homes will ever decide to buy. We’ll have to wait and see if Time Warner is going to make gigabit affordable and widely available. I’m sure that is what the Los Angeles City Council thinks they heard, but I seriously doubt that is what Time Warner meant.

Unbundling the Broadband Networks

canada_flag-1920x1080The Canadian Radio-television and Telecommunications Commission (CRTC) has ordered that large telecom companies, both telcos and cable companies, must unbundle the last mile of their network and make the facilities available to competitors.

With this ruling the CRTC has said that competition and choice is important. This was a surprising ruling because all telecom companies had filed comments stating that forced unbundling would be a disincentive for them to build expensive fiber facilities to homes and businesses.

This ruling was only the first step; the processes and procedures needed to accomplish unbundling still need to be worked out. It’s estimated that perhaps the first unbundled connections will be available to competitors by the end of 2016

This ruling applies to both fiber and coaxial networks and will apply to the larger providers like BCE (Bell Canada Enterprises) as well as to the two biggest cable companies, Rogers Communications and Shaw Communications. But the biggest impact is expected to be on BCE which has invested heavily in fiber to both businesses and residences.

The CRTC said that this was the only path they saw towards competition since the cost of building duplicate fiber networks was expensive and not likely to happen.

We know something about unbundling in this country. The Telecommunications Act of 1996 ordered large US telcos to unbundle their copper networks and make them available to competition. This promoted the explosion of CLECs in the late 90s, but the use of unbundled copper largely died when many of the CLECs formed during that period imploded during the telecom crash in the early 00s.

But the FCC in this country has never required unbundling of fiber. In fact, the 1996 Act removed the unbundling requirement as soon as a telco replaced copper with fiber. The Act did require the unbundling of dark fiber (fiber sold without electronics), but as is typical in this country, the telcos chipped away at that requirement to the point where it became incredibly difficult for a competitor to get access to telco dark fiber.

Our experience in this country is that the large companies will comply with this requirement only reluctantly, and here they put as many roadblocks as they could in the way of competitors. The telcos here required difficult paperwork for every step of the process and dragged their feet as much as possible any time they worked with a competitor. There is a famous rumor in the industry that in the work space at one of the large US telcos that dealt with unbundling there was a large sign reading “Delay, Delay, Delay”. Too bad this was before cellphone cameras because several reputable industry people swear this is true.

The idea of unbundling active fiber is an interesting one. Certainly if a competitor could get access to fiber affordably they could offer an alternate suite of products and bring both product and price competition into the network.

The idea of unbundling a cable company’s coaxial network is not as easy to contemplate. Coaxial cables are arranged so that there is not a unique cable for each customer. At the pole each customer is added into the same data and cable TV transmission path as everybody else in their neighborhood. It’s hard to think of a neat technical way to unbundle anything in an HFC network. It might be possible to unbundle the data path, but this is also shared through most of the network. It will be interesting to see how the CRTC deals with the technical issues.

Obviously competitors here will keep an eye on the Canadian experiment to see how it progresses. There has been no cry here for unbundling of fiber networks, but if there was such a ruling I think it would enable a raft of new competitive providers and would bring real competition into the duopoly networks we have in most US markets. Certainly the US suffers from the same duopoly competition that drove Canada to make this ruling.

Remember the Human Equation

MR MONOPOLYI took a lot of economics courses in college – not quite enough to get a degree, but enough to keep me interested today in keeping track of how economists view the world. One thing that economists have always been trying to do is to build economic models that predict how people act in the real world.

Recently the World Bank issued a new report in a series of what they call World Development Reports and they suggest that economists are still not accurately predicting some key human behavior in their modeling. They mention three areas where economists need to improve their models. I found these three areas interesting, because these are also types of behavior that any good salesperson knows very well. This report reminded me that it’s as important for salespeople as it for economists to keep the human equation in mind. The report said that economists need to do better in reflecting the following three things:

The first principle is that all people think automatically. Automatic thinking means that people are often intuitive and impulsive. This differs from a lot of economic models that assume that people are logical and deliberative when making buying decisions. Certainly some people and businesses make deliberate buying decisions. But the real world is full of examples of things we all do that are not logical. Perhaps one of the most common example is how we save for our retirements. I don’t think you can find anybody who doesn’t understand that saving for retirement is really important. Yet a majority of people still don’t take the steps needed to be ready for retirement.

And every good telecom salesperson knows that buying decisions are often made on impulse or based upon emotional factors and are not always fully logical. When somebody changes telecom providers they generally do so somewhat blindly and based upon trust. They really hope that the quality of the service or the level of customer service will be better with the new provider than it was with the old provider. And so they often make an emotional decision to change based upon something they don’t like about their old provider—perhaps a negative billing issue or customer service experience.

The second finding in the Development Report is that humans think socially. This means that they often make decisions based upon either pleasing others or in accordance to what other people think. By contrast, economist models generally assume that people make decisions based upon their own selfish best interest. This finding isn’t as relevant to telecom buying as the other two items, but salespeople still see it in the market. For example, it’s a lot easier selling to people with kids to make the sales pitch based upon what’s good for the kids rather than what’s good for the parents.

The third principle is that people often think using mental models. For example, people might identify themselves as part of a larger group and make decisions based upon that identity. For example, young urban millenials are now a very hard sell for traditional cable TV. Once somebody is a part of that particular culture then they often make many buying decisions based upon the peer pressure of their friends. They might not buy a car and instead use Uber and they might not buy traditional telecom services and rely completely on their cellphone and other people’s WiFi.

It is possible to break a group identify mindset, but it must be done deliberately. For example, many elderly people are of the mindset that technology is beyond them and so they are immune to any normal sales pitch you might make to them. But if you take the time to show them what technology might do for them and let them know that there is training and help for them to learn to use the Internet, then they can become good customers.

I build a lot of business plans and every client who is thinking about building a new network always wants to know what their market penetration rate is going to be. That’s an easy thing to predict if you build in an area that doesn’t have broadband, because most people in that situation will buy what you have as long as it’s affordable.

But it’s a lot harder to predict market penetration when building to a market that already has broadband. Predicting the take rates in existing markets requires understanding the human equation. Here are a few of the things that I tell people, based upon the experience of having seen hundreds of market launches:

  • If you sell residential broadband you are almost always going to get at least 20% and maybe as much as 30% of the market rather easily as long as you have a decent price and as long as your product works well. It seems that in every market there are at least that many people who just can’t stand the incumbents and who will leap to a new competitor. And if you do a good job you will generally keep these customers.
  • But after this first easy pile of customers, how many customers you get is going to depend upon how good you are at selling. And selling means understanding the market and understanding the human equation. I generally see that companies that sell based upon having a good story to tell will do better than companies that try to sell on price alone. A customer that buys from you due to a low price will also drop you when they find a better price elsewhere. But if you can instead show them that there are reasons other than price to use you, then you have a chance of building a loyal customer base.

Interestingly, almost all businesses buy based upon reliability, and not price. Business customers know how badly they suffer when their voice or Internet service is down and so they care about the reliability of your network first and foremost. So selling telecom to businesses is something that meets existing economists’ models well because most businesses will choose a telecom provider deliberately and logically. It’s easy to build models to predict business penetrations, because if you do a good job and you are willing to put knowledgeable salespeople on the street, they will be successful over time.

New CableLabs Standard will Improve Cable Networks

coaxial cableCableLabs just announced a new set of specifications that is going to improve cable HFC networks and their ability to deliver data services. They announced a new distributed architecture that they are calling the Converged Cable Access Platform (CCAP).

This new platform separates functions that have always been performed at the headend, which is going to allow for a more robust data network. Today, the cable headend is the place where all video is inserted, all cable management is done, where the QAM modulation and RF Modulation is performed, and most importantly where the CMTS (cable modem termination system) function is done.

The distributed CCAP allows these functions to be separated and geographically distributed as needed throughout the cable network. The main benefit of this is that a cable operator will be able to push pure IP to the fiber nodes. Today, the data path between the headend and the neighborhood nodes needs to carry two separate paths – both a video feed and a DOCSIS data feed. By moving the CMTS and the QAM modulators to the fiber node the data path to the node becomes a single all-IP path that contains both IPTV and IP data. The new CCAP node can then convert everything to RF frequencies as needed at the node.

We’ve been expecting this change since for the last few years Chinese cable networks have implemented the distributed network functions. Probably one of the biggest long-term potentials for this change is that it sets the stage for a cable company to offer IPTV over DOCSIS frequencies, although there is more development work to be done in this area.

There are several immediate benefits to a cable system. First, this improves video strength since the TV signals are now originating at the neighborhood nodes rather than back at the headend. This will be most noted by customers who are currently at the outer fringes of a cable node. The change also will boost the overall amount of data delivered to a neighborhood node between 20–40%. It’s not likely this mean faster speeds, but instead will provide more bandwidth for busy times and make it less likely that customers lose speed during peak hours. Finally, it means that a cable company can get more life out of existing cable nodes and will be able to wait longer before having to ‘split’ nodes to provide faster data to customers.

Cable companies are not likely to rush to implement this everywhere. It would mean an upgrade at each node and most cable companies have a node for every 200–400 customers—that’s a lot of nodes. But one would think this will quickly become the standard for new nodes and that cable companies will implement it over time into the existing network.

This is the first step of what is being called the IP transition for cable companies. Most of my readers are probably aware that the telcos are working feverishly towards making a transition to all-IP. But cable companies are going to want to do that for a different reason. There is a huge amount of bandwidth capability on coaxial cable and if the entire cable network becomes IP from end-to-end then the huge data capacity in the cable network would be realized. Today cable companies use a broadcast system where they send all cable channels to every home and they then provide data services on whatever bandwidth is left. But in an all-IP system they would only send a customer the channels they are watching, meaning that most of the bandwidth on the system would be available for high-speed Internet services.

So think of this as the first step in a transition to an all-IP cable network. There are a number of additional steps needed to get there, but this pushes IP out to the neighborhood nodes and starts the transition.

Taxing the Internet

Numismatics_and_Notaphily_iconStarting on September 1, Chicago is trying something new and will be adding a 9% tax onto almost every service provided on-line. The city, along with the state of Illinois, is having huge budget problems and they are obviously leaving no stone unturned in looking to fill the tax coffers. But Chicago is the first local jurisdiction in many years that is trying to tax Internet-based services, something that will have wide repercussions.

So what are they trying to tax? The short answer is every information service that uses the Internet. For instance, the tax would apply to services that provide searchable databases—things like the LexisNexis system used by lawyers to find legal cases and precedents. As the data we use moves to the web this is a huge source of potential revenue for the city. Consider all of the services around today that charge people to access data. The Ordinance lists services like access to consumer credit reports, real-estate listings, car prices, stock prices, economic statistics, weather statistics, job listings, resumes, company profiles, consumer profiles, marketing data—any information or data that has been compiled, entered, and stored in a provider’s computer and then sold to others. The tax is also going to apply to taxable leases of personal property that include “cloud computing, cloud services, hosted environment, software as a service, platform as a service, or infrastructure as a service.

This tax does not apply to buying things over the Internet; it is not a sales tax on tangible assets, for instance, it would not apply to all of the physical products bought from Amazon. It would instead apply to companies like Netflix and Spotify and any other web service that sells electronic products. It would be up to the companies selling the onlune services to collect the tax and to remit the revenues to Chicago.

Obviously this new law will be challenged because it taxes a whole lot of things for the first time. It will also be interesting to see if this law is infringes on the protections provided several times by Congress in the Internet Tax Freedom Act as well as multiple times by the FCC, most recently as part of the Net Neutrality ruling.

But the city might have found a clever loophole. They are not taxing Internet access, but rather are taxing access to information and information services that happens to be stored somewhere else and then delivered over the Internet. It will be up to courts to sort out that nuance (or for Congress to pass a new law which is more specific).

One has to think that this law is very bad for businesses in Chicago. A 9% tax on anything is significant. Businesses spend huge amounts of money today on access to online databases and on cloud-based services that are moving their own information to the cloud. In effect, this law would tax companies for accessing their own data that they have chosen to store somewhere other than at their own business. I would not be surprised if this law drives businesses that spend heavily for such IT functions out of the city.

This also affects most people who live in the City directly. Almost everybody today who has an Internet connection buys some service over the web, be that a movie service like Netflix or Amazon prime or a music service like Spotify or Apple.

This kind of tax potentially adds a lot of cost for on-line service providers. Every town, county and state in the country has a different basis for assessing sales and service taxes like this one, and so this is going to require companies like Spotify to incorporate the tax assessment and collection process when they sell a subscription – something they don’t do today.

One would think that there will be a lot of avoidance of such a tax. It’s not hard for a business with multiple locations to be billed from a location that doesn’t get the tax. And since most on-line services don’t verify people’s addresses, somebody living in Chicago could most likely avoid these fees just by telling a Spotify that they live somewhere else. It’s hard to think that the City is ever going to be able to dig deep enough into online transaction to ever audit this.

But the real issue is not how the people in Chicago will deal with this. I am sure people and businesses there will take steps to avoid the new taxes if possible. The bigger issue is that other localities will copy the Chicago example if this holds up in Court. There is an old maxim that politicians have never seen a tax they don’t like, and so its not hard to foresee this tax spreading all over the country. And that is going to add costs to the online services we buy today, and since more and more things are migrating to the cloud this will become even more significant over time.

Court Jumps into OTT Fray

Fatty_watching_himself_on_TVIn a really surprising ruling, a federal judge has ruled that FilmOn X should be able to get access to local network programming like a cable TV company. US District Court judge George Wu ordered that FilmOn X be treated like a cable company and is entitled to retransmit broadcaster’s content.

For those not familiar with FilmOn X, check them out on the web. They have a huge amount of  on-line content that includes local TV from around the world as well as 600 other channels. There is a little bit of everything from non-traditional sports, music from around the world, and channels of almost any topic you can imagine. They also carry a mountain of for-pay video-on-demand content that ranges from music to major league baseball. All of the free content is ad-supported. Viewers can also create their own channels.

FilmOn X also had their own version of the Aereo model and they offered a premium subscription model in a few markets, which gave customers access to 120 HD channels on any computer or smartphone through the use of a dongle. Just like Aereo this was done from antenna farms.

The company has been in a battle with the major networks in the US since its inception. The company began carrying the local networks on the Internet in 2010. In 2011 they were ordered by a court to stop the practice. But in 2012, the local channels were all allowed back onto the system through a federal appeal and FilmOn X carried local content on its broadcast dongle product. But in 2013 the US District Court of the District of Columbia issued a nationwide injunction against the antenna service.

This latest ruling overturns that injunction and seemingly gives FilmOn X the same right to content as a cable company. Obviously this is going to be appealed further and one has to doubt that the networks are going to negotiate retransmission agreements with the company while the appeals are still being fought in court.

But the case raises serious questions. Although addressing a different set of issues than the Aereo case, it still sets up conflicting district court decisions. Aereo had taken the legal tactic of dancing around the issue of whether they were a cable company by concentrating on the issue of copyright infringement. FilmOn X took a more direct legal approach and argued that they had the rights to rebroadcast the content as a cable company. And apparently the court bought it.

Realistically nothing is going to happen in the area of on-line content until the FCC decides where it wants to go with this. Recall that in January of this year the FCC opened up a Notice for Proposed Rulemaking to look at the issue of on-line content. FilmOn X was mentioned several times in that document and the FCC is asking if on-line companies can have the same rights as cable companies to get content.

The FCC can put all of these lawsuits to rest by defining the rights, or lack of rights, of on-line providers. It’s fairly clear in reading the NPRM that the FCC has a bias towards allowing content on-line and is probably seeking a legal way to do that since they are required to follow the various cable laws that have been passed by Congress.

It’s hard to think that on-line content providers are ever going to be able to comply with all of the rules included in the current cable regulations. Those rules very rigidly define tiers of programming. They also define the retransmission process whereby cable companies can rebroadcast local content. But there are a ton of other requirements that range from closed captioning to emergency alert systems that also apply to cable companies. It’s going to be a challenge to give just a few of these rights to on-line providers while making cable providers continue to comply with all of the rules.

For now this ruling is just one more confusing court ruling that has defined the on-line broadcast industry so far. There have been several conflicting rulings as part of earlier cases with Aereo and FilmOn X that muddy the legal waters for the business model. But this is something that the general public very much wants and traditional cable will be in a lot of trouble if local content ends up on the Internet. It is that content along with sports that are the primary drivers behind maintaining the cable companies’ grips on customers.

Control of the Future Voice Network

FCC_New_LogoThe FCC is looking at how to transition from the traditional TDM-based PSTN to an all-IP telephone network. A number of carriers have submitted proposals that provide their vision of an all-IP network. Today blog looks at AT&T’s vision of the future IP network.

AT&T has proposed to the FCC that there be a handful of major switching hubs created in the country. Every carrier would then send their voice traffic to these hubs to be sorted and handed back to the various terminating carriers. Their argument is that the whole idea behind the IP transition is that the network be made as efficient as possible; they are promoting this idea as the simplest network to implement.

But is it the most efficient? Over the years I’ve done a lot of traffic engineering, meaning that I’ve helped companies analyze where their voice traffic comes from and goes to. What I’ve seen is that approximately 80% of voice traffic for most companies stays in a relatively small circle of perhaps 60 miles. This distance can vary a bit by company depending on how far away they might be from a major metropolitan area, but this basic traffic rule seems to apply pretty much everywhere I’ve looked.

So let me first look at the practical application of what AT&T is proposing. One would have to assume that if there was only a handful of nationwide interconnection points that they would be put in the same places as the major internet hubs – Atlanta, Chicago, Washington DC, New York City, Dallas, etc. What this idea means is that states not near to those hubs—say Montana, Nebraska, Maine, etc. would have to ship all the voice traffic from their state to the nearest big hub and back again.

While it might be more efficient to have only a few hubs, it certainly would not be efficient from a transport perspective. Somebody has to pay for all of that transport to and from the main hubs, and that is the real purpose behind the AT&T proposal. Under their proposal carriers other than them would pay to have all traffic brought to these main hubs. That is a major change from the way the industry works today.

Today there are two different sets of transport arrangements—one for regulated telcos and one for competitive CLECs and other kinds of carriers. Regulated companies today provide joint trunking between each other. For instance, if there is a fiber route between AT&T and another telco, AT&T generally owns the part that is within their territory and the other telco owns the part in their own territory. Sometimes this is negotiated differently, but under this arrangement both sides bear some of the cost of carrying voice traffic.

CLECs and other competitive carriers have a different situation. CLECs are allowed by the Telecommunications Act of 1996 to establish a point of interface (POI) at any technically feasible spot within a LATA (or region). Once that point is established, the CLEC is responsible for all costs on their side of the POI and AT&T (or Verizon or CenturyLink) is responsible for the costs on the other side of the POI.

AT&T’s suggested new network gets rid of both of these arrangements and instead moves all of the points of interconnection to the small handful of locations, and in doing so shifts 100% of the cost of the transport onto other carriers.

In a further money grab, AT&T would (as the assumed owner of these large hubs) charge a fee to other carriers for handing traffic from carrier to another. These fees exist today and are called transit fees. But today transit fees are charged on a relatively small percentage of voice calls since there are no fees charged for all of the jointly-owned arrangements I described above. Instead, under this new arrangement there would be a transit fee charged for every call that is handed from one carrier to another.

AT&T’s proposal is ridiculous for several reasons. First, the transit fees are not cheap and they cost more today than the traditional access charges that the FCC has been trying to eliminate. So AT&T’s proposal would increase the cost of making voice calls. The proposal is also a blatant attempt to shove all of the cost of carrying voice traffic to somebody other than AT&T. And finally, it forces companies to carry calls a much greater distance than they go today. This likely will lower call quality and increases the danger of the voice network going down due to a fiber cut or other network problem.

There is a much simpler alternative to what AT&T is suggesting, which is to let carriers negotiate how and where they hand off traffic. There are already huge numbers of local and regional interconnection points in existence, most established to hand-off local traffic in the local market. Carriers should be free to continue to use arrangements that work and with which they are happy. Think of these local arrangements as voice peering arrangements and they quickly make technical sense. Nobody is going to be unhappy if local connections transition to IP instead of TDM. But making the IP transition doesn’t mean that the whole nationwide network has to be massively reorganized. That is far more inefficient and costly than letting carriers find their own solutions.

A Few Lessons from Big Companies

Text-messageI spend a lot of time reading about corporations and I think there are some lessons to learn from them that are relevant to small companies.

Selling Product versus Building Relationships. There are many  large companies that sell products without developing relationships with their customers. In our industry the large cable and telcos come to mind. They are all rated among the worst of all corporations in delivering customer service and they even antagonize many of their customers. This works fine for them until they get competition, and then the customers who don’t like them quickly jump ship to the new competitor.

But there are large businesses that go out of their way to build customer relationships because they believe that loyal customers are their most important asset. Consider car manufacturers. They realized a long time ago that they were not going to be good at customer service, so they created a network of dealers who are local businesses with ties in each community and these dealers have built trust over generations. And there are many other companies that deliver great customer service. Tech firms like Amazon, Apple, and Google have been consistently rated among the top ten in customer satisfaction for the last few years – showing that tech firms can put an emphasis on customers and still thrive.

My most successful clients build relationships with their customers and as a result have built a loyal customer base. Many of them are or were monopolies, and there was a time when most of my clients could not tell me who their ten largest customers were. But I rarely see that today and small telcos and cable companies have learned to build loyalty through building relationships.

Growing Fast versus Growing Deliberately. Many large companies need to grow fast to be successful. Once you have taken venture capital money or gone public then the pressure is on to grow profits quickly. But growing too fast almost always changes a company in negative ways. It’s really common to see companies go into the growth mode and then forget who they are. Most tech companies, for example, started with a small core of people who worked hard as a team to develop the core company. But when it’s time to grow, and companies hire mountains of new people it’s nearly impossible to maintain the original culture that made the company a great place to work.

Growth can be just as hard for small companies. It can be as hard economically and culturally for a small company to grow from 5,000 to 10,000 customers as it is for a large company to add millions. Small companies are often unprepared for the extra work involved with growth and find that they overwork and overstress their staff during a growth cycle. Growth creates a dilemma for small companies. If you hire the people needed to staff the growth period your company will be overstaffed when growth stops.

And so a lesson about growth can be learned from large companies. They will often staff growth through temporary employees, contractors, and consultants rather than take on people that they may not need later. Companies of any size are hesitant about hiring employees that they might not need a year from now.

High-Tech versus High-Touch. A lot of large businesses are trying to feign a good customer service experience by electronically ‘touching’ their customers often. I recall last year when Comcast introduced a texting system to communicate with customers. After they sent me half a dozen text messages in the same week, I disconnected the texting function because I really didn’t want to hear from them that often. But there are large companies who are convinced that if they electronically reach out to customers often that they are engaging in relationship building and proactive customer service.

And perhaps they are with some customers. But I am more appreciative of a business where I can talk to a person when it’s needed. Not that I mind electronic communications. I like to know that AT&T has auto-billed me and I like knowing when charges hit my credit cards. But I don’t want to be bothered by a business when they aren’t passing on information I want or need.

The important point here is that you have to touch your customers sometime and whether you reach out electronically or in person it’s better than no-touch and not talking to your customers. I know telecom companies that call every customer at least once a year to ask them if they like the service and if everything is okay. Such calls are welcomed by most customers and this is a great tool for businesses to build relationships. But just be prepared that if you ask your customers how you are doing that you need to be ready to deal with negative feedback. That is how to build happy customers.

Have We Entered the Age of Robots?

robbyI read a lot of tech news, journals, and blogs and it recently dawned on me that we have already quietly entered the age of robots. Certainly we are not yet close to having C-3PO from Star Wars, or even Robbie the Robot from Lost in Space. But I think that we have crossed that threshold that future historians will point to as the start of the age of robots.

There are research teams all over the world working to get robots to do the kinds of tasks that we want from a C-3PO. As the recent DARPA challenge showed, robots are still very awkward at doing simple physical tasks—but they are now able to get them done. There are research teams that are figuring out how to make robots move in the many subtle ways that humans move and they will figure it out.

The voice recognition used by robots still has a long way to go to be seamless and accurate. As you see when you use Apple’s Siri, there are still times when voice recognition just doesn’t get us. But voice recognition is getting better all the time.

And robots still are not fabulous at sensing their surroundings, but this, too, is improving. Who would ever have thought that in 2015 we would have driverless cars? Yet they are seemingly now everywhere and a number of states have already made it legal for them to share the road with the rest of us.

The reason I think we might have already entered the Robot Age is that we can now make robots that are capable of doing each of the many tasks we want out of a fully functional robot. Much of what robots can do now is rudimentary but all that is needed to get the robots from science fiction to real life is more research and development and further improvements in computing power. And both are happening. There is a massive amount of robot research underway and computer power continues to grow exponentially. I would think that within a decade computing power will have improved enough to overcome the current limitations.

All of the components needed to create robots have already gotten very cheap. Sensors that cost a $1,000 can now be bought for $10. The various motors used for robot motion have moved from expensive to affordable. And as real mass production comes into play, the cost of building a robot is going to continue to drop significantly.

We already have evidence that robots can succeed. Driverless cars might be the best example. One doesn’t have to look very far into the future to foresee driverless cars being a major phenomenon. I can’t think that Uber really expects to make a fortune by poorly paying and mistreating human drivers such that the average Uber driver last less than half a year. Surely Uber is positioning themselves to have the first fleet of driverless taxis, which will be very profitable without having a labor cost.

We see robots being integrated into the workplace more so than into homes. Amazon is working feverishly towards totally automating their distribution centers. I think this has been their goal for a decade and once its all done with robots the part of the business that has always lost money for Amazon will become quite profitable. There are now robots being tested in hospitals to deliver meals, supplies, and drugs. There are robot concierges in Japan. And almost every factory these days has a number of steel collar workers. You have to know that Apple is looking forward to the day soon when they can make iPhones entirely with robots and avoid the bad publicity they keep getting from their factories today.

The average person will look at video from the recent recent DARPA challenge and see clumsy robots and be convinced that robots are still a long way off. But almost every component needed to make robots better is improving at an exponential pace, and we know from history that things that grow exponentially always surprise people by ‘bursting’ onto the scene. I would not be at all surprised to see a workable home maid robot within a decade and to see a really awesome one within twenty years. I know when there is a robot that can do the laundry, load the dishwasher, wash the floor, and clean the cat litter than I am going to want one. Especially cleaning the cat litter—is somebody working on that?