Looking Into 2014

Crystal ball Français : Boule de cristal

Crystal ball Français : Boule de cristal (Photo credit: Wikipedia)

As any year comes to a close it’s always fun to look forward to the next year and to make some guesses about the direction of our industry. I have always done this, but this will be the first year I put my guesses out in public with a blog. I plan to come back at the end of next year to see how I did.

More Consolidation of the Big Players. While the new FCC Chairman says that he is pro-competition, I think 2014 is going to see a lot of consolidation among the big players in the industry, which will lead to less competition. It’s likely that there will be major acquisitions in both the cellular and the cable TV space that will reduce the number of major companies in each industry. One has to wonder at what point the FCC will say no to acquisitions, but I don’t think 2014 is going to be that year.

AT&T’s Response in Austin Will Squelch Major Market Competition. I predict that AT&T’s announcement that they will build fiber-to-the-premise to match what Google is doing there is going to kill competition in NFL cities for a while. I don’t expect any new major announcements of plans to build NFL cities in the coming year. There will still be new FTTP overbuilders in smaller markets, but everybody is going to be gun-shy against committing money to major markets.

Network Neutrality Will Erode. The FCC is going to follow the lead of the new Chairman and will support large company initiatives to weaken network neutrality. This might be done through inaction, in that some large carriers may make arrangements to give preferential treatment in the network and the FCC may fail to halt the practice.

Transition to All-IP Network Will Creep Forward. While the large telcos all would like the transition to happen overnight, it’s probably going to take 3 – 4 years for a transition of most of the POTs network to IP. However, there will be some major steps taken in 2014 to start defining the regulatory framework that will go along with an all-IP network.

The Large Telcos Will Continue to Shed Copper Networks. The large telcos have made it clear that they would like to get out of the copper business. AT&T’s recent decision to bail on Connecticut is just the beginning. I think all that is probably stopping telcos from shedding more copper immediately is the lack of companies capable of buying large numbers of customers. But there will be more piles of customers shed in the next year.

The Smart Phone Will Begin to be the Hub For the Internet of Things. The main thing lacking for the Internet of Things to leap forward is consolidated platform to bring devices together. While there is the chance that some sort of home platform could eventually win this battle, I think 2014 is the year when more and more IoT devices are integrated with smartphones as the hub. If smartphones capture this role early they will be the de facto hub for a decade to come.

Customers Will Bail on Cable Faster Than Predicted. The phenomenon of households dropping or downsizing cable subscriptions will pick up steam this year and will go faster than predicted by the cable companies. The industry is not going to implode, but it will become obvious by the end of the year that there has to be a new paradigm created for delivery of programming and that traditional cable bundles cannot be the only product offered. It is going to take five years for the current cable model to break, but 2014 will be the year when the erosion becomes obvious.

Build It and They Will Come

This is the baseball field featured from the 1...

This is the baseball field featured from the 1989 movie Field of Dreams. This photo shows the field as seen from the bleachers that Kevin Costner’s character Ray Kinsella built at the beginning of the movie. (Photo credit: Wikipedia)

Over the years I’ve seen a number of networks built where the service provider didn’t really know how many customers they might get. I always called this ‘build it and they will come’ because they invested money into an expensive new network with the hope that sales and marketing efforts would somehow make the new venture profitable.

Sometimes this has worked and sometimes it has resulted in spectacular failures. In looking at the failures, I see that every one of them exhibited some degree of “build it and they will come”. There are a number of reasons why a telecom venture can fail, but the number one reason is not getting enough customers. The second most common reason is doing a poor job and not retaining customers after they have been signed.

You can’t always avoid ‘build it and they will come’. For example, if you are building a business-only telecom company in a larger market, there is no amount of research that is going to tell you ahead of time how you will do. Selling to business means slugging it out face-to-face with consultative selling and your success is going to ultimately going to come down to your ability to sale and the performance and the reliability of your network and your customer service staff.

But there are steps you can take that will help assure success in smaller communities or with business plans that include a lot of residential customers. Historically you could get pretty good market data using well-designed customer surveys. I wrote an earlier blog discussing why surveys are no longer a reliable took and I won’t repeat that whole argument here. In a nutshell, the fact that most homes no longer have a landline telephone has killed the ability to do a believable random survey in a small market.

So, if you can’t do a survey you can believe, then how can you know ahead of time how well you will do in a given market? If I was investing in a new venture today I would only commit the investment if I had pre-sold enough customers to assure myself of success. There are a number of ways to do this:

Strong Market Name. If you have strong market name recognition in a new market then you can get potential customers to pledge to your new network. This is the tactic used by Google in Kansas City. They have a strong enough name recognition that they could rely on customers to pledge that they would buy services if Google built to their neighborhood. Certainly every customer who pledges doesn’t follow through, but most of the do, and so, if you have a good market name, then some sort of a pledge campaign is probably good enough to give you assurance.

You don’t have to be Google to have a good name recognition. For example, a municipal provider might have a good name in their own City if they already do a great job in selling water, power and other utilities. In smaller towns a local nearly telephone or cable company might have a good enough name that most people in town know them and want their service. And in parts of the country where cooperatives are prevalent, a neighboring coop can have tremendous name recognition.

Canvas the Area. If you don’t have great name recognition, or if you are an unknown start-up, then you most effective technique is to do a door-to-door canvas and talk to everybody in the proposed service territory. That sounds expensive, but it is not nearly as expensive as building the network and then finding out that nobody wants to buy from you. A canvas is essentially a survey, but when you knock on most of the doors you overcome the limitations from a poorly designed survey and you can believe the results of a canvas if it is done correctly.

You also have to consider a few other issues such as if you want to take a deposit, and if so, will that deposit be refundable in the future. Taking deposits will eliminate some people from signing up, but you can feel pretty secure about building a network to people who ponies up money ahead of time – they want your service. If you really want to be sure before you build, make potential customers sign contracts. Just beware that such an effort is very labor intensive and will take a long time.

The days are gone where you can build a network and just assume that your marketing skills will make you successful. You need to put the marketing effort in up front, before you commit to the build. Only then can you have a good assurance that your new venture has a real chance of success. Anything less than that is a crap shoot.

Another Look at the Future of Television

Television

Television (Photo credit: paul.kenjerski)

There is a report published by EY (formerly known as Ernst and Young) that identifies six trends it sees that will change the future of television.

Better Use of Omniplatform Environment. In the 50’s the early days of television adapted existing radio shows since they didn’t really understand the changes made possible by the new TV medium. Today we are at a similar cusp in the way that people get their entertainment. A large percentage of TV viewers also use a laptop, tablet or smartphone while watching TV. The smart programmers are going to find ways to take advantage of the way viewers want to see content.

So EY is predicting that some programmers will adopt a new kind of programming that crosses multiple screens simultaneously. Such shows will cater to and expect viewers to be watching with multiple screens to get the full story. They also think programmers will use this medium to involve viewers in defining the story line.

Greater Demand for Mobility. If you have been following the tech news you see that there are new technologies involving transparent and flexible screens that are going to make it possible to show video content almost anywhere. As screens get cheaper and more abundant viewers are going to demand a far greater degree of mobility than they have today.

And mobility will mean more than just having the ability to watch a TV show on a pad. It means the ability to have content follow the viewer on the move. It’s already possible for a techie to have viewing follow them from room to room in the house. But the ultimate mobility is going to happen when somebody watching a football game is going to be able to follow the content from living room to vehicle to the grocery store.

There Will Remain a Social Context to Viewing. People like watching television with other people and social viewing increases for major events like the Super Bowl or the Academy Awards. People are going to continue to want a social context to viewing and EY predicts that programmers will find ways to take advantage of this. They think some of the most popular future content will be that which draws people into it for social reasons instead of just for the content.

Content Will Become Personalized. Surveys show that almost nobody likes the effort that is required to find the programming they want using a settop box. EY predicts that smart software will be developed that will help people quickly find the programming they want. Further, this software will learn the viewer and will suggest programming to each person that is similar to what they most like to watch.

There are already early versions of this concept in play, but the real breakthrough will come when the content suggested is spot on to the viewer rather than just something from a similar category to things they recently watched.

Binge Viewing is Here to Stay. As somebody who is in a household of binge viewers, I agree with this observation. In our house we each tend to watch entire TV series end to end rather than view a number of different shows. We are already addicted to Amazon Prime for letting us watch the way that pleases us. I personally have been doing this for decades and am glad to now be freed from the expensive need to buy boxed DVD sets of my favorite shows.

New Content Providers Can Succeed. The bundled content packages offered by cable companies is under siege. People have so many options to watch what they want that more and more people are leaning away from bundled cable packages. There are new content entrants all of the time such as the new programming being developed by Netflix and Amazon Prime. This means there will be opportunities for new content providers to succeed with having to have access through the existing network structure.

The Battle for Austin

Official seal of City of Austin

Official seal of City of Austin (Photo credit: Wikipedia)

An interesting battle is shaping up in Austin as AT&T and Google are taking the early steps in head-to-head competition. Both have announced that they will build gigabit networks in the City. The obvious beneficiaries of this business will be the top-end customers in the City. It will be interesting to watch how both companies do there.

Other than a few greenfield tests, this will be AT&T’s first foray into fiber-to-the-premise. They have built their broadband business using DSL over multiple copper lines. It’s obvious that AT&T is drawing a line in the sand with Google and telling them that competition with fiber in AT&T markets is going to be met with competing fiber.

AT&T has announced their pricing for their faster product. Initially they will be offering Internet speeds of up to 300 Mbps, with the promise that those products will be upgraded for to gigabit free once the fiber has been built. This certainly gives them a leg up early since they have the ability to sign customers now.

There are two pricing options for the AT&T data product. For $99 customers will get the full gigabit (after upgrade). But interestingly, customers will be able to get the same gigabit speed for $70 if they agree to let AT&T monitor their Internet usage and give them directed advertising. That makes you pause for a second until you realize that this is the Google model. Every customer who uses a Google product, be that Gmail, Google+ or any of the other host of products is continuously monitored so that Google can know more about them. I think AT&T is being quite clever in that this compares their $70 product directly to Google’s product. What I think AT&T is really offering is a premier-priced product that comes without monitoring.

Both companies offer a handful of cable TV options. At least for now one would think AT&T has a leg up in this area since the word in the industry is that customers like all of the programming options they get with today’s U-Verse offering.

If Google sticks with the same product line they have in Kansas City, then they will also be offering a $70 gigabit offering and a few cable options. So the two companies will have the same basic price for gigabit service and will not be competing on price.

A gigabit product prices at $70 is clearly a product aimed at the more affluent households in the market. A lot of homes are going to find that too pricy regardless of the speeds that come with the product. In Kansas City, Google only rolled out their gigabit product in neighborhoods that guaranteed them at least a 15% take rate. It is going to be interesting in Austin, with two gigabit providers to see if there are many places where Google will be able to achieve that same take rate. If they can’t get that, how much will they build in Austin?

In any market a large percentage of households go for products in the $40 range for Internet, regardless of what other speeds are available. To some degree this is a matter of economics, but it also has a practical aspect. Most likely the households who subscribe to a $40 service in Austin are those homes who have not yet chosen to watch much of their video on the web. House holds with multiple people who are all trying to use the web for video are finding basic Internet products to be inadequate.

There is another competitor in the market, Time Warner, and nobody is talking about them. One has to think that today that they are the predominant ISP in Austin since the cable companies have won that battle almost everywhere over DSL. One would think that if they can offer something relatively fast, say 50 Mbps download for less than $50 that they might hang on to the majority of the market while the other two companies beat up each other going for the top end of the market.

One last point to mention in that I am scratching my head trying to figure out how AT&T is going to deliver speeds today of ‘up to 300 Mbps’ over existing copper. Such speeds over DSL either require the customer to be very close to the DSLAM or else require multiple pairs of copper, far more than the normal bonding of two pairs. From what we know about AT&T’s normal networks, those are not practical alternatives. There are fiber-to-the curb technologies that will deliver 300 Mbps, but those require fiber very close to the home. So that claim has us wondering if that is a real claim or a marketing claim.

Should You Have a Data Cap?

data recovery

data recovery (Photo credit: Sean MacEntee)

Over the last few years most of the cable companies and some telcos have implemented data caps on high-speed Internet access. They always claimed that caps were necessary to help protect their networks from congestion. They claimed that heavy users would clog the networks and make data speeds slow for everybody else. But as someone who sees hundreds of networks, this claim holds no technical validity, except in some isolated instances and in some parts of some networks.

Michael Powell, the head of the National Cable and Telecommunications Association admitted publicly last week that caps are not about congestion, but rather are about ‘pricing fairness”. In the telecom world there is a general rule of thumb that the most active 15% of your users will use 85% of any resource, be that minutes, data, etc. And it’s a pretty good rule of thumb. If cable companies had come along and lowered prices for the 85% who are not heavy users and then made up the difference on higher rates for the 15%, then his argument would resonate with the public. But nobody saw any rate reductions and it’s hard to see data caps as anything more than a way to make even more money from data service.

One has to just note that the US has some of the highest-priced Internet services in the world to poke holes in NCTA’s announcement. If you compare US rates to the Far East or Europe it is easy to see that our rates are way out of line on a cost per megabit of service available to customers.

To make it worse, cable companies are starting to raise data rates. And this follows a ten-year period where the underlying cost of raw data has gotten cheaper every year. When a cable company set a monthly rate of $40 or $50 a decade ago, during that decade the cost of buying wholesale access to the Internet has probably dropped by 90%. It’s my opinion that cable companies know that in another decade that they are going to mostly become ISPs since cable and telephone are both dying products. They are starting to creep the rates up now to hedge against the day when that is their only product.

But even assuming that our rates are too high and that profits are really high, should any ISP consider any sort of cap or limitation on how customers use their data. I think the answer is yes, and it is not for any of the reasons that the cable companies have given.

Using my metric, 15% of the users on a network create most of the data usage. But absent any rules on how the network can be used, a small number of them could be using most of the usage for that group. For example, customers who operate servers and operate ecommerce site or other very busy sites like a pornography server can use huge amounts of data on the network. Much of that data is sent in the upload direction and doesn’t cost as much for a carrier as downloaded data, but a few such sites actually can clog a part of the network if they are busy continuously. The way around this problem is a prohibition against using servers on a basic residential data product. But if you are going to have this kind of policy you also need to have some way to measure how much data each customer is using.

On the download side of the equation, there are always a few customers who abuse any system. There are internet hoarders just as there are hoarders of anything else, and so you might want to set a cap that discourages continuous downloading. Comcast has implemented data caps of around 300 Gb in a lot of markets lately. If a customer downloads movies at a very high quality rate, they can use around 2 Gb per hour. If they watch non-HD movies it’s about half of that. And so a 300 Gb data cap would limit people to watching 150 hours of HD programming or 300 hours of normal programming per month. That works out to a limit of 5 hours per day or HD programming or 10 hours per day of normal programming. That may seem like a lot, but if each person in the family is watching their own programming, that is a really small limit.

I have advised my clients to institute a fairer cap, but to still have one. For instance, a cap set at 1 Tb (1,000 Gb) allows for over three times the usage than the Comcast cap. Anybody going over a 1 Tb cap is likely a data hoarder because that requires somebody to be downloading video more or less continuously every day of the week. Every network has a different configuration and so this is not a hard and fast limit. But I suggest some limit on data, at a very high rate that will only affect a truly small handful of people. The Comcast rate is set to make any family who actually uses their bandwidth to pay more. My suggestion is to set a cap that stops bad abuse, while giving people what they have paid for.

The Future Viability of WiFi

Español: Logo WiFi Vectorizado

Español: Logo WiFi Vectorizado (Photo credit: Wikipedia)

Just last week I wrote a blog that talked about how busy the WiFi spectrum is getting. It seems like every telecom business is using or has plans to use the spectrum in a big way. Since I wrote that blog I noticed the following article which outlines how wireless companies intend to deploy WiFi transmitters as part of their urban cell sites (and any other cell site that experiences congestion).

Just about every telecom business has some use for WiFi. Telcos and cable companies are using WiFi in their data routers to spread their data around homes and businesses. The wireless carriers are all planning on using WiFi to offload their tremendously busy licensed spectrum. Businesses use it to set up public hotspots. Settop manufacturers are going to use it to serve multiple TVs in your home. Devices that connect to your TV like Roku and Playstation use it. And it is becoming the default spectrum to use for the Internet of things and billions of devices are going to be made WiFi capable.

So this raises the issue of whether there is some point when there is just too many different people trying to use the same spectrum in the same area at the same time. People use the WiFi spectrum because it’s free. But like any radio spectrum, it has physical limitations. At some point we can simply overwhelm a given spectrum band in a given area and it will not work well for any of the applications trying to use it. It sure seems to me like we are headed towards that possibility with WiFi. I am sure that everybody remembers in the 90’s when cordless phones came out. The spectrum got so busy in some neighborhoods that the phones just wouldn’t work.

Without getting too technical, let me discuss some of the issues associated with radio interference. There are a whole lot of different ways that interference can affect a spectrum in a given location. Consider some of the following:

Adjacent Channel Interference (ACI). The WiFi spectrum is not one big swath of data but is divided up into discrete channels. Earlier versions of WiFi used one channel per transmission, but the latest standards allow for bonding channels together. Many of the problems that are experienced in the real world with WiFi is that many of the devices using it are not built to the same high standards that you find in licensed spectrum gear. And so there are numerous devices that bleed usage into adjacent channels. When such a device is transmitting, it then not only uses the assigned channel but pollutes the two channels on both sides.

Electromagnetic Interference (EMI). This is the interference that we all remember when listening to AM radio. This is interference that comes from some outside source that can range from microwave ovens, computer terminals, solar flares, doorbell transformers, and hundreds of other sources. WiFi is not immune from external interference and so part of the spectrum is eaten through native interference.

Co-channel Interference (CCI). This is the interference that comes when more than one user us trying to use the same channel at the same time. In the voice world this is known as crosstalk, which we have all experienced on cell phones from time to time. But in a data transmission this manifests as slower data speeds since each concurrent user loses part of their signal.

Common Mode Interference (CMI). This is interference that comes from using spectrum to conduct two-way transmissions. This is basically interference between transmitting and receiving WiFi signals at the same time.

As we put more and discrete WiFi paths in the same neighborhood the effects of each one of these types of interference get magnified. In any given area there is at least a little bit of all of these types of interference. That is inherent in the way that radio waves interact with each other on a physics basis. There are engineering techniques that can be used to minimize interference. For example, it’s typical to put the transmit and receive signals as far apart as you can get them. But if you out enough different signals into the same environment there comes a point where no techniques can overcome the sheer facts of physical interference. The spectrum can get overwhelmed and essentially becomes worthless until the demand on it reduces.

I know there are a lot of scientists and engineers who look at all of the planned used for WiFi and just shudder. Because in urban environments it is likely that the spectrum is going to get overwhelmed and none of the uses will work as they should.

Google and Regulation

Logo of the United States Federal Communicatio...

Logo of the United States Federal Communications Commission, used on their website and some publications since the early 2000s. (Photo credit: Wikipedia)

AT&T said last week that they were not required to give access to Google Fiber to their poles in Austin Texas. AT&T owns about 20% of the poles there with the City owning the rest. And from what I can see, AT&T is right. This all comes down to various regulations, and it appears that Google is doing everything possible to not be regulated in any way. It seems they have set up a business plan that lets them claim to escape regulation. Let me look at the nuances of what they are doing.

There is a federal set of rules that say that pole owners must provide poles to any certified telecommunications provider. According to the Telecommunications Act of 1996, the states have the right to grant certifications to carriers. Every state provides at least two kinds of carrier certifications – CLEC and IXC. CLEC is the acronym for Competitive Local Exchange Carrier and is the federal term used to describe competitive telephone providers. IXC is the acronym for Interexchange Carrier and is the certification given to companies that only want to sell retail long distance.

Some states have other categories. Some states have a certification for a Competitive Access Provider (CAP) or for a Carrier’s Carrier, These two certifications are generally given to companies who only want to sell services to other carriers. They may sell transport, collocation or other services that only carriers can buy.

A company must obtain a CLEC or CAP certification if they want to gain all of the rights that come with such certification. This includes access to poles and conduits of other carriers, the ability to interconnect with other carriers, the ability to collocate equipment in the offices of other carriers. A CLEC certification also grants a company the right to bill ‘telecom’ products to customers, meaning traditional telephone or traditional TDM point-to-point data services. These are generally rights that anybody who is building a network or providing traditional telecom services must obtain before other carriers will talk to them. But along with those rights come some obligations. Certified carriers are subject to paying some regulatory fees and collecting other fees and taxes from their customers. Regulated companies have to follow rules that dictate how they can disconnect non-pay customers. Regulated companies in some states even have some light regulations concerning pricing, although there are very few rules anywhere dictating how a competitive carrier prices their services.

So strictly, AT&T is completely within their rights to not even talk to Google about pole attachments since Google does not have or plan to obtain a certification. As it turns out, AT&T reports that they are talking to Google anyway and are negotiating a deal to let them on the poles. And honestly, that steams me a bit, because this is how big companies treat each other. I am sure that there is enough business between AT&T and Google that AT&T doesn’t see any sense in going to war over this kind of issue. They would also be seen in Austin as holding up progress and further, Google could always get the certification if push came to shove. But if this was any company smaller than Google, then AT&T would be refusing to even open a discussion on pole attachments or any of the other issues associated with being certified. AT&T would insist that any other company jump through all of the regulatory hoops first. This I know because I have experienced it numerous times. I guess it pays to be as big as Google.

AT&T would also be required to provide access to the poles if Google was a cable TV company. This is a designation that is granted by the local community and the City of Austin could negotiate a cable franchise agreement with Google. But Google is taking the stance that they are not a cable TV company. They are claiming instead that they are a video service provider because they deliver two-way cable TV service, meaning that the customer’s settop box can talk back to Google since they offer IPTV. This is taking advantage of a loophole in the law because today every large-city cable system is two-way since customers in those systems have the ability to order Pay-per-view or video-on-demand from their settop boxes.

But Google does not want to be a cable provider, because there is one nuance of the FCC rules that say that anybody getting a franchise agreement would essentially have to sign onto the same rights and obligations as the incumbent cable company. The big catch in those rules is that Google would have coverage obligations to cover the whole City and they instead want to pick and choose the neighborhoods they serve. Google would also have to collect franchise fees from customers for their cable TV product, and such fees are around 3% of the cable bill in most places.

State regulators and cities are both willing to overlook these regulatory nuances for Google because they are so big and because they promise to bring gigabit data speeds. But these same rules never get overlooked for smaller companies, and so I guess regulations only really affect the small guys any more.

How the Internet of Things Will Transform the World

ASIMO is an advanced humanoid robot developed ...

ASIMO is an advanced humanoid robot developed by Honda. Shown here at Expo 2005. (Photo credit: Wikipedia)

I read a lot about the Internet of Things. One of the best articles recently is a series being written by Christopher Mims for Quartz. In the two articles he’s written so far he is talking about how the IoT is going to transform our lives. I don’t think very many people are yet aware that this change is coming and how life altering it will be.

It’s easy to think about the IoT as being a smart device evolution that will make our lives a little easier. People today talk about putting together smart homes where the lights turn on and off as you enter the room and where you can change the temperature and other household settings from your smart phone. And today that is the IoT. Right now companies that make devices are starting to allow them to be IP and WiFi addressable so that we can communicate with them using a smart phone, tablet or computer.

But if that is all that the IoT is going to be it would be an incremental improvement in daily life for the geekiest of us, but it would be far from transformational. I believe that we are entering the century where IoT will transform our world. Within a decade we are going to see huge changes on our lives akin to the sort of changes that came from the introduction of the Web. And I am not sure any of us can really imagine what the IoT will do by the end of this century. But I know that the year 2100 will be as different from 2000 as that year was from 1900. We went from a world of horse drawn vehicles to a computerized world in a century, and the changes due to IoT will make today’s world look just as quaint as the horse and buggy does to us now.

So what are these big transformational changes? The article talks about some of the tools that the IoT will use. All of the types of new tools and techniques noted in the article are the necessary underpinnings of making a new world where people and devices become fully integrated. That total integration is the biggest change coming from the IoT.

Imagine a world where you walk into a party and nobody there is a total stranger to you because your personal computer will tell you who everybody there is and what their connection is to you. You will know without asking that the guy you don’t recognize is your neighbor’s brother visiting from Cleveland and the couple you’ve never seen are your friend’s dentist and his wife. The social changes coming from the IoT will be transformational because social media will move off the Internet and into the world and will be everywhere you go.

The IoT will also finally bring the early promise of the Internet to pass that all information will be instantly at your fingertips. Today you can search for things on the web, but you generally have to do a lot of work to separate fact from fiction. Web search engines bring up every reference to the topic you are interested in without helping you know what is true or false. But we are not far from a time when you will always be connected to the web by some personal device and there will be smart enough programs to automatically separate fact from fiction to bring you answers you want. Arming the world with instant facts will be transformational.

Probably the most astounding change to people will be in the medical field. We will have nanobots in our bloodstream that will be alerting us when they see something wrong. This means we will be able to head off colds and infections immediately and the day of the flu epidemics will be in the past. And many of our current medical practices will look like the dark ages. If a child needs to have their tonsils out we will let these nanobots gently remove the tonsil, cell by cell, over weeks so that there is no need for a scalpel and the associated risk and trauma. People with various chemical imbalances will finally have a local and effective treatment and cure. By the end of the century people will have to ask what cancer was because it will be a ‘disease’ of the past.

3D printing will be transformational. Today this is a novelty, but within a few decades this is something that your house will do for you in the background without you having to do anything more than telling it what you want. Your house will order the needed raw materials for the 3D printer, which will be delivered by robot-powered delivery vans. Your smart house will then make the needed item as requested. I remember a Star Trek episode where underground robots made things that people wished for. We are not too many years away from some version of that reality.

Will Software Eat the Telecom Industry?

no-cable-tv

no-cable-tv (Photo credit: hjl)

A few years ago Marc Andreessen, founder of Netscape coined the phrase that software is eating the world. And by that he meant that software based systems were killing off traditional industries one by one.

It’s a great observation because we have seen entire industries crumble from web competition. And just about every industry that doesn’t involve a direct physical service, such as transportation, or involve the manufacturing of goods has felt the pinch. We still need to go see a doctor when we are sick (although a lot of us will go to Web MD and misdiagnose ourselves first).

Everybody is familiar with the industries that have been decimated by competition with software. How many of you still use a travel agent when you want to book an airline ticket? When was the last time you drove to rent a movie to watch for the evening? How many music CDs have you purchased this last year? Those industries are largely gone. And many other industries have been injured by software, if not outright killed. For example, a large percentage of stock trading is now done online without the need of a  stockbroker. There is barely an industry that hasn’t felt some pinch due to software.

Looking out a decade or two it’s easy to foresee a whole lot more industries and jobs that are going to be killed by software. Driverless vehicles are likely to eventually get rid of cab drivers and truckers. 3D printing is going to wipe out a ton of manufacturing companies. When you need a small widget you’ll just print your own, and you’ll even be able to go to a friend to print a new 3D printer.

Can the same thing happen to telecom? Certainly the executives at cable companies seem to be denying that the web is changing their business model. They have an excuse and a story for every decline they see in subscribers. I think it is clear to anybody who understands the industry that for a communications company to be relevant a decade from now that they will need to change their corporate identity to become an ISP. If you can deliver a fast pipe you can survive, and possibly even thrive. But at some future point there is no longer going to be a company that is primarily a cable company.

Anybody who thinks that voice and cable TV, two legs of the triple play, are not in danger of going the way of the CD store or video rental store is kidding themselves. The history of products that have been killed by software is that, at some point, the general public sees more merit in the new software version of the product than in the traditional one. Such changes can be rapid and viral and it only took a few years for people to stop buying music CDs and renting movies at a local store. It was much easier and more affordable to do it the software way.

What the cable companies seem to be ignoring is how much the public is talking about and thinking about dropping their product. I know I am in the industry and so I am attuned to any discussion about telecom topics. But I have noticed that almost everybody I know has either dropped cable or has seriously thought about dropping or downsizing their cable subscription. If that idea goes viral and becomes conventional wisdom, then a huge percentage of cable subscribers could disappear in a very short time.

I am positive that the executives in all of these other industries saw the end coming. But in the US corporate world, with an overriding emphasis on quarterly earnings, no executive from the companies that are now dead ever came out and publicly said, “Our industry as we know it is doomed and our company needs to change massively if we want to survive”. Instead we saw the music and other industries go down in flames rather than admit publicly that they couldn’t make it. Our financial system punishes those who tell an ugly truth, and so company after company died without ever publicly admitting they could not compete with the new paradigm.

So, will software kill the telecom industry? I think it’s inevitable that it will kill voice and cable TV as products. The product that will survive in some form is fast data, because without that none of the software killer products will work. Fast internet, along with electricity and water will be the basic utilities of the future. Will software eat the telecom industry? Probably not, but it sure as hell is going to take a big bite out of it.

How Long Should it Take to Pay for a New Asset?

Ladder Racks

Ladder Racks (Photo credit: dmitrybarsky)

One common question I get is “How does a company determine how long it should take to get back a capital investment”. I get this question concerning all types of investments – the investment to add one customer, the investment to add a new product line, the investment to expand to a new town. While the answer is different for every company, the way that you look at the issue is the same for everybody. I think you need to look at the following factors in determining what is right for your company.

Revenue Generated. It’s easier to answer this question if you are building an asset that directly contributes to generating a specific new revenue. For instance, if you build to a new customer’s house or build to a cell site, the revenues from that new location can be directly attributable to the assets being added. But things aren’t always this clear. For instance, you might be adding a new router that contributes to many existing revenues as well as enables some new ones. Or you might be doing an upgrade on a cable headend that will make the product marginally better but that will probably not attract new customers. The more direct the relationship between a revenue stream and an asset, the easier it is to talk about paying for that asset.

Margin Generated. Once you have determined the revenues associated with a given asset, you then want to look harder at the margins on those revenues, because it is only the margins that can be used to recover the cost of the assets. For example, if you pay 50% of you cable revenues for programming, then you only have the remaining 50% of margin left that can be used to recover the cost of an asset that is used to supply cable TV. I very often see companies be too simplistic and say such things as a new revenue will pay for itself in two years. But when you look at the facts they often mean that they will generate enough gross revenue to equal the cost of the asset, but that is not the same as having generated enough margin to actually have recovered the cost of the asset.

Financing Term. If an asset is financed with debt, then you can stretch the recovery of the asset for as long as the term of the debt if you want to. That may not be the best business goal, but you don’t really pay for the asset until you make the debt payments. One thing that is important though, is that if you finance an asset you need to add the interest and other financing cots to the cost of the asset.

Contract. Sometimes you will have a contract specifically designed to recover the cost of an asset. For example, if you build fiber to get to a cell tower and get a 5-year contract bandwidth at that tower, then ideally the revenue generated at the tower will pay for the asset within the five year term of the contract. You should think about breakeven and profitability as if the contract will not be renewed.

Churn. Companies often fail to consider churn in looking at the recovery of assets. For example, supposed you are building a fiber network to a new town, and further suppose that you have a 10% churn of customers each year. In such a case, you need to factor in this churn in looking at whether you can recover the cost of adding drops and electronics to customers. While you may have some customers who will last ten or more years on the network, there also will be customers who drop off in the first and second year and will strand some of your new investment without any compensating revenue stream.

Tolerance for Risk. This is the most intangible item on the list, but often the most important. It is important for each company to be realistic about their tolerance for risk. For example, a company that doesn’t have much access to debt and which must pay for new assets out of current cash flows has a much lower tolerance for risk than a company with deep pockets or an open credit line. The first company really cannot afford to make any mistakes in rolling out new investment and they also need to recover any investment made faster than the second company.

A company needs to take all of these factors into consideration. There is no industry-standard answer to the question, but there is a right answer for everybody. A lot of the answer comes back to the last bullet point – in your company, when are you going to need the cash back from an investment in order to make another investment?