The Ongoing Special Access Battle

eyeballThere is a big battle going on at the FCC over special access rates and the FCC has promised to finally weigh in later this month. Special access in the industry refers to the sale of dedicated TDM circuits like T1s and DS3s. To some extent this is a fight between the big RBOCs and all of the CLECs and other carriers that need special access circuits to reach customers. But this battle actually affects all of us because the businesses you deal with such as banks or other entities (like your local government) are big users of these products.

There are several different issues being contested in the special access investigation currently underway at the FCC. But most of the battle is about the price of special access as well as the unfair practices of the big special access providers like Verizon and AT&T. This whole fight comes down to money and special access is still a major source of revenue for the big telcos. To show you how big, USTelecom, the lobbying arm for the big telcos sponsored a study that said that regulating special access rates could cost 43,560 jobs and $3.4 billion in economic growth over five years.

Special access rates are very high. They were set in the days when TDM circuits were the state of the art technology. We all remember when it could easily cost you $700 per month or more to get a T1 to a business – a lot of money to pay for a symmetrical 1.5 Mbps connection. Last year I looked at the data bill for an urban county and they were still buying millions of dollars of these special access circuits – at nearly full cost. I estimated they could cut their bills by 50% to 60% by shifting to an alternate provider.

But therein lies the big rub with special access. Once you get outside of the main business district in most cities the RBOCs are still the only ones that have wires connected to most buildings. And so as absurd as it sounds, for a huge percentage of geography in the US special access is still the only way to provide dedicated transport to a business (meaning their data doesn’t get commingled with some other business). And this means that special access is what CLECs and other carriers must buy from the telcos if they need access to a given business.

The RBOCs make deals with the largest carriers – Level3, XO Communications and Sprint. These carriers can get a substantial discount on buying special access due to the volumes they purchase. That doesn’t sound unfair until you look into all of the strings that come attached with the volume discounts. For example, Level3 has complained in the FCC docket that there are markets where Verizon requires them to buy 90% of their connections from Verizon – or else not get the discounts. All of the carriers complain about termination charges. Should a customer of one of these carriers move or go out of business the RBOCs still demand that the carriers pay the cost of the circuit for the length of the involved contracts – lengths upon which the RBOCs largely dictate.

And of course, if you are not a huge carrier you don’t get the bulk discount. A business who wants to buy special access on their own (such a bank that wants to connect to multiple ATMs must pay the full tariff rates.

Another part of the battle at the FCC is that the carriers want more access to the fiber and Ethernet services owned by the RBOCs. But the telcos are very judicious about deciding which of their facilities are open to competitors and which aren’t.

Of real concern in the carrier world is the announcement that Verizon wants to buy the fiber assets of XO Communications. Today most businesses and smaller carriers will buy from the big carriers like XO, Level3 or Zayo because it’s cheaper, there’s less paperwork and these companies are far easier to work with than the telcos.

By buying XO, Verizon will be eliminating one of their largest and more vocal opponents. They also will be folding a lot more fiber into the Verizon networks. The fear is that Verizon will either convert the XO network to special access, meaning the price will go up, or they will consider it as fiber needed for Verizon’s own needs only and withdraw the networks from the open market.

In any market where there is a limited amount of fiber built to businesses, removing one of the biggest fiber owners like XO is going to be a big blow to many of those who use it. Many of them are either going to see rate increases or else have to find alternate transport elsewhere.

There is no telling what the FCC will order in this docket. But the position taken by the telcos is typical and a bit scary. They claim that there is vibrant competition available in the marketplace and they accuse the CLECs and carriers of whining to get cheaper prices. They love their monopoly power in most markets and aren’t going to give it up easily.

Stranding Fiber Investment

Fiber CableThere is one issue with fiber-to-the-home networks that doesn’t get talked about a lot. In areas with normal churn – people moving in and out – a fiber network will end up with stranded fiber drops and ONTs that have been built to homes and businesses which no longer have service.

This happens to all networks of course, but the investment from the curb to the customer is a lot more expensive in a fiber network than it is with a coaxial or copper network. The cable and phone companies normally just leave the drop in place and hope that sometime in the future that the residents at the address will want service again.

Most of these stranded investments come from a couple of causes. First are people that don’t pay their bills and have been cut off service by the fiber provider. In any given market when a new ISP opens their doors, a lot of households that can’t pay their bills will try to get service with the new company. And so if a new fiber provider doesn’t do good credit checks they tend to get flooded with the bad debt customers, and they will have invested in building fiber to a lot of places that aren’t likely to pay them.

But over time most of the stranded investments come from people who move. The new people moving into a home might not want the same service. But more often, the people moving into a home will have automatically called the incumbent cable or telco provider for service – generally not even knowing there is an alternate broadband provider available to them.

This is not an issue in those places where the incumbent is the fiber provider. But for competitive fiber providers this can turn into a sizable problem over time. I know companies that have accumulated stranded investments as large as 10% of the total passings in a market.

I have clients with different strategies for this problem. First, companies using external ONTs need to have a process for retrieving and reusing the ONT electronics at houses they no longer serve. A surprising number of companies leave the electronics in place hoping that they will get the customers back.

But the bigger issue companies face is how to reach new residents before they choose the competitor. People that move into a new town tend to automatically think of the incumbent provider when ordering the triple play, and it’s generally too late to get to them if they’ve already signed a contract for service.

One common strategy is to make deals with the most active real estate agents and rental agents in a market so that they tell new tenant about your fiber service. I have clients who give free service to such folks as a way to induce them to make sure that new tenants know about the fiber.

It’s also vital these days to keep good records on potential customers. If you miss an opportunity with a household that signs a one or two-year contract with the incumbent, you should have a software program that alerts you when that contract is going to expire so that you can make your pitch later. I’m always surprised at the number of clients that don’t capture and track this kind of information in any usable way. Over time you should know about every home in your fiber footprint. You should know who doesn’t pay bills, who doesn’t want broadband at any price, who has contracts with the incumbents, etc.

Two markets with an especially large potential for stranded investment are college towns or towns with a military base where a significant number of residents turn over every year. I have clients who have gotten very creative and work with the colleges and the military to make sure that information about them is given to new students.

But the takeaway from this discussion is that you are going to spend more money building fiber than you might have planned for in your original business plan. Fiber drops are not cheap – particularly buried ones – and you are going to build plenty of drops that never drive enough revenue to cover their costs. Your best way to fight this is to always check the credit of potential customers and to have a plan in place to be able to market to new people who move into your community.

Upcoming Webinar on PPPs

I have written a lot lately about Public Private Partnerships (PPPs) in this blog. After many years where most carriers were leery of municipalities we are starting to see a lot of beneficial partnerships arise throughout the industry.

I will be on a webinar panel on April 21 at 3:oo eastern discussing the topic in more detail. The panel is being sponsored by Finley Engineering and presented as part of the Telecompetitor Interact webinar series.

The webinar will look at some practical considerations for forming a telecom PPP.

You can find more details at this web site.

 

An Alternative to Title II

Network_neutrality_poster_symbolSince major sections of last year’s net neutrality ruling are being reviewed by the courts, I started wondering what would happen if the courts reverse that ruling and say that the FCC doesn’t have the authority to regulate broadband under Title II.

It’s hard to think that the courts will overturn this completely because to a large degree the courts aimed the FCC at the current solution in their order vacating the FCC’s first attempt to regulate broadband. But there are a lot of lawyers who think that the FCC rushed the current ruling into place without following its own rules – and that could cause problems in the court.

There is one interesting alternative to the net neutrality ruling that was published late last fall. It’s called the Grand Bargain and was published by the Information Technology and Innovation Council (ITIC). This is a think tank that includes several federal congressmen, academics, and representatives of tech companies like Cisco, HP, Amazon, Google, Oracle, Intel, IBM, Qualcomm and Microsoft.

The main thrust of the Grand Bargain is that broadband ought not to be considered as a “telecommunications service” which would mean that it would not be appropriate to regulate it under Title II. The ITIC sees nothing inherently wrong with data prioritization and understands that there are already many places in the web and network today where prioritization is essential – such as priority given to automated stock traders or to gamers.

The ITIC reports says that “the real issue should not be prioritization versus no prioritization, but what kind of traffic can be prioritized under what business arrangements.” The ITIC feels that the net neutrality ruling places too much emphasis on the negative aspects of paid prioritization without looking at the overall good it can create.

The Grand Bargain doesn’t just favor pro-carrier solutions but is also in favor of some of the FCCs thinking on consumer issues like the broadband adoption programs and strong privacy protections for consumers. It’s an interesting proposal that looks at the beneficial ideas that came from both sides of the net neutrality arguments. It’s very much a middle-of-the-road set of ideas and there is something in there for everybody to like.

But the proposal has one big flaw that I think is shared by every alternative net neutrality idea that I’ve seen: the proposal looks at goals that the ITIC would like to see FCC achieve but does not look at the FCC’s underlying authority to do what they are suggesting. The current net neutrality order was somewhat heavy handed in claiming Title II authority over broadband, but I can’t see that the FCC has any alternative.

Recall that the FCC’s first attempt to regulate broadband contained some of the aspects of the Grand Bargain, and the courts said that the FCC did not have the authority to regulate broadband in general.

This inability to regulate broadband was the FCC’s own doing when years earlier they had declared that broadband was an information service and was not a telecommunications service. The Grand Bargain and every other alternative to net neutrality fails to deal with the basic underlying question of the FCC’s authority to regulate broadband outside of Title II.

It’s clear that the companies behind the ITIC don’t want to see FCC regulation of the parts of the Internet that they think should be wide open and unfettered. But without some kind of regulation the Internet was already headed towards a very ugly future. It’s not hard to imagine a future where half a dozen large companies control most aspects of the Internet. We were already starting to see hints of that as Facebook and other big web companies were negotiating with large ISPs to make their products part of the base broadband packages – to the detriment of other web content. It’s inevitable that companies like Facebook and Google will try to make deals that expand reach and influence on the web. Title II regulation looks to be the only way that a regulator could apply brakes to such deals by regulating the ISP half of the equation.

It would be nice if we had an FCC that could just pick and choose what to regulate and which was free to do the kind of things proposed by the Grand Bargain. There are other countries that can do this. But the FCC is constrained by the laws that govern telecom and broadband and the only way for the FCC to regulate broadband outside of Title II is for Congress to give it the direct authority to regulate broadband without having to jump through any hoops. Unfortunately we live in a time of political gridlock and a largely ineffective Congress, and so this kind of solution is not likely coming any time soon. So I am still hoping that the court can find a way to allow Title II regulation. It’s better than all of the alternatives.

The DARPA Spectrum Challenge

darpaDARPA (the Defense Advanced Research Projects Agency) has launched a grant challenge to find a way to more efficiently use spectrum in the US. The prize is called the Spectrum Collaboration Challenge (SC2) and DARPA is offering a $2 million reward to whoever comes up with the best way to adapt in real-time to congested spectrum conditions while maximizing the use of our spectrum. The winner of the challenge won’t be a solution that dominates the use of spectrum, but will instead be looking at solutions that collaboratively share spectrum in the best manner between multiple users.

DARPA assumes that it’s going to require artificial intelligence to be able to make real-time decisions about spectrum sharing. They realize there is no easy answer and so the competition will start in 2017 and last until 2020. What is probably the coolest thing about the challenge is that DARPA is creating a large wireless test-bed they are calling the Colosseum that is going to let participants try out their ideas. This will provide researchers with remote capabilities to conduct experiments in simulated real-life environments such as a busy urban street or a battlefield (which is primary the main reason they are interested in this).

It’s a great idea because our spectrum in this country is certainly a mess. There are certain bands of spectrum that are used very heavily and other spectrum that lies fallow and unused. Further, the FCC has chopped most spectrum up into discrete channels and provided buffers between channels that go largely unused.

What really makes spectrum a challenge is that different bands are ‘owned’ by different parties and the whole point of buying spectrum from the FCC is for the buyer to use it in almost any way that makes sense to them. But the consequence of spectrum ownership is that huge swaths of spectrum are unused or at least unusable by everybody except the spectrum owner. But one would think in a battlefield situation that just about any spectrum can be used without worrying about the rules.

And while any solution that is found will probably benefit the military more than anybody else, there is still a huge amount of good that could be done with better spectrum collaboration. Certainly spectrum owners could make some or all of the spectrum they control open to collaborative sharing, for some sort of compensation.

A lot of people might look at this idea and think that this could mean great things for cellphones and other mobile communications. But cellphones have a whole different issue that makes them a very poor candidate for sharing in too many different swaths of spectrum. A primary issue goal for cellphones is power conservation and it costs a lot of power to operate antennas in too many frequencies.

Most cellphone makers today limit a phone to only using a few different frequencies at once. This is one of the reasons for the huge variance people get in 3G and 4G data rates – many of the phones on the market only look at a few different frequencies, to the detriment of how much bandwidth can be downloaded at any one time. This is something that cellphone makers don’t talk about and you have to look deep into a cellphone’s specifications to understand the frequency capabilities of a given handset.

There are software defined radios today that are a lot larger than handsets and which can be easily tuned to different frequencies. But this is something that is incredibly challenging today to do on the fly and to do accurately. And of course, to do what DARPA has in mind means coordination and collaboration so that a given sender and receiver are using the same frequencies at the same time. It’s the kind of challenge that can make a wireless engineer’s head hurt and it probably will take an AI to be able to handle the complexities involved in truly sharing multiple spectrum bands in real time.

The Increasing Importance of Broadband

4cb1f2dc96040Anybody who does what I do for a living has all sorts of evidence that the demand for broadband has been growing. For example, I have worked with rural communities without broadband for many years and have found that the number of people in those communities who say they will buy broadband is growing larger every year. I now have clients who have built rural networks and who have gotten 75% to 80% of the customers in the market footprint. These kinds of take rates would have been extraordinary five years ago but are now becoming the expected.

Pew Research Center has done a new survey that tries to quantify the importance that people place on broadband. They gave this same survey in 2010 and the new survey lets us see how the response to questions about broadband have changed over time. Here are a few of the new results:

  • 52% of people feel that those without the Internet are at a major disadvantage for finding out about job opportunities or obtaining new career skills. Only 25% thought that this is not a disadvantage.
  • 46% thought those without broadband are at a major disadvantage for learning about or accessing government services.
  • 44% think lack of broadband is a disadvantage for learning new things that will improve or enrich people’s lives.
  • And probably most significant, 69% of respondents in general felt that people without internet access have a major disadvantage.

We can also see how those same three responses have changed just since 2010.

  • Those that feel that the Internet is needed for job skills has grown from 43% to 52%.
  • Those that feel that the Internet is needed for access government services has grown from 29% to 46%.
  • Those that feel that access to broadband enriches people’s lives has grown from 41% to 44%.
  • In 2010 56% of people overall thought not having access to the Internet was a disadvantage, and that is now 69%.

For every question studied the percentage of African-Americans, Hispanics and young adults (ages 18-29) that thought the Internet was vital was higher than other groups.

Interestingly, those without home broadband access at home were slightly less likely to think that not having broadband is a major disadvantage. For example, in the recent poll 65% of them thought not having broadband was a major disadvantage compared to 69% of all respondents. But this is also the group that saw the biggest change since 2010 when only 35% of non-broadband households thought that was a disadvantage.

These kinds of surveys are interesting, but of course there are a hundred other questions you’d like to see asked. But sticking to the same questions that were asked in 2010 show how much the importance of broadband has grown in just five years.

I see this shift every day. I’ve been helping communities look for broadband solutions for nearly 15 years. Years ago when a community wanted to talk about broadband there were generally two reasons for it. First was economic development, meaning either attracting new jobs to a community or keeping the existing jobs from leaving. Secondly, communities wanted to get some price competition and thought that the incumbent providers didn’t care about their communities.

But today the demand for better broadband comes from citizens demanding a solution from local politicians. People hear of other communities that have found a way to bring broadband and they want the same. People without broadband are starting to feel like they are being left behind – and to a great extent they are. This kind of survey just reaffirms what we already know.

The High Cost of Using Your Data

eyeballAT&T just announced that they will be introducing an option for U-verse broadband customers to get unlimited broadband from any of their plans for an additional $30 per month. Along with this announcement AT&T is also increasing the data caps on existing products. For example, some plans will be increased from 250 Gb per month in total download to either 400 Gb or 600 Gb. And the current 500 Gb cap will be raised to 1 Tb.

This is very similar to the Comcast data cap plan where customers can pay $30 or $35 to get unlimited data usage for customers that exceed their 300 Gb cap. Comcast also lets customers buy additional 50 Gb blocks for $10.

What I find amazing about both of these concepts is that both companies are marketing this as if they are giving people something. What they are really doing, especially for Comcast, is punishing people who dare to drop their cable TV product and instead get video over the Internet.

For anybody who actually uses the data that they pay for each month both of these plans are nothing more than a $30 rate increase. There is no cost justification for such a gigantic overage charge. Most of my clients (who are very tiny companies compared to Comcast and AT&T) only pay a few bucks per month average for the raw bandwidth to the Internet for their broadband customers. It’s hard to think that the cost for these giant companies isn’t under $1 per month on average. Customers that exceed these caps might, at most, cost these companies an extra dollar – and that is probably too high of an estimate.

I’ve been predicting for several years that data caps were coming and that caps already in place were going to start getting enforced. While the cable companies added 3.3 million new broadband customers for 2015, they don’t have to look at too far into the future to see a time when everybody that can afford broadband will have it. The market is starting to approach the saturation equilibrium point. And they are also seeing a nibbling away of customers by fiber providers like Google, CenturyLink and municipalities.

Meanwhile, just about everybody in the cable business is seeing a drop in revenues as people either cut the cord or else downsize their packages. And that trend is only going to accelerate with skinny bundles from the cable providers and a host of OTT option as an alternative to traditional cable.

If you are a publicly traded company like AT&T or Comcast there is tremendous pressure to always grow revenues quarter over quarter and year over year. But at a time when the broadband customers are going to top out and when cable and telephone are in a decline, these companies have few options for new revenues other than from broadband rates. That is the main function of the data caps – the big ISPs are gouging their biggest data users first, with the full knowledge that every year more and more people are going to creep over the data cap threshold.

The AT&T announcement also speaks to duopoly competition. Any community that thought they might see some renewed competition between Comcast and AT&T now knows for sure that that isn’t going to happen. These companies are not competing with prices against each other – they are doing the opposite and matching each other in the ways they will increase prices and revenues. That can only happen in a monopoly or duopoly.

This is only the first step in data price increases and I think we are now going to soon start seeing all broadband prices increase every year from these providers, in the same manner that we are used to cable rate increases. It’s their only real option to keep making the money that Wall Street expects from them.

Regulating Broadband Rates

FCC_New_LogoFCC Chairman Wheeler testified in front of the House Communications Subcommittee recently about the FCC’s authority to set broadband rates. He was testifying about a bill passed out of subcommittee a few weeks ago, introduced by Rep. Adam Kinzinger (R-Ill.) that would prohibit the FCC from regulating broadband rates.

Wheeler cautioned that he was concerned that any law that curtailed the FCC’s right to regulate rates might also inhibit the FCC’s ability to regulate the three basic tenets of network neutrality – preventing blocking, throttling, or paid prioritization of data.

Unless you are an FCC rule junkie it’s probably hard to understand why rates and net neutrality might be tied together. But the Chairman’s concern comes from the reliance of the FCC on using Title II as the basis for regulating net neutrality. Part and parcel with the Title II rules also comes the ability to regulate rates.

Back when the Chairman was talking about using Title II rules he said publicly that the FCC wasn’t intending to get into the rate regulation business for broadband. In these hearings the Chairman repeated this and said that the FCC would be glad to help craft language that limit the FCC’s ability to do traditional rate regulation while making sure not to undo the other aspects of Title II regulations.

As a consumer and one who tracks industry trends I’m not so sure that the FCC should be so quick to give up rate regulation of broadband. I believe that we are at the beginning of the time when we will see continuous annual price increases for broadband. The large cable companies and telcos are under huge pressure from Wall Street to increase earnings every quarter and a lot of their traditional revenue streams like cable TV and telephone service are in a decline. This is going to leave no alternative to the big ISPs but to raise broadband rates.

We’ve already seen the beginning of this. The recent Comcast data caps trials and the recent announcement from AT&T that customers could buy unlimited data for only $30 more than what they are already paying for broadband are both nothing more than big rate increases on the biggest data users of broadband. All of these companies understand how fast consumer use of broadband is growing. We have been a curve since the 1980s where home use of broadband has doubled about every three years and there is no sign of a slowdown. So the big ISPs set data caps knowing that they will get extra revenue today from perhaps 10% to 20% of their customers, but also knowing that each year it’s going to affect more and more people.

And rate caps are only the first place ISPs will raise rates. We’ve seen a number of the large ISPs raise rates a few bucks in the last few years, and as earnings pressure increases one can expect that we are not many years away from a time when data rates are going to be increased each year in the same manner that cable rates have increased. But there is a huge difference. Cable rate increases have been driven in large part by increases in programming costs (although cable companies usually tacked on a little extra to boost bottom line). But it’s already clear today that broadband has a huge margin and that, if anything, the cost of underlying Internet connectivity keeps dropping each year. If ISPs raise data rates it’s due to nothing more than wanting to make more money.

And there is fundamentally nothing wrong with any business wanting to make more money. Except that for most markets in the US there is only one dominant broadband provider in the form of a cable company. And even where there is a second provider, like Verizon FiOS, they will undoubtedly be raising rates in lockstep with the cable companies in a pure demonstration of duopoly competition.

So I hope that the FCC doesn’t give up rate setting abilities because the day is coming within a decade when it’s going to be badly needed. You can be sure that the ISPs understand this completely and that they are the authors of the bill that would stop the FCC from looking at rates. They know that the FCC isn’t likely to do this today, but they know that there is going to be a huge public outcry for the FCC to do this in the future and they are launching a preemptive strike now to win this battle before it starts.

2015 Broadband Growth

S vurveOne of the things I’ve figured out about the telecom industry is that statistics are often used to tell very different stories. Consider this example regarding wireline broadband adoption:

In December Pew Research released the results of a survey that suggested that overall wireline broadband adoption had dropped to 67% in 2015, down from a high of 70% in 2013. This was the first time I had ever heard any suggestion that the total number of landline broadband connections have flattened out, let alone dropped.

Pew went on to say that main culprit for the drop in broadband adoption is broadband prices and that a lot of homes feel they cannot afford a broadband connection, and instead rely solely upon broadband from their smartphone. That sounds plausible, and Pew was comparing to a very similar survey they had given in 2013.

But the Leichtman Research Group just released a report saying that the big cable companies added 3.3 million broadband customers in 2015. They said that during the year that the large telcos lost 187,000 landline broadband connections, meaning an overall net increase of over 3.1 million new broadband connections for the year.

The Census estimates there were 124.6 million housing units in the country in 2015, so the big companies in total brought broadband to an additional 2.5% of the total market. That sure does not sound like a year in which broadband has declined as suggested by Pew. And Leichtman has shown total market growth for the last several years as well.

In this case you have to believe the Leichtman numbers. They gather total subscriber numbers from all of the large carriers – cable companies and telcos. Since almost all of these companies are publicly traded, and since Wall Street keeps a close eye on subscribers, one has to think that the Leichtman numbers are pretty accurate.

On the other hand the Pew numbers come from nationwide surveys. Pew did three surveys in 2015 with a total of 6,687 adult respondents. The 2013 numbers they are comparing to was based on surveys of 6,010 adults.

I have always been suspicious of nationwide surveys. Our firm gives surveys and I have found that local surveys can be very accurate and the results can often be correlated with externally collected facts. For instance, I’ve had clients do surveys to find out how many customers their competition has in a market, and these surveys often prove themselves to be valid by also accurately showing the market penetration of my clients. That makes it easy to believe that the numbers for the other competitors in the market are also accurate.

I know that Pew is very careful about how they randomly choose survey subjects. For instance they will call people with cellphones as well as those with landline telephones. If you crunch through the statistical formulas that describe the predicted accuracy of a nationwide survey, then the Pew surveys should be very accurate.

The Liechtman numbers are not a 100% count of broadband customers and only count the customers of the biggest broadband providers – but those providers are something like 95% of the whole market. I know enough about a lot of companies in the rest of the market, the smaller carriers, to know that many of them are still seeing healthy broadband customer growth.

I have no way to explain this difference and I suspect that Pew can’t either. Their survey should be pretty accurate. Yet sometimes nationwide surveys just don’t give accurate results. This can often be seen with elections where different surveys given at almost the same time show fairly disparate predictions. The trouble is that surveys from groups like Pew influence decision makers and there are now going to be those who think that broadband growth has topped out. I was just on a call last week where somebody mentioned the Pew numbers. And while the Pew numbers of total broadband users might not be totally accurate, one can still believe that  their observation that some people are finding broadband increasingly expensive probably is valid. The problem is, you just can’t really know how many people that might be.

The Downside to Cloud Services

Cloud_computing_icon_svgI have a client who has been buying a cloud service for about two years and has a number of reasons to be unhappy with the service. I’m not going to name the specific service because there aren’t many vendors in this particular space. But the issues my client is experiencing look to be common with a lot of cloud services.

My client buys this service to resell along with other services to his own customers. His number one complaint is that he never knows what service he is going to wake up to each day. The cloud software will be working great one day and then the vendor will implement a software change and all of a sudden he gets calls from his end-user customers that things have gone wrong. And inevitably the problem turns out to be something that the cloud service vendor has introduced into the service as a supposed upgrade or improvement.

This is not a new phenomenon and anybody that purchased a voice switch, a cable TV headend, or a complex billing system can remember back to the day when this same sort of thing happened all of the time. Carriers would shudder each time that they got a software update from a vendor because it often caused more harm than it did good.

And the industry learned how to deal with this problem. First, carriers started to insist that vendors build test labs and that they try out new software first in the lab rather than foisting it onto the lab of end users. Second, they insisted that software vendors update software in discrete releases so that each carrier could decide if they needed to install a new update or not.

I remember a time in the late 90s when CCG routinely recommended that our clients not install software updates. There were so many problems with new software releases that we found that it was better to let the update hit the world and to let other carriers debug each new software release. My clients would purposefully fall numerous software upgrades behind, but as long as they weren’t experiencing end-user issues they were happy.

But now my clients are starting to buy services in the cloud, and in doing so we have gone back to the 90s all over again. The biggest problem with most cloud vendors is that they only run one version of their software – the latest. The vendor will update the cloud software and every one of their customers will have the same version. This certainly makes life easier on the cloud vendor.

But unless the vendor has amazing software coders that never make mistakes (and that is never going to happen) then the vendor can release an update that has dreadful bugs in it and the test lab for these bugs become the end-user customers of the carriers. A carrier might not even realize there was a software update until they get complaints from their own customers. But now the situation is much worse than the old days, because one of the most common ways to fix this sort of problem used to be to reinstall an earlier version of the software that they knew worked right.

I guess that cloud service providers need to learn the same lessons that the other vendors in the carrier industry learned a few decades ago. Just because software is in the cloud doesn’t change good software practices – in fact it makes them even more important. A software vendor that uses end-users as his software testing lab is going to get a horrible reputation and in the long run is not going to keep customers in the carrier space.

And so I hope that software vendors would implement the same kinds of changes that the industry forced vendors to implement decades ago:

  • It should be the carrier’s choice about accepting a software upgrade. Updates should never be automatic. This means that the cloud vendor needs to keep multiple versions of their software available online.
  • Software vendors need to maintain a test lab of some kind. Most software ultimately controls hardware and the vendor ought to have a lab on which to make certain that changes in the software do what they are intended to do while not screwing up something else.