The Return of Edge Computing

We just went through a decade where the majority of industry experts told us that most of our computing needs were going to move to the cloud. But it seems that that trend is starting to reverse somewhat and there are many applications where we are seeing the return of edge computing. This trend will have big implications for broadband networks.

Traditionally everything we did involved edge computing – or the use of local computers and servers. But a number of big companies like Amazon, Microsoft and IBM convinced corporate America that there were huge benefits of cloud computing. And cloud computing spread to small businesses and homes and almost every one of us works in the cloud to some extent. These benefits are real and include such things as:

  • Reduced labor costs from not having to maintain an in-house IT staff.
  • Disaster recovery of data due to storing data at multiple sites
  • Reduced capital expenditures on computer hardware and software
  • Increased collaboration due to having a widely dispersed employee base on the same platform
  • The ability to work from anywhere there is a broadband connection.

But we’ve also seen some downsides to cloud computing:

  • No computer system is immune from outages and an outage in a cloud network can take an entire company out of service, not just a local branch.
  • A security breach into a cloud network exposes the whole company’s data.
  • Cloud networks are subject to denial of service attacks
  • Loss of local control over software and systems – a conversion to cloud often means losing valuable legacy systems, and functionality from these systems is often lost.
  • Not always as cheap as hoped for.

The recent move away from cloud computing comes from computing applications that need huge amounts of computing power done in real time. The most obvious examples of this is the smart car. Some of the smart cars under development run as many as 20 servers onboard the car, making them a driving datacenter. There is no hope of ever moving the brains from smart cars or drones to the cloud due to the huge amounts of data that must be passed quickly between the car’s sensors and its computers. Any external connection is bound to have too much latency to make true real-time decisions.

But smart cars are not the only edge devices that don’t make sense on a cloud network. Some other such applications include:

  • Drones have the same concerns as cars. It’s hard to imagine a broadband network that can be designed to always stay in contact with a flying drone or even a sidewalk delivery drone.
  • Industrial robots. Many new industrial robots need to make decisions in real-time during the manufacturing process. Robots are no longer just being used to assemble things, but are also being used to handle complex tasks like synthesizing chemicals, which requires real-time feedback.
  • Virtual reality. Today’s virtual reality devices need extremely low latencies in order to deliver a coherent image and it’s expected that future generations of VR will use significantly more bandwidth and be even more reliant on real-time communications.
  • Medical devices like MRIs also require low latencies in order to pass huge data files rapidly. As we built artificial intelligence into hospital monitors the speed requirement for real-time decision making will become even more critical.
  • Electric grids. It turns out that it doesn’t take much of a delay to knock down an electric grid, and so local feedback is needed to make split-second decisions when problems pop up on grids.

We are all familiar with a good analogy of the impact of performing electronic tasks from a distance. Anybody my age remembers when you could pick up a telephone, have instant dialtone, and then also got a quick ring response from the phone at the other end. But as we’ve moved telephone switches farther from customers it’s no longer unusual to wait seconds to get a dialtone, and to wait even more agonizing seconds to hear the ringing starting at the other end. Such delays are annoying for a telephone call but deadly for many computing applications.

Finally, one of the drivers to move to more edge computing is the desire to cut down on the amount of bandwidth that must be transmitted. Consider a factory where thousands of devices are monitoring specific operations during the manufacturing process. The idea of sending this mountains of data to a distant location for processing seems almost absurd when local servers can handle the data at faster speeds with lower latency. But cloud computing is certainly not going to go away and is still the best network for many applications. In this factory example it would still make sense to send alarms and other non-standard data to some remote monitoring location even if the data needed to keep a machine running is done locally.


KPMG’s Cloud Survey

Cloud_computing_icon_svgLate last year KPMG published the results of a survey on cloud computing. You can see the results here. The survey was given to 500 CEOs, CIOs, and CFOs of large companies with annual revenues of over $100 million.

You might ask why these results matter much to anybody who is smaller than that. I think it matters because in the IT world, what the big companies do moves downhill to the rest of us. As an example, if the large companies, with all of their buying power, move away from enterprise level routers, then the rest of us will be dragged in that same direction as the market for enterprise routers stops evolving and dries up. The large companies collectively have the majority of the buying power in this market.

When cloud computing got started a few years back the original sales pitch for the change was all about cost savings. Cloud vendors all touted that it was far cheaper to use computing resources in large data centers than to own your own computer resources that includes a dedicated staff to operate an IT network. And while cost savings is still part of the reason to change to the cloud, it’s no longer the only reason. The survey found the following reasons given by large companies for using the cloud:

  • Cost savings – 49%
  • Enabling mobile work forces – 42%
  • Improving customer service and partner interfaces – 37%
  • Understanding corporate data better – 35%
  • Accelerating product development – 32%
  • Developing new business lines – 30%
  • Sharing data globally – 28%
  • Faster time to market – 28%

In a similar survey from 2012 the responses were primarily about cost savings. For example, the emphasis on enabling a mobile workforce then was only given as a reason by 12% of respondents. What bought about such a big shift in the way that large companies think about the cloud in only a two year period?

The reason is that the cloud was originally a hardware transition. It let companies stop having to buy and maintain expensive computer systems and a large staff to operate them. Executives were tired of constantly being told that their systems were obsolete (and in our fast changing world they usually were). More importantly, executives were tired of being told that it was too hard to accomplish whatever they most wanted to do and they felt that their IT functions were often holding back their company. Many executives thought of their IT department as a black box which they didn’t understand very well.

In the last few years it has become clear that the cloud is not just a substitute for hardware and staff, but is also a catalyst for changing software. Large corporations have often been locked into huge software systems from companies like Oracle or Microsoft. While these packages did some things very well, there were some functions where they were just adequate, and other functions for which they were downright horrible. But the computer systems and IT staff tended to make everything work with a few integrated software packages rather than support a lot of different programs for various functions.

At the same time there has been a revolution in network hardware and a shift to the efficiencies of using large data centers, there is also a host of new software on the market that is extremely good at just a few functions. Companies have found that while they were breaking free of the restrictions of an in-house IT network and staff that they have also been able to break the bundles of the large software packages.

And this can be seen by looking at the claims that the respondents to the survey made about what they have already been able to achieve through the cloud:

  • Improve business performance – 73%
  • Improve the level of service automation – 72%
  • Reduce costs – 70%
  • Better integration of systems – 68%
  • Introduce new features and functions – 68%
  • Enhance interaction with customers and partners – 67%
  • Rapidly deploy new solutions – 67%
  • Replace legacy systems – 66%

Most of these results reflect changes in software as much as they represent just changing computer platforms. This is not to say that a shift to the cloud is seamless. For example, there is a lot of corporate anxiety about the security of their data. But overall, the large corporations are so far very happy with the shift and most plan on transitioning more to the cloud. Smaller companies are going to feel the tug to move to the cloud for the same reasons. It’s likely that you can save money and begin using newer and better software after such a change.

The Battle of the Routers

Cisco routerThere are several simultaneous forces tugging at companies like Cisco which make network routers. Cloud providers like Amazon and CloudFlare are successfully luring large businesses to move their IT functions from local routers to large data centers. Meanwhile, other companies like Facebook are pushing small cheap routers using open source software. But Cisco is fighting back with their push for fog computing which will place smaller function-specific routers near to the source of data at the edge.

Cloud Computing.

Companies like Amazon and CloudFlare have been very successful at luring companies to move their IT functions into the cloud. It’s incredibly expensive for small and medium companies to afford an IT staff or outsourced IT consultants, and the cloud is reducing both hardware and people costs for companies. CloudFlare alone last year announced that it was adding 5,000 new business customers per day to its cloud services.

There are several trends that are driving this shift to data centers. First, the cloud companies have been able to emulate with software what formerly took expensive routers at a customer’s location. This means that companies can get the same functions done for a fraction of the cost of doing IT functions in-house. The cloud companies are using simpler, cheaper routers that offer brute computing power which also are becoming more energy efficiency. For example, Amazon has designed all of the routers used in its data centers and doesn’t buy boxes from the traditional router manufacturers.

Businesses are also using this shift as an opportunity to unbundle from the traditional large software packages. Businesses historically have signed up for a suite of software from somebody like Microsoft or Oracle and would live with whatever those companies offered. But today there is a mountain of specialty software that outperforms the big software packages for specific functions like sales or accounting. Both the hardware and the new software are easier to use at the big data centers and companies no longer need to have staff or consultants who are Cisco certified to sit between users and the network.

Cheap Servers with Open Source Software.

Not every company wants to use the cloud and Cisco has new competition for businesses that want to keep local servers. Just during this last week both Facebook and HP announced that they are going to start marketing their cheaper routers to enterprise customers. Like most of the companies today with huge data centers, Facebook has developed its own hardware that is far cheaper than traditional routers. These cheaper routers are brute-force computers stripped of everything extraneous and that have all of their functionality defined by free open source software; customers are able to run any software they want. HP’s new router is an open source Linux-based router from their long-time partner Accton.

Cisco and the other router manufacturers today sell a bundled package of hardware and software and Facebook’s goal is to break the bundle. Traditional routers are not only more expensive than the new generation of equipment, but because of the bundle there is an ongoing ‘maintenance fee’ for keeping the router software current. This fee runs as much as 20% of the cost of the original hardware annually. Companies feel like they are paying for traditional routers over and over again, and to some extent they are.

These are the same kinds of fees that were common in the telecom industry historically with companies like Nortel and AT&T / Lucent. Those companies made far more money off of maintenance after the sale than they did from the original sales. But when hungry new competitors came along with a cheaper pricing model, the profits of those two companies collapsed over a few years and brought down the two largest companies in the telecom space.

Fog Computing.

Cisco is fighting back by pushing an idea called fog computing. This means having limited-function routers on the edge of the network to avoid having to ship all data to some remote cloud. The fog computing concept is that most of the data that will be collected by the Internet of Things will not necessarily need to be sent to a central depository for processing.

As an example, a factory might have dozens of industrial robots, and there will be monitors that constantly monitor them to spot troubles before they happen. The local fog computing routers would process a mountain of data over time, but would only communicate with a central hub when they sense some change in operations. With fog computing the local routers would process data for the one very specific purpose of spotting problems, which would save the factory-owner from paying for terabits of data transmission, while still getting the advantage of being connected to a cloud.

Fog computing also makes sense for applications that need instantaneous feedback, such as with an electric smart grid. When something starts going wrong in an electric grid, taking action immediately can save cascading failures, and microseconds can make a difference. Fog computing also makes sense for applications where the local device isn’t connected to the cloud 100% of the time, such as with a smart car or a monitor on a locomotive.

Leave it Cisco to find a whole new application for boxes in a market that is otherwise attacking the boxes they have historically built. Fog computing routers are mostly going to be smaller and cheaper than the historical Cisco products, but there is going to be a need for a whole lot of them when the IoT becomes pervasive.

Beyond a Tipping Point

Cloud_computing_icon_svgA few weeks ago I wrote a blog called A Tipping Point for the Telecom Industry that looked at the consequences of the revolution in technology that is sweeping our industry. In that blog I made a number of predictions about the natural consequences for drastically cheaper cloud services such as the mass migration of IT services to the cloud, massive consolidation of switch and router makers, a shift to software defined networks and the consequent expansion explosion in specialized Cloud software.

I recently read an interview in Business Insider with Matthew Price, the founder of CloudFlare. It’s a company that many of you will never have heard of, but which today is carrying 5% of the traffic on the web and growing rapidly. CloudFlare started as a cyber-security service for businesses and its primary product helped companies fend off hacker attacks. But the company has also developed a suite of other cloud services. The combination of services has been so effective that the company says it has recently been adding 5,000 new customers per day and is growing at an annual rate of 450%.

In that interview Price pointed out two trends that define how quickly the traditional market is changing. The first trend is that the functions served traditionally by hardware from companies like Cisco and HP are moving to the cloud to companies like Amazon and CloudFlare. The second is that companies are quickly unbundling from traditional software packages.

CloudFlare is directly taking on the router and switching functions that have been served most successfully by Cisco. CloudFlare offers services such as routing and switching, load balancing, security, DDoS mitigation and performance acceleration. But by being cloud-based, the CloudFlare services are less expensive, nimbler and don’t require detailed knowledge of Cisco’s proprietary software. Cisco has had an amazing run in the industry and has had huge earnings for decades. Its model has been based upon performing network functions very well, but at a cost. Cisco sells fairly expensive boxes that then come with even more expensive annual maintenance agreements. Companies also need to hire technicians and engineers with Cisco certifications in order to operate a Cisco network.

But the same trends that are dropping the cost of cloud services exponentially are going to kill Cisco’s business model. It’s now possible for a company like CloudFlare to use brute computing power in data centers to perform the same functions as Cisco. Companies no longer need to buy boxes and only need to pay for the specific network functions they need. And companies no longer need to rely on expensive technicians with a Cisco bias. Companies can also be nimble and can change the network on the fly as needed without having to wait for boxes and having to plan for expensive network cutovers.

This change is a direct result of cheaper computing resources. The relentless exponential improvements in most of the major components of the computer world have resulted in a new world order where centralized computing in the cloud is now significantly cheaper than local computing. I summed it up in my last blog saying that 2014 will be remembered as the year the cloud won. It will take a few years, but a cloud that is cheaper today and that is going to continue to get exponentially cheaper will break the business models for companies like Cisco, HP, Dell and IBM. Where there were hundreds of companies making routers and other network components there will soon be only a few companies – those that are the preferred vendors of the companies that control the cloud.

The reverse is happening with software. Large corporations for the last few decades have largely used giant software packages from SAP, Oracle and Microsoft. These huge packages integrated all of the software functions of a business from database, CRM, accounting, sales and operations. But these software packages were incredibly expensive. They were proprietary and cumbersome to learn. And they never exactly fit what a company wanted and it was typical for the company to bend to meet the limitations of the software instead of changing the software to fit the company.

But this is rapidly changing because the world is being flooded by a generation of new software that handle the individual functions better than was done by the big packages. There are now dozens of different collaborations platforms available. There are numerous packages for the sales and CRM function. There are specialized packages for accounting, human resources and operations.

All of these new software packages are made for the cloud. This makes them cheaper to use and for the most part easier to learn and more intuitive to use. They are readily customizable by each company to fit their culture and needs. For the most part the new world of software is built from the user interface backwards, meaning that the user interface is made as easy and intuitive as possible. The older platforms were built with centralized functions in mind first and ended up with a lot of training required for users.

All of this means that over the next decade we are going to see a huge shift in the corporate landscape. We are going to see a handful of cloud providers performing all of the network functions instead of hundreds of box makers. And in place of a few huge software companies we are going to see thousands of specialized software companies selling into niche markets and giving companies cheaper and better software solutions.

2014 Cyber Threats

Where the Internet is stored

Where the Internet is stored (Photo credit: debs)

Georgia Tech just released its annual Emerging Cyber Threats Report for 2014. They have been publishing reports for several years that looks ahead to security issues with data and devices connected to the Internet. As usual, they have summarized a number of threats that companies should be aware of.

Companies Assume the Cloud is Safer than it is. Most companies store their data in the cloud in exactly the same format as it would be stored on a local LAN. This means there is no additional security other than whatever is provided by the cloud provider.

While companies can add additional encryption to cloud-stored data, there is a trade-off between encryption and the accessibility of data by employees, so few firms add the additional encryption.

Unencrypted data can be compromised as has been seen by some of the attacks by the Chinese on companies like Google. But aside from national cyberwar threats, data in the cloud can be hacked in much easier ways, including the next threat which is

Employees are Accessing Corporate Data with Bring-Your-Own Devices. Many companies are allowing BYOD since it saves them a lot of money from buying every employee smart phones and tablets, and it also lets each employee use devices they are comfortable with meaning a lot less training.

BYO Devices create an easy path to hacking into corporate data. For example, somebody hacking, or just coming into the possession of a phone from an employee might have wide-open access to corporate data.

Very Little Security for the Internet of Things. Today we are already starting to see the proliferation of devices that connect wirelessly to networks. This first generation of devices has not paid a lot of attention to security. I am not sure that I care that much if my coffee maker or smoke alarm or sprinkler system are not encrypted. It’s unlikely that anybody would take the time to hack them, and if they did all I might get are some really wet fruit trees.

But the Internet of Things is advancing faster in areas of business automation than it is in the home. The Internet of Things in an industrial setting already includes things like security cameras, devices that sense the presence of various chemicals, thermostats and the equivalent timing devices used during the manufacturing process. And soon the Internet of Things is going to include medical devices and other things that none of us want to see hacked.

And I certainly care if somebody hacks into a heat sensor or water control valve at a nuclear reactor site or hacks into the manufacturing process at an oil refinery.

Mobile Devices Will Become the Focus of Hackers. Until now there has not been a lot of successful malware used against smart phones and other mobile-connected devices. However, these devices are no less susceptible to hacking than are PCs and network servers.

Georgia Tech sees an uptick in attempts to hack into cell phones in various ways. Obviously there be malware that will be distributed in the same manner as with computer spam. But more insidious is the idea of hacking directly into apps so that millions of users download malware with a normal update of a popular application.

And of course, as mentioned above, hacking into cell phones is a lot scarier when those phones have access to work and government networks.

Expect Cyber Attacks Meant to Ruin Corporate Reputations. One thing that has been seen with attacks by foreign governments is that these attacks aren’t always aimed at government sites, but instead at the biggest and most popular companies in the country. The goal is to breach the data and security at big US companies in order to make the general population lose trust in using them. So we have seen attacks leveled at US banks and big companies like Google and Facebook.

Do the Cloud Guys Get It?

English: Cloud Computing Image

English: Cloud Computing Image (Photo credit: Wikipedia)

I just read an article this week that cites five reasons why cloud computing isn’t taking off as fast as the companies selling the solution were hoping for. The reasons unfortunately make me feel like the cloud industry folks are out of touch with the real world. This is not an uncommon phenomenon in that high-tech industries are run by innovators. Innovators often don’t understand why the rest of the world doesn’t see things with the same clarity as they do.

Following are the five reasons cited in the article about why cloud computing is not selling as fast as hoped, with my observations after each point.

The Organization. Organizations often are structured in a way that does not make the kind of shift to cloud easy. For instance, IT shops are often organized into separate groups for compute, network and storage.

Changes that affect people are never easy for companies. Going to the cloud is supposed to save a lot of labor costs for larger companies, but that is not necessarily the case for smaller companies.  But even larger companies are going to take a while to make sure they are not walking off a cliff. Every old-timer like me remembers a few examples of where major technology conversions went poorly, and nobody wants to be the one blamed if a big conversion goes wrong.

Security. Companies are afraid that the cloud is not going to be as safe as keeping all of their data in-house.

Everything I have read says that if done right that the cloud can be very secure. However, the fear is that not every conversion is going to be done right. You can place your bets with me now, but sometime in the next year or two there is going to be a major ugly headline about a company that converted to the cloud poorly which led to a major breach of customer records. The problem is that everybody is human and not every cloud company is going to do every conversion perfectly.

Legacy Applications. Cloud companies want you to get rid of legacy systems and upgrade to applications made for the cloud.

This is where cloud companies just don’t get it. First, almost every company uses a few legacy systems that are not upgradable and for which there is no cloud equivalent. Every industry has some quirky homegrown programs and applications that are important for their core business. When you tell a company to kill every legacy application most of them are going to rightfully be scared this is going to create more problems than it solves.

Second, nobody wants to be automatically upgraded with the latest and greatest software. It’s a company nightmare to come in on a Monday and find out that the cloud provider has upgraded everybody to some new Microsoft version of Office that is full of bugs and that everybody hates and that brings productivity to a halt. Companies keep legacy systems because they work. I recently wrote about the huge number of computers still running on Windows XP. That is how the real world works.

Legacy Processes. In addition to legacy software, companies have many legacy processes that they don’t want to change.

Honestly this is arrogant. Companies buy software to make what they do easier. To think that you need to change all of your processes to match the software is really amazingly out of touch with what most companies are looking for. Where a cloud salesman sees ‘legacy system’ most companies see something that works well and that they took years to get the way they want it.

Regulatory Compliance. Companies are worried that the cloud is going to violate regulatory requirements. This is especially true for industries such as financial, health and the power industries.  

This is obviously a case-by-case issue, but if you are in one of the heavily regulated industries then this has to be a significant concern.

I hope this doesn’t make me sound anti-cloud, because I am not. But I completely understand why many companies are going to take their time considering this kind of huge change. There is no product ever made that should not be taking their customers into consideration. When I see articles like this I feel annoyed, because the gist of the article is, “Why won’t these dumb customers see that what I have is good for them”. That is never a good way to get people to buy what you are selling.

The State of Cloud Services

cloud computing

cloud computing (Photo credit: kei51)

My clients ask me all of the time how they can make money at cloud services. The fact is, for small carriers, there are a few opportunities, but the industry still has a way to go to be ready for prime time for small carriers as a revenue opportunity.

I say this because my average client only has a handful of business customers who can really benefit from using cloud services, and so the small volume they might be able to sell to them does not look like a profitable product line.

What is available today?

First, there is a very robust market in providing data storage and back-up of data. But there is only money to be made in this from business customers because residential customers can get mountains of free web storage if they look around. It’s possible for a residential customer to easily store a terabit or more of data for free.

But businesses don’t want to, and probably should not use cheap or free web storage. There are already horror stories of web storage services that have shut down and that have left people without access to the data they have stored in the cloud. So a business needs to store their data where they know they will always have access to it. This probably means storing it with a vendor that has multiple data centers so that there is a duplicate backup copy of everything to avoid the issue of natural disaster.

And it’s not hard for a small carrier to get into this business themselves and to store some data in their own central office. And if their customers want a second back-up copy there are a number of reputable data centers around the country that are owned by other small carriers and that seem pretty secure and safe. There is even a little money to be made to be the middle man and in sending all of the data to somebody else for your customers.

The other thing that is widely available today as a cloud service is IP Centrex. There are a number of national companies that will sell this service to anybody that has a fast enough data connection.

But one of the catches to this service is that these nationwide sellers do not offer phone numbers everywhere. This means that when they go to sell in rural areas they probably do not have the ability to do number porting to let the customers keep their local numbers. This is a big deal for businesses. We have always assumed that the nationwide sellers work through some other nationwide CLEC to terminate traffic, and those CLECS, like a Level3 have gained the ability to do number portability in RBOC areas, but for the most part they do not have those agreements in place for the rural areas.

But number portability aside, it is possible for anybody to resell the IP Centrex services. If you are competing in a neighboring larger town you could offer these services in the cloud as a reseller of one of the nationwide carriers. The margins are not nearly as good as if you offered this on your own switch, but they are okay.

Finally, the real promise of cloud services is that it could offer the software a business uses everywhere on any device. These are huge advantages to large companies having this ability and many of them have migrated their software to the cloud. But they have done so with a lot of effort. Most companies operate a unique set of programs. While a business may use the standard nationwide software like Microsoft Office or Quickbooks, most companies also run a number of unique and homegrown programs. The real challenge for a company that wants to take its software to the cloud is not getting the big name software to the cloud – because most of that software now has the option of cloud licenses. The issue is in moving all of the home-grown and one-off software that a company uses. As I mentioned yesterday, many companies still operate some PCs with Windows XP. It’s not as automatic to move older legacy systems to the cloud as you would hope and it takes some effort and trial and error to get some things to work in a cloud environment.

And there is no profitable product out there yet for the small carrier who wants to offer cloud software to customers. There are bits and pieces, but no easy platform that just lets you sweep your business customers into the cloud. This is probably coming, but it is not here yet.

So in summary, other than data storage and IP Centrex, there are not a lot of viable, money-making ways for a small provider to make money yet on cloud services. But I think the day when you can is fast approaching. There are bits and pieces already available for offering cloud-based software and the options are growing all of the time.

Upsell Your Customers – What to Sell

One of the best strategies you can undertake to improve bottom line performance is to increase your average revenue per existing customer by getting those customers to buy more of the services that you already offer. These are customers who already know you and trust you and send you a monthly check, so there is no target market that has a higher potential for successful marketing.

Many of my clients have been very happy to sell basic packages to customers for years. But as I have discussed in other blog posts, the traditional products that many carriers sell are becoming commodities and now have market alternatives available. Households have been dropping voice lines for a decade and are starting to drop cable connections. Many of my clients are seeing significant customer losses in their traditional products and things like long distance have withered away. These same clients have a number of products and services available to them that they are not selling. If they are going to stay profitable and remain relevant to their customers for the coming decades they are going to have to find new products to replace the ones they are losing.

If you want to undertake an upsell program you need goals. Do the math, but most of my clients would be very happy if they could increase margins per existing customer by a few dollars a year. So set a specific goal each year and then develop a plan to get there. I will have some future blogs discussing the best ways to upsell, and in this first blog on the topic I will look at the products you can sell as part of this process.

So, what are some of the products you can be selling today? The following is just a partial list that is intended to show you some of the possibilities. I have clients successfully selling all of these products:

Voice. Today, anybody with a softswitch has a score of communications tools that hardly anybody is selling. This includes such things as:

  • Unified Messaging. Almost everybody has this available on their switches and yet hardly anybody sells it. This allows customers to seamlessly move communications across all devices and once customers see how this works many want it. We are no longer talking about the ability to toggle between a cell phone and home phone, but also to tablets, laptops and any other device capable of receiving an Ethernet stream.
  • IP Centrex. Again, anybody with a softswitch can probably offer this service, and if not you can partner with somebody who offers it. This is becoming the new standard product for businesses and many home businesses will also be interested because it can allow them to act like a larger company.
  • Cheap Second Lines. Second lines today can be little more than a number of you deliver the service over Ethernet. So sell $5 or $10 second lines for teens or home businesses.
  • Other Advanced Features. Softswitches come with dozens of features that almost nobody sells. These include features like seamlessly integrating emails and voice mail; integrating voice with computers; advanced screening and call control. I have a few customers who have figured out how to sell these features and they are almost 100% margin if you have already bought them with an existing switch.

Wireless. As long as there is good cell phone coverage in your area, you can now be in the cell phone business through an MVNO program where you resell somebody else’s wireless minutes. This is very different from the resale in the past where you resold a large carrier’s products with little margins. With MVNO you can repackage minutes into your own products, and if you match this up with household Wifi you can have very good margins.

Cable TV. And on the cable TV side of the product line

  • OTT Access. Add over-the-top programming to your channel line-up. Rather than risk losing customer to OTT, let them easily get OTT directly on your video line-up without needing to buy a Roku or Apple TV box. There are numerous vendors around who have created channel line-ups for OTT programming.
  • Cable Portability. Enable your customers to watch the TV programming you sell to them on portable devices around their home like computers, cell phones and pads. If you buy programming from the NCTC coop this is now becoming available.
  • DVR Services. Provide whole-house DVRs, or even better offer centralized DVR where you do the recording on servers at your hub. Centralized DVR greatly reduces the bandwidth you have to send to customers while allowing them to easily record multiple shows at the same time. Centralized DVR also means you don’t have to invest in expensive set-top boxes.

Security. Many of my clients are doing well with security products:

  • Cameras. The simplest product is to sell and install security cameras and then set customers up to monitor these themselves from any ethernet device.
  • Safety Monitoring. Sell, set-up and monitor safety monitors for things like fire, radon and CO2.
  • Burglar Alarms. I have many clients selling ‘traditional’ burglar alarms. This is now easier than ever to do since there are a number of vendors who offer the police monitoring and as a carrier you supply the equipment and get a monthly line rental.
  • Advanced Security. Many business customers will be interested in advanced security systems that can monitor all sorts of things in addition to traditional security.

Cloud Service. Everybody is talking about things moving to the cloud but very few smaller carriers are marketing any cloud services yet. This is an area where a small carrier is going to have to break the mindset that you have to own and control the back office system behind the product. Instead, you need to find partners who offer cloud services and then repackage them to your customers. This will not be a static transaction since these products are going to change a lot over the next decade. But you can’t wait for this market to ‘stabilize’ because it may never do that. So you should start looking for cloud partners today.  Some of these services include:

  • Data Backup and Storage. While there is free back-up available on the web, many customers still prefer the safety of backing up for a fee and there are many for-pay back-up services. We are seeing is that many people would prefer to back-up their data with somebody local rather into the ‘cloud’.
  • Centralized Software A lot of software like Windows, Microsoft Office and other popular products are now available at the cloud level, saving customers from having to keep buying these for every machine they want to operate.
  • Medical Monitoring. This will eventually be a huge business and most people will elect to get monitored. It’s just starting, but worth getting into early.
  • PC Replacement. Let customers use your storage in place of their hard drives, meaning they can get to their data from any device capable of using the software.

Home Automation. I have several clients who are successfully selling and installing home automation systems. These systems are commercially available, but only really geeky customers feel comfortable making this work on their own. So the product is selling / leasing the systems, making it work, and continuing to integrate future customer devices into the systems.

Geek Squad. I have a number of customers, particularly in rural markets that are doing well offering the same sorts of services that the Geek Squad sells. They will go into customers’ homes and help customers manage make their computers, TVs, energy management, and anything else that is electronics based. All this is sold on an hourly or an insurance-type basis.

Give the Customers What They Want

I have a friend Danny who is a CPA and he is doing something that I think is brilliant. He has taken over the accounting practice from his 72 year-old father and he also has a number of other older accountants who help him during tax season. (And I don’t use the term “older” accountant nonchalantly, being one of them myself).

For several years he has tried to force the older accountants into learning new tax and accounting software and they have resisted vehemently. Their arguments are that they had multiple years of tax returns from their clients in older legacy programs and they also were just not interested in learning yet another new program. In fact, his father told him that if he was forced to learn a new system he would just stop helping him. And the clients all love his father.

And so my friend Danny did a brilliant thing. He went out and set up his own private cloud network. He put all of the new software into the cloud that he and most of the staff use, but he also sent the various older legacy software that the older accountants wanted to use into the cloud. And he chose to use a cloud so that anybody could work with any of the software packages from anywhere.

He would have preferred to do this with an existing cloud computing service, but none of them were interested in helping him set up the legacy software, some so old that they are DOS systems. There are a number of cloud services that support new accounting software. In fact, one of the major selling points of most of the cloud service providers is that a customer will never again have to worry about having software that is out of date and the cloud providers tout how they will introduce every update from the software provider when it becomes available.

Accountant upstairs ↑

Accountant upstairs ↑ (Photo credit: jah~)ems. 

And the cloud providers are completely missing the point. Real life people don’t want software that is always up to date. My worst nightmare is to log onto a cloud server with a project with a deadline and find out that the program I use every day has changed and that I will have to spend hours figuring out the differences. People don’t mind upgrading software over time and we have all migrated through the many versions of Microsoft Office. But people are creatures of habit and our relationship with software has become almost intimate. Danny’s father is a perfect example. He won’t use anything newer than Office 2007. And this is his right – he paid for it and it still works. Upgrading software you use every day can be unnerving at best and traumatic at worst and is always a bit disruptive.

And so the cloud providers have some big lessons to learn if they really want to be successful with the average customer. The cloud providers have chosen to stress the benefits of always having the most recent version of software. And from an operational perspective this makes sense for them. They only have to maintain one version of the software which makes it easier on them in a number of ways. But this doesn’t make sense from the perspective of what their customers want.

In the telecom business we have a long history of offering a handful of standard products to businesses. And from the perspective of the telcos this makes sense for the same reasons that the cloud providers want to push one version of software – it’s easier on the telco in terms of staff training, operations and billing. Selling standard products is what Ma Bell did for a century.

I would argue that selling only ‘standard’ products is not in the long-term best interest of a telco. If your company only sells standard products then you have turned those products into a commodity. In a competitive world, customers have no reason to be loyal to you if they can get that same commodity from somebody else for less. But if you are willing to listen to your customers and give them a custom product that they want, then you have created a loyal customer who is likely to stay with you for a long time. I don’t think most telecom providers add in the cost of churn when looking at profit margins. It is worth spending more up front to get a customer who will stay with you than to sell standard products to customers who will always be price shopping.

So, You Want to Get Into the Data Center Business?

Data Center

Today’s guest blogger is Mike Fox. He was one of the founders of CCG and we still work together on a number of projects. He is working today for Fox Management Advisors.

Cloud computing, business disaster recovery and continuity, off-site data storage, co-location, managed services, mirror site operations, data warehousing ….. what does it all mean, and, more importantly, how can you get some of this business?

On the surface, data centers are closely related to telco operations – high capital costs, spiky investment, technology driven, and (hopefully) good long-term investments.  Furthermore, telephone companies are natural partners (or owners) of data center operations due to the requirement for robust (and redundant) bandwidth connectivity (preferably fiber based).  Site security is also a critical factor; something telcos are well acquainted with and can naturally support.  However, given the fast-moving nature of both businesses (telecommunications and data centers), there are several key aspects that must be considered.

Location, Location, Location

As with real estate, one of the most critical factors when considering whether or not to invest in or build a data center is location.  In addition to other issues, location impacts costs (e.g., power costs – which are one of the primary cost elements), vulnerability to natural and man-made disasters, access to qualified technical personnel and your sales opportunities.  Some of the key location-based factors include, but are not limited to:

  • Cost and availability of real estate – is there sufficient open space for expansion
  • “Green” attributes – availability of green power and other low sulfur emission power sources
  • Cheap and abundant power from multiple sources
  • Climate – e.g., climate can affect HVAC cooling costs and design
  • Available education resources – colleges, universities and technical training
  • Access to redundant sources of broadband facilities – especially fiber optics
  • Local/State income tax rates – including any ‘incentives’ available for tech-based companies

Locating a data center near or in conjunction with a telephone company can be advantageous from the perspective of securing an anchor client (telcos house and store a lot of data), proximity to superior broadband connectivity, and access to technical expertise (e.g., it is not too far of a leap for telco IP technicians to be trained to be able to handle many of the technical needs of a data center).

Access to sufficiently trained technical personnel is very important.  While telco technicians have many of the same skill sets necessary to meet the needs of data center operations, it is likely that you will need more highly trained and certified employees than are necessary for traditional telco operations.  Therefore, locating close to colleges, universities and technical training centers, while not a requirement, can be a great advantage.  That said, I am personally aware of several data center operations that are several hundred miles from such educational facilities.  These operations were able to attract very qualified people due to their unique location – e.g., sometimes people would prefer to NOT live in the big city!

Not being located in the ‘big city’ is also positive from the perspective of site vulnerability.  Assuming you have sufficient broadband and power availability (again, redundant feeds, if possible), then locating in rural areas is very attractive for companies wishing to house their data in secure locations with very low risk profiles from both man-made (e.g., terrorism) and natural (e.g., hurricanes) disasters.  Coincidentally, most rural telcos are NOT in high risk areas; particularly with respect to terrorism and other man-made disasters.

The cost of power is also very location dependent.  In many rural states (e.g., Wyoming, where I reside), the cost of power is below the national average (often way below).  For example, the cost of power in Wyoming is consistently below $0.04 / Kwh (again, depending upon the exact location), which is less than half the national average of over $0.11 / Kwh.  Furthermore, availability of power is a location-based factor – is your location subject to issues such as rolling brownouts, which are common in some of our country’s more populated locations?

Know Your Business; Know Your Market

Like many technology-based businesses, the data center business is rapidly evolving.  What might have been a great business model a year ago, may have no legs today.  Rates, services, packages, bundles and even target customers are changing daily.  However, there is no doubt that in one form or another data storage, remote site availability, business disaster recovery and ‘cloud computing’ (the meaning of which is also evolving daily) will be viable products for years to come.  The key is to know where to start and how to focus effectively to meet your customers’ needs for many years to come.  It’s not necessarily ‘rocket science’ but it’s also not child’s play.  As such, we are prepared to assist with the evaluation of these and many other similar opportunities and, if it looks positive, help you launch or expand existing data center operations.