The Future of Rural Broadband

Verizon Wireless "Rule the Air" Ad C...

Verizon Wireless “Rule the Air” Ad Campaign (Photo credit: Wikipedia)

There were several events this week that are telling rural subscribers the future of rural broadband. It is a bleak picture.

First, at a Goldman Sachs conference on Tuesday, the CEO of AT&T said that he hoped that the new FCC chairman Tom Wheeler would be receptive to AT&T’s desire to begin retiring its copper network in favor of its wireless network. At the end of last year AT&T had said in an FCC filing that they were going to be seeking to retire the copper plant from ‘millions of subscribers’.

In that filing AT&T had asked to move from the copper network to an all-wireless all-IP network. Stephenson said that cost savings from getting rid of the copper network would be dramatic.

On that same day, Verizon CEO Lowell McAdam said that the idea of offering unlimited data plans for wireless customers was not sustainable and defied the laws of physics. Earlier this year Verizon had ended all of its unlimited wireless data plans and now has caps on every plan.

Verizon already has a rural wireless-based landline surrogate product that it calls VzW. This uses the 4G network to deliver a landline phone and data anywhere that Verizon doesn’t have landline coverage. The base plan is $60 per month and includes voice and 10 gigabytes of data. Every extra gigabyte costs $10. There is an option to buy a $90 plan that includes 20 gigabytes or $120 for 30 gigabytes.

Finally, at the same Goldman Sachs conference mentioned above, the CFO of Time Warner said that they saw more room for increasing data rates.

So what does all of this mean for rural subscribers? First, it means that if you are served by a large incumbent like AT&T that they are going to be working hard to retire your copper and force you onto wireless. And we all know that the wireless data coverage in rural America is not particular fast when you can even get data. The data speeds delivered from a cell tower drop drastically with distance. In urban areas where towers are only a mile or less apart this doesn’t have much practical effect. But in a rural environment a mile is nothing and homes might be a mile apart. People lucky enough to live near to a cell tower can probably get okay data speeds, but those further away will not.

And even if you can get wireless data your usage is going to be capped. Rural landline data usage today may be slow, but it is unlimited. Customers have learned that if they put in WiFi routers that they can channel all of the data usage on their cell phones and tablets to their unlimited landline data connections. But once those connections are wireless, then every byte of data leaving your home, whether directly from a device or though the WiFi router, is going to count against the data caps. So rural America can expect a future where they will have data caps while people in urban areas will not.

Finally, one can expect the price of data to keep climbing. I have been predicting this for a decade. The large telcos and cable companies are facing a future where the old revenues streams of voice and cable TV are starting to decline. The only sustainable product they have is data. And so as voice and cable continue to tumble, expect incumbents to get into the habit of raising data prices every year to make up for those declines. Competition won’t help because the cell company data is already expensive, and both the incumbent cable and telcos will be raising data rates together.

This is not a pretty picture for a rural subscriber. Customers will be forced from copper to wireless. Speeds are not likely to get much faster. Data is going to be capped and prices will probably be increased year after year.

Opportunity Abounds

English: colorful fiber light

English: colorful fiber light (Photo credit: Wikipedia)

I am often asked about ideas for building a fiber network that can make money. Right now in this country there is a huge opportunity that almost nobody is taking advantage of. There have been tens of thousands of miles of middle mile fiber built in the last five years using federal stimulus grants. Additionally there are other networks around the country that have been built by state or other kinds of grants. And there has also been fiber built to thousands of rural cell phone towers.

These networks are largely rural and in most cases the networks have only been used to connect small rural towns and to serve anchor institutions, or built to go only to cell towers. If you look at these networks closely you will see miles and miles of fiber that goes from county seat to small town to county seat with a few spurs serving schools, health facilities, junior colleges, city halls and cell towers. But for the most part the fiber has not been used to serve anything else.

The whole stimulus grant was cooked up quickly and was not a well-planned affair. They tried to make awards in every state and we ended up with a true hodge-podge of networks being built. In some cases it looks to me like networks to nowhere were built, but a large percentage of the stimulus grants went through rural areas where there are nice pockets of customers.

For years I have advocated a business plan that builds fiber in short spurs in situations where there is guaranteed success. For example, one might build to one large business whose revenue will pay for the fiber route. Or these days that is most likely going to be a cell tower. And so building to that single guaranteed customer can be a successful business plan.

However, any carrier who stops with that one customer is missing the real profit opportunity in such a build. The best business plan I can find today is to build to an anchor tenant and then doing everything possible to sign every customer that is passed to get to that new tenant customer. In economic terms you can think of the cost of the fiber build as a sunk cost. Generally in any business when you make a sunk-cost investment the goal is then to maximize the revenue that can be generated by the sunk cost.

And so, if the anchor tenant you have found can justify the fiber build and pay for the sunk-cost investment, then adding additional customers to that same fiber investment becomes a no-brainer in economic terms. The extra customers can be added for the cost of a drop and fiber terminal device, and in terms of return, adding a home or small business might have a higher margin than the original anchor tenant.

They key to making this business plan work is to keep it simple. You don’t need to be in the triple play business to add residential customers. Offering a very high-speed data connection for a bargain price is good enough to get a good long-term customer with very little effort required by the carrier. If you happen to already be in the triple play business and have all of the back-office support for such customers then you can consider this as an option, but offering only data is a profitable business.

And so the business plan is to look around you and see where there are facilities built but underutilized. The key to making this work is to get cheap transport to reach the new pocket of customers. By law the stimulus grants need to give cheap access to somebody willing to build the last mile. But commercial network owners are going to make you a good offer also for transport if you can bring them a new revenue opportunity in a place they didn’t expect it. So the key is to first work with the network providers and then look at specific opportunities.

And you possibly don’t even need much, if any staff to do this. There is already somebody maintaining the backbone fibers and they will probably be willing to support your fiber spurs. And it’s quite easy today to completely outsource the whole ISP function. The only thing that is really needed is the cash needed to build fiber spurs and connect customers. The more you have the better you can do, but you could build a respectable little business with only a few hundred thousand dollars.

If you are in a rural area there are probably dozens, and maybe hundreds of these opportunities around you if you look for them with the right eye. As the header of this blog says, opportunities abound.

Are You Collaborating?

I am not talking about World War II movies and I hope none of you have decided to side with the enemy (whoever that is). Collaboration software is a tool that every business with employees who work at different locations ought to consider.

Collaborative software began several decades ago with Lotus Notes. That software allowed multiple users on the same WAN to work on the same spreadsheet or word document at the same time. And Lotus Notes had the added feature of letting you link spreadsheets and word documents at the same time so that any change made to a spreadsheet would automatically populate your word document. But Lotus Notes required people to be on the same WAN and in most companies that meant being in the same building, and so the concept became very popular, plus Microsoft came and kicked Lotus’s butt in the marketplace.

And so collaborative software mostly died off for a while, although there were few open source programs that were widely used by universities and others who love open source software.

But collaborative software is back in a number of different variations and if your company has employees at more than one location, then one of these new software products is going to be right for you. Here are some of the features you can find in various collaborative software today:

  • Number one is the ability to simultaneously let multiple people work on the same document. And instead of just spreadsheets and word documents, this has been extended to any software that users all have rights to use. Most software also creates a log showing who made changes to a document and when.
  • Supports multiple devices. Collaborative software is no longer just for the PCs and employees using tablets and smartphones can share in many of the features. As an example, collaborative software is a great way to keep the sales staff in the field fully engaged with everybody else in the company.
  • Communicate internally. Many collaborative software programs come with chat rooms, instant messaging and text massaging tools that make it fast and easy to communicate with other employees. Why send somebody an email or call them if you only have a quick question that they can answer on an IM?
  • Some systems let you know where people are and if they are available to communicate now. This stops you from calling people who are not in their office and instead communicating with them in a faster way.
  • Create better communications history. In some software each user gets a home page that operates much like Facebook that shows everything they have done, meaning that other employees can often go find information they need without bothering that person.
  • This can become the new way to structure corporate data. With a program like SharePoint you can quickly create folders specific to a topic or a project and then give access only to those you want to have access to that date. This used to require the intervention of somebody in the IT department but now can be done by almost anybody.
  • Gives you a great tool to work with your largest customers. You can give your contacts at your largest customers limited access to your systems so that they can quickly ask questions or talk to the right person by chat or IM. This is a great new way to extend your customer service platform and make it real time. You can easily isolate outsiders from corporate information while giving them access to the social networking aspects of the software.

So what are some of the Collaborative software tools to consider? Here are a few (and there are many others).

  • Podio. This is software that is free for up to five users. It might be a good way to see if you like the concept. After five users it’s $9 per employee per month.
  • IBM (Lotus). The Lotus name is not dead and is now the brand name of the IBM collaborative suite of products. Check them out here.
  • Intuit has a product called QuickBase that is a collaborative suite of software. One good thing about this is that it will integrate with QuickBooks and other Intuit products that you might already be using. Check it out here.
  • SharePoint is Microsoft’s collaborative suite of products and has done very well in the large business sector. See it here.

How Vulnerable is the Internet?

OLPC: XO internet access

OLPC: XO internet access (Photo credit: Wikipedia)

A question you hear from time to time is how vulnerable the US Internet backbone is in terms of losing access if something happens to the major hubs. The architecture of the Internet has grown in response to the way that carriers have decided to connect to each other and there has never been any master plan for the best way to design the backbone infrastructure.

The Internet in this country is basically a series of hubs with spokes. There are a handful of large cities with major regional Internet hubs like Los Angeles, New York, Chicago, Dallas, Atlanta, and Northern Virginia. And then there are a few dozen smaller regional hubs, still in fairly large cities like Minneapolis, Seattle, San Francisco, etc.

Back in 2002 some scientists at Ohio State studied the structure of the Internet at the time and said that crippling the major hubs would have essentially crippled the Internet. At that time almost all Internet traffic in the country routed through the major hubs, and crippling a few of them would have wiped out a lot of the Internet.

Later in 2007 scientists at MIT looked at the web again and they estimated that taking out the major hubs would wipe out about 70% of the US Internet traffic, but that peering would allow about 33% of the traffic to keep working. And at that time peering was somewhat new.

Since then there is a lot more peering, but one has to ask if the Internet is any safer from catastrophic outage as it was in 2007? One thing to consider is that a lot of the peering happens today at the major Internet hubs. In those locations the various carriers hand traffic between each other rather than paying fees to send the traffic through an ‘Internet Port’, which is nothing more than a point where some carrier will determine the best routing of the traffic for you.

And so peering at the major Internet hubs is great way to save money, but it doesn’t really change the way the Internet traffic is routed. My clients are smaller ISPs, and I can tell you how they decide to route Internet traffic. The smaller ones find a carrier who will transport it to one of the major Internet hubs. The larger ones can afford diversity, and so they find carriers who can carry the traffic to two different major Internet hubs. But by and large every bit of traffic from my clients goes to and through the handful of major Internet hubs.

And this makes economic sense. The original hubs grew in importance because that is where the major carriers at the time, companies like MCI and Qwest already had switching hubs. And because transport is expensive, every regional ISP sent their growing internet traffic to the big hubs because that was the cheapest solution.

If anything, there might be more traffic routed through the major hubs today than there was in 2007. Every large fiber backbone and transport provider has arranged their transport networks to get traffic to these locations.

In each region of the country my clients are completely reliant on the Internet hubs. If a hub like the one in Dallas or Atlanta went down for some reason, ISPs that send traffic to that location would be completely isolated and cut off from the world.

There was a recent report in the Washington Post that said that the NSA had agents working at only a handful of major US Internet pops because that gave them access to most of the Internet traffic in the US. That seems to reinforce the idea that the major Internet hubs in the country have grown in importance.

In theory the Internet is a disaggregated, decentralized system and if traffic can’t go the normal way, then it finds another path to take. But this idea only works assuming that ISPs can get traffic to the Internet in the first place. A disaster that takes out one of the major Internet hubs would isolate a lot of towns from the region around it from having any Internet access. Terrorist attacks that take out more than one hub would wipe out a lot more places.

Unfortunately there is no grand architect behind the Internet that is looking at these issues because no one company has any claim to deciding how the Internet workd. Instead the carriers involved have all migrated to the handful of locations where it is most economical to interconnect with each other. I sure hope, at least, that somebody has figured out how to make those hub locations as safe as possible.

Time for a New Spectrum Plan

The spectrum in this country is a mess. And this is not necessarily a complaint against the FCC because much of the mess was not foreseeable. But the FCC has contributed at least some to the mess and if we are going to be able to march into the future we need to start from scratch and come up with a new plan.

Why is this needed? It’s from the sheer volume of devices and uses that we see coming for wireless spectrum. The spectrum that the wireless carriers are using today is already inadequate for the data that they are selling to customers. The cellular companies are only making it because a large percentage of the wireless data is being handed off to WiFi today. But what happens when Wifi gets too busy or if there are just too many devices?

As of early 2013 there were over half a billion internet connected devices in the US. This is something that ISPs can count, so we know that is fairly accurate. And the number of devices being connected is growing really quickly. We are not device nuts in my house and our usage is pretty normal. And we have a PC, a laptop, a tablet, a reader and two cell phones connected to wireless. And I am contemplating adding the TV and putting in a new burglar alarm system which would easily double our devices overnight.

A huge number of devices are counting on WiFi to work adequately to handle everything that is needed. But we are headed for a time when WiFi is going to be higher power and capable of carrying a lot more data, and with that comes the risk that the WiFi waves will get saturated in urban and suburban environments. If every home has a gigabit router running full blast a lot of the bandwidth is going to get cancelled out by interference.

What everybody seems to forget, and which has already been seen in the past with other public spectrum, is that every frequency has physical limits. And our giant conversion to the Internet of Things will come to a screeching halt if we ask more of the existing spectrum than it can physically handle.

So let’s jump back to the FCC and the way it has handled spectrum. Nobody saw the upcoming boom in wireless data two decades ago. Three decades ago the smartest experts in the country were still predicting that cell phones would be a market failure. But for the last decade we have known what was coming – and the use is wireless devices is coming faster than anybody expected, due in part to the success of smartphones. But we are on the edge of the Internet of Things needing gigantic bandwidth which will make cell phone data usage look tiny.

One thing the FCC has done that hurts the way we use the data is to chop almost every usable spectrum into a number of small channels. There are advantages to this in that different users can grab different discrete channels without interfering with other users, but the downside to small channels is that any given channel doesn’t carry much data. So one thing we need is some usable spectrum with broader channels.

The other way we can get out of the spectrum pinch is to reallocate more spectrum to wireless data and then let devices roam over a large range of spectrum. With software defined radios we now have chips that are capable of using a wide variety of spectrum and can change on the fly. So a smart way to move into the future is to widen the spectrum available to our wireless devices. If one spectrum is busy in a given local area the radios can find something else that will work.

Anybody who has ever visited a football stadium knows what it’s like when spectrum gets full. Practically nobody can get a connection and everybody is frustrated. If we are not careful, every downtown and suburban housing area is going to look like a stadium in terms of frequency usage, and nobody is going to be happy. We need to fix the spectrum mess and have a plan for a transition before we get to that condition. And it’s going to be here a lot sooner than anybody hopes.

It’s All Up to the Courts

Seal of the United States Court of Appeals for...

Seal of the United States Court of Appeals for the Second Circuit. (Photo credit: Wikipedia)

As often happens with many controversial topics in our society, the fates of Aereo and its clone FilmOn X are now in the courts. These companies supply antenna receivers to customers and let them receive live, local, over-the-air television from the local network affiliates of ABC, CBS, FOX and NBC on internet connected devices including TVs, tablets and smartphones. These companies are claiming that since the signal goes directly on a single antenna to only one customer that they don’t have any obligation to pay retransmission fees to the network affiliates for the programming.

Of course the large networks disagree vehemently with that interpretation and have sued the two companies. In April, Aereo won a suit in New York, which was then upheld later in July in the Second Circuit Court. The judge who ruled in the Aereo suit concentrated on the way that Aereo transmits the signal rather than rule on the issue of copyright infringement that was brought by the networks.

The networks also sued FilmOn X using the same arguments that they had used against Aereo. FilmOn X is an odd company in some ways because in the past it went by the names of Aereokiller and BarryDriller.com, both names that are a dig at Barry Diller, the founder of Aereo. In fact, there are conspiracy theories flying around the Internet that FilmOn X was secretly founded by the networks for the purposes of being sued and losing on the Aereo issue.

A week ago the District Court of Washington DC ruled against FilmOn X saying that the company had violated the copyrights of the networks. A week later the same court refused to accept an appeal on the issue. The suit puts an injunction on FilmOn X from operating.

So now there are two district courts with differing opinions on the same topic. The two courts heard essentially the same arguments and came to different conclusions. Generally the only way to resolve this kind of dichotomy is for the Supreme Court to hear the case and to resolve the issue.

But until then both companies are in legal limbo. Aereo came out this week and publicly advised FilmOn X to ignore the injunction. Aereo also made an effort to distinguish that its technology is different than that of FilmOn X, but the differences are subtle. Aereo continues to expand to new markets and continues to face additional lawsuits in each new market it enters.

As somebody on the sideline I really don’t know how I hope this case resolves. Part of me says that this suit is a result of the greed of the networks which are now pushing to get as much as $2 per month per subscriber in retransmission fees for each local channel. Everybody in the industry understands that we are starting to price cable TV service out of the range of a lot of households, and yet the networks and every other programmer keep pushing for higher and higher fees. As a whole the industry is laying the foundation of its own decline, and if the fees weren’t this high, then Aereo wouldn’t have a business plan.

But the other side of me says that the networks are right, at least under the current cable rules at the FCC. Of course, those rules were made in a very different time a few decades ago when nobody contemplated the ability for somebody to bypass the cable companies as Aereo has done. Certainly the FCC ought to take another look at cable regulations and update them to account for the realities of TV over the Internet.

But from what I understand, nothing is likely to happen since Washington is in gridlock. The FCC is not free to change the rules too much without authority from Congress, and there does not seem to be any impetus for Congress to look at the cable rules. So, like often happens when policy makers don’t make policy, it’s all up to the courts.

Delivering Gigabit Speeds

English: A gigabit HP-ProCurve network switch ...

English: A gigabit HP-ProCurve network switch in a nest of Cat5 cables. (Photo credit: Wikipedia)

There is a lot of talk about companies like Google and many municipal networks delivering Gigabit speeds to homes and residents. But what is not discussed is the fact is that there are no existing wiring technologies that can deliver the bandwidth for any significant distance. Most people are shocked when they find out how quickly data speeds drop with existing wiring technologies.

Existing wiring is adequate to deliver Gigabit speeds to the smaller homes or to small offices. Carriers have typically used category 5 wiring to deliver data signal, and that technology can deliver 1 Gigabit for about 100 feet from the fiber terminal. But after that the speeds drop off significantly.

Wiring technology was never a significant issue when we were using the wiring to deliver slower data speeds. The same fall-off occurs regardless of the data speeds being delivered, but a customer won’t notice as much when a 20 Mbps data connection falls to a few Mbps as when a Gigabit connection falls to the same very slow speed.

Many carriers are thinking of using the new 802.11ac WiFi technology as a surrogate for inside wiring. But the speeds on WiFi drop off faster than speeds on data cabling. So one has to ask if a customer ought to bother paying extra for a Gigabit if most of it doesn’t get delivered to his devices?

Below is a chart that compares the different technologies used today for data wiring along with a few that have been proposed, like WiGig. The speeds in this table are at the ‘application layer’. That means theoretical speeds but is the easiest number to use in a chart because it is the speeds that each technology touts when being promoted. But you must note that actual delivered data speeds are significantly less than these application layer speeds for every technology listed due to such things as overheads and for the bandwidth due to modulation techniques.

Speeds Chart

The technology that stands out on the chart is ultra-broadband from PulseLink of Carlsbad California. PulseLink uses the radio frequency (RF) spectrum on coaxial cable above 2 GHz and can deliver data rates exceeding 1 Gbps. They are marketing the technology under the name of CWave. This technology uses a wide swath of RF spectrum in the 3 to 5 GHz range. As a result the RF signal is out-of-band (OOB) to both Cable TV and Satellite and will peacefully co-exist with both. Typically RF spectrum above 3 GHz on coax cable has been considered unusable RF spectrum, but due to the unique techniques used Pulse-LINK’s CWave chipset the technology reliably delivers Gigabit data rates while not disturbing existing frequencies used by cable TV and cable modems. Effectively it adds a whole new Ethernet data path over existing coaxial and that needs no new wires when coax is already present.

The differences in the various technologies really matters when you are looking at delivering data to larger buildings like schools and hospitals. As was recently in the news, President Obama announced a ConnectED initiative that has the stated goal of bringing a minimum of 100 Mbps and a goal of 1 Gbps to 99% of students within five years. But there does not seem like any good reason to bring a gigabit to a school if only a tiny fraction of that bandwidth can be delivered to the classrooms. I think that the PulseLink ultrabroadband technology might be the only reasonable way to get broadband to our classrooms.

Watching Health Care Costs

English: President Barack Obama's signature on...

English: President Barack Obama’s signature on the health insurance reform bill at the White House, March 23, 2010. The President signed the bill with 22 different pens. (Photo credit: Wikipedia)

I know in my own company and that at every one of my clients that health care costs are a big concern. We have gone through a decade long period where the inflation in health care costs has been in the double digits each year, much faster than any other of the costs we face.

The rate of inflation of health care costs nationwide has finally slowed. This year the nationwide rate of health care cost increase is expected to be at 5.5%. Next year the prediction is for 4.5%. Both of these rates are still higher than inflation, but a welcome relief after years of really large increases. I know in my own firm, which has been around for fifteen years that health care costs per employee have nearly tripled since we opened for business.

And those current inflation rates do not tell the true story for many firms. The 5.5% increase in health care costs this year reflects the cost of health care to employers, not overall insurance costs. And so the slower rate of health care inflation is due in part to companies are pushing higher deductibles and copays to employees as a way to keep their share of health insurance under control. Last year the amount of copays by employees rose 13% which shows that the overall increase in health insurance was a lot more than the published 5.5%.

There are some trends in the industry that hint at a possible slowing in the cost of health care. For example, there is a large industry now of out-patient health clinics that charge as much as two-thirds less than a normal doctor. There is hope that the large statewide pools that are being created under Obamacare will lower overall insurance premiums by bringing more young people (and healthier people) into the insurance pool.

There is also becoming a bigger emphasis in many health care plans of preventive care, meaning that many ailments will be nipped in the bud before they become big problems. Over time preventive care will significantly lower overall health care costs.

And hidden underneath all of these numbers is the very numbing statistic that 30% of our nationwide health care each year is spent for the process of people dying in hospitals and hospices. In recent years just about two-thirds of people die in an institution rather than their home. But this is down ten percent over a decade ago. Almost nobody wants to die in an institution and perhaps as a country we will be able to find a way to allow more people to die at home.

But for most of my clients, even if health care cost inflation slows to 4% – 5% they are facing an ugly future. Trend those increases out ten years and see if you aren’t very concerned.

There is also something to keep in mind which is that in 2018 there is going to be a tax on ‘cadillac’ health care plans. These are plans today that would cost over $10,200 for an individual or $27,500 for a family. That may sound like high caps, but these amounts count the contributions made by both the company and the employee. The tax is a whopping 40% charged to the employer on anything over the cap.

The average health care insurance cost for last year was $10,522, so there are already many plans that would be considered Cadillac. These amounts will be increased over time by inflation, but if health care costs continue to climb faster than the rate of inflation, then more plans each year will fall under the premium category and incur the premium tax.

I know that all of my clients want to provide good health care to our employees. The decision to increase copays or deductibles is a painful one for all of us. There are a few creative ideas that some companies are trying that are worth considering. One of the most interesting is the idea of handing your employees the money to buy their own health insurance. There are now ways to do this as a defined health care plan. Since having health insurance is mandatory for employees in 2014 you need to demand proof that an employee is really using the money for health care. But companies who try this say that their employees are finding ways to get cheaper plans than they could buy at the company level.

The bottom line is that health care costs are going to continue to increase faster than the rate of inflation. Add to that the worry of crossing the premium tax threshold and it is going to get harder and harder for you to pay for your employee’s health care costs. I wish I had some magic bullet to recommend, but for now the best I can offer is to do the math and see if there is anything you can do to keep this under control at your own company.

The Latest in Home Security

Home security

Home security (Photo credit: Wikipedia)

Anybody following this blog knows that I have been promoting the ideas of telecom providers getting into the home security business. I see this as one of the ways that you are going to keep yourself relevant with the advent of the internet of things.

Modern Home security centers in the homes are already a lot more than that, and they can also be the platform used for home automation and energy management. There are numerous devices being made that function as the gateway to any ethernet device in your home that can be connected with wires or with wireless technologies. These main consuls then can interface with the user through smart phones or other such devices.

Of course home security still does the basic stuff. You can set up your house with monitors on doors and windows that will tell you when something changes. But modern security systems can do so much more. Here are some examples:

  • Everything can be tied into your smart phone so that you have access to your security system at all times. You can use your phone to change settings, to peek in on any of the cameras or even to speak with somebody who is at your front door even if you are not at home.
  • You can tie normal security features in with motion detectors. This will tell you if something is moving in a room that ought to be empty. But it can also do cool stuff like alert you when anybody approaches the external doors in your house. So rather than wait until somebody has broken in you can be alerted when somebody is at one of your doors. It’s not all that useful to know when the mailman comes every day, but it’s very comforting to know that you can be alerted when somebody is at your back door at 2:00 in the morning.
  • The systems can be tied into a number of other kinds of monitors. Certainly you can tie this into smoke detectors, but you can also monitor if the temperature changes drastically in any room. You can monitor for carbon monoxide levels (and if you are really paranoid, for many other kinds of chemicals and gases).
  • New systems include voice recognition and you can talk to your system. This allows you to change settings on the fly. For example, you can just tell your system that you will be working in a certain room and to ignore monitoring that room for a while. But your security system can then help with those absent-minded people like me. If you turn off the security in an area for a while, you can set it to ask you later if you still want it off.
  • Your system can get to know you. Sophisticated systems are starting to use things like face recognition and gait sensors so that your security system will know it’s you walking around on the lawn at midnight and not a stranger.
  • And it’s all cloud based, meaning that you can get an alert if the power goes out on your system while you are not at home. Turning off the power to a home has always been a common burglar technique for confounding a security system, but the system can be set to alert your smart phone every time the power goes out.
  • And of course, there are cameras to view or record everything. You can set your cameras up with some smarts to only view unusual events or events of a certain kind so that you are only storing views of things that matter. But the cameras give you the ability to monitor pets or babysitters while you are not at home. With cheap cloud storage you can record a lot of video.
  • There are now smart door locks that are tied to the security systems. These can use some combination of proximity to cell phone, voice or face recognition to allow keyless entry.
  • For those times when you drive away from home and can’t remember if you set the alarm a certain way, your system can be tied into your smart phone’s GPS and it can ask you if you want the alarms on once it senses you away from the home. Side benefit – you are always tracking the location of your cell phones if you want to see where your kids really are.
  • You customers can monitor it all themselves. It’s no longer necessary to have the security system tied into some center that will alert the police. A customer who is never without their smart phone can take a more active role and get all of the alerts if they so choose.

Most of these changes have been introduced within the last few years and one can imagine that many more changes will be coming in the next decade. So the best platform is one that is software driven and that can be upgraded to accept new devices and new features as they hit the market.

Finding the Right Partner

Yesterday I talked about public / private partnerships since that seems to be one of the more common partnerships in telecom these days. Today I am going to talk about how you find a partner that you can co-exist with to make a viable long-term venture.

It’s not as easy as it sounds. I had a partner once who said that finding a good business partner was as hard as finding a good marriage partner, and I think he was probably right. I have seen hundreds of telecom partnerships and a lot of them become totally fractious and contentious over time due to the partners growing apart over time in terms of goals, personalities or vision.

But I have been a part of some very good partnerships and I have seen other good ones, and having also seen unsuccessful partnerships I can talk about what seems to work and not work. Here are some things to consider when forming a partnership:

  • Power and ownership share needs to be equitable based upon up front contributions and ongoing contributions. I have seen partnerships who decide up front to split things 50/50 but then resentment grows over time if one of the partners is doing most of the work to make the business successful. It’s okay for the partnership to be disparate and have one partner contribute more than the other, but this ought to then be recognized in terms of ownership and profit distribution.
  • You must share the same goals. This is one of the biggest problems that I see in public / private partnerships in that a commercial company and a municipality have very fundamentally different ideas of the way that things can work. As an example, a municipal venture is considered successful when it is cash flow positive, but a commercial venture needs to make more than that to meet return expectations. If both partners don’t have the same goals, then one partner is going to be disappointed with any outcome.
  • Make sure everybody understands their roles. In a business not everybody can be the boss. Partners need to decide up front will be responsible for what and then insist that partners fulfill their obligations.
  • You must be able to communicate with your partner. Partners must be able to tell each other the whole unvarnished truth. It takes both partners to make a business work, and failure to communicate is always going to lead to trouble down the line.
  • You must be able to resolve differences. In a more traditional business the owners or the Board is ultimately in charge and they can resolve disputes by fiat. But if a two-partner firm can’t resolve an issue the business can become paralyzed. Being able to iron out differences is probably the single most important requirement for a good partnership.
  • You must trust each other. This goes back the statement that a business partner needs to be like your wife or husband in some aspect. Generally all of the partners in the firm can spend the firm’s money, can commit resources, can make decision. If you don’t trust your partner fully – trust them to be honest, to not cheat you, to do what they say they will do, to tell you the truth – then the business is eventually going to get in trouble. Partners shouldn’t be spending their time watching each other, but in furthering the goals of the business.
  • Finally, partners have to be flexible. In telecom a business rarely goes the same as the original business plan and so the partners must be flexible to change as the world changes around them. I have seen too many partnerships that end up quibbling over goals, processes or procedures that were spelled out in the original partnership agreement rather than making the needed changes to make the actual business a success.

I would hope that this list makes it obvious that you have to spend a lot of time up front talking through these issues before leaping into a partnership. For example, you should talk up front how you will go about resolving your first impasse when it pops up. Far too often I see businesses partnering with somebody because they have the same general goal, but if they can’t meet the kinds of goals I have discussed above they are going to have to have a very hard time sustaining a business.