A Few Lessons from Big Companies

Text-messageI spend a lot of time reading about corporations and I think there are some lessons to learn from them that are relevant to small companies.

Selling Product versus Building Relationships. There are many  large companies that sell products without developing relationships with their customers. In our industry the large cable and telcos come to mind. They are all rated among the worst of all corporations in delivering customer service and they even antagonize many of their customers. This works fine for them until they get competition, and then the customers who don’t like them quickly jump ship to the new competitor.

But there are large businesses that go out of their way to build customer relationships because they believe that loyal customers are their most important asset. Consider car manufacturers. They realized a long time ago that they were not going to be good at customer service, so they created a network of dealers who are local businesses with ties in each community and these dealers have built trust over generations. And there are many other companies that deliver great customer service. Tech firms like Amazon, Apple, and Google have been consistently rated among the top ten in customer satisfaction for the last few years – showing that tech firms can put an emphasis on customers and still thrive.

My most successful clients build relationships with their customers and as a result have built a loyal customer base. Many of them are or were monopolies, and there was a time when most of my clients could not tell me who their ten largest customers were. But I rarely see that today and small telcos and cable companies have learned to build loyalty through building relationships.

Growing Fast versus Growing Deliberately. Many large companies need to grow fast to be successful. Once you have taken venture capital money or gone public then the pressure is on to grow profits quickly. But growing too fast almost always changes a company in negative ways. It’s really common to see companies go into the growth mode and then forget who they are. Most tech companies, for example, started with a small core of people who worked hard as a team to develop the core company. But when it’s time to grow, and companies hire mountains of new people it’s nearly impossible to maintain the original culture that made the company a great place to work.

Growth can be just as hard for small companies. It can be as hard economically and culturally for a small company to grow from 5,000 to 10,000 customers as it is for a large company to add millions. Small companies are often unprepared for the extra work involved with growth and find that they overwork and overstress their staff during a growth cycle. Growth creates a dilemma for small companies. If you hire the people needed to staff the growth period your company will be overstaffed when growth stops.

And so a lesson about growth can be learned from large companies. They will often staff growth through temporary employees, contractors, and consultants rather than take on people that they may not need later. Companies of any size are hesitant about hiring employees that they might not need a year from now.

High-Tech versus High-Touch. A lot of large businesses are trying to feign a good customer service experience by electronically ‘touching’ their customers often. I recall last year when Comcast introduced a texting system to communicate with customers. After they sent me half a dozen text messages in the same week, I disconnected the texting function because I really didn’t want to hear from them that often. But there are large companies who are convinced that if they electronically reach out to customers often that they are engaging in relationship building and proactive customer service.

And perhaps they are with some customers. But I am more appreciative of a business where I can talk to a person when it’s needed. Not that I mind electronic communications. I like to know that AT&T has auto-billed me and I like knowing when charges hit my credit cards. But I don’t want to be bothered by a business when they aren’t passing on information I want or need.

The important point here is that you have to touch your customers sometime and whether you reach out electronically or in person it’s better than no-touch and not talking to your customers. I know telecom companies that call every customer at least once a year to ask them if they like the service and if everything is okay. Such calls are welcomed by most customers and this is a great tool for businesses to build relationships. But just be prepared that if you ask your customers how you are doing that you need to be ready to deal with negative feedback. That is how to build happy customers.

Have We Entered the Age of Robots?

robbyI read a lot of tech news, journals, and blogs and it recently dawned on me that we have already quietly entered the age of robots. Certainly we are not yet close to having C-3PO from Star Wars, or even Robbie the Robot from Lost in Space. But I think that we have crossed that threshold that future historians will point to as the start of the age of robots.

There are research teams all over the world working to get robots to do the kinds of tasks that we want from a C-3PO. As the recent DARPA challenge showed, robots are still very awkward at doing simple physical tasks—but they are now able to get them done. There are research teams that are figuring out how to make robots move in the many subtle ways that humans move and they will figure it out.

The voice recognition used by robots still has a long way to go to be seamless and accurate. As you see when you use Apple’s Siri, there are still times when voice recognition just doesn’t get us. But voice recognition is getting better all the time.

And robots still are not fabulous at sensing their surroundings, but this, too, is improving. Who would ever have thought that in 2015 we would have driverless cars? Yet they are seemingly now everywhere and a number of states have already made it legal for them to share the road with the rest of us.

The reason I think we might have already entered the Robot Age is that we can now make robots that are capable of doing each of the many tasks we want out of a fully functional robot. Much of what robots can do now is rudimentary but all that is needed to get the robots from science fiction to real life is more research and development and further improvements in computing power. And both are happening. There is a massive amount of robot research underway and computer power continues to grow exponentially. I would think that within a decade computing power will have improved enough to overcome the current limitations.

All of the components needed to create robots have already gotten very cheap. Sensors that cost a $1,000 can now be bought for $10. The various motors used for robot motion have moved from expensive to affordable. And as real mass production comes into play, the cost of building a robot is going to continue to drop significantly.

We already have evidence that robots can succeed. Driverless cars might be the best example. One doesn’t have to look very far into the future to foresee driverless cars being a major phenomenon. I can’t think that Uber really expects to make a fortune by poorly paying and mistreating human drivers such that the average Uber driver last less than half a year. Surely Uber is positioning themselves to have the first fleet of driverless taxis, which will be very profitable without having a labor cost.

We see robots being integrated into the workplace more so than into homes. Amazon is working feverishly towards totally automating their distribution centers. I think this has been their goal for a decade and once its all done with robots the part of the business that has always lost money for Amazon will become quite profitable. There are now robots being tested in hospitals to deliver meals, supplies, and drugs. There are robot concierges in Japan. And almost every factory these days has a number of steel collar workers. You have to know that Apple is looking forward to the day soon when they can make iPhones entirely with robots and avoid the bad publicity they keep getting from their factories today.

The average person will look at video from the recent recent DARPA challenge and see clumsy robots and be convinced that robots are still a long way off. But almost every component needed to make robots better is improving at an exponential pace, and we know from history that things that grow exponentially always surprise people by ‘bursting’ onto the scene. I would not be at all surprised to see a workable home maid robot within a decade and to see a really awesome one within twenty years. I know when there is a robot that can do the laundry, load the dishwasher, wash the floor, and clean the cat litter than I am going to want one. Especially cleaning the cat litter—is somebody working on that?

Your Weakness Might be Your People

Staff-buttonToday I’m going to talk about something that most company owners or general managers don’t want to be reminded about. I’ve read a number of things lately that remind me that often the employees at a company, while your biggest resource, can also sometimes be your biggest weakness.

What do I mean by that? For one thing, I’ve read several industry security reports lately that all say that company employees are the largest single reason that networks are getting compromised. Many companies now have pretty good firewalls and so hackers are no longer trying to break directly into company networks. Instead they are using techniques that get your employees to let them inside.

One of the primary new hacker tools is spoofed email. They will get valid email addresses for somebody inside the company, and then create fake and infected emails from that person to others in the company. Their hope is that somebody inside the company will open and download a file containing a virus from an infected spoofed email. Generally, once they see the structure of your email addresses it’s not that hard to figure out other email addresses inside the company.

The other way that hackers get in is with the older techniques of having somebody inside a network go to a web site that’s infected. I just reported in a recent blog that Menlo Security tested the top million websites (by traffic volume) and found that 6% of them contained malware of some sort. Much of this malware is just tracking spyware that isn’t too harmful, but some of it can be the deadliest malware on the web.

Cisco said last year that malware from web advertising is possibly the biggest new security threat. And malware is no longer just on suspicious web sites but can be found on very mainstream websites. This is due to the very odd system we have for getting advertising to websites. I discussed this in a blog earlier this year, and such malware is just as likely to come from a major news site as it is from someplace more suspicious.

The main defense against these kinds of problems is to continuously talk about these issues so that your employees are aware of them. The interesting thing is that employees are far likelier to open or download a file from an infected email at work than they are at home. For some reason employees are not as cautious with suspicious emails at work as they are on their home computers. If something is spoofed to look like it came from somebody inside the company they are likely to open it.

The other issue that brought this to mind recently is that I have several clients who have been the victims of embezzlement by employees. Of course, this is a crime that has been around forever and almost every time this happens people are shocked that it could happen to them. My first college degree is in accounting and I had several courses that dealt with these issues since it’s something that auditors are supposed to look for and uncover.

Accountants understand that there are two primary kinds of embezzlement. There is the loner who finds a way to write checks to themselves or to a bogus vendor they have created. This kind of embezzlement is almost always due to lax financial controls. If every check that is written must be approved by somebody who is going to make sure that a payment is legitimate, then it’s very hard for somebody to pull this off. Generally companies get into this kind of trouble when they have somebody with the sole authority to write checks or where people can somehow bypass the controls. Sadly, the temptation to steal is just too much for some people.

The other kind of embezzlement is a lot harder to catch and comes when a number of employees collude together to commit fraud. In that situation they are often able to bypass even the best internal controls. For instance, one employee can ask for a payment to a bogus vendor while another  employee cohort can vouch that it’s legitimate. I remember a huge case of this when I worked at Southwestern Bell many years ago where a large group of employees at the company colluded to buy huge amounts of telecom cable and electronics and have it delivered to a fake company warehouse.

It’s a shame that we live in a world where you have to worry about these sorts of things, but it happens to a lot of companies sometime during their corporate life. Almost invariably the person who is stealing from the company seems like the least likely candidate and surprises everybody.

I didn’t write this blog to cause you to be suspicious of your employees or staff. But it never hurts once in a while to think about these things because, sadly, one of your biggest weaknesses really can be your employees. And that can really hurt.

LTE-U

Cell-TowerRecently, the NCTA asked the FCC to make sure that wireless carriers don’t interfere with WiFi spectrum. I wrote a blog a few weeks ago talking about all of the demands on WiFi, and the threat that the NCTA is warning about is another use of the already busy WiFi spectrum.

Cellular carriers are using LTE technology to deliver 4G data. Cellular carriers today deliver 4G data and voice using spectrum for which they have paid billions (at least in the US and Europe). But in urban areas the LTE spectrum is already stressed and the demand for the existing spectrum is growing far faster than the carriers can find new spectrum to offload the extra demand.

The cellular carriers have had their eye on the 5 GHz unlicensed band of spectrum that is used for WiFi. This is a big swatch of spectrum that in some markets is larger than the band that some carriers have for LTE. Recently, various carriers have been experimenting with using this public spectrum to deliver LTE. Huawei and NTT demonstrated this capability last August; Qualcomm showed this capability at the CES show earlier this year. It’s rumored that T-Mobile plans to run a trial of this technology this year.

This new technology is being called LTE-U (for Unlicensed). NCTA filed at the FCC on behalf of their cable company members who use this WiFi spectrum to deliver WiFi for various uses such as distributing data wirelessly around a home or to bring data to settop boxes. They are worried that if the cellular companies start using the spectrum that they will swamp it and make WiFi useless for everybody else, particularly in urban areas where WiFi is under the most pressure.

That certainly is a valid concern. As my recent blog noted, the list of companies and technologies that are planning on using WiFi spectrum is large and growing. And there is already notable stress on WiFi around crowded places like large hotels, convention centers, and stadiums. The fear is that if cellular carriers start using the spectrum this same crowding will spread to more places, making the spectrum useless to everyone.

The cellular carriers argue that the swath of WiFi is large enough to allow them to use it without hurting other users. They argue that nobody can use all of the 400 MHz of spectrum in that band all at once. While that is true, it doesn’t take a huge pile of LTE-U customers at one time to locally overdraw the WiFi spectrum in the same manner that they are overloading the cellular spectrum today.

Engineers tell me that LTE uses the spectrum more efficiently today than does most WiFi technologies. This is due to the fact that the LTE specifications very neatly limit the bandwidth that any one customer can draw while most WiFi applications will let a user grab all of the bandwidth if it’s available. This means you can fit a lot more LTE customers into the spectrum that might be assigned to one WiFi customer.

There is a characteristic of WiFi that makes it incompatible with the way that LTE works. WiFi has been designed to share spectrum. When one customer is using WiFi they can grab a huge swath of spectrum. But when another customer demands bandwidth the system dynamically decreases the first connected customer to make room for the second one. This is very different than how LTE works. LTE works more like a telephone network and if there is enough bandwidth available to handle a customer it will assign a band to the customer or else deliver a ‘busy signal’ (no bars) if there us not enough bandwidth. The problem with these two different operating systems is that LTE would continually grab spectrum until it’s all used and the WiFi users are shut out, much like what you might get in a busy hotel in the evening.

The LTE providers say they have handled this by introducing a new protocol called LAA (Licensed Assisted Access) which introduces the idea of coexistence into the LTE network. If it works properly, LAA ought to be able to coexist with WiFi in the same manner that multiple WiFi customers coexist. Without this change in protocol LTE would quickly gobble all of the free WiFi spectrum.

But this still doesn’t answer the concern that even with LAA there could be a lot of people trying to grab bandwidth in environments where the WiFi is already stressed. Such a network never shuts anybody out like an LTE system will, but rather will just keep subdividing the bandwidth forever until the amount each customer gets is too small to use.

It will be interesting to see what the FCC says about this. This was discussed years ago and the FCC never intended to let licensed cellular holders snatch the public WiFi spectrum. I will also be curious to see if wireless carriers try to charge customers for data usage when that data is being delivered over a free, unlicensed swath of spectrum. And how will customers even know that is where they are getting their data?

I hope the FCC doesn’t let the wireless carriers run rampant with this, because I think it’s inevitable that this is going to cause huge problems. There are already places today where WiFi is overloaded, and this new kind of data traffic could swamp the spectrum in a lot more places. The wireless carriers can make promises all day about how this won’t cause problems, but it doesn’t take a huge number of LTE-U users at a cell site to start causing problems.

The Power of the Bundle

coax cablesOne of the primary ways that the cable companies have built their market dominance is through bundling. We are all aware of the many bundles of cable TV, telephone, and Internet access that they sell, and for which they give you a discount.

But you have to wonder how much longer those traditional bundles are going to make sense. According to the American Cable Association, by 2020 most cable companies are going to be seeing zero profit margins on their cable product. And for many companies it’s not even that far away. I’ve done the math and many of my clients are already losing money on cable when you consider all costs of supporting the cable product.

I’m sure the very largest cable companies do a little better, but they can’t be making a lot of money on cable TV. They have economy of scale, which has to help, but other than Comcast who owns a number of the networks carried on their systems, the other big companies can’t be making a very big margin on cable.

The normal bundle discount works by providing a discount for buying multiple products. It’s not unusual to see a bundle discount of $20 for somebody buying the full triple play. Nobody outside the cable companies knows which products are discounted or how the cable companies count the discount on their books. I’m not sure that really matters to the companies, but it really matters to customers.

Bundles today seem to have become a tool for penalizing a customer for dropping a service rather than as a marketing tool to attract customers. Today most new customers at big cable companies are attracted by specials, and those specials then eventually revert to the bundle prices when the period of the special ends. So most new customers often don’t even know the bundle prices when they buy.

But you quickly learn the unbundled prices if you try to drop services. Let’s say you have cable and Internet product and are paying $69 for the bundle. If this bundle has a $15 bundling discount the products would cost $84 dollars if bought separately. If you want to cut the cord and cancel your cable, you will find that you will lose the whole bundling discount, and your remaining Internet connection might still cost you $45 or more per month, meaning that you save $24 or less from cutting cable. If you had planned on dropping cable and buying NetFlix and perhaps some other OTT service you might easily find yourself paying more after cutting the cord than you paid before with the bundle.

That is the power of the bundle. It is no longer a marketing tool to capture customers because the big cable companies only talk about their bundles prices in the very fine print in their advertising. Instead, the bundle is a way to penalize customers for cutting service. I see industry pundits wondering all the time why cord cutting isn’t happening faster. There are a lot of people in surveys who say they are going to cut the cord but then never do it. Since most cord cutters want to keep their Internet connection, I think a lot of cord cutters change their mind about cutting cable when they find out how paltry their savings are.

If anything, cable companies are probably going to have more opportunities to bundle in the future than they do today. People have been steadily dropping voice lines for a long time. And while cable cord cutting is starting slowly, it is picking up steam. But to offset these losses the big cable companies are adding new products like security, energy management, home automation and IoT, and WiFi phones.

The cable companies are probably going to have the opportunity to sell OTT cable packages. It seems likely that the FCC is going to give anybody the ability to sell smaller OTT products over the web, and one has to think that they are going to let the cable companies compete with the same smaller products. Today, cable companies have a regulated set of rules for how they must build their programming tiers. But I suspect that there is going to be more profit for cable companies to sell a 40-channel package than what they are making with today’s big 300-channel packages.

And so we are probably going to see a lot more bundling, but rather than the triple play bundle it will be Internet access bundled with these other new products. And certainly the cable company is going to continue to use the bundling discount as a way to make it hard for customers to drop their service. So my guess is that bundling is not only here to stay but that it has a big future as a tool for cable companies to continue to strong-arm their customers to stay with them.

New Tech – July 2015

light beamsAs I do periodically, I’ve compiled a roundup of some of the coolest new technology that might be affecting our industry in the near future.

Light-Based Computers: Researchers at Stanford University have finally found an efficient way to transmit data between computer chips using light. This might finally enable light-based computers.

Light-based computing has two advantages over electricity-based computing. First, light transmissions are faster, meaning that data can be moved more quickly where it’s needed and will vastly increase the capacity of a chip. The other big advantage is that it will be greener and will not generate much heat. Today about 80% of the power poured into a chip is converted to heat, which is why you need a fan for your home computer and why data centers need huge amounts of power to keep them cool.

The breakthrough was done by taking advantage of the tiny imperfections found in any chip. They have developed an algorithm that will work with each unique chip to design the exact place where light gateways should be placed. This is a very different concept than today where chips are all uniform and the goal is to make each chip exactly the same. The architecture of the chip then uses many extremely thin layers of silicon, perhaps 20 layers in the width of a hair, and the creation of light gateways that work in 3-dimensions.

Faster Fiber: Scientists at the Qualcomm Institute in San Diego have been able to increase the power of optical signals in long-haul fiber by a factor of 20. They were able to send a signal 7,400 miles through fiber without amplification.

Long-haul fibers today use a whole range of different light spectrums in order to carry more data. However, as you cram in additional light paths you also increase interference between light paths which we call crosstalk. This eventually distorts the signal and requires the signal to be regenerated and re-amplified. The Qualcomm scientists have found a technique they call ‘combing’ that conditions the light stream before it is sent to greatly reduce the crosstalk.

This breakthrough means that existing fiber signals can be sent a lot farther without regeneration in applications like undersea fibers. But in normal fiber applications this technique means that about twice as much data can be crammed into the same light path – effectively doubling the capacity of fiber.

Biodegradable Chips: Engineers at the University of Wisconsin at Madison have developed a chip made almost entirely out of wood cellulose. The advantage of this technology is that we can create chips for many uses that will be disposable or recyclable with other trash. We have a huge worldwide waste problem with current electronics that should not be put into landfills due to containing heavy metals and other unhealthy compounds.

While these chips probably won’t be used for high-density computing like in data centers, this could become a standard way to make chips for the many things we use that are eventually disposable. One can certainly envision this as the basis of many chips for the Internet of Things.

A Replacement for GPS: DARPA is working on a replacement for GPS. GPS was developed by DARPA just a few decades ago, but there are already a lot of places where GPS doesn’t work, such as underground. Since GPS is satellite-based, DARPA is also worried about it being jammed in combat situations.

The new location system will be based upon self-calibrating gyroscopes that will always ‘know’ where they are at. This would create a location technology that is not satellite-based and not subject to outside interference. It also would work better than current GPS in three dimensions, meaning that it would more accurately be able to measure changes in altitude.

While the technology doesn’t need an external reference to calibrate itself or know its locations, they are also building in what they call ASPN (All Source Positioning and Navigation). This means a device would be able to pick up radio, television signals, or other spectrum to double-check their position so as to be able to confirm their location and recalibrate as needed.

One Way to Cut Cable Bills

Rabbit_Ears)I just read that retransmission fees may climb to $6 per network in the next few years. At a recent industry summit Randy Bongarten, the CEO of Bonten Media Group and the owner of a number of broadcast stations, said that he predicted cable systems would soon be paying as much as $6 for each of the major broadcast networks – ABC, NBC, CBS, and FOX.

For those not familiar with the cable industry, every cable company must pay a retransmission consent fee to each of these major networks to compensate them for carrying their programming. This is a relatively new phenomenon in the industry and following is a brief history:

  • In 1972 the FCC said that cable systems must carry stations that are within 60 miles of their service area.
  • In 1992 the FCC ruled that station owners could negotiate compensation for carriage of their signals.
  • Not much was done with this until the early 2000s when small payments for network content were negotiated in a few major metropolitan markets.
  • But within a decade every cable system was paying for local content and the networks increased rates aggressively with each new two-year contract. Most network stations today charge between $2 and $4 per customer to cable companies for carrying their content.

And now the networks want to keep increasing the payments for local content to $6 per customer per month. That means that soon $24 out of every cable TV bill in the country will be sent back to the four primary networks. With roughly 100 million cable subscribers that is nearly $29 billion per year and $288 per household.

This situation is made worse by the fact that cable companies have little recourse but to carry this content. Customers would drop cable if they refused to carry the local stations. But cable companys’ hands are also tied because in order to carry advanced programming such as expanded basic or digital tiers they are required to carry the basic tiers – that tier that must be given to every customer. The other problem faced by cable companies is that there is little real negotiation on the retransmission rates – it’s generally a take-it-or-leave price demanded by the network affiliates.

The FCC could return some fairness to the process and also give a break to consumers with one simple change in the rules. The FCC could let customers opt out of buying the basic channels from the cable company. Anybody who lives in a metro area can already get all of these networks for free with a pair of rabbit ears. If customers had the option of opting out of these channels from cable, then they could cut their cable bill significantly while still being able to watch the channels for free from rabbit ears. It’s relatively easy to install rabbit ears to work alongside your cable system.

Of course, the cable companies have to ask for this kind of change and so far none of them have gone this far. And this is because, as much as they hate passing on the big fees from programmers, the big cable companies are also complicit in the process. When their programming costs go up $3 in a year they will raise rates $4, and so their profits keep climbing every year along with the programmers.

But we are finally starting to see cracks in the system. Most cord cutters are doing so to save money and I am positive that if people had the ability to opt out of paying for the local networks from the cable company that many of them would. Today if such programming costs $4 per network, then a customer could instantly cut their bill $16 per month or $192 per year.

So perhaps what we need is for individuals to start asking the FCC to allow them to opt out of paying for local channels that they could otherwise get with a cheap pair of rabbit ears. The cable companies might eventually come around to wanting this if cord cutting grows to be too significant, but right now they have no interest in looking out for the benefit of their customers.

I know many smaller cable operators who would love to have this option. They feel the local networks are holding them hostage by demanding bigger payment for local content every year. If a cable company was willing to work with their own customers to bypass the local stations this might bring some balance back to this process and turn it into the negotiation that the FCC originally envisioned in 1992. I know smaller companies who would gladly provide every customers with rabbit ears and help them integrate them into their TVs if that was allowed. But today a cable company could find themselves in hot water if they actively helped customers bypass the local networks.

The runaway greed of the networks and station owners is ruining the cable market. Cable rates continue to skyrocket much faster than the cost of inflation. Households really love their TV, but more and more households are finding cable to be unaffordable.

I hope the FCC wakes up to this and perhaps this blog can be the first tiny step towards planting this idea in people’s heads. Nobody really wants to pay $24 per month just to get ABC, CBS, NBC, and FOX. So let’s start asking the FCC to let us opt out of those payments.

Finally a Use for LMDS

Satellite_dish_(Television)Vivint provides a range of security and home monitoring products and has recently also become a wireless ISP in a few markets such as San Antonio, El Paso, and a few towns in northern Utah. What I found interesting is that they are using the LMDS spectrum.

LMDS stands for Local Multipoint Distribution Service and is a licensed spectrum operating between 27.5 GHz to 31.3 GHz, close to the range of various microwave frequencies. The LMDS spectrum was sold in a very robust A band that was an 1150 MHz swath of bandwidth and a B band of 150 MHz of bandwidth.

This was auctioned to the public in 1998. I know a number of companies that bought the spectrum then and I know a few who created business plans using LMDS that all failed. There were two problems with using LMDS. The first was the chicken and egg issue that all spectrum faces. A spectrum can’t really be used commercially until somebody develops cheap gear to use it and the vendors won’t develop cheap gear until they get a large buyer who will buy enough gear to finance the R&D. After the spectrum hit the market there were a few beta tests of equipment that didn’t work well, but no big user and the market died.

The other issue is the practical application of using the spectrum. In 1998 this was touted as being able to deliver a wireless DS3 which is about 45 Mbps. That was a lot of bandwidth in 1998, but over time that is no longer particularly great. And the spectrum has real-life limitations. On a point-to-point basis it can go, at best, about 5 miles and on a point-to-multipoint basis it can go, at best, about a mile and a half. The spectrum can achieve those distances in areas without a lot of humidity (which is why Vivint is deploying it in the dry southwest). It also is easily deflected by trees and buildings, another reason to go to west Texas and Utah.

So this spectrum has basically gone mostly unused for a decade and a half. A lot of license holders have a few point-to-point links working on it just to preserve their license, but I am sure there are license holder who just let it go. Vivint is buying rights to the spectrum in these markets from XO Communications and Straight Path Communications.

It looks like Vivint has found a strategy for monetizing the equipment. They obviously found radios that will work on the spectrum, which is not that unusual today now that we have software tunable radios that can work on a wide range of spectrums (something we didn’t have in 1998).

Vivint is also dealing with the distance and bandwidth limitations in a very creative way. They are selling in urban/suburban areas giving them a decent density within the range of a given transmitter. They are then using point-to-point radios to bring bandwidth to what they call hub homes. They are giving these homes free Internet connectivity for housing and powering their equipment. From each of these homes they will serve up to 24 other homes. That small number of subscribers is what allows them to offer the 100 Mbps bandwidth. If they serve more homes the effective bandwidth would quickly drop.

Vivint prices 100 Mbps bandwidth at $59.95 per month. For the wireless customers they are also offering VoIP plus cloud storage. Plus Vivint has a wide range of security and other products they can sell to a household. It’s not a standard bundle, but it’s a pretty good one.

This doesn’t look like a bad business plan. With the range of services they sell they are probably averaging more than $85 per customer per month on average, and maybe more. And they are gaining some economy of scale and report having over 15,000 customers.

This business plan certainly isn’t for everybody. It wouldn’t work well in places like humid Florida or Louisiana. It also wouldn’t work well in towns that are solid trees. This business plan takes a lot of discipline to be successful. Once they have established a hub home the business plan is only going to work if they can find other customers in the same local area, within 1.5 miles. I figure that they knock on doors to find customers around every hub home. The math would be terrible if they only got a few homes per hub.

They also have to find licensed LMDS spectrum holders and they obviously have in these markets. But that might not be possible in other markets. This business plan must be urban in order to have enough density, and this looks totally infeasible in rural areas.

I have to credit Vivint with finally finding a market use for this spectrum. In today’s marketplace it sounds like they have put together a very marketable suite of products including bandwidth at an affordable price. This is what competition looks like. While LMDS spectrum is only going to work this well in arid places, the idea of a non-traditional bundle is one that others ought to consider.

The Big City Bandwidth Dilemma

Seattle-SkylineSeattle is like many large cities and they badly want a gigabit fiber network everywhere. They were one of the earliest cities to want this and they hired me back in 2005 to try to find a way to bring big bandwidth to the city. They still don’t have fiber, and they recently commissioned another study to see if there is a solution available today.

The study concentrated on the cost of bringing fiber everywhere and about how the City might be able to pay for it. After all, no city wants to build fiber if they don’t reasonably believe they can make the payments on the bonds used to pay for the fiber. The report shows that it’s very hard for a large City to justify paying for a fiber network. And this highlights what I call the big city bandwidth dilemma. Should a City just wait to see what the incumbents do and hope that they eventually get gigabit broadband, or should they be like Seattle and keep pushing for a solution? There are two major aspects of the dilemma that every city is wrangling with:

The Incumbent Response. If a city does nothing they may never get fiber, or they might get fiber to some of the ‘best’ neighborhoods, but not everywhere. We see that in markets where somebody other than the incumbents brings fiber that the incumbents immediately step up their game and offer fast speeds. There is no better evidence for this than in Austin where both AT&T and Time Warner quickly announced much faster speeds and competitive prices to offset Google’s entry into the market.

But everybody understands that the incumbents in Austin would not have increased speeds absent any competition, as can be seen in their many other markets. This create a huge dilemma for a city. Should they decide to build fiber alone or with a commercial partner, that new venture will be met with stiff competition and will have a hard time getting the needed market penetration rate to ensure financial success. But should the city do nothing – then they get nothing.

Citywide Coverage. In large cities almost no commercial builder is willing to build fiber to every neighborhood. One doesn’t need a crystal ball to look at the consequences of this in the future. A city will become a patchwork of fiber haves and have-nots. The have-not neighborhoods probably already have some poverty and blight, but if they get walled off from having the same broadband as everybody else, then over time they are going to become even more isolated and poor. Every city that has Google coming to town is so thrilled to have them that nobody is looking forward ten and twenty years to imagine what will happen to the neighborhoods without fiber.

Cherry Picking. Google is selling a gigabit for a flat $70 per month. While that might be cheap for a gigabit it is still a cherry picking price that is too expensive for most households. It’s hard to imagine more than 30% to 40% of any market being willing to pay that much for broadband. A large number of homes settle for something slower, but that they can afford.

And almost every other gigabit provider charges more than Google. For example, CenturyLink is now selling a gigabit in some markets at $79.95—but in order to get it you have to buy a $45/month phone plan. Before taxes that means it will cost $125 per month to get the gigabit. I can’t see that Comcast has a gigabit product yet, but earlier this year they came out with a 2-gigabit fiber-fed product priced at $300 per month.

The problem with cherry picking is that it also creates a market of haves and have-nots. The incumbent cable company may not like the competition, but they know they are still going to be able to sell over-priced bandwidth to the majority of the market. Look at how Comcast has fared against Verizon FiOS and you will see that, while they hate competition, they still fare quite well in a competitive market.

A Possible Solution? The Seattle report did suggest one solution that could make this work. Cities not only want fiber, but they want fiber everywhere and at prices affordable to the vast majority of their citizens. Any city that can accomplish that understands that they will have a huge competitive advantage over cities without affordable fiber.

The report suggest that Seattle ought to ‘buy-down’ the retail rate on a gigabit by paying for some of the network with property taxes. This is not a new idea and there are a few small cities that have financed fiber using this solution. But nobody has ever tried this in a large city.

The report suggests buying the price of a gigabit down to $45 per month, a figure that is not cherry-picking and that a lot of homes can afford. That kind of price certainly would put a whole different set of competitive pressures on the incumbents. I can imagine them screaming and probably suing a city who tries this. But if this was done through a referendum and people voted for it, almost no court will overturn a vote of the people. I don’t know if this idea can work in a large city, but it’s the first idea I’ve heard that deals with the issues I’ve outlined above.

The Open Compute Project

The InternetI wrote recently about how a lot of hardware is now proprietary and that the largest buyers of network gear are designing and building their own equipment and bypassing the normal supply chains. My worry about this trend is that all of the small buyers of such equipment are getting left behind and it’s not hard to foresee a day when small carriers won’t be able to find affordable network routers and other similar equipment.

Today I want to look one layer deeper into that premise and look at the Open Compute Project. This was started just four years ago by Facebook and is creating the hardware equivalent of open source software like Linux.

Facebook found themselves wanting to do things in their data centers that were not being satisfied by Cisco, Dell, HP or the other traditional vendors of switches and routers. They were undergoing tremendous growth and their traffic was increasing faster than their networks could accommodate.

So Facebook followed the trend set by other large companies like Google, Amazon, Apple, and Microsoft, and set off to design their own data center and data equipment. Facebook had several goals. They wanted to make their equipment far more energy efficient because data centers are huge generators of heat and they were using a lot of energy to keep servers cool and were looking for a greener solution. They also wanted to create routers and switches that were fast, yet simple and basic, and they wanted to control them by centralized software – which differed from the market who built the brains into each network router. This made Facebook one of the pioneers in software defined networks (SDN).

And they succeeded; they developed new hardware and software that allowed them to handle far more data than they could have done with what was on the market at the time. But then Facebook took an extraordinary step and decided to make what they had created available to everybody else. Jonathan Heiliger at Facebook came up with the idea of making their hardware  open source. Designing better data centers was not a core competency for Facebook and he figured that the company would benefit in the future if other outside companies joined them in searching for better data center solutions.

This was a huge contrast to what Google was doing. Google believes that hardware and software are their key differentiators in the market, and so they have kept everything they have developed proprietary. But Facebook had already been using open source software and they saw the benefits of collaboration. They saw that when numerous programmers worked together the result was software that worked better with less bugs and that could be modified quickly, as needed, by bringing together a big pool of programming resources. And they thought this same thing could happen with data center equipment.

And they were right. Their Open Compute Project has been very successful and has drawn in other large partners. Companies like Apple, HP, and Microsoft now participate in the effort. It has also drawn in large industry users like Wall Street firms who are some of the largest users of data center resources. Facebook says that they have saved over $2 billion in data center costs due to the effort and their data centers are using significantly less electricity per computation than before.

And a new supply chain has grown around the new concept. Any company can get access to the specifications  and design their own version of the equipment. There are manufacturers ready to build anything that comes out of the process, meaning that all of the companies in this collaborative effort have bypassed the traditional telecom vendors in the process and work directly with a factory to produce their gear.

This effort has been very good for these large companies, and good for the nation as a whole because through collaboration these companies have pushed the limits on data center systems to make them less expensive and more efficient. They claim that for now they have leapt forward past Moore’s law and are ahead of the curve.

But as I wrote earlier, this leaves out the rest of the world. Smaller carriers cannot take advantage of this process. Small companies don’t have the kind of staff that can work with the design specs, and no factory is going to make a small batch of routers. While the equipment and controlling hardware is open source, each large member is building different equipment and none of it is available on the open market. And small companies wouldn’t know what to do with the hardware if they got it, because it’s controlled by open source software that doesn’t come with training or manuals.

So smaller carriers are still buying from Cisco and the traditional switch and router makers. The small carriers can still find what they need in the market. But if you look ten years forward this is going to become a problem. Companies like Cisco have always funded their next generation of equipment by working with one or two large customers to develop better solutions. The rest of Cisco’s customers would then get the advantages of this effort as the new technology was rolled out to everybody else. But the largest users of routers and switches are no longer using the traditional manufacturers. That is going to mean less innovation over time in the traditional market. It also means that the normal industry vendors aren’t going to have the huge revenue streams from large customers to make gear affordable for everybody.