Making Money With Home Automation

Nest_Diamond_ThermostatAt CCG we are always telling carriers that they need to find products to replace cable TV and voice, both which are slowly losing customers. One of the products worth considering for carriers that have a sizable residential base is home automation.

What we hear is that once homeowners learn what home automation can do for them they want this product. But there are a lot of moving parts to the product. There are hardware costs to cover along with numerous home visits needed, so it’s not a product that is automatically going to make money unless you do it right. Here are things to think about when considering the product:

  • You must be willing to cross the threshold. This product requires you to routinely go into customer homes. As an industry we have spent a decade looking for ways to reduce truck rolls and this product increases truck rolls as a routine part of the product. Your pricing must embrace that concept so that you are recovering a lot of your technician time.
  • One way to look at this product is that it gives you the opportunity to cross-sell other telecom products. We are told by some clients that the cross-sales are worth far more than the margin on home automation.
  • There are not likely to be any industry standards for a long time, if ever. This means that you need to decide what devices you will and will not support. We think the right strategy is to define the list of things that you will automate, and even then that you only deal with monitoring units that are part of your suite of products. Otherwise customers will always be buying crazy off-the-shelf things (like an egg tray that tells you how many eggs are left) and expect you to somehow tie them into your system.
  • You must recover equipment costs. You need to have an initial installation fee plus some portion of monthly fee on a term contract that is aimed at recovering the cost of the equipment. Base hub units for home automation are going to cost you from $150 to $200 and there is a wide array of monitors that can be added to the system to automate things like watering systems, thermostats, fire detectors, music systems, lighting, etc.
  • This industry and the product are always going to be changing. Home automation is the first small step into the Internet of Things and by becoming the trusted vendor today you have a foot up on that market when it gets more mature. The downside to this is that the technology will be changing quickly and so you are going to have to always be looking at different and newer monitors and devices to support with the product. But your customers will want many of the new things that will be coming along, and so you will have a continuous opportunity to upgrade and add on to customer systems.
  • You must manage customer expectations. There are three components to pricing the product – installation, equipment and ongoing maintenance. We think customers are going to get excited about this product. Once it’s installed and you have automated their sprinkler system and window shades they are going to want you to keep coming back to update more things over time. So your pricing needs to make it very clear about what is included with your base fee, and what costs extra. We suggest that you offer pricing plans that include some set number of visits. For instance, a base plan might mean that all future visits to the home are for a fee. But you might then also sell plans that include two, four or six visits a year where the customers pay for these visits as part of their monthly fees. That kind of pricing will stop customers from calling you to visit every time they think of a new home automation device to add or a refinement to make with the existing equipment. Without managing expectations in this manner you will find yourself making a lot of unpaid trips to customers.
  • Bundle the product. It’s a natural to bundle home automation with home security, but you could bundle it with anything else your customers want. The whole point of this product is to use it as a platform to get your customers to buy multiple products from you.

If They Can Do It

ppp_logoI think everybody agrees that we don’t have enough last mile fiber infrastructure in this country to bring high-speed internet to the homes and businesses that want it. For various reasons the large telecom companies have decided to not invest in rural America, or even in any town or neighborhood where it costs more than average to build fiber.

Meanwhile we have a lot of municipalities that are interested in getting fiber to their communities but don’t know how to get it financed. Lastly, from what I read, we have tens and maybe hundreds of billions of dollars of potential investment money on the sidelines looking for solid investment opportunities. It seems like there ought to be an easier way to pull these things together because all of the components are there to make this work – the customer demand, the community desire and the money needed to get it done.

One has to look north to Canada to see an economic model that offers an answer – public private partnerships (P3s). For an example of a working investment model, consider  Partnerships British Columbia. The venture was launched in 2002 and is wholly owned by the province. It’s operated by a Board comprised of both government and private sector members.

The primary function of Partnerships BC is to analyze potential infrastructure projects and then attract private capital to get them constructed. The benefits to the province are tremendous. First, Partnerships BC stops the government from undertaking projects that don’t make financial sense. After all, governments often like to build ‘bridges to nowhere’. Partnerships BC will analyze a project and make certain that it will cash flow. They then attract private bidders to build and operate the most financially attractive projects.

They bring private money in to take the place of tax money and in British Columbia this is getting things like hospitals and roads built without putting all of the burden on the taxpayer. Additionally, projects built with private money cost less to build. We know all of the horror stories of cost overruns on government-built projects and that doesn’t happen when there are financial incentives for the private entities to build things efficiently. In fact, a few hospitals were built so far ahead of schedule in BC that the province was not ready for them.

One of the biggest complaints against government fiber projects in the US is that government has no business operating a competitive business. The P3 model eliminates that complaint as well since it attracts qualified operators to build and run projects. The telephone companies in the US should all be in favor of having a P3 structure here since it would help them to finance new fiber projects. Smaller and mid-sized telecoms have always had trouble finding capital and a P3 fund would bring money that might not be found elsewhere.

And of course, while I have a bias towards funding and building fiber projects, a state P3 fund would fund almost any infrastructure project that has a demonstrable cash flow such as hospitals, water systems, roads, railroads and bridges. I keep reading that we have a multi-trillion dollar infrastructure need in the country which is far greater than the combined borrowing capacity of governments. So we need to wake up and look north and use the P3 idea along with any other idea that will let us dig ourselves out of our infrastructure hole. America is crumbling and P3s are one part of the solution that could be implemented immediately.

The Gigabit Divide

WDM_FOWe all know what the digital divide is – it’s when one place or demographic has broadband when those nearby do not. The term was originally coined after DSL and cable modems came to urban areas while rural America was left with dial-up access.

Over the years the definition is still the same but the circumstances have changed. For example, there still are some millions of households in the country stuck with dial-up or satellite broadband. But most of the digital divide today is an urban / rural divide where the telecom companies have invested in much newer and faster technology in urban areas and have ignored rural areas. Metropolitan areas all over the country now have at least some 100 Mbps cable modems while surrounding smaller towns often still get maybe 3 Mbps. And there is an economic digital divide within cities where some neighborhoods, particularly richer ones, get better infrastructure than poor.

But we are about to embark on the most dramatic divide of all, the gigabit divide. I spent last week in Austin and they are a good example of what I fear will be happening all over the country. There are three companies building gigabit fiber in Austin – Google, AT&T and Grande. None of them are going to build everywhere. For instance, Google will only build to a ‘fiberhood’ where enough people in an area pre-sign with Google. And the other two carriers are going to do something similar and carve out their parts of the market.

This is great for those who get fiber. They will end up with the fastest fiber connections in the world, and hopefully over time that will make a big difference in their lives. But my concern is that not everybody in Austin is going to get fiber. To see how this works we only have to look at Verizon FiOS. For years Verizon built to the neighborhoods with the lowest construction costs. That meant, for example, that they would favor an older community with aerial cable that could be over-lashed over a newer community where everything was buried and construction costs were high.

You find a real hodge-podge when you look closely at FiOS – it will be on one street and not the next, in one neighborhood and not the adjoining one. And Austin is going to be the same way. These three carriers are not going to all overbuild the same neighborhoods because in a competitive 3-way overbuild none of them will make money. Instead it is likely that Austin will get balkanized and chopped up into little fiberhoods for each of the three carriers.

But what about those that don’t get any fiber? There will likely be significant parts of the City where nobody builds. Those houses are going to be on the wrong side of the gigabit divide. Since most of the world is on the wrong side of the gigabit divide that doesn’t sound so bad. But think what it means. Who is going to buy a house in the future Austin that doesn’t have gigabit fiber? This is going to create a permanent and very tangible division of fiber haves and have-nots.

Cities used to protect their citizens against this sort of thing and that is why cable franchises were awarded locally so that a City could make sure that everybody got served. But cities are embarrassingly falling over themselves for Google to the detriment of many of their own citizens. They are going to take care of the richer neighborhoods at the expense of the poorer ones. This is not what cities are supposed to do since they represent all of their citizens. We have had processes in place for years to make sure that telecom companies don’t bully and divide our communities, and now City Hall is in front of the line inviting them to do so.

I say shame on Austin if they wake up five years from now and find that 20% or 30% of their City doesn’t have fiber and is being left far behind. The houses and businesses in those neighborhoods will have lost value and will probably be the seeds of the slums of the future. When we look back twenty years from now I think we’ll see that this short-sighted policy to bow to Google cost the City more money than it gained.

The Battle for the OTT Box

Apple_TV_2nd_Generation_backAmazon this week finally announced the fireTV, an OTT settop box. It’s been rumored for years that they would launch one, and considering how popular AmazonPrime is it’s surprising how long it took for them to do this. But this announcement highlights the giant battle going on in households for control of the OTT market.

And of course, boxes aren’t the only way for homes to get OTT content. In my house we don’t have a television and we watch our content on PCs, laptops, tablets and smartphones. This works for us. And then there are smart TVs. There are decent smart TVs from LG, Samsung, Sony, Toshiba, Philips and Panasonic. Each of these comes with a different system for giving access to web channels. The best of them offer a lot of customization to make your line-up what you want to watch. All come with some modest amount of web gaming.

But the big battle today is with the boxes, primarily between the new Amazon fireTV, Apple, Roku and the dongle from Google Chromecast. This wide array of options must have the average household scratching their head. Every box is different in look and feel, price and features. They vary widely in what you can watch and in the ease of using their interface. And they all are all hoping to control a large chunk of the market

The Amazon fireTV is an interesting platform. They have built in 2 GB of RAM and a dedicated graphics processor. With an add-on $40 game controller this is going to give them the ability for higher quality gaming than the other boxes, although not near the capability of the dedicated game platforms. For non-hardcore gamers who just want to play games on their big screen it should be a good alternative to buying an expensive gaming box

Many of the boxes now have voice activation. With smart TVs you normally have to shout across the room to the TV and this is widely reported to be clunky. The fireTV puts the voice control in the remote. Roku 3 has taken the path of making their remote motion controlled.

The real competition between boxes comes with the programming choices they have built in to the channel line-up. For example, the ROKU 3 line-up has grown to over 1,000 channels and apps. The Amazon fireTV is launching with only 165 and has some clear major omissions such as HBO Go. But one has to suspect those deals will all be made and that they will quickly catch up.

And perhaps the real winner will be the box and company that finally makes a deal for some regular programming to go along with the OTT content. The first one that can bring in the network channels, HBO without a landline subscription and popular programming like ESPN and Disney could be a major competitor to cable companies. Recently an email from Steve Jobs right before he died showed that Apple was hoping to add this kind of content when they release the next generation of Apple TV, and it might be the lack of such deals that has held off that release.

The Amazon fireTV is announced at a price of $99, the same as the Roku and the Apple TV, although both are widely available today for around $95. The Google Chromecast is available today for only $35. I have to be honest and say that if I buy a TV, which I am considering, that I will have a hard time making a choice between these options. I read a lot more about this stuff than the average household and it makes me wonder how people make such a choice. They probably just go with the brand that they feel the most comfortable with rather than making the hard side-by-side comparisons.

More on White Space Wireless

001-Signal-Command-SSILast July I wrote about the Google database that shows the availability of white space radio spectrum in the US. This is spectrum that has been used for years by UHF television stations. In some rural places it was never used and in others it has been freed up as stations have moved elsewhere.

I’ve been hearing about this spectrum a lot lately so I thought I’d talk a little more about it. There are now several trials of the spectrum going on in the US. The first test market was the City of Wilmington NC who implemented this in their municipal network in 2010. They use it to control traffic lights, for public surveillance cameras and other municipal uses. Probably the biggest US test so far is a campus-wide deployment at West Virginia University in Morgantown that launched in July 2013. There only look to be a few dozen of these trials going on worldwide.

So what are the pros and cons of this technology and why isn’t it being deployed more? Consider some of the following:

  • It’s not available everywhere. That’s why Google and others have put together the maps. Where there are still TV stations using some of the bandwidth, only the unused portion of spectrum is available. There are still large areas around most major metros that have some use in the spectrum.
  • This is still a trial provisional spectrum and the FCC has to approve your trial use. I’m not sure why this is taking so long, because the Wilmington test has been going on since 2010 and supposedly has no interference issues. But I guess the FCC is being very cautious about letting WISPs interfere with television signals.
  • We are at that awkward point that happens with all new uses of spectrum, where there is equipment that will work with the spectrum, but that equipment won’t get really cheap until there is a lot of demand for it. But until that demand is believed by a manufacturer, not much happens. It was this equipment cost barrier that killed the use of LMDS and MMDS spectrum in the 90s. There is no equipment on the market yet that would let white space be used by laptops, cell phones or tablets. Instead it must feed a traditional WiFi router.
  • One use of the spectrum is that it can make a better hotspot. I don’t think most people understand the short distances that can be achieved with hotspots today. A 2.4 GHz WiFi signal can deliver just under 100 Mbps out to about 300 feet. But it dies quickly after that and there may 30 Mbps left at 600 feet and nothing much after that. If they put whitespace receivers into laptops this spectrum can deliver just under 50 Mbps out to 600 feet and 25 Mbps out to 1,200 feet. And there is an additional advantage to white space in that it travels fairly free through walls and other barriers.
  • The real potential for the spectrum is to extend point-to-multipoint radio systems. With white space you can deliver a little less than 50 Mbps up to about 6 miles from the transmitter. That’s easily twice as far as the distances that can be achieved today using unlicensed spectrum and a 12-mile circle around a transmitter can make for viable economic returns on an investment. Physics limits this to about 45 Mbps of total bandwidth meaning that a product of 40 Mbps download and 5 Mbps upload is possible. That is certainly not fiber speeds, but it would be a great rural product. The problem comes in in the many places where part of the spectrum is still in use, and in those places the radios would have to work around the used spectrum and the speeds would be correspondingly slower.

It seems like this is a spectrum with a lot of potential, especially in rural places where there are no existing uses of the spectrum. This could be used for new deployments or for supplementing existing WiFi deployments for WISPS. There is equipment that works on the spectrum today and I guess we are now waiting for the FCC here and regulatory bodies around the world to open this up to more use. The US isn’t the only place that used this spectrum for TV and much of the rest of the world shares the same interference concerns. But if this is ever released from the regulatory holds I think we would quickly hear a lot more about it.

A Great Time to Love Technology

slime moldI love science and technology and I read dozens of different on-line publications to see what is going on in the telecom and related science worlds. I find something interesting almost every day in my reading due to the fact that there is so much R&D happening around the world. But last Sunday in a very short period of reading I was struck by the sheer volume of new stull that scientists and engineers are working on. I want to share some of what I found in one short hour of reading as a way to highlight how amazing the world is right now. I read recently that all of human knowledge is now doubling every two years and I can believe it.

This article talks about how slime molds can be used as a more efficient way of designing complex circuits. I’ve done a lot of hiking in my life and slime molds are those veiny orange-yellow molds that you find growing in dark damp places in old woods. But it turns out that slime molds ‘move’ by following nutrition and they can be tricked into following the best paths on circuit boards and highlight the most efficient path to take in a circuit. Now that we are trying to make smaller, faster and more energy efficient chips and boards, every breakthrough like this helps.

There is also now a technology for building a circuit that be applied to your skin like a heat tattoo. This means that almost any technology that can be put onto a small circuit could be cheaply adhered to your skin for personal use. The initial application is probably going to be medical, like the sensors people wear for sleep tests. But soon you can apply a cell phone or a host of other electronics to your skin to wear while you go hiking or running.

How about a biodegradable battery that can melt inside your body? Currently, when you need a medical device that needs power it means an operation to put it in and another to take it out. But this concept means that a wider array of devices can be implanted for things like treating cancer that will just be absorbed by the body when they are done.

In Australia they have come up with a robotic kangaroo that can recapture much of the energy used to propel it. And that recapture of energy is the breakthrough because it means that we can build robots and other machines that can be made to need far less external power to operate. Obviously we can’t build a perpetual motion machine, but this is getting closer.

There is a new 3D printer that only costs $200 and that can print a coffee mug in half an hour. That is getting to the point where people can actually 3D print things they need like replacement parts for an appliance. You don’t hear a lot about this, but 3D printing frees the world from reliance on factories and might be the most transformational technology out there soon.

And finally, there is now a working model of a battery charger that uses biological semiconductors that can charge a smartphone in 30 seconds. This kind of technology could finally free us from worrying about keeping our portable devices charged since things could be recharged in the time it takes to go get a cup of coffee.

These are just the wow sort of technology things that I found in half an hour of reading. In addition to these items I ran across a long list of more telecom specific things that happened last week including things like Sprint offering $650 to move large family plans. Xbox is going to be putting out interactive programming where you can see the show from the perspective of different characters. Qualcomm is coming out with really fast new chips that are going to make for blazingly faster smartphones next year (made note to hold off on buying new phone this year). Leaked documents show that Google has plans for a major Android TV. It is a great time to be a techie, for sure.

The Skinny on U.S. 4G Data Speeds

Cell-TowerI am a statistic freak and I read any and all statistics I can find about the telecom industry. A lot of statistics are interesting but require a lot of heavy lifting to see what is going on beneath the numbers. But I ran across one set of statistics that sums up the problems of wireless 4G data in this country in a few simple numbers.

A company called OpenSignal has an app that people can use to measure the actual download speeds they see on LTE 4G networks. This app is used worldwide and so we can also compare the US to other parts of the world. In 2014 the comparisons were made from readings from 6 million users of the app.

The first interesting statistic is that the US came in 15th in the world in LTE speeds. In 2014 the US average download speed was a paltry 6.5 Mbps across all US downloads using 4G. At the top of the chart was Australia at 24.5 Mbps, Hong Kong at 21 Mbps, Denmark at 20.1 Mbps, Canada at 19.3 Mbps, Sweden at 19.2 Mbps and South Korea at 18.6 Mbps. Speeds drop pretty significantly after that, and for example Japan was at 11.8 Mbps. So beyond all of the hype from AT&T and Verizon touting their network speeds, they have not done a very good job in the US.

But the second statistic is even more telling. The speeds in the US dropped from 9.6 Mbps in 2013 to 6.5 Mbps in 2014. The US was the only country on the list of the top fifteen countries that saw a significant percentage drop from one year to the next. Sweden did have a drop, but they went from 22.1 Mbps to 19.2 Mbps

So what does this all mean? First, the drop in speed can probably best be explained by the fact that so many people in this country are using wireless data. Large amount of users are obviously overwhelming the networks, and as more people use the wireless data networks the speeds drop. Our wireless networks are all based upon the total bandwidth capacity at a given cell site, and so to the extent that more people want data than a cell site is designed for, the speeds drop as the cell site tries to accommodate everybody.

But for the average 4G speed for the whole year to only be 6.5 Mbps there has to be a whole lot more to the story. One might expect Canada to be faster than the US simply because we have a lot more large cities that can put strains on wireless networks. But you wouldn’t expect that to make the Canadian 4G experience three times faster than the US experience. And there are very few places on earth as densely populated as Hong Kong and they have the second fastest 4G networks in the world.

It’s obvious from these numbers that the US wireless carriers are not making the same kinds of investments per customer as other countries are doing. It’s one thing to beef up urban cell sites to 4G, but if those cell sites are too far apart then too many people are trying to use the same site. I would have to guess that our main problem is the number and spacing of cell sites.

But we also have a technology issue and regardless of what the carriers say, there are a lot of places that don’t even have 4G yet. I don’t have to drive more than 2 miles outside my own town to drop to 3G coverage and then only a few more miles past that to be down to 2G. A few weeks ago I was in Carlsbad California, a nice town halfway between LA and San Diego and right on I-5. I couldn’t even find a 2G network there at 5:00 in the evening, probably due to all of the traffic on the interstate.

I hope the FCC looks at these kinds of statistics because they debunk all of the oligopoly hype we get from the wireless carriers. I laugh when people tell me they are getting blazing fast speeds on 4G, because it’s something I look at all of the time when I travel and I have never seen it. When I hear of somebody who claims that they are getting 30 Mbps speeds I know that they must be standing directly under a cell tower at 3:00 in the morning. I like speed, but not quite that much.

See You in Austin

BBC%20Summit%202014pink_woDATE%20-%20regonline-HDerrel Duplechin of CCG and I will be in Austin this week at the Broadband Communities Summit. We will be putting on a seminar on Wednesday afternoon on the topic of Revenues Beyond the Triple Play. If you happen to be coming to the convention we’d love to see you at the session, or look us up.

I feel lucky to have gotten this topic to discuss. If you have been reading this blog you know that we at CCG feel strongly that every triple play provider should be putting energy into developing new products. The revenues we derive from voice are continuing to decline and cable TV is headed down the same path. The time to react to this eventual train wreck is now, while you still have the margins from those products, and not wait until your cash is squeezed.

Every triple play carrier is going to face a pretty simple choice at some time in the near future – either retract your company and become an ISP and sell nothing more than fast data pipes to your customers, or else start implementing new products to replace the sinking triple play products. If you choose to become a dumb pipe provider your future is really simple. You’ll need to strip out employees and systems and become a pure ISP and do nothing but provide the fastest pipe you can create.

If you elect to remain as a full-service provider you have a much more challenging task. No one or two or even three products is going to replace the revenues and margins you have been getting from voice and cable. Rather than have a few products that most of your customers buy, you are going to need a lot of products that only have a 5% to 10% penetration. There are no more big magic bullets. I offered to help one company look at their future was told that they would pay to have me come see them if I could tell them what the next big product is. That is exactly the wrong question to ask because there isn’t going to be one. The small carrier industry has frankly gotten a bit spoiled in that we had products that were relatively easy to sell. But those days are over and we are going to have to do what many other businesses do and scramble for every customer and every dime we can make

Both choices I have laid out are probably valid ones, and both are very different than what we do today. For instance, if you choose to be nothing more than an ISP you are going to have to dismantle most of your company and staff to stay profitable.  It can be done. and if you want a model of what that looks like look at the many WISPs in the marketplace today.

But if you choose the full service provider route what will you sell? There are a number of potential products you can sell today and many more coming in the future. Today you can consider products like security, energy management, home automation, wireless MVNO, IP Centrex and OTT Video. You can also do what we call crossing the threshold, meaning that you make a product out of having your technicians do whatever businesses need in the telecom and computer space. We know companies doing each of these products and they can all be moderately successful.

There are also a lot of interesting things coming. Home automation is the very first step of getting into the Internet of Things. This is going to quickly grow into areas like medical monitoring, crop monitoring, flock and herd monitoring. And mostly the things that are coming we haven’t thought of yet as carrier products.

The biggest challenge of transitioning to many new products is to figure out a way to be efficient with new product development. You can no longer take a year or two to put together a new product. You have to roll them out quickly and learn how to sell them efficiently. You will have to do this in-house or collaborate with other carriers. If you can figure this out you will probably thrive and survive. But if you don’t do anything and stay blindly on today’s path, at some point you will no longer be viable and will fail. Our industry has never faced such a divergent set of options and this is both a scary time and an exciting time to be in the business.

Bringing the Internet Everywhere

AldoCorazza1904WPFacebook has announced that they are working to create platforms that will deliver Internet access to the large parts of the planet that don’t have it. They are considering several options including flying drones and satellites, both using infrared lasers to beam connections to users. This is being done under the name of the Facebook Connectivity Lab and includes experts from NASA’s Jet Propulsion Lab, the Ames Research Center and the National Optical Astronomy Observatory.

Google announced something similar last year with its project Loon that is looking to bring Internet access using large balloons. Google has already run tests with the technology last year in New Zealand. The balloons fly high up in the stratosphere, far above weather, and they can be placed where needed by moving them higher or lower to use wind layers that travel in the desired direction.

I find it fascinating that these companies see enough benefit to themselves to spend a huge amount of money to bring Internet to people that can’t get it today. There are varying estimates of how much of the world has Internet access today, but I’ve seen several estimates that say that half of the world still doesn’t have access. So these two companies have decided that they want to be the first to get to the remaining billions of people

Certainly these companies both will benefit financially by having large numbers of additional customers. And one can certainly expect that if Google or Facebook brings Internet access to some remote area that the Internet access they deliver will be highly flavored by their brand name.

Both companies are actively engaged in gathering data about the world and the people who use their services. Once can only imagine how much more they could learn if they are able to add an additional billion people to that data gathering effort.

Nobody was surprised when Google took on this effort since they are infamous for tackling one-off and interesting ideas. But I think a lot of people were surprised to see Facebook tackle this. But this is not Facebook’s first foray into hardware. They have spent a lot of effort to redesign the servers used in data centers and they have developed a new generation of servers that are faster and that yet need less power and generate less heat.

And Facebook is not just going after the unserved and rural third world. They are working on a project that will use solar-powered high-altitude planes that stay aloft for months over the suburbs of big cities.

And recently Facebook bought Oculus, a gaming company that specializes in virtual reality headsets. They have in mind to create an immersive Facebook experience. I don’t know about you, but I am not sure I am quite ready to get too immersed with my Facebook friends!

The FCC Grants More Wireless Spectrum

FCC_New_LogoIn several actions on Monday the FCC granted for wireless spectrum for use as WiFi and cellular data.

In the WiFi arena the FCC freed up three new bands of spectrum for use as WiFi. The specific bands that are now available to the public include spectrum between 5470 – 5725 MHz and another band between 5725 – 5850 MHz. These two bits of spectrum were already surrounded by other WiFi spectrum and were referred to in the past as WiFi potholes. This now creates a continuous band between 5150 – 5250 MHz.

The two new bands are together 75 MHz of new WiFi spectrum and begin the process the Commission started in 2010 in the National Broadband Plan Order, when they said that they would find 500 new MHz of public spectrum.

Of course, there are not devices on the market capable of utilizing these two pieces of spectrum immediately, but one would expect that devices shipped fairly soon will have the capacity. Because these two pieces of spectrum were islands within a larger band of WiFi spectrum it will be easier to include them than it was to exclude them. These new pieces of spectrum will make it that much easier to use our insatiable use of WiFi for cellular offload and other mobile computing needs like watching video.

The FCC also announced that there will be an auction for new bandwidth that will be available to cellular carriers for 4G wireless. The three bands exist between 1695 – 1710 MHz, 1755 – 1780 MHz and 2155 – 2180 MHz. The FCC labeled this new spectrum as Advanced Wireless Services 3 (AWS-3)

There is a catch, though with these spectrum in that any cellular company that uses it has to share it with existing government wireless systems. Nationwide there are over 3,100 registered uses in this spectrum that range from the Department of Homeland Security for border surveillance to the US Army for tactical communication. The FCC is not planning to move most of these uses out of the spectrum but will instead expect any carrier that buys the spectrum to somehow coexist with the existing uses.

The major wireless carriers aren’t nuts about the idea but have agreed to run tests to see how they might share the frequency with the government. Obviously they would not share networks, but in areas of contention each would have authorized use of the spectrum somehow. Obviously the wireless carriers would love the spectrum cleared for their exclusive use, and there will be many markets where the spectrum is clear or mostly clear.

The FCC announced that it is going to auction off this spectrum in a mixture of large and small blocks, and in large and small geographic areas. They hope this will entice smaller regional carriers to go after spectrum for use in rural markets. They plan on auction rules that give an edge in these small markets to the small carriers plus T-Mobile. The proceeds from this auction are aimed at helping to pay for FirstNet, the nationwide emergency response network. Let’s just hope that by requiring bandwidth sharing that the big carriers show up and bid. They don’t always do so as evidence by the last auction where the only bidder was Dish Networks at the required minimum bid.

Certainly these announcements are good news for anybody with a cell phone, and having additional WiFi spectrum is going to make us that much more ready for the Internet of Things. As a country we have an insatiable demand for wireless spectrum and this is one more step towards making enough spectrum available to keep us humming along.