New and Better WiFi

Wi-Fi Signal logo

Wi-Fi Signal logo (Photo credit: Wikipedia)

There are two new standards for WiFi that will be hitting the market in the next few years. The standards are 802.11ac and 802.11ad. The two new standards use different spectrum with 802.11ac at 5 GHz and 802.11ad at 60 GHz. Both new Wifi standards will be able to deliver up to 7 gigabits per second, compared to today’s WiFi that tops out at 600 megabits per second.

Looking at basic spectrum characteristics there are four major differences in the way these two standards will use the spectrum:  bandwidth available, propagation characteristics, antenna size and interference.

The maximum data speed that can be delivered by any radio spectrum is limited by the amount of spectrum used and the signal-to-noise ratio. This limit is defined by the Shannon-Hartley Theorem. The 802.11ac at 5 GHz can use about 0.55 GHz of spectrum. The 802.11ad at 60 GHz can use up to 7 GHz. 802.11ac has channels that are 160 MHz wide while 802.11 will have channels that are 2,160 MHz wide. But the channels in 802.11ac can be bonded which will allow it to deliver almost as much bandwidth as 802.11ad.

802.11ac will use the same 5 GHz spectrum that is used by today’s Wifi and will have similar propagation characteristics. But the 802.11ad spectrum at 60 GHz will not travel through bricks, wood or paint and thus this technology will be most useful as an in-room technology.

For these spectrums to achieve full potential they need to be able to transmit multiple signals, meaning that they need multiple antennas. Antenna size is proportional to the wavelength being transmitted. A 5 GHz antenna has to be about an inch long and spaced at least an inch apart to be effective. But 60 GHz antennae only need to be 1/10 inch long and apart. This is going to make it easier to put 802.11ad into handsets or into any small device.

Finally is the issue of interference. There is already a lot of usage in the 5 GHz band today. In addition to being used for WiFi the spectrum is used for weather Doppler radar. There are also a few other channels in the band that have been allowed for other uses. And so 802.11ac will have to work around the other uses in the spectrum. The 60 GHz spectrum range is mostly bare today, and since this will go such short distances there should be very few cases of interference. However, multiple 801.11d devices in the same room will interfere with each other to some extent.

The 80211.ac standard is pretty much set but won’t be fully certified until 2014. However, there are already devices being shipped that include some of the features of the standard. For example, it’s included in the Samsung Galaxy S4 and MacBooks. But today’s version uses beamforming to send the signal to one device at a time. Beamforming means that the signal is sent to one device from each separate antenna in an array, but at slightly different times.

Still to come is the best feature of 80211.ac, which is to support separate sessions with different devices, different priorities and different power needs. This feature is called multi-user MIMO and it will revolutionize the way that WiFi is used. For example, you will be able to make a WiFi voice call while simultaneously downloading a video from another device. Your WiFi chip will determine the location of each device you will be talking to and will initiate a prioritized session with each. In this example it can give priority to the voice call.

The fully deployed 80211.ac will be the first generation wireless that is getting ready for the Internet of Things. It will be able to communicate with multiple devices in the environment at the same time. It will turn smartphones and tablets into workhorses able to gather data from sensors in the environment.

802.11ad is going to be far more limited due to its inability to pass through barriers. The most likely use for the spectrum will be to create very high-speed wireless data paths between devices, such as connecting a PC or laptop to a wireless network. It should be able to achieve speeds approaching 7 Gbps with only one device and one path in play.

One would expect by 2016 or 2017 for devices using these two technologies will become widespread. Certain in the telecom industry an upgrade to 802.11ac will allow carriers to deliver more bandwidth around a home or office and be able to handle multiple sessions with wireless devices. This new technology is a fork-lift upgrade and is not backwards compatible with earlier WiFi devices. This means it will take some time to break into the environment since all of the local wireless devices will need to be upgraded to the new standard. One would expect first generation 802.11ac routers to still include 802.11n capabilities.

The Battle of the Boxes

Image representing Roku as depicted in CrunchBase

Image via CrunchBase

For years we’ve been told that the day was coming when we would be able to get rid of the settop boxes supplied by the cable company and instead use our own smart devices to receive cable TV. A number of years ago the FCC tried to promote this with its cable card order that said that customers must be allowed to bring their own devices and that the cable companies then had to give them a discount for doing so. But cable cards were a massive failure and only a very small percentage of customers went through the hassle of trying to use their own settop boxes.

And then we heard a lot of talk about how TVs were going to get smarter and that we would be able to plug our cable into the back of the TV and eliminate the settop box. And that actually worked for a few years. But then cable companies started converting their systems to all-digital to make more room for faster cable modems, and analog transmissions are quickly becoming a thing of the past.

So we are no closer today to being able to bring our own smart box to the game and almost every home still has a settop box or a DTA (Digital Television Adapter) for which the cable company charges them a fee of around $5 or more per month.

Meanwhile there are a host of new boxes in the world that are designed to help customers bring the Internet and its many programming options to the TV. Among these are Roku, Apple TV and Sony Playstation. There are a number of households that are using these boxes to replace the cable company altogether and are settling for the programming that can be found on the web. These boxes let people subscribe to things like NetFlix, Hulu or Amazon Prime, which are much cheaper than the typical cable subscription.

Time Warner is taking an interesting approach to the battle of the boxes. In March they announced a deal to allow people to use a Roku box in place of a Time Warner settop box. In June they announced a deal that allows customers to use high-end Samsung TVs without a settop box. And it was reported last week that they are making a deal for people to use Apple TV in place of their settop box.

Image representing Netflix as depicted in Crun...

Image via CrunchBase

Time Warner is doing this by developing a specific App that works on each device. A customer can download an app that will let the Roku box mimic the Time Warner settop box and save the monthly fee. It’s reported that the app is not as good as the real thing and the line-up and some reception is not as good as using a TV. But Time Warner sees some advantages to this arrangement. While they lose the typical $5 per month charge for the settop boxes, they also get out of all of the obligations that go with providing settop boxes. No cable provider likes being in the settop box business. They require truck rolls to install and sometimes to retrieve. They break and must be replaced. And a surprising number of people move, pack and take their boxes with them. Cable companies are probably a net winner by getting out of the settop box business.

But I see a few problems with Time Warner’s approach. First, Time Warner is headed down a path that is going to make their software life complicated over time. Soon they will have deals that require them to supply apps for three different boxes. But over time that number is going to mushroom. There will eventually be many generations of Roku and Apple TV and every other current box as they get updated and outdated. And over time there will be dozens, if not hundreds of devices that will be able to get TV signal onto a TV. Looking into the future five or ten years I see Time Warner’s strategy getting very complicated.

But the biggest danger I see is that Time Warner’s strategy is inviting the fox into the henhouse. Do they really want to promote customers to use boxes that bring Netflix and Hulu into the house and make it easier for customers to cancel or downgrade their Time Warner cable TV service? Obvious some people are going to be buying these boxes anyway, but should the cable company be promoting people to buy a box that makes it easier to bypass them? It seems like a risky bet to me.

Even if Time Warner is onto something, this solution is not for everybody. Certainly the handful of other large cable companies could follow suit, but it’s hard to see this working for smaller cable companies. And this solution won’t work at all for companies that deliver IPTV over DSL or fiber like Verizon, AT&T, municipalities and hundreds of independent telephone companies and small CLECs. The IPTV stream requires a proprietary device to descramble the signal (and  scrambling for IPTV is required in the contracts with the content owners), and so these providers cannot move customers to alternate boxes.

Time Warner’s approach is unique and we will have to see if any other cable companies follow them. This is a home run for the box makers, but I’m not so sure that Time Warner wins too.

Spectrum Winners and Losers

AT&T posted a short statement on their public policy blog called ‘Inconvenient Facts and the FCC’s Flawed Spectrum Screen’. In that blog post they complained that the FCC had failed to apply the spectrum screen to Softbank’s acquisition of Sprint and Sprint’s acquisition of the rest of Clearwire. And AT&T is right. The FCC has been incredibly inconsistent in the way it looks at wireless acquisition and mergers.

So what is the spectrum screen? The spectrum screen is a set of internal rules at the FCC that they use to determine if any wireless carrier owns too much spectrum in a given market. Historically the FCC had a generic rule that said that no one company could own more than one-third of the spectrum usable for wireless in a given geographic area. This spectrum screen was applied both to attempts of wireless carriers to buy new spectrum or to mergers between wireless carriers.

The FCC has been very inconsistent in the way they apply the existing screen. Last September they announced that they were going to look at the way the spectrum screen ought to work. But meanwhile, during the last year the screen has been applied (or ignored) in the following ways:

  • When the FCC looked at the proposed AT&T / T-Mobile merger they rejected the merger in part because they said that the acquisition would fail the screen test in 274 CMAs that covered 71 of the top 100 markets and 66% of the US population. However, the FCC fudged the spectrum screen in coming up with those numbers. At that time the spectrum screen set the maximum amount that any one carrier could own in one market at 95 MHz, which was one-third of the spectrum available for wireless carriers. However, in coming up with their conclusion the FCC lowered that threshold to 90 MHz in judging the merger. That might not sound like a big difference, but it lowered the number of markets affected by the merger by 84 and reduced the overall problem to less than 50% of the top 100 markets and 50% of the US population. That is still a lot of places where the proposed merger would have failed the spectrum screen, but AT&T had announced plans to divest of bandwidth as needed to meet the FCC test. The FCC made this change in the spectrum screen without any public input.
  • When Verizon acquired spectrum in the 1.7 to 2.1 GHz band the FCC applied this fully to their spectrum screen band. They did the same when AT&T acquired 2.3 GHz spectrum.
  • And then there is the recently announced approval for Softbank to acquire Sprint and Clearwire spectrum. The Clearwire spectrum at 2.5 GHz is right next to the 2.3 GHz spectrum recently acquired by AT&T. While the FCC fully counted the spectrum AT&T purchased against the spectrum screen, in the Softbank acquisition the FCC counted only 55.5 MHz of the Clearwire spectrum against the new Softbank spectrum screen even though there is an average of 140 MHz available in most of the Softbank markets.

So AT&T has a legitimate gripe. The FCC seems to apply the spectrum screen to get the results they want. It looks a lot more like the FCC is picking market winners and losers than they are protecting the public. The spectrum screen was established in the first place to promote competition. The FCC wanted to make sure that a given carrier did not get so much spectrum in a major market that they could effectively close out competition. They also didn’t want carriers to be able to hoard spectrum for future use. But the FCC no longer seems to be using market protection as the criteria of deciding who can and cannot merge.

It’s clear that the FCC didn’t want AT&T and T-Mobile to merge. They thought that it was bad for competition to lose one of the major carriers in the country. But it was wrong for them to fudge the spectrum screen as a way to justify their position rather than just oppose the merger on pure competitive grounds.

And in the case of Softbank they are going in the opposite direction. They obviously want a new competitor to AT&T and Verizon and they are ignoring the spectrum screen to make sure that happens.

Why does all of this matter? Like anything else it’s a matter of money. Wireless carriers have two ways that they can address congested conditions. They can just add more cell sites, closer and closer to the old ones. In effect spectrum is reusable and each new cell site uses the original spectrum freshly. The other solution is to just layer on a new spectrum in a crowded area so that no new cell sites need to be constructed. That is much cheaper than building cell sites, and so carriers want more and different spectrum in major markets to meet the seemingly insatiable and rapidly growing demand for mobile data.

The issue is going to get a lot worse. President Obama announced a new policy that will release up to 500 MHz of new spectrum for wireless use over the next five years. So there is going to be a new land grab by all of the carriers and the FCC needs to get ready.

It just seems to me like the FCC needs to toss out the spectrum screen and come up with a new way to determine the right amount of competition. In the two biggest merger cases before them in the last few years they blatantly ignored their own spectrum screen rules to get the result they wanted. That is evidence enough that we need to stop having the fiction of a spectrum screen. If the FCC wants to be in the game of picking market winners and losers they just need to be upfront about it.

G.Fast

You are going to start hearing about a new technology that may infuse some life back into existing copper networks. The technology is being referred to as G.Fast. This technology promises to be able to deliver very fast speeds up to a gigabit over copper for very short distances.

Some are referring to G.Fast as a last mile technology, but it is really a drop technology. The distances supported by the technology are so short that this is going to require fiber to the curb, or as some are now calling it, fiber to the distribution point.

Alcatel-Lucent and Telekom Austria just announced a field trial of G.Fast. That trial achieved a maximum speed of 1.1 Gbps over 70 meters and 800 Mbps over 100 meters for brand new copper. On older copper the speed dropped to 500 Mbps for 100 meters.

Current copper technologies use only a small portion of the theoretical bandwidth available on a copper wire. For example, most VDSL2 systems deployed today use up to 17 MHz of spectrum on the copper. G.Fast can provide more speeds by using more of the available spectrum and will be able to use somewhere between 70-140 MHz on copper. Plus G.Fast will be more efficient. Today DSL functions by dividing the data path into sub channels which each contain about 15 bits of data. Engineers are looking at coding and modulation techniques that will increase the bits per sub channel for G.Fast and thus increase speeds more.

G.Fast also will benefit by an existing technique called vectoring. This technology is used today with VDSL2 and eliminates crosstalk interference between copper pairs. It does this by monitoring the noise on copper and then creating an anti-noise signal which cancels the noise in the same was as is done by noise-canceling headphones.

Right now Alcatel-Lucent is spending a lot of time on G.Fast because they see a big opportunity to make more money out of the old copper networks. So let’s look at the issues that a large telco like AT&T will face when considering the technology:

  • Because the distances to deploy G.Fast are so short, the carrier is going to have to build fiber past every customer, just like in a FTTH network. A large carrier like AT&T has some advantages over a fiber overbuilder in that they can overlash fiber onto existing copper on pole lines. This is cheaper and faster than putting up fiber for a new provider who has to deal with pole make-ready costs.
  • Copper drops are generally the worst copper in the network. These wires get banged around by wind and suffer from repetitive water damage and are the weak point in the copper network. The promised savings from G.Fast is to lower the cost of installation at a customer. Some of this savings disappears if too many homes need a new drop to make it work.
  • G.Fast will save the cost of getting into the house. Once connected to an existing telephone NID on the outside of the house the signal can go anywhere in the home that is already wired for telephone. But the distance issue quickly kicks in and I would expect carriers to take this to a wireless right inside the house.
  • Savings are going to depend on how inexpensive the G.Fast electronics are compared to FTTH electronics.
  • Large telcos have relied for years upon customer self-installation of DSL and they will need G.Fast to work the same way.

So the savings to somebody like AT&T come from a) cheaper fiber installation costs because of the ability to overlash, 2) the ability in many cases to use existing drop and inside telephone wires, and 3) the ability to have customers self-install the product to avoid having to go into the home.

There are still a lot of technical issues to consider and overcome. Some issues that come to my mind include things like overcoming existing splices in the copper, and making sure there is no interference with existing DSL.

The expected time line for the deployment of G.Fast is as follows:

  • Standards finalized by spring of 2014.
  • Chip sets developed in 2015.
  • First generation hardware available in 2016 that probably won’t support vectoring.
  • Mature second generation equipment available in 2017.

Since a carrier has to build fiber everywhere for this to work, the technology is really competing against FTTH. By the time this is readily available there may be lower-cost units for FTTH deployment and I think any carrier would prefer an all-fiber network if possible.

Why Aren’t You in the Security Business?

Security camera

Security camera (Photo credit: Wikipedia)

The security business is booming. Both residents and businesses want security cameras and other monitoring devices to keep an eye on their property when they aren’t there. Everybody with a wireline network should be considering offering security services of some type. There are a number of different ways to approach the security business, as follows:

Security Cameras. Your customers are interested in security cameras. They may want them for the traditional purpose of watching their business. But they now want them for a whole lot of other reasons. Farmers want them to keep an eye on livestock and on expensive farm machinery. Residents want to keep an eye on the babysitter, the pets or the kids when they aren’t at home. People want to be able to see who is at the front door before they answer it.

Your customer can go to Walmart or Radio Shack and pick up a run-of-the-mill camera. But given a choice, your customers probably want a quality HD camera, professionally installed. There is a huge difference in the picture quality between an older analog security camera and the new HD cameras. It’s the difference between being able to see that there is somebody in your home and the ability to read the name tag on the pocket of their shirt.

Most of your customers are not going to be comfortable with or have the knowledge needed to install an HD camera properly. Ideally cameras ought to be installed on coaxial cable rather than using WiFi so that it will work if the WiFi gets knocked out. To be effective a camera also ought to be on some kind of backup power if the customer wants to be able to see what is happening if the power to the premise is cut. You will want to choose cameras that come with the ability to let the customer see what the camera sees using their cell phone.

Why is this a business opportunity? I have been advocating in this blog that telecom businesses need to decide if you are going to be a full-service provider or a dumb-pipe provider going into the future. If you are going to be a full-service provider then you should look for opportunities to go into customer’s homes and businesses. Services like installing security cameras are not going to drive a lot of revenue. Instead, it will pay for a few hours of your installer’s time, but it will give you a chance to get to know your customers better, to upsell them on other services and to create loyalty since you are the provider who will take the time to visit and listen to them.

Recording. While there isn’t a lot of money to be made in installing cameras, you can sell a monthly service to record what the cameras see. This requires you to establish a high-speed connection to the camera and to have recording devices capable of storing and retrieving video. Ideally you will only record a camera when there is something to record. This can be done by including a motion detector that will trigger the recording. Any recordings you save should also record a time stamp so that you know when the recording was made.

There are off-the-shelf systems for recording video in this manner and you don’t have to reinvent the wheel. But investing in this kind of product line means that you will need to do the math and figure how many customers you will need to justify getting into the business. The normal pricing for this service would consist of a monthly fee to record the images plus a fee when customers want to retrieve recordings over some set limit of times.

Monitoring. The next level of security involves monitoring and this gets into the area of more traditional burglar alarms. There are a number of well-known nationwide brands of security monitoring like Frontpoint, ADT, Vivint, Pinnacle and Life Shield.

How can you compete against the nationwide firms? The burglar alarm business has two components – selling / leasing the hardware and the monitoring function. You can buy the same security system hardware used by any of the nationwide firms. There is a wide array of different systems available. The nationwide companies make a lot of money on the hardware and the installation. They generally advertise a low price but then quickly try to upsell customers to get additional hardware. You should be able to compete and beat the pricing that these firms offer on hardware. And you can offer this without the somewhat sleazy sales tactics that many of them use. Tout yourself as the ‘honest’ firm and many people will be interested.

Second, you can now buy monitoring services on a wholesale basis. There are security monitoring centers that will act as your back office to monitor the alarms and dispatch fire and police as needed. You can easily mark-up their fees and still make a nice monthly margin for monitoring a customer.

Many customers have been through the mill with the nationwide firms since many of them deploy high-pressure sales tactics. Customers are going to prefer to go with somebody they know and trust and who gives them what they need for an affordable price without the high-pressure sales.

The Full Deal. There are also upper-end security systems available that come with the latest high-tech monitoring devices. There is a wide array of different sensors available today that let a business test for all kinds of events. The upper end systems typically are for businesses that want to do a better job of monitoring both security and safety at their premise.

Any carrier can obviously get into the high-end security businesses because anybody can buy the systems used. But my word of caution is that this business line requires a lot of research and the companies you will compete with know what they are doing.

Google and Whitespace Radios

Image representing Google as depicted in Crunc...

Image via CrunchBase

Last week Google received approval to operate a public TV whitespace database. They are the third company after Telcordia and Spectrum Bridge to get this designation. The database is available at http://www.google.org/spectrum/whitespace/channel/index.html and is available to the public. With this database you can see the whitespace channels that are available in any given market in the country.

The Google announcement stems from a FCC order in April, 2012 in FCC Docket 12-36A1 which is attached. This docket established the rules under which carriers can use whitespace spectrum. Having an authorized public spectrum database is the first step for a company to operate in the spectrum.

You may have seen recent press releases that talk about how Google proposes to use tethered blimps to operate in the whitespace spectrum. They are calling this system ‘SkyNet’, a name that sends a few shiver up the spine of movie buffs, but the blimps are an interesting concept in that they will be able to illuminate a large area with affordable wireless spectrum. By having their database approved, Google is now able to test and deploy the SkyNet blimps.

The whitespace spectrum operates in the traditional television bands and consists of a series of 6‑megahertz channels that correspond to TV channels 2 through 51, in four bands of frequencies in the VHF and UHF regions of 54-72 MHz, 76-88 MHz, 174-216 MHz, and 470-698 MHz. Whitespace radio devices that will work in the spectrum are referred to in the FCC order as TVBD devices.

For a fixed radio deployment, meaning a radio always sitting at a home or business, a TVBD radio must be able to check back to the whitespace database daily to makes sure what spectrum it is allowed to use at any given location. Mobile TVBD radios have to check back more or less constantly. It is important for a radio to be able to check with the database because there are licensed uses available in these spectrums and a whitespace operator needs to always give up space to a licensed use of the spectrum as it arises.

This means that TVBD radios must be intelligent in that they need to be able to change the spectrum they are using according to where they are deployed. Whitespace radios are also a challenge from the perspective of radio engineering in that they must be able to somehow bond multiple paths from various available, yet widely separated channels in order to create a coherent bandwidth path for a given customer.

There are whitespace radios on the market today, but my research shows that they are still not particularly affordable for commercial deployment. But this is a fairly new radio market and this is part of the normal evolution one sees after new spectrum rules hit the market. Various vendors generally develop first generation devices that work in the spectrum, but the long-term success of any given spectrum generally depends upon having at least one vendor that finds a way to mass produce radios so that they can reduce the unit costs. There are some spectacular failures in several spectrums that have been released in the last few decades, such as MMDS, that failed due to never having reached the acceptance level of producing affordable devices.

But one might hope that Google will find the way to produce enough radios to make them affordable for the mass market. And then maybe we will finally get an inkling of Google’s long-term plans. There has been a lot of speculation about Google’s long term plans as an ISP due to their foray into gigabit fiber networks in places like Kansas City and Austin. And now, with SkyNet we see them making another deployment as an ISP in rural markets. If Google produces proprietary TVBD radios that they only use for themselves then one has to start believing that Google has plans to deploy broadband in many markets as an ISP as it sees opportunities. But if they make TVBD radios available to anybody who wants to deploy them, then we will all go back to scratching our heads and wondering what they are really up to.

I have a lot of clients who will be interested in whitespace radios if they become affordable (and if they happen to operate in one of the markets where there is enough whitespace channels available). Like others I will keep watching this developing market to see if there is any opportunity to make a business plan out of the new spectrum opportunity.

Are You Spending too Much on Mailings?

English: First 4 digits of a credit card

English: First 4 digits of a credit card (Photo credit: Wikipedia)

When I look at client’s books, one expense that I almost always think is too high is what companies spend on mailings for billing and marketing. Every carrier has tight budgets these days and so it is important to get by with less. Anywhere you can cut back on an unnecessary expense goes straight to the bottom line. There are some fairly easy things that can cut down on the postage, supplies and labor that goes into the mailings you are doing today.

Simplify Billing

Go Paperless. One of the first questions I always ask is if a company has given their customers a chance to go paperless for billing. I know in my personal life that I have been able to go paperless for every monthly bill I get except my electric bill. And even they let me check my balance on-line. I am sure that a number of your customers now pay their bills by electronic checks and don’t return a payment in the envelope you provide for them.

So you need to give your customers the option of going paperless.  Most companies who have done this have been able to cut down on the number of bills that they mail out by 50% – 70%. Some companies have carried this to an extreme and now charge extra for a paper bill as a further incentive for customers to go paperless.

Bank Debits / Credit Cards. You might also want to consider giving your customers the option to pay by bank debit or credit card as a way to get more of them to go paperless. This also improves your cash flow significantly. But it also means you have an extra obligation to keep their banking information very safe.

Typical credit card fees are around 3% of the bill, so take that cost into consideration when looking at this option. The credit card fee is a bargain compared to cost of mailing for a customer with a $50 bill. It’s not much of a bargain for a carrier paying for a DS3.

Portal. I recommended in an earlier blog that you create a portal so that customers can look up their current and past billing and payment history on-line and also can add or drop products. Such a portal makes it easier for customers to pay you electronically and will help you go paperless.

Simplify Your Products. I never miss a chance to say that you should simplify your product offering to make your billing easier. The easier the billing the easier it is to go paperless.

Marketing and Mailing

A lot of carriers still use mailings as their primary tool for marketing. What they fail to recognize is that there is a large percentage of their customers who never read bill stuffers. And so they end up spending a large portion of their marketing budget trying to sell to only a subset of their customers while another subset of customers never even gets their message.

I am surprised by the number of companies that don’t take the time to measure how successful their mailing campaigns are. It’s easy to tie specific mailings to offer codes so that you know where customers got your message.

I am not saying to not conduct mail campaigns, but also to email these to customers when possible. Emailing cuts down on postage and printed materials and means you can do more campaigns for the same money. And you will reach more customers.

Bill Stuffers and Newsletters. Companies often say that they don’t want to go paperless because they want to continue to send bill stuffers and newsletters to their customers. But bill stuffers and other marketing materials can all be sent to a lot of your customers electronically.

One thing to remember is that today a lot of people are looking at emails on smartphones. Your web page and any advertising materials need to be put on the web in both normal html and also in a phone-friendly format. Otherwise you will have ignored a percentage of your customers. Your bills, newsletters, portal and everything marketing related needs to be able to be seen at a decent size on a smartphone to be effective.

The National Broadband Map

Seal of the United States Federal Communicatio...

Seal of the United States Federal Communications Commission. (Photo credit: Wikipedia)

Last Thursday the FCC voted to take over the data collection for the National Broadband Map. The Map was created as part of the funding for broadband supplied a few years ago by the Stimulus package. The Map was created and administered by the NTIA (National Telecommunications and Information Administration) with input from the states, and that funding is now running out.

Back when the Map was suggested I thought the concept was a good one. But as soon I saw that the data gathered for the Map was to be self-reported by carriers I knew that there were going to be problems. And sure enough, when the first generation Map was produced it was full of errors – big errors.

I work with a lot of rural communities and I have reviewed the maps in many areas of the country and compared it to the actual deployment of broadband. Some communities have developed their own maps – and they did it the hard way. They sent people around to see where broadband was available. A lot of this can be done by somebody who knows how to look up at the cables. It’s easy to know where cable modems are available by the presence of coaxial cable on the poles. And rural DSL generally has repeaters that can be spotted by the eagle-eyed observer. And it’s not hard to look at your cell phone to see how many bars of data you can get. But the best test of where broadband is at is done by knocking on doors and asking people what they are able to buy.

As an example of what I found, let me talk about the issues found in just one county in Minnesota. The Map showed that most of the County had landline broadband availability. The County is very typical of rural areas and the County Seat is the largest town in the County. There are half a dozen much smaller towns and everything else is rural. A large chunk of the rural area is a national forest where very few people live. Most people live in close proximity of the roads in the rural areas.

The reality in this County is that even in the several of the smaller towns the DSL is so slow that it is hard to think of it as broadband. It’s more like dial-up plus. And there was no cable modem service from the cable company outside of the County Seat. And as is typical with DSL, as one goes outside of the towns the quality of the DSL quickly degrades with distance from the DSL hub. We’ve always called this the donut effect with large areas of no broadband surrounding rural towns that have DSL and/or cable modems.

The Map also showed that almost every populated area of this Minnesota County had 3G wireless data available. It’s a very hilly and rugged place and probably half of the county by area can’t even get cellular voice calls, let alone data. But even where voice is available there are many areas that can’t get cellular data. The Map was just wrong about this.

Everywhere that I have helped communities look at the Map we have seen the same thing. The Map shows broadband that isn’t there. It shows cellular data coverage that isn’t there. And it often shows providers that are supposedly serving the counties that nobody ever heard of.

And this is not true for just rural counties. I have helped two suburban counties near large cities look at the Map and they found the same situation. The Map showed areas that are supposed to have broadband where their citizens still have dial-up or satellite. And cellular coverage was exaggerated on the Map.

An obvious question is why this matters? The national Broadband Map has only been around for only a few years and anybody who has ever looked at it knows it us full of inaccuracies. The problem is that the federal government now relies on the Map for several purposes. For instance, if you want to get federal money by loan or grant to deploy rural broadband the assumption is that the Map is good. It is then your responsibility to show where the map is wrong.

And the FCC uses the Map when it talks about the availability of Broadband in rural America. The Map has been overlaid with Census data to count how many households can get broadband. This produces a very distorted picture of who has broadband. There are pockets of people without broadband in even some of the most populated counties in the country and the Map simply misses them. And in rural areas the Map can be very wrong.

The FCC just took over responsibility for the Map. From my perspective they either need to do it right or get out of the mapping business. It’s not easy to get it right, but it can be done. One of the easiest steps they could take would be to give counties the authority to clean up the maps for their areas. Many of them would be glad to do that. And broadband availability is not static. There are areas all of the time getting or losing broadband. If the FCC won’t take the time to get the Map right they should just let it die as another impractical idea.

Who is a Cable Company?

wikipedia:RG-6 Wikipedia:Coaxial cable

wikipedia:RG-6 Wikipedia:Coaxial cable (Photo credit: Wikipedia)

There are regulatory battles that tackle issues of great importance, but there are also battles, which if brought to the public’s attention would leave them shaking their heads. Currently there is one such battle going on at the FCC.

The battle is a simple one that defines who is a cable company. This kind of regulatory battle comes up all of the time because of the nature of the way that regulation is written. Traditional cable TV has been around since the 1950’s when it brought network channels to remote rural markets which had no over-the-air reception. But the industry as we all now know it exploded in the 70’s when the industry was deregulated and new programming was created in the form of the many networks we now all watch.

As often happens, the FCC regulations concerning cable TV were written to be very technology specific. For many decades there was only one way to be a cable television provider, and that was to string coaxial cable to deliver cable signal to homes. The original cable technology got a major upgrade when fiber was brought into the network and most cable companies upgraded to hybrid fiber/coax (HFC) systems. But the new HFC technology still delivered the cable signal to the home using the same coaxial cables.

But then, as invariably happens with technology, something new came along. First were the satellite providers. They don’t use any wires and instead put satellites into low orbits and send the signal down to everybody that is under the satellites. And more recently came IPTV (IP-based delivery of cable signal using either DSL over copper wire or fiber). IPTV differs from traditional cable TV in that it typically only sends the signal to the customer for the channel they are watching while traditional cable transmits all of the channels all of the time. And there have been other technologies used during the years, such as several cable systems that were developed that beamed the signal to customers using a spectrum referred to as MMDS.

One would think that as new technologies are developed that do the same things as older technologies that regulations would just be changed as needed. After all, the general public doesn’t much care about the technology used to deliver their cable programming. I think most people would agree that a cable TV company is one that brings MTV and ESPN to their television.

And the technology is about to get a lot more complicated. First, many cable companies are upgrading their networks to become more digital and there are already trials of cable companies that are upgrading to IPTV across their coaxial cables. They are doing this to save more bandwidth to use to provide faster cable modem service. Would this mean they are no longer cable companies? And then there is the whole issue of people getting programming over the Internet. If I watch The Daily Show on my cellphone, is that cable TV? My guess is that no matter what the FCC does to change the definition of cable TV that it will be out of date in just a few years.

Technology differences are at the heart of a lot of FCC issues. For example, there are different rules now that apply to traditional long distance telephone companies versus those who use IP and the Internet to deliver telephone calls. A lot of the reason for these issues is that the FCC doesn’t get to make up its own rules in a vacuum. Many of the underlying rules that the FCC enforces are derived from bills passed by Congress. The FCC has a certain amount of leeway to interpret such rules, but they are also restrained to a great degree by stepping too far outside of Congress’s original language and intentions in the various laws.

As is often the case, this current dispute boils down to money. The FCC charges a fee per cable customer to pay for the cost of operating its Media Bureau, which oversees cable TV providers. Currently this fee is only assessed to traditional cable TV operators that deliver their signal to customers using coaxial cable. But the fee is not charged to the satellite and the IPTV providers. And both of those groups are huge. For instance if AT&T U-verse, which uses IPTV was classified as a cable company they would be the seventh largest cable provider. And the satellite companies are huge with over 34 million subscribers in 2012.

As usual, the various companies argue that there are differences that should keep them from being regulated as cable companies. For example the satellite providers don’t get involved in issues concerning hanging cables on poles. But honestly those kinds of distinctions are silly. There are differences everywhere among companies in every regulated industry. For example, there are many FCC rules that apply to the very large telephone companies that don’t apply to tiny telephone companies, and vice versa. And yet they are all considered to be telephone companies.

The similarities among cable providers are obvious. They all deliver a nearly identical product to consumers and they all pay a lot of money to programmers to get the content they transmit. And they are all regulated by the Media Bureau. Common sense tells me that any company that delivers cable programming to homes is a cable company and ought to kick in for the cost of regulation. I am not sure that I have ever seen any regulatory issue that makes me think, “If it quacks like a duck it must be a duck”.