Is Wireless Really Better than Wires?

An rural area west of Route 41 and Lowell, Ind...

An rural area west of Route 41 and Lowell, Indiana. (Photo credit: Wikipedia)

It is clear that the FCC prefers wireless as the broadband solution for rural areas. It seems like they badly want every rural household in the country to get some kind of broadband just so they can take this issue off their plate. Just about every bit of policy decided in the last few years has a bias towards wireless bias.

For instance, the historic Universal Service Fund which was used to promote rural telephony over copper has been transitioned into a new CAF fund that will instead promote high-speed data in rural areas. There are several aspects of the CAF that clearly will ensure that the funds will go mostly to wireless carriers. The bulk of the funding will eventually be distributed by a reverse auction. This is an auction where the broadband providers in a given area will be able to compete for the funding, and the one who bids for the lowest amount of subsidy per customer will receive the funds.

The first time I read the reverse auction rules my first thought was that this money is all going to wireless companies. The reverse auction rules strongly favor companies who can provide data over large areas. Any smaller company who wants to get CAF funds to help pay for a rural wired network can be undercut by the largest wireless companies. AT&T Wireless and Verizon Wireless are the two richest and most successful companies in the country. They pay many billions of dollars of dividends annually and they can afford to underbid any rural landline company for subsidy, simply because they do not need it. But of course, they will bid in the reverse auctions and take the subsidies because the rules allow them to.

There are also parts of the CAF that can be used to build new broadband infrastructure and these funds also favor wireless companies. The funds get distributed by complicated rules that have a bias to get broadband to customers at the lowest cost per subscriber. And of course, there is no cheaper way to cover a large rural footprint than with wireless. Wireless companies are also going to get a lot of this infrastructure funding.

Meanwhile, AT&T recently told the FCC that they were going to introduce a plan to drop the copper for ‘millions’ of rural subscribers. And if they are successful then their rural subscribers can expect to be told to get cell phones rather than landlines. And for voice telephony this might not be such a bad thing. But do we really want to relegate a large bunch of the US geography to only having cellular data?

Today there is clearly a broadband gap with some rural areas still stuck with dial-up Internet access. And so getting them some kind of faster data seems like a reasonable plan. The FCC has set the definition of broadband to be the capability of receiving 4 Mbps download. And it’s obvious that they set that limit with rural areas in mind.

And so over the next decade more and more of rural America will be getting cellular data that will meet, or come close to meeting the FCC’s definition of broadband. But meanwhile, the cities have already far surpassed those speeds. There are very few cities left where the average home can’t get speeds of between 10 Mbps and 20 Mbps. There are usually cheaper alternatives in the range of 5 Mbps to 7 Mbps, but the faster speeds are widely available. And many places have much faster speeds available.

The FCC itself has promoted the availability of gigabit bandwidth and companies are responding. Google is bringing this speed to Kansas City, Austin and Provo and AT&T has promised to match them in Austin. CenturyLink is bringing a gigabit to Omaha. And a number of smaller municipal and commercial providers have brought gigabit speeds to other towns and cities scattered across the country. And one can expect the gigabit movement to grow rapidly.

It’s universal knowledge that the household use of bandwidth has continued to grow and there is no end in sight for that growth. As networks can provide more data households find ways to use it. Video has been the recent reason for the explosion in data usage, and now we can see that the Internet of Things will probably be the next big bandwidth driver.

Have we really solved the rural bandwidth gap if people in those areas are going to have 4 Mbps data speeds while urban areas have a gigabit? Obviously the rural areas will continue to be left behind and they will fall further behind than today. Just a few years ago the rural areas had dial-up and the cities had maybe 5 Mbps. But a gap between a rural world at single digit megabit speeds with the cities at gigabit speeds is a much larger gap and the rural areas will not be able to share in the benefits that bandwidth will bring.

The only long-term solution is to build fiber to rural America. Obviously nobody is going to build fiber to single homes at the top of mountains or at the end of ten-mile dirt roads, but I have been working on business plans that show that fiber can make sense in the average rural county. But it is really hard to get rural fiber funding since such projects tend to jut pay for themselves and are not wildly profitable.

It’s possible that the FCC’s universal service plans will work and that a lot of the 19 million rural people without broadband will get some sort of rudimentary broadband. But meanwhile, the rest of the country will be getting faster and faster bandwidth. And so, before the FCC declares ‘mission accomplished’ I think we need to have more of a debate about the definition of broadband and what is acceptable. I hate to tell the FCC, but the rural broadband issue is not going to go away even after rural areas all have cellular data.

Spectrum Winners and Losers

AT&T posted a short statement on their public policy blog called ‘Inconvenient Facts and the FCC’s Flawed Spectrum Screen’. In that blog post they complained that the FCC had failed to apply the spectrum screen to Softbank’s acquisition of Sprint and Sprint’s acquisition of the rest of Clearwire. And AT&T is right. The FCC has been incredibly inconsistent in the way it looks at wireless acquisition and mergers.

So what is the spectrum screen? The spectrum screen is a set of internal rules at the FCC that they use to determine if any wireless carrier owns too much spectrum in a given market. Historically the FCC had a generic rule that said that no one company could own more than one-third of the spectrum usable for wireless in a given geographic area. This spectrum screen was applied both to attempts of wireless carriers to buy new spectrum or to mergers between wireless carriers.

The FCC has been very inconsistent in the way they apply the existing screen. Last September they announced that they were going to look at the way the spectrum screen ought to work. But meanwhile, during the last year the screen has been applied (or ignored) in the following ways:

  • When the FCC looked at the proposed AT&T / T-Mobile merger they rejected the merger in part because they said that the acquisition would fail the screen test in 274 CMAs that covered 71 of the top 100 markets and 66% of the US population. However, the FCC fudged the spectrum screen in coming up with those numbers. At that time the spectrum screen set the maximum amount that any one carrier could own in one market at 95 MHz, which was one-third of the spectrum available for wireless carriers. However, in coming up with their conclusion the FCC lowered that threshold to 90 MHz in judging the merger. That might not sound like a big difference, but it lowered the number of markets affected by the merger by 84 and reduced the overall problem to less than 50% of the top 100 markets and 50% of the US population. That is still a lot of places where the proposed merger would have failed the spectrum screen, but AT&T had announced plans to divest of bandwidth as needed to meet the FCC test. The FCC made this change in the spectrum screen without any public input.
  • When Verizon acquired spectrum in the 1.7 to 2.1 GHz band the FCC applied this fully to their spectrum screen band. They did the same when AT&T acquired 2.3 GHz spectrum.
  • And then there is the recently announced approval for Softbank to acquire Sprint and Clearwire spectrum. The Clearwire spectrum at 2.5 GHz is right next to the 2.3 GHz spectrum recently acquired by AT&T. While the FCC fully counted the spectrum AT&T purchased against the spectrum screen, in the Softbank acquisition the FCC counted only 55.5 MHz of the Clearwire spectrum against the new Softbank spectrum screen even though there is an average of 140 MHz available in most of the Softbank markets.

So AT&T has a legitimate gripe. The FCC seems to apply the spectrum screen to get the results they want. It looks a lot more like the FCC is picking market winners and losers than they are protecting the public. The spectrum screen was established in the first place to promote competition. The FCC wanted to make sure that a given carrier did not get so much spectrum in a major market that they could effectively close out competition. They also didn’t want carriers to be able to hoard spectrum for future use. But the FCC no longer seems to be using market protection as the criteria of deciding who can and cannot merge.

It’s clear that the FCC didn’t want AT&T and T-Mobile to merge. They thought that it was bad for competition to lose one of the major carriers in the country. But it was wrong for them to fudge the spectrum screen as a way to justify their position rather than just oppose the merger on pure competitive grounds.

And in the case of Softbank they are going in the opposite direction. They obviously want a new competitor to AT&T and Verizon and they are ignoring the spectrum screen to make sure that happens.

Why does all of this matter? Like anything else it’s a matter of money. Wireless carriers have two ways that they can address congested conditions. They can just add more cell sites, closer and closer to the old ones. In effect spectrum is reusable and each new cell site uses the original spectrum freshly. The other solution is to just layer on a new spectrum in a crowded area so that no new cell sites need to be constructed. That is much cheaper than building cell sites, and so carriers want more and different spectrum in major markets to meet the seemingly insatiable and rapidly growing demand for mobile data.

The issue is going to get a lot worse. President Obama announced a new policy that will release up to 500 MHz of new spectrum for wireless use over the next five years. So there is going to be a new land grab by all of the carriers and the FCC needs to get ready.

It just seems to me like the FCC needs to toss out the spectrum screen and come up with a new way to determine the right amount of competition. In the two biggest merger cases before them in the last few years they blatantly ignored their own spectrum screen rules to get the result they wanted. That is evidence enough that we need to stop having the fiction of a spectrum screen. If the FCC wants to be in the game of picking market winners and losers they just need to be upfront about it.

G.Fast

You are going to start hearing about a new technology that may infuse some life back into existing copper networks. The technology is being referred to as G.Fast. This technology promises to be able to deliver very fast speeds up to a gigabit over copper for very short distances.

Some are referring to G.Fast as a last mile technology, but it is really a drop technology. The distances supported by the technology are so short that this is going to require fiber to the curb, or as some are now calling it, fiber to the distribution point.

Alcatel-Lucent and Telekom Austria just announced a field trial of G.Fast. That trial achieved a maximum speed of 1.1 Gbps over 70 meters and 800 Mbps over 100 meters for brand new copper. On older copper the speed dropped to 500 Mbps for 100 meters.

Current copper technologies use only a small portion of the theoretical bandwidth available on a copper wire. For example, most VDSL2 systems deployed today use up to 17 MHz of spectrum on the copper. G.Fast can provide more speeds by using more of the available spectrum and will be able to use somewhere between 70-140 MHz on copper. Plus G.Fast will be more efficient. Today DSL functions by dividing the data path into sub channels which each contain about 15 bits of data. Engineers are looking at coding and modulation techniques that will increase the bits per sub channel for G.Fast and thus increase speeds more.

G.Fast also will benefit by an existing technique called vectoring. This technology is used today with VDSL2 and eliminates crosstalk interference between copper pairs. It does this by monitoring the noise on copper and then creating an anti-noise signal which cancels the noise in the same was as is done by noise-canceling headphones.

Right now Alcatel-Lucent is spending a lot of time on G.Fast because they see a big opportunity to make more money out of the old copper networks. So let’s look at the issues that a large telco like AT&T will face when considering the technology:

  • Because the distances to deploy G.Fast are so short, the carrier is going to have to build fiber past every customer, just like in a FTTH network. A large carrier like AT&T has some advantages over a fiber overbuilder in that they can overlash fiber onto existing copper on pole lines. This is cheaper and faster than putting up fiber for a new provider who has to deal with pole make-ready costs.
  • Copper drops are generally the worst copper in the network. These wires get banged around by wind and suffer from repetitive water damage and are the weak point in the copper network. The promised savings from G.Fast is to lower the cost of installation at a customer. Some of this savings disappears if too many homes need a new drop to make it work.
  • G.Fast will save the cost of getting into the house. Once connected to an existing telephone NID on the outside of the house the signal can go anywhere in the home that is already wired for telephone. But the distance issue quickly kicks in and I would expect carriers to take this to a wireless right inside the house.
  • Savings are going to depend on how inexpensive the G.Fast electronics are compared to FTTH electronics.
  • Large telcos have relied for years upon customer self-installation of DSL and they will need G.Fast to work the same way.

So the savings to somebody like AT&T come from a) cheaper fiber installation costs because of the ability to overlash, 2) the ability in many cases to use existing drop and inside telephone wires, and 3) the ability to have customers self-install the product to avoid having to go into the home.

There are still a lot of technical issues to consider and overcome. Some issues that come to my mind include things like overcoming existing splices in the copper, and making sure there is no interference with existing DSL.

The expected time line for the deployment of G.Fast is as follows:

  • Standards finalized by spring of 2014.
  • Chip sets developed in 2015.
  • First generation hardware available in 2016 that probably won’t support vectoring.
  • Mature second generation equipment available in 2017.

Since a carrier has to build fiber everywhere for this to work, the technology is really competing against FTTH. By the time this is readily available there may be lower-cost units for FTTH deployment and I think any carrier would prefer an all-fiber network if possible.

Who is a Cable Company?

wikipedia:RG-6 Wikipedia:Coaxial cable

wikipedia:RG-6 Wikipedia:Coaxial cable (Photo credit: Wikipedia)

There are regulatory battles that tackle issues of great importance, but there are also battles, which if brought to the public’s attention would leave them shaking their heads. Currently there is one such battle going on at the FCC.

The battle is a simple one that defines who is a cable company. This kind of regulatory battle comes up all of the time because of the nature of the way that regulation is written. Traditional cable TV has been around since the 1950’s when it brought network channels to remote rural markets which had no over-the-air reception. But the industry as we all now know it exploded in the 70’s when the industry was deregulated and new programming was created in the form of the many networks we now all watch.

As often happens, the FCC regulations concerning cable TV were written to be very technology specific. For many decades there was only one way to be a cable television provider, and that was to string coaxial cable to deliver cable signal to homes. The original cable technology got a major upgrade when fiber was brought into the network and most cable companies upgraded to hybrid fiber/coax (HFC) systems. But the new HFC technology still delivered the cable signal to the home using the same coaxial cables.

But then, as invariably happens with technology, something new came along. First were the satellite providers. They don’t use any wires and instead put satellites into low orbits and send the signal down to everybody that is under the satellites. And more recently came IPTV (IP-based delivery of cable signal using either DSL over copper wire or fiber). IPTV differs from traditional cable TV in that it typically only sends the signal to the customer for the channel they are watching while traditional cable transmits all of the channels all of the time. And there have been other technologies used during the years, such as several cable systems that were developed that beamed the signal to customers using a spectrum referred to as MMDS.

One would think that as new technologies are developed that do the same things as older technologies that regulations would just be changed as needed. After all, the general public doesn’t much care about the technology used to deliver their cable programming. I think most people would agree that a cable TV company is one that brings MTV and ESPN to their television.

And the technology is about to get a lot more complicated. First, many cable companies are upgrading their networks to become more digital and there are already trials of cable companies that are upgrading to IPTV across their coaxial cables. They are doing this to save more bandwidth to use to provide faster cable modem service. Would this mean they are no longer cable companies? And then there is the whole issue of people getting programming over the Internet. If I watch The Daily Show on my cellphone, is that cable TV? My guess is that no matter what the FCC does to change the definition of cable TV that it will be out of date in just a few years.

Technology differences are at the heart of a lot of FCC issues. For example, there are different rules now that apply to traditional long distance telephone companies versus those who use IP and the Internet to deliver telephone calls. A lot of the reason for these issues is that the FCC doesn’t get to make up its own rules in a vacuum. Many of the underlying rules that the FCC enforces are derived from bills passed by Congress. The FCC has a certain amount of leeway to interpret such rules, but they are also restrained to a great degree by stepping too far outside of Congress’s original language and intentions in the various laws.

As is often the case, this current dispute boils down to money. The FCC charges a fee per cable customer to pay for the cost of operating its Media Bureau, which oversees cable TV providers. Currently this fee is only assessed to traditional cable TV operators that deliver their signal to customers using coaxial cable. But the fee is not charged to the satellite and the IPTV providers. And both of those groups are huge. For instance if AT&T U-verse, which uses IPTV was classified as a cable company they would be the seventh largest cable provider. And the satellite companies are huge with over 34 million subscribers in 2012.

As usual, the various companies argue that there are differences that should keep them from being regulated as cable companies. For example the satellite providers don’t get involved in issues concerning hanging cables on poles. But honestly those kinds of distinctions are silly. There are differences everywhere among companies in every regulated industry. For example, there are many FCC rules that apply to the very large telephone companies that don’t apply to tiny telephone companies, and vice versa. And yet they are all considered to be telephone companies.

The similarities among cable providers are obvious. They all deliver a nearly identical product to consumers and they all pay a lot of money to programmers to get the content they transmit. And they are all regulated by the Media Bureau. Common sense tells me that any company that delivers cable programming to homes is a cable company and ought to kick in for the cost of regulation. I am not sure that I have ever seen any regulatory issue that makes me think, “If it quacks like a duck it must be a duck”.

Switching in an IP Environment

FCC HQ

FCC HQ (Photo credit: Wikipedia)

In this industry there are always interesting fights going on behind the scenes. In fact, it seems like a lot of the policies made by the FCC are in response to battles being waged between carriers. As the FCC intervenes in these fights they end up creating policy as they help solve issues.

This Letter is a correspondence with the FCC about a current dispute that is going on with Verizon and AT&T disputing the way they are being billed by Bandwidth.com. and Level3. This fight is an interesting one because it asks the FCC to affirm that is supports a migration to an all-IP network.

The dispute is over what is called OTT (Over-the-top) VoIP. OTT in this case means that there are voice calls being made from a service provider’s network for which the service provider is not providing the switching. Instead the service provider is buying switching from a CLEC like Level3. And all of the calls involved are VoIP calls, meaning that they are being delivered from the customers to the switching CLEC using the IP network rather than the public switched telephone network.

Here is how this might happen, although there are other configurations as well. The network in question is clearly an IP network to the customer in order for this to be considered as VoIP. That means it is either a fiber-to-the-home network, DSL over a copper network or a cable system that has been upgraded to send the voice over the data path. In a traditional TDM network the calls from customers are routed directly to a voice switch and that switch will decide what to do with the call based upon the numbers that were dialed. But in this scenario there is not a switch in the subscriber’s network. Instead, when a customer makes a call, a signal is sent to wherever the switch is located telling it where the customer wants to call. That remote voice switch then tells the network owner where to send the call. It is no longer necessary in a smartswitch environment for the call to actually touch the switch, but the switch is still the device that decides how to route the call.

The parties are fighting about whether access charges ought to be charged for an OTT VoIP call. Access charges are fees that long distance carriers pay at both the originating and terminating end of a call to compensate the network owner at each end for processing the call. Verizon and AT&T don’t want to pay the switching component of the access charges for these calls. They are arguing that since there is not a physical switch in the originating network that such charges aren’t warranted.

Broadband.com and Level3 are arguing that the switching is being performed regardless of the location of that switch. They point out that for the FCC to rule otherwise would be counter to the FCC’s desire for the telephony world to migrate to an all-IP environment.

If the FCC rules that AT&T and Verizon are right, they will be saying that a carrier performing a switching function on legacy TDM technology can bill for performing that function but that somebody doing it more efficiently in an IP environment cannot. I just published a blog yesterday talking about ways to share a softswitch and that is exactly what is happening in this case. In an all-IP environment the network can be more efficient and not every carrier needs to buy and operate a switch. They can instead contract with somebody else to switch calls for them which is easy to make happen in an IP environment. Access charges are designed to compensate local carriers for the cost of performing certain functions and one has to think that the network owner in this case is still having to pay for the switching function and should get to recover some of that cost.

In fact, there has been switch sharing for years even in the TDM world. I know several rural LECS who lease switching from their neighbors and who have not owned a switch for decades, and they have always billed the switching access charge element. That element reimburses you for the cost of switching and it really shouldn’t matter if that cost is made up of the depreciation on a box you paid for or else a fee you pay to use somebody else’s box. Cost is cost and the key fact is that calls can’t be made or received from an area if somebody isn’t doing the switching.

I always find arguments by the large RBOCs to be interesting because they wear many hats. AT&T and Verizon are wireless carriers, LECs and long distance companies, and often when one part of the large companies make regulatory arguments it will be contrary to the interest of one of the other branches of the company. In this case the long distance branches of the RBOCs are looking for a way to avoid paying access charges. But the LEC side of both Verizon and AT&T share switching and they do not have a switch any more for every historic exchange area. So to some degree these companies are arguing against something that another branch of their company is doing. And this is often the case in many regulatory arguments since these companies do so many things.

Hopefully the FCC will agree with Broadband.com and Level3. If they rule otherwise they will be telling carriers that it is not a good idea to establish switch-sharing arrangements that are more efficient than having every carrier buying the same expensive boxes. If the FCC really wants the telco world to move to IP they need to get rid of any regulatory impediments that would make an IP network less desirable than a legacy network. Hopefully the FCC sides with efficiency.

Keep it Simple

I spend a lot of time looking at the products that carriers sell and one conclusion I reach is that simpler is better. I have found carriers with a multitude of options, with dozens of data products, many cable TV options and even many voice options. And I think I know where this came from. In the 90’s there was a movement to ‘give the customers more choice’ and I think that led some carriers down the path of customizing products for every customer who asked for something different.

But that does not seem to make sense for a variety of reasons. In probably the most extreme example, I know one carrier who has over forty Internet data products. This leads me to ask if a company really needs to be selling a 10 mbps, a 15 mbps, a 17 mbps and a 20 mbps data product? And the obvious answer is no. There is not enough practical difference between these products to justify having different ones.

It makes a lot more sense to have just a few data products. The companies that I see doing the best at selling data have three of four products, which can be characterized in terms of speed and price as low, medium and high, with maybe a fourth thrown in for a lifeline product. And they will have just a few cable TV options instead of the dozens of packages that I see at some companies. The same with voice, there might be a basic line and a line with unlimited long distance.

There are a number of reasons to keep it simple:

Customer service. It is important that all of your employees, from top to bottom in the company know your products. To some extent every employee in your company is a salesperson when they talk to the general public at or away from work. The basic triple play products are the core of what most carriers sells for a living, and if your employees don’t know what you sell then they can’t talk about your product to the public. As an example, every employee at your company ought to be able to instantly quote the latest prices and speeds for your Internet data product. This is an easy challenge to test – go out today and ask the next few employees you see if they can cite the speeds and prices of your basic residential and business data products. I would venture to say that most companies are going to fail this simple test.

Let’s face it – the success of your business depends on you being able to make a convincing story to customers of why your product is a better deal than the competition. For data products that difference is going to boil down to speed and price. Sales don’t just happen on the customer service lines, the opportunity is there every time one of your technicians is fixing something or an employee is standing in line at a grocery store. So make the products simple and make sure your employees can all cite your products and prices.

Sales, marketing. It’s much easier to market a simple product line. If you can summarize your pricing with a minimum of copy then you can spend your marketing efforts on talking about the benefits of your products and how you are a better deal than the competition.

And it’s certainly a lot easier to take an order from a customer when you don’t have to explain a ton of options. I can’t imagine the effort that is required in a company with dozens of data options when it is time to explain the product to a new customer or to discuss upgrading to an existing customer. Keeping it simple makes the whole sales process easier.

A simple product line also makes it a lot easier to build a customer portal so that customers can change products on their own. I just wrote last week how I recently went to AT&T wireless to change my voice plan and I was a bit overwhelmed by the number of options I had. I’m in the business and if I felt that choosing an option was a lot of work I wonder how somebody unfamiliar with the products in our industry must face these kinds of choices.

Provisioning. Whether you provision manually or have software that allows you to automatically provision products, having a simple product line is going to cut down on errors in provisioning. I talk to employees at carriers all of the time and a common problem I hear is that customers don’t get the products they thought they were signing up for. And when that happens you have started out on a sour note with a customer. With a simple product line, provisioning becomes a lot simpler because there are only a few options that customers can buy.

I do have a number of clients who have simple product lines. But even with those companies I will often see things like a phone product priced at $18.62 and it makes me wonder why it’s not priced at $18.99 or $18.49 or some number that everybody can remember. If you want your own folks to remember the prices, keep them simple as well.

Some companies seem to get this. I look at Google in Kansas City and their product line is downright sparse. They literally only have a tiny handful of products. I have written about them before and I think they have taken simplicity too far. But it’s easy to understand how much easier this has made their launch considering that they are new to the business.

So take a look at your product list with an eye to see if it’s simple and easy to understand. Or better yet, get some people outside of your staff to look at it. If the general public gets your products then you probably have it right.

 

 

Customer Portal

I have talked in other blog posts how I believe that the successful residential service provider in the future is going to have a choice to make between being what I call a dumb pipe provider or a full service provider. And there are merits to both approaches.

But should you elect to take the service provider approach you will be selling many smaller and niche products to your customers instead of the handful of major products you sell today. It may be a decade until voice and cable TV become 100% commoditized, but every year there will be fewer and fewer customers buying those traditional products.

One of the tools that service providers are going to need for selling multiple services to customers is a customer portal. This is a website that allows customers to see a menu of what is available to them. Last week I wrote a blog entry about upselling your current products to your customers as a way to immediately affect bottom line and a well-designed portal is a great tool for enabling that process.

Here is what I envision as the perfect customer portal:

  • The ability for a customer to see what services they are already buying today.
  • An easy-to-use menu that shows what else is available, categorized to make it easy for a customer to browse your products.
  • Product descriptions that explain the benefits of each available product.
  • Ideally, a video or demo for more complex products showing how they works.
  • The ability to offer sales specials as a customer browses to entice them to try the product.
  • A tie-in to your provisioning system so that the customer can buy, or even just try the product as they shop.

There are a number of customer portals in the telecom world today and I have yet to see one that works in this ideal way. Just last week I went in and changed several things on my AT&T Wireless bill. I found a lower cost voice package and the portal let me easily change plans. But in doing to it deleted my text messaging plan and decided I desired to pay 25 cents per text message. That took a call to fix. And I wanted to delete a feature that gave me lower cost international calling and that also took a phone call to fix. There is nobody bigger than AT&T and they don’t have their portal figured out correctly. But what they did have was a lot better than nothing because it enabled me to familiarize myself with their various plans so I could decide what I wanted, without having to involve a person in that process. I was glad to have the portal, and I just wished I didn’t have to make two phone calls to finally complete what I wanted to change.

Some of the better portals I have seen are from the major cable companies. They often offer so many different programming packages that having them all explained on a portal is a great way for a customer to shop without tying up a customer service representative. But from what I can see, none of them yet give customers the ability to change products without talking to a live person before it is finished.

I think a lot of companies hesitate to build a portal because they don’t want to commit the resources needed to build the ideal one. But there is no reason to wait since even the largest carriers haven’t perfected the customer portal yet. There is nothing stopping you from starting your portal now to let your customers see the wide range of your existing products. Every one of my clients has a number of products that they barely sell. I believe that there are a lot more customers who would buy products like unified messaging if they understood what it could do for them and if they knew that you offered it. Think of building a portal as a way of communicating with your customers.

If you are going to start a portal or improve an existing one you should consider including some of the following functions:

  • Let customers check their bill on-line.
  • Let customers make a credit card or bank debit payment.
  • Let customers change product parameters like their Internet bandwidth.
  • Make it easy for customers to order Pay-per-view events.
  • Let’s customers place a tentative order even if that just prompts you to call them back.

So I recommend that you create a portal today that does some of these functions. There is probably not going to be some magic program available that is going to let you create the perfect customer portal all at once. Rather, this is likely to be an ongoing process. Because of that, do what you can for now, but do so in such a way that you are prepared to evolve your portal into a powerful tool for you and your customers.

Finally, I would note that there is an additional set of functions that are sometimes referred to as a customer portal. On smart switches you can build a web interface so that customers with advanced voice features can maintain the settings for those products. While this is certainly a portal function, this is more of an operational function and not a marketing function.

Is Wireless a Substitute for Wireline?

English: A cell phone tower in Palatine, Illin...

Last week in GN Docket 13-5 the FCC issued an update that asked additional questions about its planned transition of the historic TDM telephone network to all-IP network. This docket asked for comments on several topics like having a trial for transitioning the TDM telephone network to all-IP, for having a trial to go to enhanced 911 and for making sure that a switch to IP would not adversely affect the nationwide telephone databases.

But the docket also asks for comments on whether the FCC should grant telephone companies the right to substitute wireless phones for wireline phones and abandon their copper network. The docket mentioned two companies that wanted to do this. For example, Verizon said they intend to put wireless on Fire Island off New York City as they rebuild it from the devastation of hurricane Sandy. But AT&T has told the FCC that they are going to request permission to replace “millions of current wireline customers, mostly in rural areas, with a wireless-only product”.

Let me explain what this means. There are now traditional-looking telephone sets that include a cellular receiver. To replace a wireline phone, the telephone company would cut the copper wires, and in place of your existing phones they would put one of these cellular handsets. They would not be making every family member get a cell phone and there would still be a telephone in the house that works on the cellular network.

This make good sense to me for Fire Island. It is mostly a summer resort and there are not many residents there in the winter. It’s a relatively small place and with one or two cell phone towers the whole island could have very good coverage. And if the cell phone tower is upgraded to 4G there would be pretty decent Internet speeds available, certainly much faster than DSL. One would have to also believe that the vast majority of visitors to the island bring along a cell phone when they visit and that there is not a giant demand for fixed phones any longer.

It is AT&T’s intentions, though, that bother me a lot. AT&T wants to go into the rural areas it serves and cut the copper and instead put in these same cellular-based phones. This is an entirely different situation than Fire Island.

Anybody who has spent time in rural areas like I do knows the frustration of walking around trying to find one bar of cellular service to make or receive a call. Cell phone coverage is so good today in urban areas that one forgets that this is not true in many places. I have a client, a consortium of towns and the rural areas of Sibley and Renville Counties in Minnesota. Let me talk about my experience in working with them as an example of why this is a bad idea.

My primary contact works in the small town of Winthrop. I have AT&T cellular service and when I visit him my cellphone basically will not work. I sometimes can move around and find one bar and get a call through, but I can’t coax the phone to get a data connection so that I can check email. And if you go west from Winthrop the coverage gets even worse. AT&T’s coverage maps show that they serve this area, but they really don’t. There are places in the east end of Sibley County that have decent coverage. But there are also plenty of farms where you can get coverage outdoors, but you can’t get coverage in the house.

The traditional cellular network was not built to serve people, but rather cars. Cell phone coverage is so ubiquitous now that we already forget that cellular minutes used to be very expensive, particularly when you roamed away from your home area. The cell phone network was mostly built along roads to take advantage of that roaming revenue stream. If you happen to live near to a tower you have pretty decent coverage. But you only need to go a few miles off the main highway to find zero bars.

And I use the Renville / Sibley County client as an example for a second reason. The people there want fiber – badly. They have been working on a plan for several years to get fiber to everybody in the area. The area is a typical farming community with small hub towns surrounded by farms. The towns have older cable systems and DSL and get broadband, although much slower than is available in the Twin Cities an hour to the east. But you don’t have to go very far outside of a town to get to where there is no broadband. Many people have tried satellite and found it too expensive and too slow. There are any homes still using dial-up, and this is not nearly as good as the dial-up most of you probably remember. This is dial-up delivered to farms on old long copper pairs. And it is to get access to an Internet that has migrated to video and heavy graphics. Dial-up is practically useless for anything other than reading email, as long as you don’t send or receive attachments.

Over 60% of the people in the rural areas in Renville and Sibley Counties have signed pledge cards to say that they would take service if fiber can be built to them. One would expect this would translate to at least a 70% penetration if fiber is built. They refer to the project locally as fiber-to-the farm. There has been a cooperative formed to look at ways to get fiber financed. And any financing is going to require local equity, meaning the people in the County are going to have to invest millions of their own dollars in the project – and they are certain they can raise that money. That is how much they want the fiber. And this same thing is true in rural areas all over the country. Most of rural America has been left behind and does not have the same access to the Internet that the rest of us take for granted.

AT&T’s idea is only going to work if they make a big investment in new rural cell towers. The current cell phone network in rural areas is not designed to do what they are proposing, even for delivering voice. And even if the existing rural cell phone towers are upgraded to 3G or 4G data (which almost none have been), most people live too far from the existing towers to get any practical use from cellular data. Cellular data speeds are a function of how close one is to the tower and, just like with DSL, the speeds drop off quickly as you get away from the hub.

I hope rural America notices this action at the FCC and files comments. Because as crappy as the rural copper wires are today, when the wireline network disappears many rural households are going to find themselves without telephone service. And forget about fast rural data. The AT&T plan is really just a plan for them to abandon and stop investing in rural communities.

Current Access Disputes

We are seeing more access charge disputes today than we have ever seen. For those who don’t know about access charges they are the fees that an Interexchange Carrier (IXC, or long distance carrier) pays for accessing a local network. Most of the fees are quite miniscule at fractions of a penny per minute, but since there are still a lot of long distance minutes they add up to substantial payments from long distance carriers to LECs and CLECs.

It seems that a number of IXCs have recently adopted a policy of disputing access charges in the hopes of getting out of paying what they should pay. They know that some local telcos won’t dispute their claims even if the dispute is wrong. They also know that the dispute process can be painful and they hope to wear telcos down into making compromises just to get paid something. In my view some IXCs are being bad citizens in that they know they can strong-arm smaller telcos into accepting less than they should be paid.

Over the last year, the following are the sorts of disputes we have been seeing:

  • IXC’s are demanding a fully verifiable access bill. By that I mean that they expect every fact on the access bill to be correct. In the telephone industry there are several industry databases and the IXCs want every fact on the bill to match the information in these databases. This includes a lot of different facts from the names of switching offices (CLLI codes), mileages, billing percent splits between various carriers, the company that should be billing (OCNs), etc. There is nothing wrong with expecting the bills to be verifiable. But over time small errors creep into these databases as companies make changes to their networks. In the past the IXCs would see these kinds of issues as clerical issues and not substantive issues and they would often point them out and ask the carrier to fix them. But today the more aggressive carriers are refusing to pay bills until such problems are fixed.
  • NECA LATA issue. The NECA tariff which most small telephone companies still use for their Interstate tariff has a prohibition in it that says that a telco cannot carry their traffic to a tandem in a different LATA. This prohibition comes from 1984 when the RBOCs were all part of NECA for a few years. Judge Greene, in the order that divested the RBOCs from AT&T prohibited the RBOCs from carrying voice traffic to another part of the country, and this was left to the IXCs, being mostly AT&T then. However, when the RBOCs all left NECA nobody changed the language in the NECA tariff and so the prohibition is still there. There is no external law or rule that prohibits smaller telcos from carrying traffic to another LATA. Unfortunately, the language in a tariff overrides any industry rules, so if you use the NECA tariff and your tandem is in a different LATA your access bill can be successfully disputed. The only real fix for this is for NECA to fix their tariff or for you to use a different tariff.
  • Traffic and mileage pumping. Last year the FCC banned traffic and mileage pumping. Traffic pumping is when a carrier generates bogus traffic simply for the purposes of generating access charges. Mileage pumping is when a carrier rearranges their network to bill extra miles of transport for the purposes of billing more access. Since that ruling I have seen a number of disputes that accused telcos of one of these types of pumping, but in each case the accusation was not true. Since traffic pumping is now a bad word, I believe the IXCs are trying to scare telcos into settling rather than taking a claim of traffic pumping to a regulatory body. If you are accused of this please talk to us, because the chances are high that you are not in violation of this prohibition.

All of these issues can be a problem for a telco since the IXCs are in the driver’s seat. They can withhold payments for access which gives them the upper hand in a dispute. They know it is a costly process for telcos to appeal an access dispute to the next level, which is normally done by filing a complaint at the state Commission. I don’t mean to sound cynical, but I think there are ruthless people in the access departments of some IXCs that are getting bonuses for reducing access payments by any means they can find. Even scarier, there is now a whole industry of access consultants who get paid a percentage of any savings they can find in access bills. Such consultants are highly motivated to use any tactic in the book to get a payday.

And so my warning to LECs and CLECs is to get your access bills into the best shape they can be. Do a careful review between your access bills, your actual network and the industry databases (the LERG and Tariff 4). Eliminate any easy reason for the IXCs to single you out, because fighting your way out of access disputes can be costly and time-consuming. CCG has done hundreds of access charge reviews, so don’t hesitate to call us if you want to do this and need help.

Who is Going to Pay for the IP Network?

Peninsular Telephone Company

Peninsular Telephone Company (Photo credit: Nick Suan)

Small telcos and most CLECs are waiting to see what will come from the changes due to converting to an all IP network for telephony. Today the telephony voice network utilizes TDM (time division multiplexing) technology that was originally developed for copper but that has been upgraded to use fiber. But the FCC has said that this old network is going to have to be upgraded to all-IP, meaning that voice will be carried by Ethernet similar to the way that data is transmitted.

I don’t think anybody is arguing that this kind of shift makes sense. IP trunking is far more efficient in terms of carrying more calls in the same amount of bandwidth. And a lot of companies have already implemented some IP trunking.

The important issue for small telcos and CLECs is how this transition is going to change their costs. In order to understand the possible change, let’s look at how voice traffic gets to and from small telcos and CLECs today.

  • Independent telephone companies connect with larger companies and neighboring companies by physical interconnection at mutual meetpoints. Historically, most of the meetpoints are located at the physical border between two neighboring telephone companies with each company owning the fiber and electronics in their own territory. And each telco is responsible for the costs of their portion of the network. Historically local calls have been exchanged for free in both directions and there are access charges in place for all telcos to get paid by the long distance carriers for using their network and facilities for long distance calls.
  • The rules governing CLECs were established by the Telecommunications Act of 1996. This Act laid forth the basic rule that a CLEC can interconnect with a telco network at any technically feasible point. This idea was fought hard by the large telcos who wanted CLECs to bring traffic to their tandems (regional hub offices). Once a CLEC has established a meetpoint, then it works pretty much the same as normal telco interconnection in that both parties are responsible for costs on their side of the interconnection. Sometimes local calls are interchanged for a fee and sometimes they are free (called bill and keep) and this is negotiated. The CLECs also bill access charges for carrying long distance calls.

There are a number of ways that IP trunking could be implemented, and each of them has financial consequences for small telcos and CLECs:

  • The IP network could be built to mimic the current PSTN. The routes would be roughly the same but the rules of interconnection would stay the same. But with IP trunking the network would be more efficient.
  • The large telcos could establish regional hubs and expect everybody else to somehow get their traffic to those locations. This would be a radical change for small telcos who would have to build or lease fiber from their rural location to the nearest regional hub. For CLECs this would completely undo the rules established by the Telecommunications Act of 1996 and would put all of the cost to get to the hubs onto them.
  • In the most extreme IP network there would be only a few large hubs to cover the whole US. This would be the most efficient in terms of the hubs, but it would require all telcos and CLECs to spend a lot of money to get their voice traffic to and from the hub.

Since I have been working in the industry the RBOCs (now AT&T and Verizon) have tried several times to put the burden and the cost of transporting calls onto the small telcos. But regulators have always stepped in to stop this because they realize that it would greatly jack up the cost of doing business in rural areas. I certainly hope that as we move to a more efficient network that we don’t end up breaking a system that is working well.

The downside to any plan that shifts cost to small telcos is that the cost of providing local and long distance service will increase in rural areas. The consequence of changing the CLEC rules will be less competition. The current interconnection and compensation rules have served the country well. Every caller benefits by having affordable rates to call to and from rural areas. And there is no doubt that higher communications cost would be a major hindrance to creating and keeping jobs in rural areas.