Categories
The Industry What Customers Want

Trusting Our Digital Universe

BeetleI was thinking about the Volkswagen cheating scandal, where they had a computer chip change the emissions of their cars during testing. It got me to thinking about how customers trust or don’t trust businesses. Volkswagen not only lied to regulators that their cars have low emissions, but they went and made that the centerpiece of their advertising campaign.

What made the Volkswagen scandal worse for the rest of us is that they cheated using software. Pretty much everything we do in the telecom industry these days involves software. The Volkswagen scandal, along with many others, might eventually make the public untrusting of everything that includes software.

There are already examples of telecom companies who have violated their customers’ trust. For example, Comcast has made everybody’s WiFi routers into a dual purpose router that can serve people outside of your house. Comcast very quietly told the public about this once, but if I wasn’t in the industry I probably wouldn’t have noticed this change and I’m sure the average household has no idea that Comcast is using their routers that way. Security experts everywhere warn about how dangerous it is to let the public into your WiFi router.

I paused when considering buying a smart TV after it was revealed that Samsung TVs had the ability to watch whatever happens in front of them and to hear everything within earshot. Our PCs have had that same weakness ever since they started building cameras into every monitor.

And a lot of people now mistrust their ISPs who have been funneling all of their data to the NSA. Of course, your ISP already knows everything you do online anyway and there is no telling what some of them might be doing with the information.

We are about to enter an age where people are going to be filling their homes up with many more smart devices. We’ll obviously buy them because they will make our lives easier or more fun, but every one of these devices that is hooked to a network could end up being used to spy on us. You have to know that at least some of the makers of IoT devices will try to spy on us since there is a lot of money in selling data about us all.

I’m not quite sure as a society how we deal with this issue because we have entered uncharted legal waters. Almost all of our product liability laws concentrate of the mechanical nature of the things we buy. In the case of Volkswagen, the mechanical parts of their cars worked just fine; the fault was in their software that had been deliberately manipulated to lie about the performance of the cars. It’s hard to think that anybody except the smartest technical people are going to have any way to know if our devices are doing things we don’t want them to do. Once they get hooked up to a network, their software can spy on us in devious was that are as hard to detect as the Volkswagen software.

Telecom companies have a particularly important obligation to the public. As the ISP most directly serving people we must earn and keep their trust. This is why I am particularly dismayed to see the big carriers like AT&T and Verizon so willingly handing over customer data to the NSA. If the law makes a telecom company do something then they must obviously comply. But these companies chased the big bucks from the NSA as if they were just another customer and sold out everybody else who sends them a monthly check. And sadly, since AT&T controls a lot of the Internet hubs, the data from all of the little ISPs was given over as well, without the consent or knowledge of the smaller companies or their customers.

I fully expect some day that we’ll have a terrible scandal or tragedy involving the ability of our new IoT devices to spy on us. And when that happens there might well be a backlash with people ripping out and stopping their use of the devices. The whole industry needs to realize that a few bad events can spoil the market for all of them, and so it’s my hope that companies that abuse the public trust get exposed by those who do not. Unfortunately, we don’t have a lot of history of that happening.

Categories
Regulation - What is it Good For?

Barriers to Broadband Deployment on Federal Lands

CenturyLink’s VP of Regulatory Affairs, Jeb Benedict testified recently at the House Energy and Commerce subcommittee that there are a number of barriers to rural broadband deployment when a fiber needs to pass though federal or tribal land. He said that CenturyLink would support legislation that would do the following:

  • Require that federal agencies give priority to rights-of-way applications and establish time frames in which they must respond to applications to build broadband.
  • Make it easier to put fiber into areas that were previously disturbed like the shoulder of a roadway.
  • Minimize or even eliminate permitting fees and leases for rights-of-way for fiber projects.
  • Require federal agencies to work together when necessary for fiber projects.

CenturyLink is right about all of these items and I’ve seen projects get bogged down over these issues many times. For example, the process and paperwork required to build fiber through federal park land can be gruesome and time consuming. There are different processes to follow for different kinds of federal land and so the process differs depending upon whether something is a national park, a national forest, or just general federal land. And there are often numerous barriers for getting fiber through tribal lands as well.

What I’ve always found mystifying is that building on park land always treats each new application like it is the first time that telecom is being built there. It’s no easier to put fiber where copper has been run before and you have to start from scratch. What is particularly frustrating is that, as Mr. Benedict pointed out, a lot of hoops have to be jumped through to build into the rights-of-ways or roads where the dirt was dug up already while constructing the road. There are often environmental and archaeological studies required to bury conduit in these rights-of-ways that have clearly already been fully excavated in the past when building the road.

National Parks are the hardest places to build. I have a client who found grant money to bring wireless service to the Channel Islands off San Diego, much of which is a national park. The area had cellular coverage in the past but the carriers were removing the cell towers which means that the islands would be cut off from communications. The park wanted basic services like the ability of park visitors to call 911 and wanted data for the park rangers and a few others who still live on the islands.

But the barriers to building there were so stringent that the project could never be made to work. The park wouldn’t allow the construction of any new buildings or enclosures of any kind. They would not allow any dirt on the islands to be disturbed, meaning no digging of any kind. And there were incredibly expensive environmental studies required that I recall cost $150,000. Even though the people that worked at the park wanted new wireless service, and even though there would be great public benefit, the national park service rules basically made it impossible to install telecom gear.

And I have similar stories from all over the country. Trying to get fiber through national forests is almost as hard as national parks. Applications to build can be delayed for seemingly forever. There are usually environmental studies to be done even to build in existing rights-of-ways on existing roads, and there are numerous rules about how and when construction can be done. I’ve seen companies route fiber many miles out of the best path just to avoid the hassle of building through the federal land.

The problem is that these federal lands are often surrounded by rural communities that badly need broadband. But it’s hard to build fiber, cellular towers, or any kind of infrastructure if the parkland creates a barrier for reaching the areas with fiber.

It’s not just parklands that are a problem. Just trying to build under an interstate highway overpass or across a federal bridge can also be a very slow process. And those are found everywhere. As CenturyLink points out, there is no requirement that the agencies involved look at such requests in a timely manner. Sometimes such requests get processed quickly and sometimes they languish for a very long time.

If the federal government really wants to promote more rural fiber then they need to eliminate the barriers that they have created for their own lands, highways, and bridges.

Categories
The Industry

The Battle for Eyeballs

There is an interesting aspect of the web that happens behind the scene and that doesn’t get a lot of press: the tracking and maximizing of web views on social media sites like Facebook and Twitter. Large content providers like the Huffington Post, BuzzFeed, and the New York Times very closely monitor how many shares they get on the various sites. The reason that shares matter is that the more eyeballs they get to look at their pages, the more they make from advertising. It’s easy to forget that advertising drives the web, but to these companies advertising is the major, and in some cases the only source of revenue.

Following is a list from NewsWhip showing the 10 largest content providers, based on Facebook shares, for August, 2015. Some of these are familiar names, but some post content under various names that a Facebook reader would more likely recognize.

Content providers are currently in a bit of a panic because the largest social media sites are working very hard to keep eyeballs on their own pages. When somebody clicks on a web article on Facebook they are sent away from Facebook and they often don’t return. Social media sites know that keeping eyeballs on their site increases their own ad revenues.

Twitter recently launched Moments, a space for content that stays inside the Twitter platform. Twitter directly creates content for Moments and has also invited partners to write and create content inside the Twitter platform. Facebook has been doing similar things through its Trending Topics pages that lead you to content within Facebook. They are also looking at a more aggressive platform they are calling Notify. LinkedIn probably started the trend and has enlisted heavy hitters from various industries to write content directly inside their site.

It’s a tough time to be a content creator. They are already seeing a downward trend in revenue due to ad blockers. It will be that much harder to make money as a content provider if they have to also compete the social media sites directly for content. After all, the social media sites know a lot more about what each of us is interested in, and companies like Facebook can use that knowledge to entice us to view content that they think is of interest to us.

The content creators have a real concern. For example, the Huffington Post has lost about 2 million Facebook shares per month over the course of this year. The issue matters to web users, because it is the content creators that make the web worth visiting. I personally use Twitter as a way to find articles about various tech industries and I am not that much interested in personal tweets by the people I follow. I am sure that many other people use these platforms the same way – as a way to follow topics they are interested in. But whenever large sums of money are involved somebody is always going to be scheming to capture market share, and the tug of war for advertising eyeballs is in full force.

Categories
The Industry

The Disappearing Web

Someday you are going to click on the link to today’s blog and it will no longer be on the web. Let’s hope that it’s because I am retired and have stopped paying my annual fees to keep this blog on WordPress. But it also might be for another reason – that WordPress is sold, goes bankrupt, or just decides to get out of the web business.

We like to think of the web as a giant repository that is recording and storing everything that we are doing in this century – but nothing could be further from the truth. The vast majority of content on the web is going to disappear, and a lot sooner than you might imagine. There is very little of today’s content that will still be around even fifty years from now, and most of it will disappear long before that.

And this is because somebody has to spend money to put and keep the vast majority of content on the web. In the case of this blog I would have to keep paying WordPress. A lot of web content is on private servers and is not dependent upon a larger company like WordPress to keep going. But somebody has to pay for the bandwidth to connect these servers and to replace and migrate the content somewhere else when the servers inevitably wear out and die.

I don’t know much about the company behind WordPress, but what is the likelihood that they will still be in business fifty years from now even if I somehow paid them to maintain this blog forever? I would think that over the next fifty years that most of today’s big web companies will be gone. It’s hard to think that even the largest content repositories like Facebook will last that long. In the fast moving world of the web, fifty years is forever and companies will be supplanted by something new as tastes and trends change.

And even should the platform that has your content survive for fifty years, what are the chances that the coding underlying your content will still be supported fifty years from now? In the short history of the web we have already obsoleted much of the earliest content due to its format.

Web content already disappears a lot faster than people might believe. I’ve seen several sources that suggest that the average life of a web page is 100 days. And links die regularly. Around 8% of links die every year for one of the many reasons I’ve mentioned.

What is sad about all of this is that a lot of the content on the web doesn’t exist anywhere else. There are many blogs and news websites that are the main chroniclers of our times that don’t exist in any other format. It’s certainly possible that future historians will look back on this time as a big black hole of historical data.

Even should content be stored somehow off the web there really is no off-line electronic storage medium today that lasts very long. There are a few storage technologies that have the possibility to last longer, but there is very little web content that people value enough to turn into a long-term off-line format. And even if you bother to archive content, being able to read anything electronic years from now is likely to be a puzzle. No matter the technology used to store your content, that technology will be obsoleted by something better. It’s already getting hard to find somebody capable or reading content from as recent as twenty year ago.

A few years ago I read the correspondence between John and Abigail Adams. That correspondence provided a great peek into what it was like to be alive then. As a whole we are even more prolific today than people a few hundred years ago. People blog, email, and tweet in great volumes. But I find it a bit sad that nobody in the future is likely to be able to read this blog – because, gosh darnit, this is good stuff worthy of the ages!

 

 

Categories
The Industry What Customers Want

The Cherry Picking Dilemma

I ran across an article written by somebody in Provo, Utah who claims that the penetration rate for paying data customers on the Google network has fallen to around 20% from a previous penetration rate of 30% when the network was operated by the city. I have no idea if the 20% penetration rate cited is accurate, but it is not surprising since Google also offers 5 Mbps for free in the City as part of the deal for buying the network from the city. I’m sure that a lot of households and students are taking the free option.

But the article did prompt me to think about cherry picking – the phenomenon where telecom carriers tend to mostly pursue customers who spend the most money. This topic is of particular interest when talking about Provo because the network that Google now operates was once an open access network. And I think the pre- and post-Google situations are worth comparing.

Back when the city operated the network they operated it on an open access basis, as required by Utah law. This means that the city was prohibited from being an ISP, but they could sell access to other last-mile service providers on the fiber network. Provo sold lit fiber loops on the network for roughly $30 per month. ISP using the network were then free to sell any services that a customer wanted to buy.

An open access network leads to a form of cherry picking in that no ISP is going to buy a $30 fiber loop and then offer a standalone inexpensive data product. There just is not enough profit in such a situation to sell a standalone $40 or $45 data products. Instead an ISP in an open access network will either price standalone data high, or else bundle it with lots of other stuff. You can contrast this to Qwest who would have competed against iProvo by selling low price DSL. I am sure that Qwest had some data products in the $30 per month range. They would have been much slower than the iProvo fiber but would have been attractive to the budget-minded customer.

And then consider Google who is definitely a cherry picker. They sell a gigabit of data for $70 per month. There are very few markets where a significant percentage of households are going to find that affordable, regardless of how attractive they might find the speed. I don’t know what Google’s target penetration rate is, but they can’t be shooting for the same overall penetration rate as the cable company can shoot for. The cable companies have a full range of products from slow to fast, from cheap to expensive.

I work with hundreds of ISPs and the one thing that I have consistently seen in every market across the country is that when customers have a choice between a low and a high priced data product that the vast majority of them will take the lowest priced data product that will give them a speed they can live with. Cable companies don’t expect more than a few percentage of households to buy their fastest and most expensive data product.

And so, even if the author of the article is right, I’m not sure that this is a negative thing for Google. If the city was selling broadband to 30% of households in an open access environment, then one has to imagine this represented a broad range of products at different prices and speeds. There would have been no really cheap products due to the $30 monthly loop rate, but there still was probably a range of packages between $50 and $200 with various combinations of data, video, and voice.

If Google has been able to get 20% of the people in Provo to pony up $70 per month for broadband they might be very happy with the results. They bought the network for $1, but obviously had to make some capital investments to get the network capable of gigabit everywhere. I see nothing automatically distressing about a 20% penetration rate of a very high margin product.

There are a lot of other new ISPs hitting various markets around the country today. A few of them are also peddling gigabit as their only product like Google. But most competitive ISPs still sell a mix of products. Whenever I talk to these companies I always caution them that given a choice that very few people are going to buy the gigabit if there is an affordable 100 Mbps alternative. I’ve seen a number of business plans that predict a high penetration rate of the fastest data product, but I’ve just seen human nature rear its head in almost every market I’ve ever worked in. Given a choice people will save money when they can. And all of the marketing in the world won’t get the to spend more than they are comfortable with.

Categories
Current News Regulation - What is it Good For?

Congress Considering Mandate for Conduit

There is a bill making its way through Congress that ought to be of interest to the carrier community. It’s called the Broadband Conduit Deployment Act of 2015. It’s a bipartisan bill being sponsored by Rep. Anna Eshoo (D-CA) and Greg Walden (R-OR).

In a nutshell this requires that all federally funded highway construction projects include the installation of empty fiber conduits in cases where it is determined that an area has a need for broadband in the fifteen years after the construction. I have no idea who makes this determination.

There are a number of cities and counties around the country that have had this policy in place and it works, albeit slowly. People don’t realize it, but most local roads get rebuilt to some degree every thirty years, and so every year about 3% to 4% of roads in an area ought to be getting rebuilt. That number varies according to weather conditions in different parts of the country and according to how heavily a road is used. Roads that carry a lot of overweight loads wear out a lot faster. But federal interstate highways are built to a higher standard and are expected in many parts of the country to last up to forty years. And there are now some stretches of interstate highways that are fifty years old.

One has to wonder about how quickly there might be benefit from such a policy. Certainly any conduit put into urban stretches of highway would probably be grabbed up. But in a lot of places it might be a decade or more until the new conduit provides any real benefit. Once you get out of urban areas conduit is mostly used for long haul fiber, and so have having a patchwork of conduits here and there isn’t going to get many carriers excited.

But over time such a system will provide benefits as more and more stretches of a highway get empty conduits. The same thing has happened in the cities that have this policy. They hoped for a quick benefit for broadband when they introduced this kind of ordinance, but it often takes many years until there is enough conduit available to get any fiber provider excited. The place where almost any empty conduit is of immediate interest is if it runs through neighborhoods, because saving any construction costs on the last mile matters to a fiber builder.

The law is silent on how this conduit would be made available. I’ve worked with getting rights to government-owned fiber before and it has always been difficult. The government owner of a conduit doesn’t have the same sense of urgency as a carrier who is trying to build a fiber route. If you have to wait too long to get access to conduit you’re probably better off finding a different solution.

But it’s step in the right direction and over time this will produce benefits in some places. I also don’t know exactly what kind of roads qualify as receiving federal highway funding assistance. Obviously all interstate highways meet that test. But I’ve sat through many city council meetings where I’ve heard that state highway projects sometime get some federal funding assistance. If so, then this greatly expands the scope and potential of the law.

Similar bills have been bouncing around in congress since 2006 and never passed for one reason or the other. The White House is in favor of this bill as one more piece of the puzzle in promoting more broadband. The White House tried to implement an abbreviated version of this idea a few years ago through executive order, but apparently the implementation of that has been very spotty.

Like many good ideas that work their way up to Congress, this bill is probably twenty years too late. If this had been implemented at the time of the Telecommunications Act of 1996 then we would already have conduit all over the country that would provide cheaper transport. But I guess you have to start somewhere, so I hope this bill becomes law.

Categories
Regulation - What is it Good For?

Lifeline and Rural America

Earlier this year Chairman Tom Wheeler of the FCC proposed to change the Lifeline program to support broadband in addition to voice. In that proposal he suggested that a household should get at least 10 Mbps download and 1 Mbps upload in order to qualify for a Lifeline subsidy.

Here is where it gets weird. Frontier has filed comments that the 10/1 Mbps threshold is too high and that using such a high standard will stop a lot of rural households from getting Lifeline assistance. They are right, of course, but their solution is to lower the Lifeline threshold to whatever level is necessary to meet actual speeds in a given rural market.

Meanwhile, Frontier has taken a huge amount of money recently from the Connect America Fund for the purpose of raising rural DSL up to the 10/1 Mbps level. But they have six years to get to those speeds, and most of us in the industry think that even after all of their upgrades a lot of the rural households in the upgraded areas still won’t get 10/1 speeds. It’s going to be very hard for Frontier to do that with DSL in a rural setting where people are on scattered farms or back long lanes. I find it unlikely that Frontier, or any of the big telcos, are going to put enough fiber in the rural areas to actually achieve that goal.

But far more importantly, 10/1 DSL is not broadband. It’s not broadband by today’s current FCC definition that says broadband must be at least 25/3 Mbps, and it’s not broadband for real life applications.

I use my own household as the first example. There are two adults and one teenager. We work at home and we are cord cutters and get all of our video online. We have a 50 Mbps cable modem, and as cable modems tend to do, sometimes it slows down. When our speed hits 25 Mbps we’re all asking what is wrong with the Internet. So our household needs something greater than 25 Mbps for normal functioning. If we get less than that we have to cut back on something.

I have a friend with two teenage boys who are both gamers. He has a 100 Mbps Verizon FiOS connection on fiber, and when there are multiple games running everything else in the house comes to a screeching halt. For his household even 100 Mbps is not enough speed to meet his normal expected usage.

And yet here we are having discussion at the federal level of setting up two major programs that are using 10/1 Mbps as the standard goal of Internet speed. As a nation we are pouring billions of dollars into a project to improve rural DSL up to a speed that is already inadequate and by the time it is finally finished in six years will be massively below standard. It won’t take very many years for the average household to need 100 Mbps and we are instead taking six years to bring a huge amount of the rural parts of American up to 10/1 DSL.

I know that the FCC is trying to help. But it’s sad to see them crowing about having ‘fixed’ the rural broadband problem when instead they are condemning millions of households to have nearly worthless broadband for the next couple of decades. Imagine if they had instead allowed those billions of dollars to become matching funds for communities willing to invest in real broadband? Communities wanting to do this are out there and many of them were hoping to get some federal help to bring broadband to their areas. Building rural fiber is expensive, and even a little federal help would be enough to allow many rural areas to find the rest of the funding needed to build their own solutions.

And the problems are going to get worse, not better. Verizon didn’t even bother to take the federal subsidies to improve DSL because they don’t want to invest anything in rural copper. AT&T has told the FCC repeatedly that they want to tear down copper to millions of households and put rural households on cellular data. And while Frontier is going to try to make their rural copper plant better, how much can they realistically accomplish with 50–70 year-old copper that was neglected for decades before they bought it?

I just shake my head when I see that Frontier and the FCC are going to be wrangling about households getting Lifeline subsidies for speeds slower than 10/1 Mbps. The FCC has already decided that they are going to throw billions at rural copper and call it job done. It’s about time that we instead start having a conversation about bringing real broadband to rural America.

Categories
The Industry

The Next Generation of 911

I’ve started noticing news articles talking about the next generation of 911 (NG911), so it seems the public is starting to be aware that there is a big change coming in the way that 911 works. We are in the process nationwide of migrating from the traditional 911 to a fully IP-based system that will include a lot of new features. When fully implemented, NG911 will allow interactive text messaging and for smart call routing using caller location that will consider factors such as the workload at the closest 911 center, current network conditions, and the type of call. The NG911 system will enable a data stream between callers and the 911 center so that there can be an exchange of pictures, videos (including support for American Sign Language), and other kinds of data that will enhance the ability of a 911 center to do their job such as building plans or medical information.

NG911 will be implemented in phases and many places are already experimenting with some of the new features like text messaging. But other parts of the final IP-based 911 are still under development.

NG911 is going to replace today’s circuit-switched 911 networks that carries only voice and a very limited amount of data. Today each carrier that handles voice calls must provide dedicated voice circuits between them and the various 911 centers that fall within their service area. For landlines the 911 center that any given customer contacts is predetermined based upon their telephone number.

But the number-based 911 has been having problems with some kinds of calls. There are numerous examples of where 911 was unable to locate mobile callers since they tried to use triangulation to find the location of a caller to 911. And for a number of years it’s been possible to move a VoIP phone to anywhere that has a data connection and the current 911 systems have way to identify or know the location of such callers. And identifying callers is going to get harder as we start seeing huge volumes of WiFi-based VoIP from cellphones as cellular carriers dump voice traffic onto the landline data network in the same manner they have with other data. The promise is that NG911 will be able to handle the various flavors of VoIP.

There are a lot of new standards being developed to define the operating parameters of NG911. The standards are being driven though NENA, the National Emergency Number Association. Many of these standards are now in place, but standards keep evolving as vendors try different market solutions. A lot of the new NG911 is going to be driven by the creation and use of a number of new database systems. These systems will be used to manage functions like call validation, smart routing control, and the processing of new NENA-approved call records.

The new IP-based 911 networks are being referred to as ESInets (Emergency Service IP Networks). There are managed private networks that will be used to support not only 911 but also other types of public safety communications.

The overall goal is to do a much better job responding to emergencies. Today there are far too many examples of calls to 911 that never get answered, calls that are sent to the wrong 911 center, or calls where the 911 operators can’t determine the location of the caller. Further, the new system will allow for the public to summon 911 in new ways other than through voice calls. There will be a two-way process for sending pictures and videos to the 911 center or floor plans and medical advice back to callers. When fully implemented this should be a big leap forward resulting in reduced costs due to more efficient use of our emergency resources as well as more lives saved.

Categories
What Customers Want

Google’s Experiment with Cellular Service

As I’m writing this (a week ago), Google opened up the ability to sign-up for its Project Fi phone service for a 24-hour period. Until now this has been by invitation only, limited I think by the availability of the Google Nexus phones. But they are launching the new Nexus 5X phone and so they are providing free sign-up for a 24-hour period.

The concept behind the Google phone plan is simple. They sell unlimited voice and text for $20 per month and sell data at $10 per gigabit as it’s used. The Google phone can work on WiFi networks or will use either the Sprint or T-Mobile networks when a caller is out of range of WiFi. And there is roaming available on other carriers when a customers in not within the range of any of the preferred networks.

Cellular usage is seamless for customers and Google doesn’t even tell a customer which network they are using at any given time. They have developed a SIM card that can choose between as many as 10 different carriers although today they only have deals with the two cellular carriers. The main point of the phone is that a customer doesn’t have to deal with cellular companies any longer and just deals with Google. There are no contracts and you only pay for what you use.

Google still only supports this on their own Nexus phones for now although the SIM card could be made to work in numerous other phones. Google is letting customers pay for the phones over time similar to what the other cellular carriers do.

Google is pushing the product harder in markets where it has gigabit networks. Certainly customers that live with slow or inconsistent broadband won’t want their voice calls routing first to WiFi.

The main issue I see from the product is that it is an arbitrage business plan. I define anything as arbitrage that relies on using a primary resource over which the provider has no control. Over the years a lot of my clients are very familiar with other arbitrage plans that came and went at the whim of the underlying providers. For example, there have been numerous wholesale products sold through Sprint like long distance, dial tone, and cellular plans that some of my clients used to build into a business plan, only to have Sprint eventually decide to pull the plug and stop supporting the wholesale product.

I am sure Google has tied down Sprint and T-Mobile for the purchase of wholesale voice and texting for some significant amount of time. But like with any arbitrage situation, these carriers could change their mind in the future and strand both Google and all of their customers. I’m not suggesting that will happen, but I’ve seen probably a hundred arbitrage opportunities come and go in the marketplace during my career and not one of them lasted as long as promised.

It’s been rumored that Apple is considering a similar plan. If they do, then the combined market power of both Google and Apple might make it harder for the underlying carriers to change their mind. But at the end of the day only a handful of companies own the vast majority of the cellular spectrum and they are always going to be the ones calling the shots in the industry. They will continue with wholesale products that make them money and will abandon things that don’t.

There are analysts who have opined that what Google is doing is the inevitable direction of the industry and that cellular minutes will get commoditized much in the manner as long distance in the past. But I think these analysts are being naive. AT&T and Verizon are making a lot of money selling overpriced cellular plans to people. These companies have spent a lot of money for spectrum and they know how to be good monopolists. I still laugh when I think about how households that used to spend $30 to $50 per month for a landline and long distance now spend an average of $60 per family member for cellphones. These companies have done an amazing job of selling us on the value of the cellphone.

Perhaps the analysts are right and Google, maybe with some help from Apple, will create a new paradigm where the carriers have little choice but to go along and sell bulk minutes. But I just keep thinking back to all of the past arbitrage opportunities where the buyers of the service were also told that the opportunity would be permanent – and none of them were.

Categories
Regulation - What is it Good For?

Special Access Rate Investigation

There is an investigation going on at the FCC that is probably long overdue involving looking at special access rates. Special access rates are rates used by telephone companies to charge for TDM data circuits such as T1s.

You might think that T1s and TDM technology would be fading away, but the large telcos are still making a fortune by requiring other carriers and large businesses to interface with them using TDM circuits and then charging a lot of money for the connections. As an example, the connections between a large telco like AT&T and CLECs or long distance carriers are still likely to be comprised of DS-3s (28 T1s).

There are also still a lot of businesses that use T1s. There are still a lot of older phone systems sitting at small businesses that need a T1 interface to connect back to the phone company. And in very rural markets where there is no last mile fiber the telcos are still selling T1 data connections to businesses and delivering the paltry 1.544 Mbps of data that it can deliver.

The main thrust of the investigation are the prices being charged. In places with no competition the telcos might still charge between $400 and $700 per month for a T1 connection. And it’s not unusual for carriers to have to pay thousands of dollars per month to interface with the large carriers at a regional tandem switch.

There was a time when the prices charged for TDM circuits were somewhat cost-based, even though as somebody who did some of the cost studies behind the rates, I can tell you that every trick in the books was used to justify the rates at the highest possible cost. But in a 100% copper network there was some logic behind the charges. For example, if a business bought a T1 the phone company had to dedicate two copper pairs throughout the network for that service plus provide fairly costly electronics to supply the T1. I remember when T1s first hit the market and they were a big step forward in telco technology.

But technology has obviously moved forward and we now live in an Ethernet world. The FCC has been working on a transition of the public switched telephone network from TDM to all-IP and this investigation is part of that process.

The prices for TDM special access are all included in tariffs, and for the most part the rates have not changed in years, or even in decades. Where it used to be a big deal, for example, for a telco to send a DS3 between its offices, the bandwidth on a DS3, at 45 Mbps, is barely a blip today inside of the huge Ethernet data pipes the phone companies use to connect their locations. Even if the cost for a DS3 was justified at some point in time, when those same circuits are carried today inside much large data pipes the cost of transporting the data has dropped immensely.

It’s good that the FCC is investigating this, but to a large degree it’s their fault that the rates are so high. It’s been decades now since either the FCC or the state regulatory commissions required cost studies for TDM circuits. And without the prodding by the regulatory agencies the telcos have all let the old rates stand in place and have been happily billing them year after year. This investigation should have been done sometime soon after the Telecommunications Act of 1996, because the rise of competitive telecom companies created a boom in special access sales, all at inflated prices.

Special access rates matter a lot to small carriers. For example,special access is one of the largest expenses for any company that wants to provide voice services. It’s not unusual for a company to spend $100,000 or more per year buying special access services even if they deliver only a tiny volume of voice traffic to the world. As would be expected, the high costs adversely affect small carriers to a much greater extent than large carriers who can better fill up the pipe between them and the large telco.

For years the telcos have hidden behind the fact that these rates are in a tariff, meaning that they are not negotiable for other carriers. But at the same time, the telcos routinely drop rates significantly when selling special access circuits in a competitive market. The high special access rates apply only to those small carriers and businesses who are too small to negotiate or who do not operate in a competitive part of the network. It’s definitely time for these rates to be brought in line with today’s costs, which are a small fraction of what the telcos are charging. It would not be shocking for the FCC to determine that special access rates are 70% to 90% too high, particularly when you consider that most of the network and electronics that support them have been fully depreciated for years.

Exit mobile version