Is the FCC Going to Nationalize the Media?

Pinky_and_the_BrainI ran across an article that just has me shaking my head. It’s by Kurt Nimmo at a site called Alex Jones’ InfoWars. This article claims that the FCC is doing a study as a precursor for privatizing the private media sector, including the Internet.

I certainly understand with people feel a little paranoid about the government right now due to the gigantic data gathering that is being done by the NSA. But it’s pretty incredibly paranoid to think that somehow the US government is planning on taking over every TV station, radio station, newspaper and even the Internet. Of course, one gets the first clue that this article is a bit paranoid and biased when the very first sentence uses the word ‘sovietization’.

So what the heck is this guy talking about that would give him the idea that the FCC was ready to take over the communications world? It starts with a study that the FCC is undertaking through Social Solutions International. I have included here this Research Design Study from SSI so that you can see it for yourself. This document is not the results of the FCC study, but instead is a description of the study that is currently being undertaken, to be published sometime in 2014.

My firm CCG Consulting does market surveys and so I am pretty conversant with the kind of jargon that is used to talk about statistical sampling and market research. And this document is massively jargon-laden and it takes some reading between the lines to figure out exactly what they are doing.

So what is this study trying to find out? They are basically after two things. First, they want to understand better where people go to get their news. The study refers to this as ‘critical information needs’, but it basically boils down to where people to find out what is happening in the world – and that is news

Second, the study is looking at random local markets to do a qualitative analysis of information that is made available to the public. This is the part that has Mr. Nimmo so paranoid because it is going to look at local newspapers, radio broadcasts and TV and judge them according to accuracy, fairness, bias, etc. And somehow they are going to try to do the same thing with the Internet.

But it’s a long stretch to say that the FCC is using this study as a precursor to taking over media. That is a monstrous break in logic and out of touch with reality and with the relatively weak nature of the FCC.

So what do I think of this study? It certainly is within the purview of the FCC to periodically look at how people communicate in the country. After all, they are in charge of monitoring and regulating those very industries.

But I don’t think this particular study is going to be very effective or turn up anything of much interest. Certainly it is going to give us a peek at where people go to get information today. But one would have to think that companies like Google know far more about that today than what this study is going to uncover.

And I have very poor hopes that the qualitative analysis is going to uncover anything that will be statistically valid and have any relevance for the whole country. It would make a lot more sense to study a tiny of handful of markets in complete depth over a long period of time if somebody really wants to understand the barriers and misinformation that is in place today in local media. Those kinds of local studies are best done by academia. This study doesn’t look to me to be as thorough and vigorous as those kinds of studies can be.

And so my expectations is that this study is going to generate a few headlines next year highlighting whatever claims the study makes, and then it will go on the shelf. It’s not likely to have much impact on FCC policy and it certainly is not going to be the catalyst to the FCC somehow taking over the US media (not sure how they would do that even if they wanted to).

But unfortunately in the Internet age people like Mr. Nimmo can stir up paranoia and animosity towards the government over what, in this case, looks more like an expensive boondoggle. There are certainly things that I don’t like about the FCC, being an industry person, but I am not too worried that they are out to conquer the world a la Pinkie and the Brain.

Europe Has the Right Goals

The European Commission issued a press release yesterday that announced that 100% of the households in Europe now have access to broadband.

Most households have some sort of wired access with 96.1% of homes having access to copper, coax or fiber. Wireless coverage with 2G, 3G or 4G covers 99.4% of houses. And all remote homes are now covered by satellite broadband using a network of 148 satellites.

Before anybody argues that we have the same thing here in the US due to satellite, we need to distinguish between the satellite broadband that is available here and what is available in Europe. Basic satellite service in Europe is only $13 per month. I can’t find the speed for but assume this is a few Mbps download speeds. But customers can get 20 Mbps download from satellite for $33 per month.

In the US there are two major satellite providers. ViaSat Exede offers a 12 Mbps download service. The amount you pay is based upon the usage cap you choose. For $50 per month you can get 10 GB per month, for $80 you can buy 15 GB and for $130 you can get 25 GB. Hughesnet offers 5 Mbps down and 1 Mbps up for $50 per month, 10 Mbps down and 1 Mbps up for $60, 10 Mbps down and 2 Mbps up for $80 and 15 Mbps down and 2 Mbps up for $130. The four Hughesnet products also have data caps of 10 GB, 20 GB, 30 GB and 40 GB respectively.

Speed isn’t everything and the caps matter. Just to put those data caps into perspective, a 2-hour HD movie will range between 3 and 4.5 GB. So homes in the US using satellite are very limited in using their satellite connection to view video.

The US satellite companies are also limited since they only have a few satellites capable of delivering the above products. If those satellites get oversubscribed then actual speeds will be slower than advertised in the same way that a cable modem system can bog down in the evening hours. But with more satellites in Europe the speeds can be faster and there is a lot less chance of congestion and oversubscription.

The Europeans also have goals to speed up Internet access. They have the goal by 2020 of getting all citizens the ability to have 30 Mbps download speeds, with at least half of them having access to 100 Mbps.

This is pretty easy to contrast with the US where the current national definition for terrestrial  broadband is 4 Mbps down and 1 Mbps up. Both stimulus grants and borrowing from the RUS have recently financed networks that are able to deliver those speeds.

If we don’t set high goals in the US, and if we are content to finance rural broadband that delivers slow speeds when it is brand new, we are relegating the rural areas to having slow broadband for decades to come.

In the US we are more given to grand announcements that don’t come with any funding or mandates. For example, earlier this year the FCC set a goal of having a Gigabit City in every state of the country. That means a network that is capable of delivering a gigabit of download speeds to customers.

Don’t get me wrong, I would love to live in one of those few places where you can get a gigabit. But this is a completely voluntary system, and a Gigabit City might only be actually selling that much speed to a few customers to be given the designation. Rather than trying to get one City in each state to provide a few customer with a gigabit download speed we ought to instead be concentrating on making our basic broadband a lot faster than 4 Mbps. When that lowly speed is our national goal, we are telling rural America to not expect anything better.

The Europeans have it right and we have it wrong. And a decade from now when we are far behind them in terms of productivity we can look back on the crappy national goals we set for ourselves.

The Infrastructure Crisis

infrastructure revealed

infrastructure revealed (Photo credit: nicolasnova)

This country has an infrastructure crisis. A lot of my blog talks about the need for building fiber since I consider fiber as basic infrastructure in the same way that roads, bridges and sewers are infrastructure. Any town without adequate fiber is already starting to get bypassed in terms of opportunities for its citizens and businesses. And this is only going to get worse with the upcoming Internet of Anything, because only fiber is capable of carrying the vast amounts of data that are going to be generated.

But this country has a crisis with every kind of basic infrastructure. We are not spending enough money to keep our roads, bridges, power, water and other basic infrastructure from slowly deteriorating. The backlog of infrastructure upgrades needed just to get the country back to adequate is staggering.

It has historically been the purview of government to take care of a lot of this infrastructure – and while the federal government takes care of interstate highways and some bridges, the obligation for keeping up with infrastructure falls largely on state and local governments.

And those government entities do not have anywhere near the borrowing capacity to begin tackling the cost of fixing everything that needs fixing or updated. And local property and other taxes would have to be increased a huge amount to pay for it all. Even if there was a taste for doing the needed upgrades, the recent economy has brought many local governments up against their borrowing limits. And we are starting to see municipal bankruptcies, small and large, which is a sign that the municipal borrowing system is cracking around the edges.

And the ability for municipal entities to borrow could get much harder. The recent Detroit bankruptcy is just the tip of the iceberg in terms of large cities that are buckling under accumulated pension costs. And the nonsense going on in nonsense going on in nonsense going on in Washington with the federal debt ceiling might drive up interest rates.

Given all of these factors one has to ask if government financing is the best way to build infrastructure. There certainly are mountains of evidence that municipally funded projects cost more than similar projects constructed by private firms. And while municipal bond interest rates sound cheap, bond money is extremely expensive money due to the additives to bond borrowing such as capitalized interest and debt service reserve funds.

If this country has any hope of putting a dent in the huge infrastructure hole we find ourselves it is going to have to come from bringing private capital to bear on the problem. Where there is a financial crush in the public sector today we are looking at huge amount of private equity on the sidelines today just waiting to be invested in good projects.

The trick to attracting private money for infrastructure is to find a good way to forge public / private partnerships. Unfortunately, there is one key missing component that is making it hard to bring private money into infrastructure deals. And that is development capital.

Development capital is the money that is spent up front in a project to take it from concept to working plan. This includes such things as creating business plans, doing basic engineering, identifying hurdles and solutions – all of those early steps that private equity expects to be done before they will consider a project. In layman’s terms, private equity investors expect somebody else to have done the legwork to prove the feasibility of a project before they will consider it.

We have a development capital gap in this country. There are very few entities today that are willing to tackle spending the development capital needed to prove infrastructure projects. And so hundreds, even thousands of worthy projects are going undone because nobody is willing to spend that first 1% of a project needed to get it started.

What we need is a person or a group of people to step up to provide development capital. This could be government. For instance, for the cost of building one bridge they could instead provide the public development capital to build one hundred bridges. So state governments might be a great place to get this done.

It could also be done privately, meaning that somebody needs to create funds that strictly are development capital. Such funds could produce fantastic returns. But this is a concept that is alien to US investors.

But somebody needs to figure out how we get development capital or our infrastructure is going to continue to deteriorate until we have no choice but to fix it directly with tax dollars.

Grasping the Internet of Things

Internet of Things IoT13 Forum June 2013 040

Internet of Things IoT13 Forum June 2013 040 (Photo credit: marklittlewood1)

I have written several blog entries about the Internet of Things. But I have not really defined it very well. I read as many articles about the topic as I can find since I find it personally fascinating. To me this is mankind finally using computer technology to affect everyday life and goes far beyond things you can do with a PC or tablet.

I recently saw an article that summarized the direction of the Internet of Things into three categories – and this is a good description of where this is all headed. These categories are:

Knowledge of Self. This part of the Internet of things is in its infancy. But the future holds the promise that the Internet can be used to help people with self-control, mindfulness, behavior modification and training.

Today there are gimmicky things people are doing with sensors, such as counting the number of times you have opened the refrigerator as a way to remind you to lose weight. But this can be taken much further. We are not far from a time when people can use computers to help them change their behavior effectively, be that in losing weight or in getting your work done on time. Personal sensors will get to know you intimately and will be able to tell when you are daydreaming or straying from your tasks and can bring you back to what you want to accomplish. Computers can become the good angel on your shoulder should you choose that.

Probably the biggest promise in this area is that computers can be used to train anybody in almost anything they want to know. The problem with the Internet today is that it is nearly impossible in a lot of cases to distinguish between facts and fiction. But it ought to be possible to have the needed facts at your fingertips in real-time. If you have never changed a tire your own personal computer assistant will lead you through the steps and even show you videos of what to do as you do it for the first time. But such training could bring universal education to everybody in the world, which would be a gigantic transformation of mankind – and would obviate the widespread ignorance and superstitions that still come today from lack of education.

Knowledge of Others. Perhaps the two most importance in this area will be virtual presence and remote health care.

With virtual presence you will be able to participate almost anywhere as if you were there. This takes the idea of video conferencing and makes it 3D and real-time. This is going to transform the way we do business, hire employees and seek help from others.

But perhaps the biggest change is going to come in health care. Personal medical sensors are going to be able to monitor your body continuously and will alert you to any negative change. For instance, you will know when you are getting the flu at the earliest possible time so that you can take medicine to mitigate the symptoms.

There is also great promise that medical sensors will make it possible for people to live in their own homes for longer as we all age, something just about everybody wants. Sensors might even change the way we die. Over 80% of people say they want to die at home, but in 2009 only 33% did so. Medical monitoring and treatment tied to sensors ought to let a lot more of us die in the peace of our own beds.

Perhaps the biggest promise of personal monitors is the ability to detect and treat big problems before they get started. Doctors are saying that it ought to be possible to monitor for pre-cancerous cells and kill them when they first get started. If so, cancer could become a disease of the past.

Knowledge of the World. The Internet of Things promises to eventually have sensors throughout the environment. More detailed knowledge of our surroundings will let us micromanage our environment. Those who want a different amount of humidity in the air will be able to have this done automatically in rooms where they are alone.

But remote sensors hold the most promise in areas of things like manufacturing and food production. For instance, sensors can monitor a crop closely and can make sure that each part of a field gets the right amount of water and nutrition and that pests are controlled before they get out of hand. Such techniques could greatly increase the production of food per acre.

And we can monitor anything. People living near to a volcano, for example, will know far ahead of time when there has been an increase in activity.

Monitoring the wide world is going to be the last part of the Internet of Things to be implemented because it is going to require drastic new technologies in terms of small sensors and the ability to interpret what they are telling us. But a monitored world is going to be a very different world – probably one that is far safer, but also one where there is far less personal freedom – at least the freedom to publicly misbehave.

How Vulnerable is the Internet?

OLPC: XO internet access

OLPC: XO internet access (Photo credit: Wikipedia)

A question you hear from time to time is how vulnerable the US Internet backbone is in terms of losing access if something happens to the major hubs. The architecture of the Internet has grown in response to the way that carriers have decided to connect to each other and there has never been any master plan for the best way to design the backbone infrastructure.

The Internet in this country is basically a series of hubs with spokes. There are a handful of large cities with major regional Internet hubs like Los Angeles, New York, Chicago, Dallas, Atlanta, and Northern Virginia. And then there are a few dozen smaller regional hubs, still in fairly large cities like Minneapolis, Seattle, San Francisco, etc.

Back in 2002 some scientists at Ohio State studied the structure of the Internet at the time and said that crippling the major hubs would have essentially crippled the Internet. At that time almost all Internet traffic in the country routed through the major hubs, and crippling a few of them would have wiped out a lot of the Internet.

Later in 2007 scientists at MIT looked at the web again and they estimated that taking out the major hubs would wipe out about 70% of the US Internet traffic, but that peering would allow about 33% of the traffic to keep working. And at that time peering was somewhat new.

Since then there is a lot more peering, but one has to ask if the Internet is any safer from catastrophic outage as it was in 2007? One thing to consider is that a lot of the peering happens today at the major Internet hubs. In those locations the various carriers hand traffic between each other rather than paying fees to send the traffic through an ‘Internet Port’, which is nothing more than a point where some carrier will determine the best routing of the traffic for you.

And so peering at the major Internet hubs is great way to save money, but it doesn’t really change the way the Internet traffic is routed. My clients are smaller ISPs, and I can tell you how they decide to route Internet traffic. The smaller ones find a carrier who will transport it to one of the major Internet hubs. The larger ones can afford diversity, and so they find carriers who can carry the traffic to two different major Internet hubs. But by and large every bit of traffic from my clients goes to and through the handful of major Internet hubs.

And this makes economic sense. The original hubs grew in importance because that is where the major carriers at the time, companies like MCI and Qwest already had switching hubs. And because transport is expensive, every regional ISP sent their growing internet traffic to the big hubs because that was the cheapest solution.

If anything, there might be more traffic routed through the major hubs today than there was in 2007. Every large fiber backbone and transport provider has arranged their transport networks to get traffic to these locations.

In each region of the country my clients are completely reliant on the Internet hubs. If a hub like the one in Dallas or Atlanta went down for some reason, ISPs that send traffic to that location would be completely isolated and cut off from the world.

There was a recent report in the Washington Post that said that the NSA had agents working at only a handful of major US Internet pops because that gave them access to most of the Internet traffic in the US. That seems to reinforce the idea that the major Internet hubs in the country have grown in importance.

In theory the Internet is a disaggregated, decentralized system and if traffic can’t go the normal way, then it finds another path to take. But this idea only works assuming that ISPs can get traffic to the Internet in the first place. A disaster that takes out one of the major Internet hubs would isolate a lot of towns from the region around it from having any Internet access. Terrorist attacks that take out more than one hub would wipe out a lot more places.

Unfortunately there is no grand architect behind the Internet that is looking at these issues because no one company has any claim to deciding how the Internet workd. Instead the carriers involved have all migrated to the handful of locations where it is most economical to interconnect with each other. I sure hope, at least, that somebody has figured out how to make those hub locations as safe as possible.

Watching Health Care Costs

English: President Barack Obama's signature on...

English: President Barack Obama’s signature on the health insurance reform bill at the White House, March 23, 2010. The President signed the bill with 22 different pens. (Photo credit: Wikipedia)

I know in my own company and that at every one of my clients that health care costs are a big concern. We have gone through a decade long period where the inflation in health care costs has been in the double digits each year, much faster than any other of the costs we face.

The rate of inflation of health care costs nationwide has finally slowed. This year the nationwide rate of health care cost increase is expected to be at 5.5%. Next year the prediction is for 4.5%. Both of these rates are still higher than inflation, but a welcome relief after years of really large increases. I know in my own firm, which has been around for fifteen years that health care costs per employee have nearly tripled since we opened for business.

And those current inflation rates do not tell the true story for many firms. The 5.5% increase in health care costs this year reflects the cost of health care to employers, not overall insurance costs. And so the slower rate of health care inflation is due in part to companies are pushing higher deductibles and copays to employees as a way to keep their share of health insurance under control. Last year the amount of copays by employees rose 13% which shows that the overall increase in health insurance was a lot more than the published 5.5%.

There are some trends in the industry that hint at a possible slowing in the cost of health care. For example, there is a large industry now of out-patient health clinics that charge as much as two-thirds less than a normal doctor. There is hope that the large statewide pools that are being created under Obamacare will lower overall insurance premiums by bringing more young people (and healthier people) into the insurance pool.

There is also becoming a bigger emphasis in many health care plans of preventive care, meaning that many ailments will be nipped in the bud before they become big problems. Over time preventive care will significantly lower overall health care costs.

And hidden underneath all of these numbers is the very numbing statistic that 30% of our nationwide health care each year is spent for the process of people dying in hospitals and hospices. In recent years just about two-thirds of people die in an institution rather than their home. But this is down ten percent over a decade ago. Almost nobody wants to die in an institution and perhaps as a country we will be able to find a way to allow more people to die at home.

But for most of my clients, even if health care cost inflation slows to 4% – 5% they are facing an ugly future. Trend those increases out ten years and see if you aren’t very concerned.

There is also something to keep in mind which is that in 2018 there is going to be a tax on ‘cadillac’ health care plans. These are plans today that would cost over $10,200 for an individual or $27,500 for a family. That may sound like high caps, but these amounts count the contributions made by both the company and the employee. The tax is a whopping 40% charged to the employer on anything over the cap.

The average health care insurance cost for last year was $10,522, so there are already many plans that would be considered Cadillac. These amounts will be increased over time by inflation, but if health care costs continue to climb faster than the rate of inflation, then more plans each year will fall under the premium category and incur the premium tax.

I know that all of my clients want to provide good health care to our employees. The decision to increase copays or deductibles is a painful one for all of us. There are a few creative ideas that some companies are trying that are worth considering. One of the most interesting is the idea of handing your employees the money to buy their own health insurance. There are now ways to do this as a defined health care plan. Since having health insurance is mandatory for employees in 2014 you need to demand proof that an employee is really using the money for health care. But companies who try this say that their employees are finding ways to get cheaper plans than they could buy at the company level.

The bottom line is that health care costs are going to continue to increase faster than the rate of inflation. Add to that the worry of crossing the premium tax threshold and it is going to get harder and harder for you to pay for your employee’s health care costs. I wish I had some magic bullet to recommend, but for now the best I can offer is to do the math and see if there is anything you can do to keep this under control at your own company.

Finding a Broadband Partner

Logo of the United States National Telecommuni...

Logo of the United States National Telecommunications and Information Administration, an agency in the Department of Commerce. (Photo credit: Wikipedia)

The NTIA issued a notice last week that asks if they should continue the BroadbandMatch website tool. This tool was created during the stimulus grant process and the original goal was to connect partners for applying or implementing the broadband grants. And the process worked. One of the components of the grants was the requirement for matching funds and there were many grant applicants with a great idea who had to find a partner to supply the matching funds. A significant percentage of the stimulus grants involved multiple parties and many of them found their partners using this tool.

On the NTIA tool a company would describe what they were trying to do and would describe the kind of partner they were looking for. And the main reason this worked was that the government was giving away billions of dollars for fiber construction, and so a lot of companies were looking for a way to get in on the action. Many of the companies involved in the grant process were new companies formed just go get the grants. The NTIA tool gave companies who were not historically in the telecom business a way to find potential partners

The NTIA asks if they should keep this service going, and if so how it ought to work. I will be the first to say that I was surprised that the tool was even still around since it was clearly designed to put together people to make stimulus grants work. The only way a tool like this can work now is if everybody in the industry knows about it and thinks to look there when they are interested in making an investment.

But I am going to guess that if I didn’t know that this tool was still active that hardly anybody else does as well. It was great for the purpose it was designed for, but one has to ask if this is going to be a place where companies look when they are seeking a partner. It has been my experience that outside that grant process, which was very public, that most people want to keep the process of forming new ventures as quiet as possible to avoid tipping the competition too early. And so, without the billions of public dollars that made the grants attractive I can’t see this tool being of much interest.

But this leads me to ask how a company can find a partner for a new telecom venture? The most normal type of partnership I see is one made between people with technical expertise looking for investors and people with cash looking for opportunities. So how do these kinds of partners find each other?

At CCG we have helped numerous carriers find partners and the following, in our experience, is what has worked and not worked:

  • Put out a formal request for a partner. This may mean issuing an RFP or an RFI or advertising somewhere to find interested parties. I have not found this process to be particularly fruitful, because it normally doesn’t uncover any potential partners that you didn’t already know.
  • Get to know your neighbors better. I have found that most partnerships end up being made by people in the same geographic area. It is not uncommon for the parties to not know each other well before the partnership, and sometimes they are even competitors. But there is a lot more chance that people in your region will best understand the potential for local opportunities.
  • Don’t be afraid to cross the line. Commercial CLECs and independent telephone companies are usually dismayed by municipalities that get into the telecom business. But generally those cities are just hungry for broadband and in almost every case they would prefer that a commercial provider come and build the infrastructure in their community. So crossing the line and talking to municipalities might uncover the best local partnership opportunities. If a town wants broadband badly enough (and many of them do) then they might be willing to provide concessions and cash to make it work.

Of course, this doesn’t even begin to answer the question of how to make a partnership work, which I will address in later blogs this week.

The FCC’s Data Collection Effort

Character for children of FCC"Broadband"

Character for children of FCC”Broadband” (Photo credit: Wikipedia)

The FCC just changed the way that they are going to gather data from carriers about voice and data usage in the US. To some degree they seem to be throwing in the towel and just giving up.

I have blogged before about the massive inadequacies of the National Broadband Map. This is an FCC-sponsored effort to show the availability of broadband on a geographic basis. This sounds like a laudable goal, but the carriers decide what information they want to supply to the mapping process, and so the map is full of what can only be described as major lies from the largest carriers. They claim to have broadband where they don’t and at speeds far greater than they actually deliver.

The FCC announced new rules for their data collection process that is done using FCC Form 477. This revised effort by the FCC is going to make their data gathering more like the process that is used to collect data for the National Broadband Map. They are no longer going to try to collect actual data speeds in tiers, but instead will be collecting only advertised speeds for data – the fastest advertised speed for landline providers and the slowest advertised speeds for wireless providers. For the life of me I can’t imagine how this data can be of the slightest use to anybody.

I just recently worked with a client in a small town in Oregon. The incumbent providers there are the biggest telephone company and cable company in the state. In both cases, they advertise the same speeds in this small town that they advertise in Portland. But in this town, as in most or rural America, the actual speeds delivered are far slower. They think the fastest cable modem speeds in the town are from 3 – 5 Mbps download and the fastest DSL is not much over 1.5 Mbps. And yet both carriers advertise products at many times those speeds.

This would just be a big annoyance if it wasn’t for the fact that the FCC and others use the data gathered to talk about what a great job the carriers are doing in this country to supply broadband. I recently saw an announcement that 98% of households now have broadband availability. And since the FCC’s definition of broadband is now a download speed of 4 Mbps and an upload speed of 1 Mbps, this makes it sound like the country’s broadband problems are being solved. But announcements of this sort are based upon lies and exaggerations by the carriers.

And since the whole point of this data gathering effort is to formulate policies to spur the carriers to do better, letting the carriers self-report whatever they want is like asking the fox to go count the eggs in the henhouse every morning. There is no practical penalty against a carrier advertising any speed they want or reporting falsely to the FCC. And it’s a lot easier, as it is with the Oregon example, for the incumbent providers to gear all of their advertising in a state around the urban markets. I have no idea if those incumbents in Oregon can actually deliver the advertised speeds in Portland, but I know for a fact that they do not do so outside of Portland.

The FCC is also changing the way that they gather information about VoIP lines. But I think the day for them to be able to gather any meaningful data about business phones in the country is over. There is such a proliferation of IP Centrex and other VoIP technologies that the carriers don’t even know what is being delivered. Consider this:

  • It’s now possible to use one number for a thousand lines in a call center or instead to give a thousand numbers to one phone.
  • There is a proliferation of resellers in the market who buy numbers and 911 from larger carriers so that they don’t have to become a CLEC. And these resellers can then deliver a wide variety of business voice services over anybody’s data connection. These carriers will not be reporting what they are doing to the FCC because most of them are not certified as carriers but rely on the certification of the CLEC that gave them numbers.  Nobody in the FCC reporting chain is going to know about or report these kinds of customers and lines. And it gets worse because I know of many cases now of resellers of these resellers. Literally almost anybody can become a carrier overnight reselling these services. It’s back to the wild west days we used to see with long distance resale. I’m expecting to go to a telecom convention soon and see the shark-skin suits again.

Should You Be Peering?

Google 貼牌冰箱(Google Refrigerator)

Google 貼牌冰箱(Google Refrigerator) (Photo credit: Aray Chen)

No, this is not an invitation for you to become peeping toms, dear readers. By peering I am talking about the process of trading Internet traffic directly with other networks to avoid paying to transport all of your Internet traffic to the major Internet POPs.

Peering didn’t always make a lot of sense, but there has been a major consolidation of web traffic to a few major players that has changed the game. In 2004 there were no major players on the web and internet traffic was distributed among tens of thousands of websites. By 2007 about 15,000 networks accounted for about half of all of the traffic on the Internet. But by 2009 Google took off and it was estimated that they accounted for about 6% of the web that year.

And Google has continued to grow. There were a number of industry experts that estimated at the beginning of this year that Google carried 25% to 30% of all of the traffic on the web. But on August 16 Google went down for about 5 minutes and we got a look at the real picture. A company called GoSquared Engineering tracks traffic on the web worldwide and when Google went down they saw an instant 40% drop in overall web traffic as evidenced by this graph: Google’s downtime caused a 40% drop in global traffic

And so, when Google went dead for a few minutes, they seem to have been carrying about 40% of the web traffic at the time. Of course, the percentage carried by Google varies by country and by time of day. For example, in the US a company called Sandvine that sells Internet tracking systems, estimates that NetFlix uses about 1/3 of the US Internet bandwidth between 9 P.M. and midnight in each time zone.

Regardless of the exact percentages, it is clear that a few networks have grabbed enormous amounts of web traffic. And this leads me to ask my clients if they should be peering? Should they be trying to hand traffic directly to Google, NetFlix or others to save money?

Most carriers have two major cost components to deliver their Internet traffic – transport and Internet port charges. Transport is just that, a fee that if often mileage based that pays for getting across somebody else’s fiber network to get to the Internet. The port charges are the fees that are charged at the Internet POP to deliver traffic into and out of the Internet. For smaller ISPs these two costs might be blended together in the price you pay to connect to the Internet. So the answer to the question is, anything that can produce a net lowering of one or both  of these charges is worth considering.

Following is a short list of ways that I see clients take advantage of peering arrangements to save money:

  • Peer to Yourself. This is almost too simple to mention, but not everybody does this. You should not be paying to send traffic to the Internet that goes between two of your own customers. This is sometimes a fairly significant amount of traffic, particularly if you are carrying a lot of gaming or have large businesses with multiple branches in your community.
  • Peer With Neighbors. It also makes sense sometime to peer with neighbors. These would be your competitors or somebody else who operates a large network in your community like a university. Again, there is often a lot of traffic generated locally because of local commerce. And the amount of traffic between students and a university can be significant.
  • Peering with the Big Data Users. And finally is the question of whether you should try to peer with Google, Netflix or other large users you can identify. There are several ways to peer with these types of companies:
    • Find a POP they are at. You might be able to find a Google POP or a data center somewhere that is closer than your Internet POP. You have to do the math to see if buying transport to Google or somebody else costs less than sending it on the usual path.
    • Peer at the Internet POP. The other way to peer is to go ahead and carry the traffic to the Internet POP, but once there, split your traffic and take traffic to somebody like Google directly to them rather than pay to send it through the Internet port. If Google is really 40% of your traffic, then this would reduce your port charges by as much as 40% and that would be offset by whatever charges there are to split and route the traffic to Google at the POP.

I don’t think you have to be a giant ISP any more to take advantage of peering. Certainly make sure you are peeling off traffic between your own customers and investigate local peering if you have a significant amount of local traffic. It just takes some investigation to see if you can do the more formal peering with companies like Google. It’s going to be mostly a matter of math if peering will save you money, but I know of a number of carriers who are making peering work to their advantage. So do the math.

Will the Real 4G Please Stand Up?

English: 4G LTE single mode modem by Samsung, ...

English: 4G LTE single mode modem by Samsung, operating in the first commercial 4G network by Telia (Photo credit: Wikipedia)

We are all aware of grade inflation where teachers give out more high grades than are deserved. But US cellular marketers have been doing the same thing to customers and have inflated the performance of their data products by calling every new development the next generation. Earlier this year the International Telecommunications Union (ITU) approved the final standards for 4G cellular data. One of the features of the final standard is that a 4G network must be able to deliver at least 100 Mbps of data to a phone in a moving vehicle and up to 1 Gbps to a stationary phone.

Meanwhile in the US we have had cellular networks marketed as 4G for several years. In the US the earliest deployments of 3G networks happened just after 2001. That technology was built to a standard that had to deliver at least 200 kbps of data, which was more than enough when we were using our flip phones to check sports scores.

But since then there have been a number of incremental improvements in the 3G technology. Improvements like switching to 64-QAM modulation and multi-carrier technologies improved 3G speeds. By 2008 3G networks were pretty reliably delivering speeds up to 3 Mbps download using these kinds of improvement. Around the rest of the world this generation of 3G improvements was generally referred to as 3.5G. But in the US the marketers started calling this 4G. It certainly was a lot faster than the original 3G, but it is still based on the 3G standard and is not close to the 4G standard.

And since then there has been other big improvements in 3G using LTE and HSPA. For example, LTE is an all-packet technology and this allows it to send voice traffic over the data network, gaining efficiency by not having to switch between voice and data. One of the biggest improvements was the introduction of MIMO (multiple input multiple output). This allows LTE to use different frequencies to send and receive data, saving it from switching back and forth between those functions as well.

For a while Wi-max looked like a third competitor to LTE, but it’s pretty obvious now in the US that LTE has won the platform battle. All of the major carriers have deployed significant amounts of LTE and most of them say these deployments will be done by the end of this year in metropolitan markets. Speeds on LTE are certainly much faster than earlier speeds using 3.5G technology. But this is still not 4G and around the rest of the world this technology is being referred to as 3.9G or Pre-4G.

But to date there are very few phones that have been deployed that use the LTE network to its fullest. There have been a few handsets, like the HTC Thunderbolt that have been designed to use the available LTE speeds. And Verizon says it will roll out smartphones in 2014 that will only work on the LTE network.

There is a big trade-off in handsets between power consumption and the ability to switch between multiple cellular technologies. A typical cell phone today needs to be able to work on 3G networks, 3.5G networks and several variations of the latest networks including the different flavors of LTE as well as the HSPA+ used by T-Mobile. So, interestingly, the most popular phones like the iPhone and the Galaxy S4 will work on LTE, but don’t come close to achieving the full speeds available with LTE. And of course, nobody tells this to customers.

Starting in September in South Korea will be a new deployment of another incremental improvement in cellular data speeds using a technology called LTE-A (LTE Advanced). This is achieving data speeds of about twice those achieved on the current US LTE deployments. This is achieved by layering in a technology called carrier aggregation (CA) that links together two different spectrums into one data path.

And the US carriers have talked about deploying the LTE-A technology starting sometime in 2014. No doubt when this is deployed in the US some marketer is going to call it 5G. And yet, it is still not up to the 4G standard. Maybe this is now 3.95G. Probably by the time somebody actually deploys a real 4G phone in the US it is going to have to be called 8G.