Categories
Technology

Grasping the Internet of Things

Internet of Things IoT13 Forum June 2013 040 (Photo credit: marklittlewood1)

I have written several blog entries about the Internet of Things. But I have not really defined it very well. I read as many articles about the topic as I can find since I find it personally fascinating. To me this is mankind finally using computer technology to affect everyday life and goes far beyond things you can do with a PC or tablet.

I recently saw an article that summarized the direction of the Internet of Things into three categories – and this is a good description of where this is all headed. These categories are:

Knowledge of Self. This part of the Internet of things is in its infancy. But the future holds the promise that the Internet can be used to help people with self-control, mindfulness, behavior modification and training.

Today there are gimmicky things people are doing with sensors, such as counting the number of times you have opened the refrigerator as a way to remind you to lose weight. But this can be taken much further. We are not far from a time when people can use computers to help them change their behavior effectively, be that in losing weight or in getting your work done on time. Personal sensors will get to know you intimately and will be able to tell when you are daydreaming or straying from your tasks and can bring you back to what you want to accomplish. Computers can become the good angel on your shoulder should you choose that.

Probably the biggest promise in this area is that computers can be used to train anybody in almost anything they want to know. The problem with the Internet today is that it is nearly impossible in a lot of cases to distinguish between facts and fiction. But it ought to be possible to have the needed facts at your fingertips in real-time. If you have never changed a tire your own personal computer assistant will lead you through the steps and even show you videos of what to do as you do it for the first time. But such training could bring universal education to everybody in the world, which would be a gigantic transformation of mankind – and would obviate the widespread ignorance and superstitions that still come today from lack of education.

Knowledge of Others. Perhaps the two most importance in this area will be virtual presence and remote health care.

With virtual presence you will be able to participate almost anywhere as if you were there. This takes the idea of video conferencing and makes it 3D and real-time. This is going to transform the way we do business, hire employees and seek help from others.

But perhaps the biggest change is going to come in health care. Personal medical sensors are going to be able to monitor your body continuously and will alert you to any negative change. For instance, you will know when you are getting the flu at the earliest possible time so that you can take medicine to mitigate the symptoms.

There is also great promise that medical sensors will make it possible for people to live in their own homes for longer as we all age, something just about everybody wants. Sensors might even change the way we die. Over 80% of people say they want to die at home, but in 2009 only 33% did so. Medical monitoring and treatment tied to sensors ought to let a lot more of us die in the peace of our own beds.

Perhaps the biggest promise of personal monitors is the ability to detect and treat big problems before they get started. Doctors are saying that it ought to be possible to monitor for pre-cancerous cells and kill them when they first get started. If so, cancer could become a disease of the past.

Knowledge of the World. The Internet of Things promises to eventually have sensors throughout the environment. More detailed knowledge of our surroundings will let us micromanage our environment. Those who want a different amount of humidity in the air will be able to have this done automatically in rooms where they are alone.

But remote sensors hold the most promise in areas of things like manufacturing and food production. For instance, sensors can monitor a crop closely and can make sure that each part of a field gets the right amount of water and nutrition and that pests are controlled before they get out of hand. Such techniques could greatly increase the production of food per acre.

And we can monitor anything. People living near to a volcano, for example, will know far ahead of time when there has been an increase in activity.

Monitoring the wide world is going to be the last part of the Internet of Things to be implemented because it is going to require drastic new technologies in terms of small sensors and the ability to interpret what they are telling us. But a monitored world is going to be a very different world – probably one that is far safer, but also one where there is far less personal freedom – at least the freedom to publicly misbehave.

Categories
Improving Your Business Technology What Customers Want

Are Telephone Features Still a Product?

English: Original Caller Identification receiv...
English: Original Caller Identification receiver installed at Boeing/PEC facility in Huntsville Alabama (Photo credit: Wikipedia)

I just read another blog that asked why businesses bother paying for voice mail any longer. And it got me to thinking. Why are companies still offering and charging a lot for telephone features in general?

Let’s face it, traditional TDM voice is dying. It may take a few decades for people like my mother to give up her home phone but, but residential voice is going to continue to shrink and shrink until it becomes a pretty marginal product. Don’t get me wrong about the need to still offer voice – 60% of homes and most businesses still have voice – so a full-service carrier needs to have a voice product.

But a lot of homes only carry a phone line to have a 911 phone or to be able to fax. And more and more businesses are going mobile instead of pouring big dollars into fixed phone systems.

What no longer makes sense is to have a large pile of very expensive features and options that people must add to basic telephone service. You are going to drive your voice customers away even faster than they are already leaving if you are still selling voice mail and caller ID as expensive additives to a voice line.

The blog I read asks why any businessman even bothers to use voice mail today, and it’s a great question. The people who know you can always find you some other way to reach you like text or email. I know for myself that the very last and worst way to find me is to leave me a voice mail. And when I get somebody else’s voice mail I rarely leave a message any more, but instead hang up and send them an email or sometimes a text (I am slow with my thumbs). And so voice mail is turning into the way that you communicate with strangers. Unless you are a salesperson, you are not going to find this all that useful.

The cell phone companies have it right. They don’t even offer features and everything they have is included in the basic phone. Granted they have convinced people to pay a huge dollar premium for mobility, but the line they sell you includes all the features. And with smart phones people can easily customize their calling all they want by adding phone apps.

I have been advising for over a decade for my clients to include most features automatically in their base product. Telephone carriers are competing against cell phones or the cable company, both of which give away most features.  The goal today with a voice product is to keep your customers as long as possible – not to nickel and dime them to death with features prices from $1 to $5 each per month. So look at your product line in terms of being customer friendly and competitive – and stop thinking that features are a way to make money.

There might be one feature package that might still make sense as a standalone product – a robust unified messaging platform. But this can’t just be glorified voice mail or nobody is going to pay extra for it. It needs to include the whole suite of tools that make voice usable across all platforms – follow-me service, voice to text, text to voice.

And rather than charge extra for features for your business customers you should be offering them IP Centrex that has all of the traditional features plus all of the features of cell phones built-in. If you can’t give your customers a business line that will do everything they want they will eventually bypass you in favor of somebody who will give them what they want. Note that there are now a few dozen companies that are selling IP Centrex lines over any web connection. So you cannot count on keeping your business customers just because they have always used you.

Categories
Current News Technology

The Future of Rural Broadband

Verizon Wireless “Rule the Air” Ad Campaign (Photo credit: Wikipedia)

There were several events this week that are telling rural subscribers the future of rural broadband. It is a bleak picture.

First, at a Goldman Sachs conference on Tuesday, the CEO of AT&T said that he hoped that the new FCC chairman Tom Wheeler would be receptive to AT&T’s desire to begin retiring its copper network in favor of its wireless network. At the end of last year AT&T had said in an FCC filing that they were going to be seeking to retire the copper plant from ‘millions of subscribers’.

In that filing AT&T had asked to move from the copper network to an all-wireless all-IP network. Stephenson said that cost savings from getting rid of the copper network would be dramatic.

On that same day, Verizon CEO Lowell McAdam said that the idea of offering unlimited data plans for wireless customers was not sustainable and defied the laws of physics. Earlier this year Verizon had ended all of its unlimited wireless data plans and now has caps on every plan.

Verizon already has a rural wireless-based landline surrogate product that it calls VzW. This uses the 4G network to deliver a landline phone and data anywhere that Verizon doesn’t have landline coverage. The base plan is $60 per month and includes voice and 10 gigabytes of data. Every extra gigabyte costs $10. There is an option to buy a $90 plan that includes 20 gigabytes or $120 for 30 gigabytes.

Finally, at the same Goldman Sachs conference mentioned above, the CFO of Time Warner said that they saw more room for increasing data rates.

So what does all of this mean for rural subscribers? First, it means that if you are served by a large incumbent like AT&T that they are going to be working hard to retire your copper and force you onto wireless. And we all know that the wireless data coverage in rural America is not particular fast when you can even get data. The data speeds delivered from a cell tower drop drastically with distance. In urban areas where towers are only a mile or less apart this doesn’t have much practical effect. But in a rural environment a mile is nothing and homes might be a mile apart. People lucky enough to live near to a cell tower can probably get okay data speeds, but those further away will not.

And even if you can get wireless data your usage is going to be capped. Rural landline data usage today may be slow, but it is unlimited. Customers have learned that if they put in WiFi routers that they can channel all of the data usage on their cell phones and tablets to their unlimited landline data connections. But once those connections are wireless, then every byte of data leaving your home, whether directly from a device or though the WiFi router, is going to count against the data caps. So rural America can expect a future where they will have data caps while people in urban areas will not.

Finally, one can expect the price of data to keep climbing. I have been predicting this for a decade. The large telcos and cable companies are facing a future where the old revenues streams of voice and cable TV are starting to decline. The only sustainable product they have is data. And so as voice and cable continue to tumble, expect incumbents to get into the habit of raising data prices every year to make up for those declines. Competition won’t help because the cell company data is already expensive, and both the incumbent cable and telcos will be raising data rates together.

This is not a pretty picture for a rural subscriber. Customers will be forced from copper to wireless. Speeds are not likely to get much faster. Data is going to be capped and prices will probably be increased year after year.

Categories
Improving Your Business Technology The Industry

Are You Collaborating?

I am not talking about World War II movies and I hope none of you have decided to side with the enemy (whoever that is). Collaboration software is a tool that every business with employees who work at different locations ought to consider.

Collaborative software began several decades ago with Lotus Notes. That software allowed multiple users on the same WAN to work on the same spreadsheet or word document at the same time. And Lotus Notes had the added feature of letting you link spreadsheets and word documents at the same time so that any change made to a spreadsheet would automatically populate your word document. But Lotus Notes required people to be on the same WAN and in most companies that meant being in the same building, and so the concept became very popular, plus Microsoft came and kicked Lotus’s butt in the marketplace.

And so collaborative software mostly died off for a while, although there were few open source programs that were widely used by universities and others who love open source software.

But collaborative software is back in a number of different variations and if your company has employees at more than one location, then one of these new software products is going to be right for you. Here are some of the features you can find in various collaborative software today:

  • Number one is the ability to simultaneously let multiple people work on the same document. And instead of just spreadsheets and word documents, this has been extended to any software that users all have rights to use. Most software also creates a log showing who made changes to a document and when.
  • Supports multiple devices. Collaborative software is no longer just for the PCs and employees using tablets and smartphones can share in many of the features. As an example, collaborative software is a great way to keep the sales staff in the field fully engaged with everybody else in the company.
  • Communicate internally. Many collaborative software programs come with chat rooms, instant messaging and text massaging tools that make it fast and easy to communicate with other employees. Why send somebody an email or call them if you only have a quick question that they can answer on an IM?
  • Some systems let you know where people are and if they are available to communicate now. This stops you from calling people who are not in their office and instead communicating with them in a faster way.
  • Create better communications history. In some software each user gets a home page that operates much like Facebook that shows everything they have done, meaning that other employees can often go find information they need without bothering that person.
  • This can become the new way to structure corporate data. With a program like SharePoint you can quickly create folders specific to a topic or a project and then give access only to those you want to have access to that date. This used to require the intervention of somebody in the IT department but now can be done by almost anybody.
  • Gives you a great tool to work with your largest customers. You can give your contacts at your largest customers limited access to your systems so that they can quickly ask questions or talk to the right person by chat or IM. This is a great new way to extend your customer service platform and make it real time. You can easily isolate outsiders from corporate information while giving them access to the social networking aspects of the software.

So what are some of the Collaborative software tools to consider? Here are a few (and there are many others).

  • Podio. This is software that is free for up to five users. It might be a good way to see if you like the concept. After five users it’s $9 per employee per month.
  • IBM (Lotus). The Lotus name is not dead and is now the brand name of the IBM collaborative suite of products. Check them out here.
  • Intuit has a product called QuickBase that is a collaborative suite of software. One good thing about this is that it will integrate with QuickBooks and other Intuit products that you might already be using. Check it out here.
  • SharePoint is Microsoft’s collaborative suite of products and has done very well in the large business sector. See it here.
Categories
Current News Technology

How Vulnerable is the Internet?

OLPC: XO internet access (Photo credit: Wikipedia)

A question you hear from time to time is how vulnerable the US Internet backbone is in terms of losing access if something happens to the major hubs. The architecture of the Internet has grown in response to the way that carriers have decided to connect to each other and there has never been any master plan for the best way to design the backbone infrastructure.

The Internet in this country is basically a series of hubs with spokes. There are a handful of large cities with major regional Internet hubs like Los Angeles, New York, Chicago, Dallas, Atlanta, and Northern Virginia. And then there are a few dozen smaller regional hubs, still in fairly large cities like Minneapolis, Seattle, San Francisco, etc.

Back in 2002 some scientists at Ohio State studied the structure of the Internet at the time and said that crippling the major hubs would have essentially crippled the Internet. At that time almost all Internet traffic in the country routed through the major hubs, and crippling a few of them would have wiped out a lot of the Internet.

Later in 2007 scientists at MIT looked at the web again and they estimated that taking out the major hubs would wipe out about 70% of the US Internet traffic, but that peering would allow about 33% of the traffic to keep working. And at that time peering was somewhat new.

Since then there is a lot more peering, but one has to ask if the Internet is any safer from catastrophic outage as it was in 2007? One thing to consider is that a lot of the peering happens today at the major Internet hubs. In those locations the various carriers hand traffic between each other rather than paying fees to send the traffic through an ‘Internet Port’, which is nothing more than a point where some carrier will determine the best routing of the traffic for you.

And so peering at the major Internet hubs is great way to save money, but it doesn’t really change the way the Internet traffic is routed. My clients are smaller ISPs, and I can tell you how they decide to route Internet traffic. The smaller ones find a carrier who will transport it to one of the major Internet hubs. The larger ones can afford diversity, and so they find carriers who can carry the traffic to two different major Internet hubs. But by and large every bit of traffic from my clients goes to and through the handful of major Internet hubs.

And this makes economic sense. The original hubs grew in importance because that is where the major carriers at the time, companies like MCI and Qwest already had switching hubs. And because transport is expensive, every regional ISP sent their growing internet traffic to the big hubs because that was the cheapest solution.

If anything, there might be more traffic routed through the major hubs today than there was in 2007. Every large fiber backbone and transport provider has arranged their transport networks to get traffic to these locations.

In each region of the country my clients are completely reliant on the Internet hubs. If a hub like the one in Dallas or Atlanta went down for some reason, ISPs that send traffic to that location would be completely isolated and cut off from the world.

There was a recent report in the Washington Post that said that the NSA had agents working at only a handful of major US Internet pops because that gave them access to most of the Internet traffic in the US. That seems to reinforce the idea that the major Internet hubs in the country have grown in importance.

In theory the Internet is a disaggregated, decentralized system and if traffic can’t go the normal way, then it finds another path to take. But this idea only works assuming that ISPs can get traffic to the Internet in the first place. A disaster that takes out one of the major Internet hubs would isolate a lot of towns from the region around it from having any Internet access. Terrorist attacks that take out more than one hub would wipe out a lot more places.

Unfortunately there is no grand architect behind the Internet that is looking at these issues because no one company has any claim to deciding how the Internet workd. Instead the carriers involved have all migrated to the handful of locations where it is most economical to interconnect with each other. I sure hope, at least, that somebody has figured out how to make those hub locations as safe as possible.

Categories
Regulation - What is it Good For? Technology

Time for a New Spectrum Plan

The spectrum in this country is a mess. And this is not necessarily a complaint against the FCC because much of the mess was not foreseeable. But the FCC has contributed at least some to the mess and if we are going to be able to march into the future we need to start from scratch and come up with a new plan.

Why is this needed? It’s from the sheer volume of devices and uses that we see coming for wireless spectrum. The spectrum that the wireless carriers are using today is already inadequate for the data that they are selling to customers. The cellular companies are only making it because a large percentage of the wireless data is being handed off to WiFi today. But what happens when Wifi gets too busy or if there are just too many devices?

As of early 2013 there were over half a billion internet connected devices in the US. This is something that ISPs can count, so we know that is fairly accurate. And the number of devices being connected is growing really quickly. We are not device nuts in my house and our usage is pretty normal. And we have a PC, a laptop, a tablet, a reader and two cell phones connected to wireless. And I am contemplating adding the TV and putting in a new burglar alarm system which would easily double our devices overnight.

A huge number of devices are counting on WiFi to work adequately to handle everything that is needed. But we are headed for a time when WiFi is going to be higher power and capable of carrying a lot more data, and with that comes the risk that the WiFi waves will get saturated in urban and suburban environments. If every home has a gigabit router running full blast a lot of the bandwidth is going to get cancelled out by interference.

What everybody seems to forget, and which has already been seen in the past with other public spectrum, is that every frequency has physical limits. And our giant conversion to the Internet of Things will come to a screeching halt if we ask more of the existing spectrum than it can physically handle.

So let’s jump back to the FCC and the way it has handled spectrum. Nobody saw the upcoming boom in wireless data two decades ago. Three decades ago the smartest experts in the country were still predicting that cell phones would be a market failure. But for the last decade we have known what was coming – and the use is wireless devices is coming faster than anybody expected, due in part to the success of smartphones. But we are on the edge of the Internet of Things needing gigantic bandwidth which will make cell phone data usage look tiny.

One thing the FCC has done that hurts the way we use the data is to chop almost every usable spectrum into a number of small channels. There are advantages to this in that different users can grab different discrete channels without interfering with other users, but the downside to small channels is that any given channel doesn’t carry much data. So one thing we need is some usable spectrum with broader channels.

The other way we can get out of the spectrum pinch is to reallocate more spectrum to wireless data and then let devices roam over a large range of spectrum. With software defined radios we now have chips that are capable of using a wide variety of spectrum and can change on the fly. So a smart way to move into the future is to widen the spectrum available to our wireless devices. If one spectrum is busy in a given local area the radios can find something else that will work.

Anybody who has ever visited a football stadium knows what it’s like when spectrum gets full. Practically nobody can get a connection and everybody is frustrated. If we are not careful, every downtown and suburban housing area is going to look like a stadium in terms of frequency usage, and nobody is going to be happy. We need to fix the spectrum mess and have a plan for a transition before we get to that condition. And it’s going to be here a lot sooner than anybody hopes.

Categories
Improving Your Business Technology The Industry

Delivering Gigabit Speeds

English: A gigabit HP-ProCurve network switch in a nest of Cat5 cables. (Photo credit: Wikipedia)

There is a lot of talk about companies like Google and many municipal networks delivering Gigabit speeds to homes and residents. But what is not discussed is the fact is that there are no existing wiring technologies that can deliver the bandwidth for any significant distance. Most people are shocked when they find out how quickly data speeds drop with existing wiring technologies.

Existing wiring is adequate to deliver Gigabit speeds to the smaller homes or to small offices. Carriers have typically used category 5 wiring to deliver data signal, and that technology can deliver 1 Gigabit for about 100 feet from the fiber terminal. But after that the speeds drop off significantly.

Wiring technology was never a significant issue when we were using the wiring to deliver slower data speeds. The same fall-off occurs regardless of the data speeds being delivered, but a customer won’t notice as much when a 20 Mbps data connection falls to a few Mbps as when a Gigabit connection falls to the same very slow speed.

Many carriers are thinking of using the new 802.11ac WiFi technology as a surrogate for inside wiring. But the speeds on WiFi drop off faster than speeds on data cabling. So one has to ask if a customer ought to bother paying extra for a Gigabit if most of it doesn’t get delivered to his devices?

Below is a chart that compares the different technologies used today for data wiring along with a few that have been proposed, like WiGig. The speeds in this table are at the ‘application layer’. That means theoretical speeds but is the easiest number to use in a chart because it is the speeds that each technology touts when being promoted. But you must note that actual delivered data speeds are significantly less than these application layer speeds for every technology listed due to such things as overheads and for the bandwidth due to modulation techniques.

The technology that stands out on the chart is ultra-broadband from PulseLink of Carlsbad California. PulseLink uses the radio frequency (RF) spectrum on coaxial cable above 2 GHz and can deliver data rates exceeding 1 Gbps. They are marketing the technology under the name of CWave. This technology uses a wide swath of RF spectrum in the 3 to 5 GHz range. As a result the RF signal is out-of-band (OOB) to both Cable TV and Satellite and will peacefully co-exist with both. Typically RF spectrum above 3 GHz on coax cable has been considered unusable RF spectrum, but due to the unique techniques used Pulse-LINK’s CWave chipset the technology reliably delivers Gigabit data rates while not disturbing existing frequencies used by cable TV and cable modems. Effectively it adds a whole new Ethernet data path over existing coaxial and that needs no new wires when coax is already present.

The differences in the various technologies really matters when you are looking at delivering data to larger buildings like schools and hospitals. As was recently in the news, President Obama announced a ConnectED initiative that has the stated goal of bringing a minimum of 100 Mbps and a goal of 1 Gbps to 99% of students within five years. But there does not seem like any good reason to bring a gigabit to a school if only a tiny fraction of that bandwidth can be delivered to the classrooms. I think that the PulseLink ultrabroadband technology might be the only reasonable way to get broadband to our classrooms.

Categories
Technology

The Latest in Home Security

Home security (Photo credit: Wikipedia)

Anybody following this blog knows that I have been promoting the ideas of telecom providers getting into the home security business. I see this as one of the ways that you are going to keep yourself relevant with the advent of the internet of things.

Modern Home security centers in the homes are already a lot more than that, and they can also be the platform used for home automation and energy management. There are numerous devices being made that function as the gateway to any ethernet device in your home that can be connected with wires or with wireless technologies. These main consuls then can interface with the user through smart phones or other such devices.

Of course home security still does the basic stuff. You can set up your house with monitors on doors and windows that will tell you when something changes. But modern security systems can do so much more. Here are some examples:

  • Everything can be tied into your smart phone so that you have access to your security system at all times. You can use your phone to change settings, to peek in on any of the cameras or even to speak with somebody who is at your front door even if you are not at home.
  • You can tie normal security features in with motion detectors. This will tell you if something is moving in a room that ought to be empty. But it can also do cool stuff like alert you when anybody approaches the external doors in your house. So rather than wait until somebody has broken in you can be alerted when somebody is at one of your doors. It’s not all that useful to know when the mailman comes every day, but it’s very comforting to know that you can be alerted when somebody is at your back door at 2:00 in the morning.
  • The systems can be tied into a number of other kinds of monitors. Certainly you can tie this into smoke detectors, but you can also monitor if the temperature changes drastically in any room. You can monitor for carbon monoxide levels (and if you are really paranoid, for many other kinds of chemicals and gases).
  • New systems include voice recognition and you can talk to your system. This allows you to change settings on the fly. For example, you can just tell your system that you will be working in a certain room and to ignore monitoring that room for a while. But your security system can then help with those absent-minded people like me. If you turn off the security in an area for a while, you can set it to ask you later if you still want it off.
  • Your system can get to know you. Sophisticated systems are starting to use things like face recognition and gait sensors so that your security system will know it’s you walking around on the lawn at midnight and not a stranger.
  • And it’s all cloud based, meaning that you can get an alert if the power goes out on your system while you are not at home. Turning off the power to a home has always been a common burglar technique for confounding a security system, but the system can be set to alert your smart phone every time the power goes out.
  • And of course, there are cameras to view or record everything. You can set your cameras up with some smarts to only view unusual events or events of a certain kind so that you are only storing views of things that matter. But the cameras give you the ability to monitor pets or babysitters while you are not at home. With cheap cloud storage you can record a lot of video.
  • There are now smart door locks that are tied to the security systems. These can use some combination of proximity to cell phone, voice or face recognition to allow keyless entry.
  • For those times when you drive away from home and can’t remember if you set the alarm a certain way, your system can be tied into your smart phone’s GPS and it can ask you if you want the alarms on once it senses you away from the home. Side benefit – you are always tracking the location of your cell phones if you want to see where your kids really are.
  • You customers can monitor it all themselves. It’s no longer necessary to have the security system tied into some center that will alert the police. A customer who is never without their smart phone can take a more active role and get all of the alerts if they so choose.

Most of these changes have been introduced within the last few years and one can imagine that many more changes will be coming in the next decade. So the best platform is one that is software driven and that can be upgraded to accept new devices and new features as they hit the market.

Categories
Improving Your Business Technology

Should You Be Peering?

Google 貼牌冰箱(Google Refrigerator) (Photo credit: Aray Chen)

No, this is not an invitation for you to become peeping toms, dear readers. By peering I am talking about the process of trading Internet traffic directly with other networks to avoid paying to transport all of your Internet traffic to the major Internet POPs.

Peering didn’t always make a lot of sense, but there has been a major consolidation of web traffic to a few major players that has changed the game. In 2004 there were no major players on the web and internet traffic was distributed among tens of thousands of websites. By 2007 about 15,000 networks accounted for about half of all of the traffic on the Internet. But by 2009 Google took off and it was estimated that they accounted for about 6% of the web that year.

And Google has continued to grow. There were a number of industry experts that estimated at the beginning of this year that Google carried 25% to 30% of all of the traffic on the web. But on August 16 Google went down for about 5 minutes and we got a look at the real picture. A company called GoSquared Engineering tracks traffic on the web worldwide and when Google went down they saw an instant 40% drop in overall web traffic as evidenced by this graph: Google’s downtime caused a 40% drop in global traffic

And so, when Google went dead for a few minutes, they seem to have been carrying about 40% of the web traffic at the time. Of course, the percentage carried by Google varies by country and by time of day. For example, in the US a company called Sandvine that sells Internet tracking systems, estimates that NetFlix uses about 1/3 of the US Internet bandwidth between 9 P.M. and midnight in each time zone.

Regardless of the exact percentages, it is clear that a few networks have grabbed enormous amounts of web traffic. And this leads me to ask my clients if they should be peering? Should they be trying to hand traffic directly to Google, NetFlix or others to save money?

Most carriers have two major cost components to deliver their Internet traffic – transport and Internet port charges. Transport is just that, a fee that if often mileage based that pays for getting across somebody else’s fiber network to get to the Internet. The port charges are the fees that are charged at the Internet POP to deliver traffic into and out of the Internet. For smaller ISPs these two costs might be blended together in the price you pay to connect to the Internet. So the answer to the question is, anything that can produce a net lowering of one or both  of these charges is worth considering.

Following is a short list of ways that I see clients take advantage of peering arrangements to save money:

  • Peer to Yourself. This is almost too simple to mention, but not everybody does this. You should not be paying to send traffic to the Internet that goes between two of your own customers. This is sometimes a fairly significant amount of traffic, particularly if you are carrying a lot of gaming or have large businesses with multiple branches in your community.
  • Peer With Neighbors. It also makes sense sometime to peer with neighbors. These would be your competitors or somebody else who operates a large network in your community like a university. Again, there is often a lot of traffic generated locally because of local commerce. And the amount of traffic between students and a university can be significant.
  • Peering with the Big Data Users. And finally is the question of whether you should try to peer with Google, Netflix or other large users you can identify. There are several ways to peer with these types of companies:
    • Find a POP they are at. You might be able to find a Google POP or a data center somewhere that is closer than your Internet POP. You have to do the math to see if buying transport to Google or somebody else costs less than sending it on the usual path.
    • Peer at the Internet POP. The other way to peer is to go ahead and carry the traffic to the Internet POP, but once there, split your traffic and take traffic to somebody like Google directly to them rather than pay to send it through the Internet port. If Google is really 40% of your traffic, then this would reduce your port charges by as much as 40% and that would be offset by whatever charges there are to split and route the traffic to Google at the POP.

I don’t think you have to be a giant ISP any more to take advantage of peering. Certainly make sure you are peeling off traffic between your own customers and investigate local peering if you have a significant amount of local traffic. It just takes some investigation to see if you can do the more formal peering with companies like Google. It’s going to be mostly a matter of math if peering will save you money, but I know of a number of carriers who are making peering work to their advantage. So do the math.

Categories
Regulation - What is it Good For? Technology

FCC Makes Changes to 60 GHz Spectrum

United States radio spectrum frequency allocations chart as of 2003 (Photo credit: Wikipedia)

On August 12, 2013 the FCC, in [ET Docket No 07-113] amended the outdoor use for the 60 GHz spectrum. The changes were prompted by the industry to make the spectrum more useful. This spectrum is more commonly known as the millimeter spectrum, meaning it has a very short wavelength and operates between 57 GHz and 64 GHz. Radios at high frequencies like this have very short antennae which are typically built into the unit.

The spectrum is used today in two applications, a) as outdoor short-range point-to-point systems used in place of fiber, such as connecting two adjacent buildings, and b) as in-building transmission of high-speed data between devices for functions such as transmitting uncompressed high-definition (HD) video between devices like blu-ray recorders, cameras, laptops and HD televisions.

The new rules modify the outside usage to increase power and thus increase the distance of the signal. The FCC is allowing an increase in emissions from 40 dBm to 82 dBm which will increase the outdoor distance for the spectrum up to about 1 mile. The order further eliminates the need for outside units to send an identifying signal, which now makes this into an unlicensed application. This equipment would be available to be used by anybody, with the caveat that it cannot interfere with existing in-building uses of the spectrum.

One of the uses of these radios is that multiple beams can be sent from the same antenna site due to the very tight confinement of the beams. One of the drawbacks of this spectrum is it is susceptible to interference from heavy rain, which is a big factor in limiting the distance.

Radios in this spectrum can deliver up to 7 Gbps of ethernet (minus some for overheads) and so this is intended an alternative to fiber drops to buildings needed less bandwidth than that limit. A typical use for this might be to connect to multiple buildings in a campus or office park environment rather than having to build fiber. The FCC sees this mostly as a technology to be used to serve businesses, probably due to the cost of the radios involved.

Under the new rules the power allowed by a given radio is limited to the precision of the beam created by that radio. Very precise radios can use full power (and get more distance) while the power and distance are limited for less precise radios.

The FCC also sees this is an alternative for backhaul to 4G cellular sites, although the one mile limitation is a rather short one. Most 4G sites that are already within a mile of fiber have largely been connected.

This technology will have a limited use, but there will be cases where using these radios could be cheaper than installing fiber and/or dealing with inside wiring issues in large buildings. I see the most likely use of these radios to get to buildings in crowded urban environments where the cost of leasing fiber or entrance facilities can be significant.

The 60 GHz spectrum has also been allowed for indoor use for a number of years. The 60GHz band when used indoors has a lot of limitations related to both cost and technical issues. The technical limitations are 60 GHz must be line-of-sight and the spectrum doesn’t go through walls. The transmitters are also very power consumptive and require big metal heat sinks and high-speed fans for cooling. Even if a cost effective 60 GHz solution where to be available tomorrow battery operated devices would need a car battery to power them.

One issue that doesn’t get much play is the nature of the 60 GHz RF emissions. 60 GHz can radiate up to 10 Watts with the spectrum mask currently in place for indoor operation. People are already concerned about the 500mW from a cell phone and WiFI and it is a concern in a home environment to have constant radiation at 10 Watts of RF energy. That’s potentially 1/10 the power of a microwave oven radiated in your house and around your family all of the time.

Maybe at some point in the distant future there may be reasonable applications for indoor use of 60 GHz in some vertical niche market, but not for years to come.

Exit mobile version