Categories
Current News What Customers Want

We Don’t Have Enough Bandwidth

I read three different articles Friday that have a common theme – we just don’t have enough bandwidth in this country.

The first article from the Fiber To The Home Council which reports on a recent survey. They report that video viewing over the Internet is growing faster than expected, led by the viewing habits of the young. One third of young viewers watch video on a cell phone or tablet at the same time that they watch TV. And 12% of viewers under 35 report watching all of their content over the Internet.

The article also points out a recent report from Conviva, a web optimization company, who reports that they sampled 22.6 billion video streams and found that 60% of them suffered some degradation due to inadequate bandwidth.

The gist of the article is that demand keeps growing while many parts of the Web are near or at a breaking point in terms of capacity and quality. It’s also evidence that homes don’t want to just watch streaming video, they want to watch multiple streaming videos.

In another article Time Warner announced that it would roll out significantly faster Internet service, but only in competitive markets. The upgrades will come in markets where they are competing against fast competition, such as places where Verizon has built FiOS, where AT&T has relatively fast U-verse and where municipalities have built fiber networks. The company says that they will upgrade to DOCSIS 3 and also install much faster wireless routers. They also will upgrade the DVRs in these markets and roll out apps that are designed for the faster Internet.

But Time Warner also made it clear that they have no plans to upgrade markets where there is not fast competition. My take away from this article is that a lot of the incumbent providers are still only doing upgrades in response to direct competition. Otherwise they are quite satisfied with the status quo and only make investments under duress.

Finally, the citizens of Bergen County, New Jersey have started a petition to ask their politicians to offer whatever is necessary to attract Google fiber to the county. Bergen County is the most populous county in the state.

I find this somewhat surprising because most of the people in this county have Verizon FiOS available. And recently Verizon said they plan to have all of New Jersey covered by FiOS. Most of the rest of the country would be thrilled to be upgraded to the kinds of speeds available in Bergen County. FiOS speeds differ by market, but most markets have speeds available from 15 Mbps download to 150 Mbps download. And a few markets have 300 Mbps and 500 Mbps speeds available. Of course, Google would be bring 1 Gbps speeds for a little more than what people are paying for 50 Mbps from Verizon.

My takeaway from this is that people are beginning to realize how important very fast Internet service is. Even those who already have some of the fastest Internet speeds in the country do not view what they have as a value.

Unfortunately for the citizens of Bergen County I find it highly unlikely that Google will ever build to compete against another fiber network. Verizon could easily upgrade their network to compete with Google on speed and price and the conventional wisdom is that nobody is going to build a second fiber network to homes or both fiber owners will go broke competing against each other.

But all of these articles are indicative of the daily articles I see that continue to highlight the big gap between the bandwidth people want and what they are being offered in the market. We just don’t have enough bandwidth in the country, at least according to consumers.

Categories
Current News Technology

Europe Has the Right Goals

The European Commission issued a press release yesterday that announced that 100% of the households in Europe now have access to broadband.

Most households have some sort of wired access with 96.1% of homes having access to copper, coax or fiber. Wireless coverage with 2G, 3G or 4G covers 99.4% of houses. And all remote homes are now covered by satellite broadband using a network of 148 satellites.

Before anybody argues that we have the same thing here in the US due to satellite, we need to distinguish between the satellite broadband that is available here and what is available in Europe. Basic satellite service in Europe is only $13 per month. I can’t find the speed for but assume this is a few Mbps download speeds. But customers can get 20 Mbps download from satellite for $33 per month.

In the US there are two major satellite providers. ViaSat Exede offers a 12 Mbps download service. The amount you pay is based upon the usage cap you choose. For $50 per month you can get 10 GB per month, for $80 you can buy 15 GB and for $130 you can get 25 GB. Hughesnet offers 5 Mbps down and 1 Mbps up for $50 per month, 10 Mbps down and 1 Mbps up for $60, 10 Mbps down and 2 Mbps up for $80 and 15 Mbps down and 2 Mbps up for $130. The four Hughesnet products also have data caps of 10 GB, 20 GB, 30 GB and 40 GB respectively.

Speed isn’t everything and the caps matter. Just to put those data caps into perspective, a 2-hour HD movie will range between 3 and 4.5 GB. So homes in the US using satellite are very limited in using their satellite connection to view video.

The US satellite companies are also limited since they only have a few satellites capable of delivering the above products. If those satellites get oversubscribed then actual speeds will be slower than advertised in the same way that a cable modem system can bog down in the evening hours. But with more satellites in Europe the speeds can be faster and there is a lot less chance of congestion and oversubscription.

The Europeans also have goals to speed up Internet access. They have the goal by 2020 of getting all citizens the ability to have 30 Mbps download speeds, with at least half of them having access to 100 Mbps.

This is pretty easy to contrast with the US where the current national definition for terrestrial  broadband is 4 Mbps down and 1 Mbps up. Both stimulus grants and borrowing from the RUS have recently financed networks that are able to deliver those speeds.

If we don’t set high goals in the US, and if we are content to finance rural broadband that delivers slow speeds when it is brand new, we are relegating the rural areas to having slow broadband for decades to come.

In the US we are more given to grand announcements that don’t come with any funding or mandates. For example, earlier this year the FCC set a goal of having a Gigabit City in every state of the country. That means a network that is capable of delivering a gigabit of download speeds to customers.

Don’t get me wrong, I would love to live in one of those few places where you can get a gigabit. But this is a completely voluntary system, and a Gigabit City might only be actually selling that much speed to a few customers to be given the designation. Rather than trying to get one City in each state to provide a few customer with a gigabit download speed we ought to instead be concentrating on making our basic broadband a lot faster than 4 Mbps. When that lowly speed is our national goal, we are telling rural America to not expect anything better.

The Europeans have it right and we have it wrong. And a decade from now when we are far behind them in terms of productivity we can look back on the crappy national goals we set for ourselves.

Categories
Current News Improving Your Business Regulation - What is it Good For?

Another Idea for Rural Broadband

An rural area west of Route 41 and Lowell, Indiana. (Photo credit: Wikipedia)

The Fiber-to-the-Home Council (FTTHC) asked the FCC to give consideration for a new way to fund rural broadband. Their proposal asks the FCC to make available unused portions of the Universal Service Fund to supply grants to build gigabit fiber networks. They would have this done under a competitive process, meaning that the networks that could do this the most efficiently would be at the top of the grant list.

It’s an intriguing idea. I have often talked in this blog about the state of broadband in rural America. Consider some of the following rural broadband issues:

  • About a year and a half ago the FCC estimated that there was still about 14 million rural households with no access to any kind of terrestrial broadband. There have been some projects in the last year that now serve some of these customers, but the number is still probably not much smaller.
  • In the FCC’s last three Broadband Progress Reports the agency said that incumbent carriers were not upgrading to the FCC’s minimum definition of broadband fast enough. Those speeds are currently 4 Mbps download and 1 Mbps upload. And the FCC has promised that every few years they will revisit that definition of broadband, and supposedly will increase it over time.
  • There is often a big difference between advertised speeds and actual speeds. Getting 4 Mbps download is barely enough bandwidth for a household to participate in today’s web, and if the actual speeds delivered are less than this then it’s hard to call the service broadband by today’s reality.
  • The availability of rural broadband depends upon where a customer lives. If they live in a large enough rural town then they might have broadband available from either the telco or the cable company, and even sometimes from both. But cable networks rarely extend much past the borders of these small towns and DSL rarely carriers more than a mile or two from the center of town. So there are many rural counties that have some broadband in the towns but practically none outside the towns.
  • Most urban areas now have cable modem service that is advertised at between 10 Mbps and 20 Mbps. And urban broadband keeps improving. Rural areas are permanently falling behind and the gap is going to widen over time. This has huge implications for the long-term economic viability of rural America.

Of course, there are some organizations that have opposed this idea, mostly those organizations funded by incumbent telcos and cable companies. This always has me scratching my head. For the most part the large telcos and cable companies have ignored rural America for one or even two decades. They have not poured new capital into these areas to bring them up to the needed speeds and they spend as little as possible to keep these areas operating. I contrast this to the small independent telcos who generally do an excellent job in rural America, but there are still large swaths of rural area that have been largely ignored. And even while ignoring these areas the large telcos want to protect their revenue streams.

I guess that is good business, but it is poor policy. In my mind broadband is basic infrastructure and homes and businesses need adequate broadband in order to take part in modern society. And this is just about to become much more important as we move into the Internet of things. It’s one thing to not provide enough broadband to a rural home so that they can watch streaming videos. But when we are having our healthcare monitored by the Internet then broadband becomes an essential component of every home’s life.

The rural broadband crisis is already here and the broadband gap is already unacceptable. The FTTHC’s proposal is creative and doesn’t ask for any additional government funds. They are asking that the FCC make an investment today in rural areas as a down-payment to help those areas stay viable as places for people to live in the future. I would assume that any awards of funds are also going to expect the rural communities to chip in a lot of matching funds, and so all that is being asked is to help these communities help themselves. I think it is an idea worthy of FCC consideration.

Categories
Improving Your Business Technology The Industry

Delivering Gigabit Speeds

English: A gigabit HP-ProCurve network switch in a nest of Cat5 cables. (Photo credit: Wikipedia)

There is a lot of talk about companies like Google and many municipal networks delivering Gigabit speeds to homes and residents. But what is not discussed is the fact is that there are no existing wiring technologies that can deliver the bandwidth for any significant distance. Most people are shocked when they find out how quickly data speeds drop with existing wiring technologies.

Existing wiring is adequate to deliver Gigabit speeds to the smaller homes or to small offices. Carriers have typically used category 5 wiring to deliver data signal, and that technology can deliver 1 Gigabit for about 100 feet from the fiber terminal. But after that the speeds drop off significantly.

Wiring technology was never a significant issue when we were using the wiring to deliver slower data speeds. The same fall-off occurs regardless of the data speeds being delivered, but a customer won’t notice as much when a 20 Mbps data connection falls to a few Mbps as when a Gigabit connection falls to the same very slow speed.

Many carriers are thinking of using the new 802.11ac WiFi technology as a surrogate for inside wiring. But the speeds on WiFi drop off faster than speeds on data cabling. So one has to ask if a customer ought to bother paying extra for a Gigabit if most of it doesn’t get delivered to his devices?

Below is a chart that compares the different technologies used today for data wiring along with a few that have been proposed, like WiGig. The speeds in this table are at the ‘application layer’. That means theoretical speeds but is the easiest number to use in a chart because it is the speeds that each technology touts when being promoted. But you must note that actual delivered data speeds are significantly less than these application layer speeds for every technology listed due to such things as overheads and for the bandwidth due to modulation techniques.

The technology that stands out on the chart is ultra-broadband from PulseLink of Carlsbad California. PulseLink uses the radio frequency (RF) spectrum on coaxial cable above 2 GHz and can deliver data rates exceeding 1 Gbps. They are marketing the technology under the name of CWave. This technology uses a wide swath of RF spectrum in the 3 to 5 GHz range. As a result the RF signal is out-of-band (OOB) to both Cable TV and Satellite and will peacefully co-exist with both. Typically RF spectrum above 3 GHz on coax cable has been considered unusable RF spectrum, but due to the unique techniques used Pulse-LINK’s CWave chipset the technology reliably delivers Gigabit data rates while not disturbing existing frequencies used by cable TV and cable modems. Effectively it adds a whole new Ethernet data path over existing coaxial and that needs no new wires when coax is already present.

The differences in the various technologies really matters when you are looking at delivering data to larger buildings like schools and hospitals. As was recently in the news, President Obama announced a ConnectED initiative that has the stated goal of bringing a minimum of 100 Mbps and a goal of 1 Gbps to 99% of students within five years. But there does not seem like any good reason to bring a gigabit to a school if only a tiny fraction of that bandwidth can be delivered to the classrooms. I think that the PulseLink ultrabroadband technology might be the only reasonable way to get broadband to our classrooms.

Categories
Current News Regulation - What is it Good For?

The FCC’s Data Collection Effort

Character for children of FCC”Broadband” (Photo credit: Wikipedia)

The FCC just changed the way that they are going to gather data from carriers about voice and data usage in the US. To some degree they seem to be throwing in the towel and just giving up.

I have blogged before about the massive inadequacies of the National Broadband Map. This is an FCC-sponsored effort to show the availability of broadband on a geographic basis. This sounds like a laudable goal, but the carriers decide what information they want to supply to the mapping process, and so the map is full of what can only be described as major lies from the largest carriers. They claim to have broadband where they don’t and at speeds far greater than they actually deliver.

The FCC announced new rules for their data collection process that is done using FCC Form 477. This revised effort by the FCC is going to make their data gathering more like the process that is used to collect data for the National Broadband Map. They are no longer going to try to collect actual data speeds in tiers, but instead will be collecting only advertised speeds for data – the fastest advertised speed for landline providers and the slowest advertised speeds for wireless providers. For the life of me I can’t imagine how this data can be of the slightest use to anybody.

I just recently worked with a client in a small town in Oregon. The incumbent providers there are the biggest telephone company and cable company in the state. In both cases, they advertise the same speeds in this small town that they advertise in Portland. But in this town, as in most or rural America, the actual speeds delivered are far slower. They think the fastest cable modem speeds in the town are from 3 – 5 Mbps download and the fastest DSL is not much over 1.5 Mbps. And yet both carriers advertise products at many times those speeds.

This would just be a big annoyance if it wasn’t for the fact that the FCC and others use the data gathered to talk about what a great job the carriers are doing in this country to supply broadband. I recently saw an announcement that 98% of households now have broadband availability. And since the FCC’s definition of broadband is now a download speed of 4 Mbps and an upload speed of 1 Mbps, this makes it sound like the country’s broadband problems are being solved. But announcements of this sort are based upon lies and exaggerations by the carriers.

And since the whole point of this data gathering effort is to formulate policies to spur the carriers to do better, letting the carriers self-report whatever they want is like asking the fox to go count the eggs in the henhouse every morning. There is no practical penalty against a carrier advertising any speed they want or reporting falsely to the FCC. And it’s a lot easier, as it is with the Oregon example, for the incumbent providers to gear all of their advertising in a state around the urban markets. I have no idea if those incumbents in Oregon can actually deliver the advertised speeds in Portland, but I know for a fact that they do not do so outside of Portland.

The FCC is also changing the way that they gather information about VoIP lines. But I think the day for them to be able to gather any meaningful data about business phones in the country is over. There is such a proliferation of IP Centrex and other VoIP technologies that the carriers don’t even know what is being delivered. Consider this:

  • It’s now possible to use one number for a thousand lines in a call center or instead to give a thousand numbers to one phone.
  • There is a proliferation of resellers in the market who buy numbers and 911 from larger carriers so that they don’t have to become a CLEC. And these resellers can then deliver a wide variety of business voice services over anybody’s data connection. These carriers will not be reporting what they are doing to the FCC because most of them are not certified as carriers but rely on the certification of the CLEC that gave them numbers.  Nobody in the FCC reporting chain is going to know about or report these kinds of customers and lines. And it gets worse because I know of many cases now of resellers of these resellers. Literally almost anybody can become a carrier overnight reselling these services. It’s back to the wild west days we used to see with long distance resale. I’m expecting to go to a telecom convention soon and see the shark-skin suits again.
Categories
Regulation - What is it Good For? Technology

FCC Makes Changes to 60 GHz Spectrum

United States radio spectrum frequency allocations chart as of 2003 (Photo credit: Wikipedia)

On August 12, 2013 the FCC, in [ET Docket No 07-113] amended the outdoor use for the 60 GHz spectrum. The changes were prompted by the industry to make the spectrum more useful. This spectrum is more commonly known as the millimeter spectrum, meaning it has a very short wavelength and operates between 57 GHz and 64 GHz. Radios at high frequencies like this have very short antennae which are typically built into the unit.

The spectrum is used today in two applications, a) as outdoor short-range point-to-point systems used in place of fiber, such as connecting two adjacent buildings, and b) as in-building transmission of high-speed data between devices for functions such as transmitting uncompressed high-definition (HD) video between devices like blu-ray recorders, cameras, laptops and HD televisions.

The new rules modify the outside usage to increase power and thus increase the distance of the signal. The FCC is allowing an increase in emissions from 40 dBm to 82 dBm which will increase the outdoor distance for the spectrum up to about 1 mile. The order further eliminates the need for outside units to send an identifying signal, which now makes this into an unlicensed application. This equipment would be available to be used by anybody, with the caveat that it cannot interfere with existing in-building uses of the spectrum.

One of the uses of these radios is that multiple beams can be sent from the same antenna site due to the very tight confinement of the beams. One of the drawbacks of this spectrum is it is susceptible to interference from heavy rain, which is a big factor in limiting the distance.

Radios in this spectrum can deliver up to 7 Gbps of ethernet (minus some for overheads) and so this is intended an alternative to fiber drops to buildings needed less bandwidth than that limit. A typical use for this might be to connect to multiple buildings in a campus or office park environment rather than having to build fiber. The FCC sees this mostly as a technology to be used to serve businesses, probably due to the cost of the radios involved.

Under the new rules the power allowed by a given radio is limited to the precision of the beam created by that radio. Very precise radios can use full power (and get more distance) while the power and distance are limited for less precise radios.

The FCC also sees this is an alternative for backhaul to 4G cellular sites, although the one mile limitation is a rather short one. Most 4G sites that are already within a mile of fiber have largely been connected.

This technology will have a limited use, but there will be cases where using these radios could be cheaper than installing fiber and/or dealing with inside wiring issues in large buildings. I see the most likely use of these radios to get to buildings in crowded urban environments where the cost of leasing fiber or entrance facilities can be significant.

The 60 GHz spectrum has also been allowed for indoor use for a number of years. The 60GHz band when used indoors has a lot of limitations related to both cost and technical issues. The technical limitations are 60 GHz must be line-of-sight and the spectrum doesn’t go through walls. The transmitters are also very power consumptive and require big metal heat sinks and high-speed fans for cooling. Even if a cost effective 60 GHz solution where to be available tomorrow battery operated devices would need a car battery to power them.

One issue that doesn’t get much play is the nature of the 60 GHz RF emissions. 60 GHz can radiate up to 10 Watts with the spectrum mask currently in place for indoor operation. People are already concerned about the 500mW from a cell phone and WiFI and it is a concern in a home environment to have constant radiation at 10 Watts of RF energy. That’s potentially 1/10 the power of a microwave oven radiated in your house and around your family all of the time.

Maybe at some point in the distant future there may be reasonable applications for indoor use of 60 GHz in some vertical niche market, but not for years to come.

Categories
The Industry What Customers Want

The DSL TV Market

CenturyLink Contingent (Photo credit: sea turtle)

I find it surprising that DSL TV providers have been the fastest growing segment of the cable TV industry. And my surprise is due to the fact that these companies are delivering TV over the smallest data pipe of any of the comparable technologies. Over the last year the companies using DSL and fiber to deliver cable TV have grown in customers while the traditional cable companies have lost customers.

Cable TV is delivered over DSL using a bonded pair of telephone wires using either ADSL2 or VDSL. In theory these technologies can deliver speeds up to about 40 Mbps. But depending upon the gauge, the age and the condition of the copper many actual deployments are closer to 20 Mbps than the theoretical 40 Mbps. The bandwidth that is left over after the TV signal is used to deliver voice and data.

The DSL providers make cable work by using a technology called IPTV. This technology only sends the signals to the home that the customer is asking to see. One can always tell that you are on an IPTV system because of the small pause that occurs every time you change channels.

The DSL cable industry is composed of AT&T U-verse, CenturyLink Prism and a whole slew of smaller telephone companies. Not every telco has taken the bonded DSL path. For example, a number of the mid-sized telcos like Frontier, Fairpoint and TDS have elected to partner with a satellite provider in order to have a TV product in the bundle. But last year TDS ventured out into the DSL TV market in Madison Wisconsin.

AT&T is by far the most successful DSL TV provider as one would expect from their large customer base. AT&T has made the product available to over 24 million homes. At the end of the first quarter of 2013 they reported having 5 million cable customers on U-verse and 9.1 million data customers.

The biggest problem with using DSL is the distance limitation. The speeds on DSL drop significantly with distance and so customers have to be on a relatively short copper path in order for it to work. The DSL that AT&T is using can support the U-verse product up to about 3,500 feet on good single copper pair and up to 5,500 feet using a two bonded copper pairs. And the key word in that description is good copper, because older copper and copper with problems will degrade the speed of the product significantly.

I really don’t know who is in second place. CenturyLink announced that they had 120,000 TV customers on their Prism product at the end of the first quarter of 2013. There may be some other telcos out there with more DSL cable customers. But CenturyLink if fairly new to the product line having launched it just a few years ago. They still only offer it in a few markets but are adding new markets all of the time. So if they are not in second base they soon will be.

In researching this article I came across some web sites that carry customer complaints about Prism. Look at the Yelp pages for CenturyLink in Las Vegas. I’ve always suspected that unhappy customers are more likely to post an on-line review than happy ones, but some of the stories in here are extraordinarily bad. Obviously CenturyLink is having some growing pains and has a serious disconnect between their marketing and sales departments and their customer service. But some of the policies in here, such as charging people a large disconnect fee even though there is no contract is surprising in a competitive environment. And yet, even with these kinds of issues the company has added over 100,000 customers in just a few years.

I have to wonder how this industry segment is going to handle where the cable business is going. How much they can squeeze out of a 20 Mbps data pipe when you have customers who want to watch several TVs at the same time, record shows while watching another show and also streaming video to tablets and laptops, all simultaneously? Yesterday I noted the new trend in large TVs which is to split the screen into four parts, each showing something different. Most reviews of the performance of TV over DSL are pretty good, but how will DSL handle the guy who wants to watch four HD football games at the same time while surfing the internet?

Categories
Improving Your Business Technology

Do You Understand Your Chokepoints?

Almost every network has chokepoints. A chokepoint is some place in the network that restricts data flow and that degrades the performance of the network beyond the chokepoint. In today’s environment where everybody is trying to coax more speed out of their network these chokepoints are becoming more obvious. Let me look at the chokepoints throughout the network, starting at the customer premise.

Many don’t think of the premise as a chokepoint, but if you are trying to deliver a large amount of data, then the wiring and other infrastructure at the location will be a chokepoint. We are always hearing today about gigabit networks, but there are actually very few wiring schemes available that will deliver a gigabit of data for more than a very short distance. Even category 5 and 6 cabling is only good for short runs at that speed. There is no WiFi on the market today that can operate at a gigabit. And technologies like HPNA and MOCA are not fast enough to carry a gigabit.

But the premise wiring and customer electronics can create a choke point even at slower speeds. It is a very difficult challenge to bring speeds of 100 Mbps to large premises like schools and hospitals. One can deliver fast data to the premise, but once the data is put onto wires of any kind the performance decays with distance, and generally a lot faster than you would think. I look at the recent federal announced goal of bringing a gigabit to every school in the country and I wonder how they plan to move that gigabit around the school. The answer mostly is that with today’s wiring and electronics, they won’t. They will be able to deliver a decent percentage of the gigabit to classrooms, but the chokepoint of wiring is going to eat up a lot of the bandwidth.

The next chokepoint in a network for most technologies is neighborhood nodes. Cable TV HFC networks, fiber PON networks, cellular data networks and DSL networks all rely on creating neighborhood nodes of some kind, a node being the place where the network hands off the data signal to the last mile. And these nodes are often chokepoints in the network due to what is called oversubscription. In the ideal network there would be enough bandwidth delivered so that every customer could use all of the bandwidth they have been delivered simultaneously. But very few network operators want to build that network because of the cost, and so carriers oversell bandwidth to customers.

Oversubscription is the process of bringing the same bandwidth to multiple customers since we know statistically that only a few customers in a given node will be making heavy use of that data at the same time. Effectively a network owner can sell the same bandwidth to multiple customers knowing that the vast majority of the time it will be available to whoever wants to use it.

We are all familiar with the chokepoints that occur in oversubscribed networks. Cable modem networks have been infamous for years for bogging down each evening when everybody uses the network at the same time. And we are also aware of how cell phone and other networks get clogged and unavailable in times of emergencies. These are all due to the chokepoints caused by oversubscription at the node. Oversubscription is not a bad thing when done well, but many networks end up, through success, with more customers per node than they had originally designed for.

The next chokepoint in many networks is the backbone fiber electronics that delivers bandwidth to from the hub to the nodes. Data bandwidth has grown at a very rapid pace over the last decade and it is not unusual to find backbone data feeds where today’s data usage exceeds the original design parameters. Upgrading the electronics is often costly because in some network you have to replace the electronics to all nodes in order to fix the ones that are full.

Another chokepoint in the network can be hub electronics. It’s possible to have routers and data switches that are unable to smoothly handle all of the data flow and routing needs at the peak times.

Finally, there can be a chokepoint in the data pipe that leaves a network and connects to the Internet. It is not unusual to find Internet pipes that hit capacity at peak usage times of the day which then slows down data usage for everybody on the network.

I have seen networks that have almost all of these chokepoints and I’ve seen other networks that have almost no chokepoints. Keeping a network ahead of the constantly growing demand for data usage is not cheap. But network operators have to realize that customers recognize when they are getting shortchanged and they don’t like it. The customer who wants to download a movie at 8:00 PM doesn’t care why your network is going slow because they believe they have paid you for the right to get that movie when they want it.

Categories
Technology

Is There any Life Left in Copper?

RG-59 coaxial cable A: Plastic outer insulation B: Copper-clad aluminium braid shield conductor C: Dielectric D: Central conductor (copper-clad steel) (Photo credit: Wikipedia)

Copper is still a very relevant technology today, and when looked at on a global scale nearly 2/3 of all broadband subscribers are still served by copper. That percentage is smaller in the US, but this country has a far more widely deployed cable TV system than most of the rest of the world.

The most widely deployed DSL technologies today are ADSL2 and VDSL. In theory these technologies can get speeds up to about 40 Mbps. But depending upon the gauge, the age and the condition of the copper many actual deployments are closer to 20 Mbps than the theoretical 40 Mbps.

ADSL2 and VDSL technology has been widely deployed by AT&T in its U-verse product which serves over 7 million data customers and over 4.5 million cable customers. AT&T has made the product available to over 24 million homes. AT&T can support the product up to about 3,500 feet on good single copper pair and up to 5,500 feet using a two bonded copper pairs.

And ADSL2 is a pretty decent product. It can deliver IPTV and still support an okay data pipe. However, as the cable companies are finding ways to get more bandwidth out of their coaxial cable and as new companies are deploying fiber, these DSL technologies are going to again fall behind the competition.

So what is out there that might resurrect copper and make speeds faster than ADSL2? Not too long ago I wrote a blog about G.Fast, which is Alcatel-Lucent’s attempt to find a way to get more speeds out of legacy copper networks. In recent field tests ALU achieved a maximum speed of 1.1 Gbps over 70 meters and 800 Mbps over 100 meters for brand new copper. On older copper the speed dropped to 500 Mbps for 100 meters.

However, the G.Fast distance limitations are far shorter than ADSL2 and G.Fast is really more of a drop technology than a last mile technology and it would require a telco like AT&T to build a lot more fiber to get even closer to houses. You have to wonder of it makes any sense to rebuild the copper network to be able to get up to 500 Mbps out of copper when fiber could deliver many gigabits.

There are other technologies that have been announced for copper. Late last year Genesis Technical Systems announced a scheme to get 400 Mbps out of copper using a technology they are calling DSL Rings. This technology would somehow tie 2 to 15 homes into a ring and bridge them with copper. Details of how the technology works are still a little sketchy.

In 2011 the Chinese vendor Huawei announced a new technology that will push up to 1 gigabit for 100 meters. This sounds very similar to G.Fast and sounds like a way to use existing copper within a home rather than rewiring.

There is one new technology that is finally getting wider use which is bonded VDSL pairs that use vectoring. Vectoring is a noise cancellation technology that works in a way similar to how noise-cancelling headphones work to eliminate sound interference. Vectoring eliminates most of the noise between bonded pairs of copper. Alcatel-Lucent hit the market with bonded pair VDSL2 in late 2011 that can deliver up to 100 Mbps. However, in real deployment speeds are reported to be 50 Mbps to 60 Mbps on older copper. That is enough speed to probably give another decade to DSL, although to do so requires a full replacement of old technology DSL technology with VDSL2. One has to wonder how many times the telcos will keep upgrading their copper electronics to get a little more speed rather than taking the leap to fiber like Verizon did.

One only has to take a look at the growth rate of the data used at homes and ask how long copper can remain relevant. Within a few short decades we have moved from where homes could get by on dial-up and now find a 20 Mbps connection too slow. Looking just a few years forward we see the continued growth of video sharing and a lot of new traffic from cellular femtocells and the Internet of Things. It’s hard to think that it won’t be long until people are bemoaning the inadequacy of their 50 Mbps connections. But that day is coming and probably is not more than a decade away.

Categories
Current News Regulation - What is it Good For?

Is Wireless Really Better than Wires?

An rural area west of Route 41 and Lowell, Indiana. (Photo credit: Wikipedia)

It is clear that the FCC prefers wireless as the broadband solution for rural areas. It seems like they badly want every rural household in the country to get some kind of broadband just so they can take this issue off their plate. Just about every bit of policy decided in the last few years has a bias towards wireless bias.

For instance, the historic Universal Service Fund which was used to promote rural telephony over copper has been transitioned into a new CAF fund that will instead promote high-speed data in rural areas. There are several aspects of the CAF that clearly will ensure that the funds will go mostly to wireless carriers. The bulk of the funding will eventually be distributed by a reverse auction. This is an auction where the broadband providers in a given area will be able to compete for the funding, and the one who bids for the lowest amount of subsidy per customer will receive the funds.

The first time I read the reverse auction rules my first thought was that this money is all going to wireless companies. The reverse auction rules strongly favor companies who can provide data over large areas. Any smaller company who wants to get CAF funds to help pay for a rural wired network can be undercut by the largest wireless companies. AT&T Wireless and Verizon Wireless are the two richest and most successful companies in the country. They pay many billions of dollars of dividends annually and they can afford to underbid any rural landline company for subsidy, simply because they do not need it. But of course, they will bid in the reverse auctions and take the subsidies because the rules allow them to.

There are also parts of the CAF that can be used to build new broadband infrastructure and these funds also favor wireless companies. The funds get distributed by complicated rules that have a bias to get broadband to customers at the lowest cost per subscriber. And of course, there is no cheaper way to cover a large rural footprint than with wireless. Wireless companies are also going to get a lot of this infrastructure funding.

Meanwhile, AT&T recently told the FCC that they were going to introduce a plan to drop the copper for ‘millions’ of rural subscribers. And if they are successful then their rural subscribers can expect to be told to get cell phones rather than landlines. And for voice telephony this might not be such a bad thing. But do we really want to relegate a large bunch of the US geography to only having cellular data?

Today there is clearly a broadband gap with some rural areas still stuck with dial-up Internet access. And so getting them some kind of faster data seems like a reasonable plan. The FCC has set the definition of broadband to be the capability of receiving 4 Mbps download. And it’s obvious that they set that limit with rural areas in mind.

And so over the next decade more and more of rural America will be getting cellular data that will meet, or come close to meeting the FCC’s definition of broadband. But meanwhile, the cities have already far surpassed those speeds. There are very few cities left where the average home can’t get speeds of between 10 Mbps and 20 Mbps. There are usually cheaper alternatives in the range of 5 Mbps to 7 Mbps, but the faster speeds are widely available. And many places have much faster speeds available.

The FCC itself has promoted the availability of gigabit bandwidth and companies are responding. Google is bringing this speed to Kansas City, Austin and Provo and AT&T has promised to match them in Austin. CenturyLink is bringing a gigabit to Omaha. And a number of smaller municipal and commercial providers have brought gigabit speeds to other towns and cities scattered across the country. And one can expect the gigabit movement to grow rapidly.

It’s universal knowledge that the household use of bandwidth has continued to grow and there is no end in sight for that growth. As networks can provide more data households find ways to use it. Video has been the recent reason for the explosion in data usage, and now we can see that the Internet of Things will probably be the next big bandwidth driver.

Have we really solved the rural bandwidth gap if people in those areas are going to have 4 Mbps data speeds while urban areas have a gigabit? Obviously the rural areas will continue to be left behind and they will fall further behind than today. Just a few years ago the rural areas had dial-up and the cities had maybe 5 Mbps. But a gap between a rural world at single digit megabit speeds with the cities at gigabit speeds is a much larger gap and the rural areas will not be able to share in the benefits that bandwidth will bring.

The only long-term solution is to build fiber to rural America. Obviously nobody is going to build fiber to single homes at the top of mountains or at the end of ten-mile dirt roads, but I have been working on business plans that show that fiber can make sense in the average rural county. But it is really hard to get rural fiber funding since such projects tend to jut pay for themselves and are not wildly profitable.

It’s possible that the FCC’s universal service plans will work and that a lot of the 19 million rural people without broadband will get some sort of rudimentary broadband. But meanwhile, the rest of the country will be getting faster and faster bandwidth. And so, before the FCC declares ‘mission accomplished’ I think we need to have more of a debate about the definition of broadband and what is acceptable. I hate to tell the FCC, but the rural broadband issue is not going to go away even after rural areas all have cellular data.

Exit mobile version