Categories
The Industry

Google Fiber to Push Speed Limits Again

Dinni Jain, the CEO of Google Fiber, posted a blog last week that talks about dramatically increasing the top speeds available on fiber. He says the specific announcement will come in the coming months to dramatically expand Google Fiber’s gigabit offerings.

The blog gives a hint at what might be coming. Included in the blog is a speed test from the home of a Google Fiber employee in Kansas City who is receiving 20.2 Gbps. I think this might be a signal to other ISPs that Google Fiber is prepared to surpass the capability of the XGS-PON technology that the industry is adopting. That technology delivers up to 10 gigabits symmetrical to a cluster of homes, depending on the electronics vendor. It’s obvious that Google Fiber is using something faster for the test than the currently available XGS-PON. It’s always been speculated that Google has developed its own customer electronics, but the company has always been moot on the issue.

It’s not easy for most current fiber providers to update to 20-gigabit speeds, even if there I a PON solution faster than 10 Gbps. An upgrade hits every portion of a network. It means a faster connection to neighborhoods. It means faster core routers and switches. And it means a more robust pipe to the Internet. Google Fiber has an edge in fiber backbone since it has built or leased dark fiber to many markets to support YouTube peering. It means all new ONTs and customer modems capable of receiving 20-gigabit speeds – and it means much faster WiFi within homes. Ultimately it means computers and devices cable of handling faster speeds.

Google Fiber was the first to make a national splash in 2010 with gigabit fiber for $70 per month – a price it has never increased. At that time, there were a handful of municipal, cooperatives, and small telcos that offered gigabit speeds – but all of them I know about charged significantly more than $70. It sounds like Google Fiber is getting ready to recalibrate the top of the speed market again. An affordable 20-gigabit product would certainly do that.

The most interesting thing said in the blog is that speed isn’t everything. The blog hints at having products that benefit from much faster speeds – something the industry has been searching for since the introduction of gigabit speeds. There are still very few uses that can fully utilize a gigabit connection in a home, let alone a much faster connection. There are some. I have a friend with several competitive games in the house that tax his gigabit FiOS connection. There are doctors with direct connections to hospitals that can use a gigabit to view complex imaging files. There are specialty engineers, data scientists, animators, and others who could use a gigabit and more if working from home. But most homes don’t use services that can use that much bandwidth.

The product on the near horizon that could use multiple gigabit bandwidth is 3D holograms as part of immersive virtual reality and telepresence. I keep waiting for somebody to offer such a product. It wouldn’t be hard to imagine hundreds of thousands of homes trying this almost immediately. My guess is that the roadblock to much faster services is the underlying middle-mile backbones. I don’t think most local ISPs have nearly enough backbone bandwidth to support multiple customers using a dedicated gigabit of bandwidth.

The other impediment to superfast broadband products is upload bandwidth. Telepresence is a 2-way service, and even if it can work on a gigabit download connection, there is no chance of such a service working on cable company networks where upload speeds are a minuscule fraction of download speeds. According to OpenVault, over 14% of homes now buy a gigabit download connection, but I have to imagine a large percentage of these connections are on cable companies.

It’s easy to write off fast broadband speeds as vanity purchases, and to some degree, it’s true. But the industry is now facing a classic chick-and-egg dilemma. We won’t get faster broadband products until there is some critical mass of homes ready to use them.

The blog says that Google will be discussing some of these issues in the coming weeks, including the upgrade of networks and maximizing speeds inside homes.

Categories
The Industry Uncategorized

Counting Gigabit Households

I ran across a website called the Gigabit Monitor that is tracking the population worldwide that has access to gigabit broadband. The website is sponsored by VIAVI Solutions, a manufacturer of network test equipment.

The website claims that in the US over 68.5 million people have access to gigabit broadband, or 21% of the population. That number gets sketchy when you look at the details. The claimed 68.5 million people includes 40.3 million served by fiber, 27.2 million served by cable company HFC networks, 822,000 served by cellular and 233,000 served by WiFi.

Each of those numbers is highly suspect. For example, the fiber numbers don’t include Verizon FiOS or the FiOS properties sold to Frontier. Technically that’s correct since most FiOS customers can buy maximum broadband speeds in the range of 800-900 Mbps. But there can’t be 40 million people other people outside of FiOS who can buy gigabit broadband from other fiber providers. I’m also puzzled by the cellular and WiFi categories and can’t imagine there is anybody that can buy gigabit products of either type.

VIAVI makes similar odd claims for the rest of the world. For example, they say that China has 61.5 million people that can get gigabit service. But that number includes 12.3 million on cellular and 6.2 million on WiFi.

Finally, the website lists the carriers that they believe offer gigabit speeds. I have numerous clients that own FTTH networks that are not listed, and I stopped counting when I counted 15 of my clients that are not on the list.

It’s clear this web site is flawed and doesn’t accurately count gigabit-capable people. However, it raises the question of how to count the number of people who have access to gigabit service. Unfortunately, the only way to do that today is by accepting claims by ISPs. We’ve already seen with the FCC broadband maps how unreliable the ISPs are when reporting broadband capabilities.

As I think about each broadband technology there are challenges in defining gigabit-capable customers. The Verizon situation is a great example. It’s not a gigabit product if an ISP caps broadband speeds at something lower than a gigabit – even if the technology can support a gigabit.

There are challenges in counting gigabit-capable customers on cable company networks as well. The cable companies are smart to market all of their products as ‘up to’ speeds because of the shared nature of their networks. The customers in a given neighborhood node share bandwidth and the speeds can drop when the network gets busy. Can you count a household as gigabit-capable if they can only get gigabit speeds at 4:00 AM but get something slower during the evening hours?

It’s going to get even harder to count gigabit capability when there are reliable cellular networks using millimeter wave spectrum. That spectrum is only going to able to achieve gigabit speeds outdoors when in direct line-of-site from a nearby cell site. Can you count a technology as gigabit-capable when the service only works outdoors and drops when walking into a building or walking a few hundred feet away from a cell site?

It’s also hard to know how to count apartment buildings. There are a few technologies being used today in the US that bring gigabit speeds to the front of an apartment building. However, by the time that the broadband suffers packet losses due to inside wiring and is diluted by sharing among multiple apartments, nobody gets a true gigabit product. But ISPs routinely count them as gigabit customers.

There is also the issue of how to not double-count households that can get gigabit speeds from multiple ISPs. There are urban markets with fiber providers like Google Fiber, Sonic, US Internet, EPB Chattanooga, and others where customers can buy gigabit broadband on fiber and also from the cable company. There are even a few lucky customers in places like Austin, Texas and the research triangle in North Carolina where some homes have three choices of gigabit networks after the telco (AT&T) also built fiber.

I’m not sure we need to put much energy into accurately counting gigabit-capable customers. I think everybody would agree an 850 to 950 Mbps connection on Verizon FiOS is blazingly fast. Certainly, a customer getting over 800 Mbps from a cable company has tremendous broadband capability. Technically such connections are not gigabit connections, but the difference between a gigabit connection and a near-gigabit connection for a household is so negligible as to not practically matter.

Categories
Technology

A New Technology for MDU Broadband

A Canadian company recently announced a new device that promises the ability to deliver gigabit speeds inside of MDUs using existing copper or coaxial wiring. The company is Positron Access Solutions and I talked to their CTO and president, Pierre Trudeau at the recent Broadband Communities event in Washington DC. Attached is an article and a PowerPoint talking about the new technology.

The technology is built upon a framework of the G.hn standards. You might remember this as the standard supporting powerline carrier that was used before WiFi to distribute broadband around the home using the electrical wiring in the home. G.hn over powerline was a sufficient technology when broadband speeds were slow but didn’t scale up to support faster broadband speeds. In thinking back, I recall that the biggest limitation was that there are dozens of different types of electrical wires used in homes over the last century and it was hard to have a technology that worked as promised over the various sizes and types of in-home wiring.

Positron has been around for many years and manufactures IP PBX systems and DSL extenders. They are referring to the new technology as GAM, which I take to mean G.hn Access Network.

The company says that the technology will deliver a gigabit signal about 500 feet over telephone copper wires and over 4,000 feet on coaxial cable. Large MDUs delivering the technology using telephone copper might require spacing a few devices throughout parts of the network.

The technology operates on unused frequency bands on the copper cables. For example, on telephone copper, the technology can coexist on a telephone wire that’s already carrying telephone company voice. On coaxial cable, the Positron device can coexist with satellite TV from DirecTV or Dish Networks but can’t coexist with a signal from a traditional cable company.

Positron says they are a natural successor to G.Fast which has never gotten a lot of traction in the US. Positron says they can deliver more bandwidth with less noise than G.Fast. The Positron GAM spits out Ethernet at the customer apartment unit and can be used with any existing CPE like WiFi routers, computers, TVs, etc.

This is a new technology and the company currently has only a few test units at clients in the field. Like all new technology, a company should consider this as a beta technology where the vendor will be working out field issues. But this technology has a lot of promise if perfected. There are a lot of older MDUs where the cost of rewiring is prohibitive or where the building owners don’t want fiber strung through hallways. Getting to apartment units through existing copper wiring should be less disruptive, less expensive and faster to market.

I always caution all of my clients about using first-generation technology. It’s bound to suffer from issues that aren’t discovered until deployed in real-world situations. First-generation equipment is always a risk since many vendors have abandoned product lines that have too many field problems. The supply chain is often poorly defined, although in the case of Positron the company has been providing technical support for many years. My main concern with beta technology is that it’s never comfortable using end-user customers as guinea pigs.

However, an MDU might be the perfect environment to try new technology. Many MDUs have been unable to attract better broadband due to high rewiring costs and might be willing to work with an ISP to test new technology. If this technology operates as touted it could provide a cost-effective way to get broadband into MDUs, particularly older ones where rewiring is a cost barrier.

Categories
The Industry

Broadband and Unemployment

Economists at the University of Tennessee at Chattanooga and Oklahoma State University conducted a study that correlates broadband speeds to unemployment. They concluded that unemployment rates are 0.26% lower in counties with faster broadband. They further concluded that broadband has a bigger impact on jobs in rural areas than in metropolitan ones.

The lead economist on the project, Bento J. Lobo, lives in Chattanooga and began the investigation because of the high-speed municipal fiber network in the City. He was curious if that network had contributed in a measurable way to jobs. The study also looked at FCC data from the National Broadband Map dataset and looked at 95 other counties in Tennessee.

The study looked at broadband availability and unemployment over the period from 2011 to 2016. The study measured broadband availability by considering places that have more than one landline broadband provider defined as served, with the rest either unserved or underserved. They concluded that Tennessee looks a lot like the rest of the country in that urban areas have decent broadband while broadband options in rural parts of the state are limited.

I have to wonder about the extent to which poor FCC broadband mapping data suppressed the findings of the study. I wrote a blog earlier this week that highlighted a Penn State study that showed the inadequacies of the FCC data in Pennsylvania, where the number of people that have broadband availability was overstated in every county in the state. For example, the Penn State study showed that there are counties in the state that the FCC considers as fully covered by broadband, but in which the average actual download speed in the county is at half of the FCC’s 25/3 Mbps definition of broadband. That kind of mapping error has to be affecting the results found in this unemployment study by overstating the rural areas that have good broadband.

The fact that the authors found a correlation is impressive after understanding the nature of the FCC dataset. The authors of this report say that the topic is worthy of more granular studies looking at specific counties that get broadband for the first time. At CCG we work with such counties and we’ve gathered a lot of anecdotal evidence over the years that broadband brings jobs to rural America.

Everywhere we go we see evidence that rural people are hungry for good-paying jobs. In one rural county we studied in Minnesota we saw that every single farm in the county had an incorporated home-based business that was separate from farming. Somebody at every farm was trying to supplement farming income. The rural folks in that county hoped that they could find better-paying jobs after getting broadband. In this particular county, the farms didn’t even have rudimentary DSL, and their broadband options were limited to satellite broadband or cellular data.

Parts of this county have gotten wireless broadband that is advertised at speeds between 25 Mbps – 50 Mbps. I agree with the researchers that more granular study ought to be done and it would be illuminating to have studied the rural households in this county before and after the introduction of broadband. My guess is that broadband has a bigger impact than calculated by this study.

Good broadband enables rural residents to find home-based online jobs – an exploding part of the new economy. In this particular county the unemployment rate might not change due to broadband – but household incomes are likely to increase as farm family members find better-paying jobs online to replace the ones they are tackling today without broadband. That kind of job upgrade would not be measured by looking at the unemployment rate, but would be discovered in more granular analysis.

The impacts in bigger cities like Chattanooga must be a lot harder to quantify. Again, we know anecdotally that programmers and other high-tech folks moved to Chattanooga due to the ubiquitous gigabit fiber network. It has to be very hard to somehow pinpoint those fiber-related new jobs out of a diverse big-city economy. This has to be particularly hard to pinpoint the impact of broadband in an economy where unemployment rates fell nationwide during the whole study period.

Categories
What Customers Want

Why Offer Fast Data Speeds?

A commentor on an earlier blog asked a great question. They observed that most ISPs say that customer usage doesn’t climb when customers are upgraded to speeds faster than 50 Mbps – so why does the industry push for faster speeds? The question was prompted by the observation that the big cable companies have unilaterally increased speeds in most markets to between 100 Mbps to 200 Mbps. There are a lot of different answers to that question.

First, I agree with that observation and I’ve heard the same thing. The majority of households today are happy with a speed of 50 Mbps, and when a customer that already has enough bandwidth is upgraded they don’t immediately increase their downloading habits.

I’ve lately been thinking that 50 Mbps ought to become the new FCC definition of broadband, for exactly the reasons included in the question. This seems to be the speed today where most households can use the Internet in the way they want. I would bet that many households that are happy at 50 Mbps would no longer be happy with 25 Mbps broadband. It’s important to remember that just three or four years ago the same thing could have been said about 25 Mbps, and three or four years before that the same was true of 10 Mbps. One reason to offer faster speeds is to stay ahead of that growth curve. Household bandwidth and speed demand has been doubling every three years or so since 1980. While 50 Mbps is a comfortable level of home bandwidth for many today, in just a few years it won’t be.

It’s also worth noting that there are some households who need more than the 50 Mbps speeds because of the way they use the Internet. Households with multiple family members that all want to stream at the same time are the first to bump against the limitations of a data product. If ISPs never increase speeds above 50 Mbps, then every year more customers will bump against that ceiling and begin feeling frustrated with that speed. We have good evidence this is true by seeing customers leave AT&T U-verse, at 50 Mbps, for faster cable modem broadband.

Another reason that cable companies have unilaterally increased speeds is to help overcome customer WiFi issues. Customers often don’t care about the speed in the room with the WiFi modem, but care about what they can receive in the living room or a bedroom that is several rooms away from the modem. Faster download speeds can provide the boost needed to get a stronger WiFi signal through internal walls. The big cable companies know that increasing speeds cuts down on customer calls complaining about speed issues. I’m pretty sure that the cable companies will say that increasing speeds saves them money due to fewer customer complaints.

Another important factor is customer perception. I always tell people that if they have the opportunity, they should try a computer connected to gigabit speeds. A gigabit product ‘feels’ faster, particularly if the gigabit connection is on fiber with low latency. Many of us are old enough to remember that day when we got our first 1 Mbps DSL or cable modem and got off dial-up. The increase in speed felt liberating, which makes sense because a 1 Mbps DSL line is twenty times faster than dial-up, and also has a lower latency. A gigabit connection is twenty times faster than a 50 Mbps connection and seeing it for the first time has that same wow factor – things appear on the screen almost instantaneously as you hit enter. The human eye is really discerning, and it can see a big difference between loading the same web site at 25 Mbps and at 1 Gbps. The actual time difference isn’t very much, but the eye tells the brain that it is.  I think the cable companies have figured this out – why not give faster speeds if it doesn’t cost anything and makes customers happy?

While customers might not immediately use more broadband, I think increasing the speed invites them to do so over time. I’ve talked to a lot of people who have lived with inadequate broadband connections and they become adept at limiting their usage, just like we’ve all done for many years with cellular data usage. Rural families all know exactly what they can and can’t do on their broadband connection. For example, if they can’t stream video and do schoolwork at the same time, they change their behavior to fit what’s available to them. Even non-rural homes learn to do this to a degree. If trying to stream multiple video streams causes problems, customers quickly learn not to do it.

Households with fast and reliable broadband don’t give a second thought about adding an additional broadband application. It’s not a problem to add a new broadband device or to install a video camera at the front door. It’s a bit of the chicken and egg question – does fast broadband speeds promote greater broadband usage or does the desire to use more applications drive the desire to get faster speeds? It’s hard to know any more since so many homes have broadband speeds from cable companies or fiber providers that are set faster than what they need today.

Categories
The Industry

Using Gigabit Broadband

Mozilla recently awarded $280,000 in grants from its Gigabit Communities Fund to projects that are finding beneficial uses of gigabit broadband. This is the latest set of grants and the company has awarded more than $1.2 million to over 90 projects in the last six years. For any of you not aware of Mozilla, they offer a range of open standard software that promotes privacy. I’ve been using their Firefox web browser and operating software for years. As an avid reader of web articles I daily use their Pocket app for tracking the things I’ve read online.

The grants this year went to projects in five cities: Lafayette, LA; Eugene, OR; Chattanooga, TN; Austin, TX; and Kansas City. Grants ranged from $10,000 to $30,000. At least four of those cities are familiar names. Lafayette and Chattanooga are two of the largest municipally-owned fiber networks. Austin and Kansas City have fiber provided by Google Fiber. Eugene is a newer name among fiber communities and is in the process of constructing an open access wholesale network, starting in the downtown area.

I’m not going to recite the list of projects and a synopsis of them is on the Mozilla blog. The awards this year have a common theme of promoting the use of broadband for education. The awards were given mostly to school districts and non-profits, although for-profit companies are also eligible for the grants.

The other thing these projects have in common is that they are developing real-world applications that require robust broadband. For example, several of the projects involve using virtual reality. There is a project that brings virtual reality to several museums and another that shows how soil erosion from rising waters and sediment mismanagement has driven the Biloxi-Chitimacha-Choctaw band of Indians from the Isle de Jean Charles in Louisiana.

I clearly remember getting my first DSL connection at my house after spending a decade on dial-up. I got a self-installed DSL kit from Verizon and it was an amazing feeling when I connected it. That DSL connection provided roughly 1 Mbps, which was 20 to 30 times faster than dial-up. That speed increase freed me up to finally use the Internet to read articles, view pictures and shop without waiting forever for each web site to load. I no longer had to download software updates at bedtime and hope that the dial-up connection didn’t crap out.

I remember when Google Fiber first announced they were going to build gigabit networks for households. Gigabit broadband brings that same experience. When Google Fiber announced the gigabit fiber product most cable networks had maximum speeds of perhaps 30 Mbps – and Google was bringing more than a 30-times increase in speed.

Almost immediately we heard from the big ISPs who denigrated the idea saying that nobody needs gigabit bandwidth and that this was a gimmick. Remember that at that time the CEO of almost every major ISP was on the record saying that they provided more than enough broadband to households – when it was clear to users that they didn’t.

Interestingly, since the Google Fiber announcement the big cable companies have decided to upgrade their own networks to gigabit speeds and ISPs like AT&T and Verizon rarely talk about broadband without mentioning gigabit. Google Fiber reset the conversation about broadband and the rest of the industry has been forced to pay heed.

The projects being funded by Mozilla are just a few of the many ways that we are finding applications that need bigger broadband. I travel to communities all over the country and in the last year I have noticed a big shift in the way that people talk about their home broadband. In the past people would always comment that they seemed to have (or not have) enough broadband speed to stream video. But now, most conversations about broadband hit on the topic of using multiple broadband applications at the same time. That’s because this is the new norm. People want broadband connections that can connect to multiple video streams simultaneously while also supporting VoIP, online schoolwork, gaming and other bandwidth-hungry applications. I now routinely hear people talking about how their 25 Mbps connection is no longer adequate to support their household – a conversation I rarely heard as recently as a few years ago.

We are not going to all grow into needing gigabit speeds for a while. But the same was true of my first DSL connection. I had that connection for over a decade, and during that time my DSL got upgraded once to 6 Mbps. But even that eventually felt slow and a few years later I was the first one in my area using the new Verizon FiOS and a 100 Mbps connection on fiber. ISPs are finally facing up to the fact that households are expecting a lot of broadband speed. The responsive ISPs are responding to this demand, while some bury their heads in the sand and try to convince people that their slower broadband speeds are still all that people need.

Categories
Current News

Charter Upgrading Broadband

We are now starting to see the results of cable companies upgrading to DOCSIS 3.1. Charter, the second biggest ISP in the country recently announced that it will be able to offer gigabit speeds to virtually it’s whole footprint of over 40 million passings.

DOCSIS 3.1 is the newest protocol from Cable Labs that allows bonding an unlimited number of spare channel slots for broadband. A gigabit data path requires roughly 24 channels on a cable network using the new DOCSIS protocol. In bigger markets this replaces DOCSIS 3.0 that was limited to maximum download speeds in the range of 250 Mbps. I know there are Charter markets with even slower speeds that either operate under older DOCSIS standards or that are slow for some other reason.

Charter has already begun the upgrades and is now offering gigabit speeds to 9 million passings in major markets like Oahu, Hawaii; Austin, Texas; San Antonio, Texas, Charlotte, North Carolina; Cincinnati, Ohio; Kansas City, Missouri; New York City; and Raleigh-Durham, North Carolina. It’s worth noting that those are all markets where there is fiber competition, so it’s natural they would upgrade these first.

The new increased speed won’t actually be a gigabit and will be 940 Mbps download and 35 Mbps upload. (It’s hard to think there is anybody who is really going to care about that distinction). Cable Labs recently came out with a DOCSIS upgrade that can increase upload speeds, but there’s been no talk from Charter about making that upgrade. Like the other big cable companies, Charter serves businesses that want faster upload speeds with fiber.

Along with the introduction of gigabit broadband the company also says it’s going to increase the speed of it’s minimum broadband product. In the competitive markets listed above Charter has already increased the speed of its base product to 200 Mbps download, up from 100 Mbps.

It’s going to be interesting to find out what Charter means by the promise to cover “virtually’ their whole footprint. Charter grew by purchasing systems in a wide range of conditions. I know of smaller Charter markets where customers don’t get more than 20 Mbps. There is also a well-known lawsuit against Charter in New York State that claims that a lot of households in upstate New York are getting speeds far slower than advertised due to having outdated cable modems.

The upgrade to DOCSIS 3.1 can be expensive in markets that have not yet been upgraded to DOCSIS 3.0. An upgrade might mean replacing power taps and other portions of the network, and in some cases might even require a replacement of the coaxial cable. My guess is that the company won’t rush to upgrade these markets the upgrade to DOCSIS 3.1 this year. I’m sure the company will look at them on a case-by-case basis.

The company has set a target price for a gigabit at $124.95. But already in the competitive markets like Oahu the company was selling introductory packages for $104.99. There is also a bundling discount for cable subscribers.

The pricing list highlights that they still have markets with advertised speeds as low as 30 Mbps – and the company’s price for the minim speeds is the same everywhere, regardless if that product is 30 Mbps or 200 Mbps. And as always with cable networks, these are ‘up to’ speeds and as I mentioned, there are markets that don’t meet these advertised speeds today.

Overall this ought to result in a lot of home and businesses getting faster broadband than today. We saw something similar back when the cable companies implemented DOCSIS 3.0 and the bigger companies unilaterally increased speeds to customers without increasing the prices. Like other Charter customers, I will be interested in what they do in my market. I have the 60 Mbps product and I’ll be interested to see if my minimum speeds is increased to 100 Mbps or 200 Mbps and if I’m offered a gigabit here. With the upgrade time frame they are promising I shouldn’t have to wait long to find out.

Categories
Technology The Industry

Do We Really Need Gigabit Broadband?

I recently read an article in LightReading titled “All That’s Gigabit Doesn’t Glitter.” The article asks the question if the industry really needs to make the leap to gigabit speeds. It talks about the industry having other options that can satisfy broadband demand but that telco executives get hooked into the gigabit advertising and want to make the gigabit claim. A few of the points made by the article are thought-provoking and I thought today I’d dig deeper into a few of those ideas.

The big question of course is if telco providers need to be offering gigabit speeds, and it’s a great question. I live in a cord cutter family and I figure that my download needs vary between 25 Mbps and 50 Mbps at any given time (look forward to a blog soon that demonstrates this requirement). I can picture homes with more than our three family members needing more since the amount of download speed needed is largely a factor of the number of simultaneous downloads. And certainly there are people who work at home in data intensive jobs that need far more than this.

There is no doubt that a gigabit is a lot more broadband than I need. If we look at my maximum usage need of 50 Mbps then a gigabit is 20 times more bandwidth capacity than I am likely to need. But I want to harken back to our broadband history to talk about the last time we saw a 20-fold increase in available bandwidth.

A lot of my readers are old enough to remember the agony of working on dial-up Internet. It could take as much as a minute at 56 kbps just to view a picture on the Internet. And we all remember the misery that came when you would start a software update at bedtime and pray that the signal didn’t get interrupted during the multi-hour download process.

But then along came 1 Mbps DSL. This felt like nirvana and it was 20 time faster than dial-up. We were all so excited to get a T1 to our homes. And as millions quickly upgraded to the new technology the services on the web upped their game. Applications became more bandwidth intensive, program downloads grew larger, web sites were suddenly filled with pictures that you didn’t have to wait to see.

And it took a number of years for that 1 Mbps connection to be used to capacity. After all, this was a 20-fold increase in bandwidth and it took a long time until households began to download enough simultaneous things to use all of that bandwidth. But over time the demand for web broadband kept growing. As cable networks upgraded to DOCSIS 3.0 the web started to get full of video and eventually the 1 Mbps DSL connection felt as bad as dial-up a decade before.

And this is perhaps the major point that the article misses – you can’t just look at today’s needed usage to talk about the best technology. Since 1980 we’ve experienced a doubling of the amount of download speeds needed by the average household every three years. There is no reason to think that growth is stopping, and so any technology that is adequate for a home today is going to feel sluggish in a decade and obsolete in two decades. We’ve now reached that point with older DSL and cable modems that have speeds under 10 Mbps.

The other point made by the article is that there are technology steps between today’s technology and gigabit speeds. There are improved DSL technologies and G.Fast that could get another decade out of embedded copper and could be competitive today.

But it’s obvious that the bigger telcos don’t want to invest in copper. I get the impression that if AT&T found an easy path to walk away from all copper they’d do so in a heartbeat. And none of the big companies have done a good job of maintaining copper and most of it is in miserable shape. So these companies are not going to be investing in G.Fast, although as a fiber-to-the-curb technology it would be a great first step towards modernizing their networks to be all-fiber. CenturyLink, AT&T and others are considering G.Fast as a technology to boost the speeds in large apartment buildings, but none of them are giving any serious consideration of upgrading residential copper plant.

It’s also worth noting that not all companies with fiber bit on the gigabit hype. Verizon always had fast products on their FiOS and had the fastest speed in the industry of 250 Mbps for many years. They only recently decided to finally offer a gigabit product.

And this circles back to the question of whether homes need gigabit speeds. The answer is clearly no, and almost everybody offering a gigabit product will tell you that it’s still largely a marketing gimmick. Almost any home that buys a gigabit would have almost the same experience on a fiber-based 100 Mbps product with low fiber latency.

But there are no reasonable technologies in between telephone copper and fiber. No new overbuilder or telco is going to build a coaxial cable network and so there is no other choice than building fiber. While we might not need gigabit speeds today for most homes, give us a decade or two and most homes will grow into that speed, just as we grew from dial-up to DSL. The gigabit speed marketing is really not much different than the marketing of DSL when it first came out. My conclusion after thinking about this is that we don’t need gigabit speeds, but we do need gigabit capable networks – and that is not hype.

Categories
What Customers Want

What’s the Right Price for a Gigabit?

I often get asked how to price gigabit service by clients that are rolling it out for the first time. For an ISP already in the broadband business, layering in a super-fast Internet product on top of an existing product line can be a real challenge.

Google certainly lowered the bar for the whole industry when they priced a gigabit at $70. And that is the real price since Google doesn’t charge extra for the modem. I think the Google announcement recalibrated the public’s expectations and anybody else that offers a gigabit product is going to be compared to that price.

There are a few other large companies marketing a gigabit product in multiple markets. CenturyLink has a gigabit connection for $79.95 per month. But it’s hard to know if that is really the price since it is bundled with CenturyLink’s Prism TV. The cheapest Prism TV product offered on the web costs $39.99 per month and includes 150 channels of programming and also comes with an additional settop box fee of $9.99 per month – the highest box fee I’ve seen. I don’t know exactly what kind of bundle discount is available, but on the web I’ve seen customers claiming that the cheapest price for the gigabit bundle is around $125 per month. That’s a far cry from Google’s straight $70. And for customers who want to use a gigabit to cut the cord a forced bundles feel a bit like blackmail.

Verizon FiOS has not yet given in to the pressure to offer a gigabit product. In looking at their web site their fastest product is still a symmetrical 500 Mbps connection at $270 per month plus an added fee for a modem, and with a required 2-year commitment. A 1-year commitment is $280 per month.

Comcast will soon offer a gigabit in more markets than anybody else. In Atlanta where Comcast is competing against Google Fiber a gigabit is $70 per month with a 3-year contract, including an early termination fee (meaning that if you leave you pay for the remaining months). This package also requires an additional modem charge. Without a contract the price for the gigabit is $140. It’s unclear if Comcast is offering the same lower-price deal in other markets with newly upgraded DOCSIS 3.1 like Chicago. The word on the Internet is that customers are unable to sign-up for the lower-price option in these markets, but the company says it’s available. I’m sure the availability  will soon become clear.

One thing that happens to any company that offers a gigabit is that the prices for slower speeds are slashed. If a gigabit is $70 – $80 then slower products must become correspondingly less expensive. Google offers a 100 Mbps product for $50 and each of the other companies listed above has a range of slower bandwidth products.

The first question I always ask an ISP is if they are offering gigabit speed for the public relations value or they really want to sell a lot of it. There are plenty of ISPs that have gone for the first option and have priced a gigabit north of $100 per month.  But for somebody that hopes to sell the product, the dilemma is that they know that the majority of their customers will buy the least expensive product that provides a comfortable speed. The rule of thumb in the industry is that, in most markets, at least 80% of customers will buy the low or moderate priced options. But if the choice is between a gigabit product and a 100 Mbps product, the percentage buying the slower product is likely to be a lot higher.

The issue that small ISPs face when recalibrating their speeds is that they end up increasing speeds for most existing customers. If they migrate from a scale today where 50 Mbps or 100 Mbps is the fastest product up to a new scale topped by a gigabit, then they have to increase speeds across the board to accommodate the new gigabit product.

This is a hard mental block to get over for many small ISPs. If a company offers a range today of products from 6 Mbps to 75 Mbps it’s mentally a challenge to reset their slowest speed to 50 Mbps or faster. They often tell me that in doing so they feels like they are giving away something for free. If a company has been an ISP since the dial-up days they often have a number of customers that have been grandfathered with slow, but inexpensive broadband. It’s a real dilemma when rebalancing speeds and rates to know what to do with households that are happy with a very cheap connection at 1 Mbps or 2 Mbps product.

For the last ten years I have advised clients to raise speeds. ISPs that have raised speeds tell me that they generally only see a tiny bump in extra traffic volume after doing so. And I’ve always seen that customers appreciate getting faster speeds for the same price. Since it doesn’t cost much to raise speeds it’s one of the cheapest forms of marketing you can do, and it’s something positive that customers will remember.

I think most ISPs realize that the kick-up to gigabit speeds is going to be a change that lasts for a long time. There are not many customers in a residential market that need or can use gigabit speeds. What Google did was to leap many times over the natural evolution of speeds in the market, and I think this is what makes my clients uneasy. They were on a path to have a structure more like Verizon with a dozen products between slow and fast. But the market push for gigabit speeds has reduced the number of options they are able to offer.

Exit mobile version