Are Cable Companies Winning the Speed War?

Polk County SignThe latest news about Google Fiber slowing on their metropolitan fiber builds got me to wondering if perhaps the cable companies are starting to win the speed wars. Are we getting to a time when a fiber overbuilder is going to have trouble competing with them?

After many years of being stingy with bandwidth the cable companies have now largely adopted the opposite strategy and increase household speeds over time without raising prices. I can remember quotes from several big cable companies a few years ago where the cable companies claimed they were giving households all the speed that they need. And this was back at a time when they were experiencing a significant amount of network congestion during the peak evening hours. But my reading of many different customer reviews tells me that the cable companies have largely solved the congestion issue.

This is not to say that there are not places where the cable networks are still not up to snuff, but compared to ten years ago, a lot more cable networks seem to be delivering the speeds that customers want. Of course, there are still plenty of small town where the rural cable networks are not up to snuff, but metropolitan areas seem to have improved a lot.

The FCC reported in their 2015 Measuring Broadband America Fixed Report that Comcast customers got between 109% and 119% of the speeds that they paid for. I know personally that my speed tests often shows at least 5 Mbps better performance than what I am paying for with Comcast.

But the question that has been nagging me is if a new fiber provider can really thrive in a metropolitan area? Can they get enough customers to be profitable? It’s been widely reported that Google and other fiber overbuilders need at least a 30% market share to succeed, and that’s a tall order in a city where everybody already has broadband.

People need a compelling reason to change providers, because it’s a process that nobody enjoys. It means staying at home to meet an installer, returning settop boxes and modems, and worrying about the billing transition.

I have some anecdotal evidence about the way at least one group of people buy broadband. I’ve been a member of several active Maryland sports message boards for over two decades and broadband is a periodic topic of conversation since sports fans these days care about watching sports on the Internet. The majority of the people on these boards happen to live in neighborhoods that have both Verizon FiOS and a cable company – mostly Comcast, but sometimes somebody else. These are folks who have had the choice between fiber and coaxial cable networks for a long time.

What I’ve seen over the years is that there are a few people that are big fans of either the cable company or Verizon. But the vast majority don’t seem to really care as long as the broadband works well enough to watch their sports and the other things their families do on the Internet. Probably half of the people on these boards have moved back and forth between the providers during the last decade. I’ve seen evidence that content matters more than speeds when over the years there were occasions when one provider or the other did not broadcast a Maryland football or basketball game. At least among this one large group of I don’t see any major affinity for fiber over coaxial cable networks. These folks just want something that will work.

A new fiber provider has to provide a compelling reason for people to change. Certainly having lower prices could be a compelling reason, but most metropolitan fiber providers are not much cheaper than the cable company (and sometimes they are more expensive). And while a fiber provider might offer gigabit speeds, I wonder if that is enough to get people to change if they are happy with the speeds they have had for the last few years?

I’ve always said that there is some percentage of any community that will change to a new provider because they dislike the current provider for some reason. But those are rarely enough customers to justify a business plan, and so being successful with fiber also means persuading customers that are not unhappy to change. And perhaps, as Google has found out, that is not as easy as fiber proponents have assumed. Certainly, the cable company tactic of greatly improving the performance of their data products is making it harder and harder for a new overbuilder to thrive.

Technology Hype

coax cablesI find it annoying when I read short articles that proclaim that a new technology that can deliver faster data speeds is right around the corner. This has most recently happened with 5G cellular, but in the past there have been spates of such articles talking about cable modem speeds with DOCSIS 3.1, and faster copper speeds with G.Fast.

It’s always easy to understand where such articles come from. Some vendor or large ISP will announce a technical breakthrough in a lab, and then soon thereafter there are numerous articles written by non-technical people proclaiming that we will soon be seeing blazing speeds at our homes or on our cell phones.

But these articles are usually premature, and sadly there are real-life consequences to this kind of lazy press. Politicians and policy makers see these articles and accept them as gospel and make decisions based upon these misleading articles. It then is up to people like me to come behind and explain to them why the public claims are not true.

This is happening right now with talk about blazingly fast millimeter wave radios to replace fiber loops. Even if this technology were ready for market tomorrow (which it won’t be), like any technology it will have limits. There are places where wireless loops might be a great solution but other places where it may never be financially or technically feasible. Yet a whole lot of the country now believes that our future broadband is dependent upon gigabit wireless, and this is quashing plans for building fiber networks.

One recent set of these kinds of articles proclaimed that DOCSIS 3.1 is going to bring everybody gigabit speeds over cable company networks. And there is some truth to that, but the nuances are never explained. There are a lot of changes needed in a cable network to bring gigabit speeds to all of their customers. What is really happening in the first upgrade is that cable networks will have limited gigabit capabilities. The companies will be able to deliver gigabit speeds to perhaps hundreds of people in a market. Their networks would have problems if they tried to deliver it to thousands, and their networks would crash if they tried to give fast speeds to everybody.

Consider the list of issues that must be overcome to use a cable network to bring gigabit speeds to the masses:

  • First a cable company has to free up enough empty channels to make room for the gigabit data channels. For many cable system this will require upgrading the overall bandwidth of the cable network, and this can be very expensive. In the most extreme cases it can mean replacing all of the network amplifiers and power taps and even sometimes replacing some of the coaxial cable.
  • Cable bandwidth is shared by all of the customers in a neighborhood (called a node). If a cable company only sells a few gigabit products in a given node there will be some small degradation of bandwidth performance for everybody else. But if enough customers want to buy a gigabit the cable company will be forced to ‘split’ the nodes so that there are fewer homes sharing the bandwidth. Cable companies today have nodes of 200 – 300 customers, compared to fiber network nodes that generally range between 16 and 32 customers per node. A cable company has to build more fiber and install more electronics to get nodes as small as fiber systems.
  • Every network has chokepoints, or places where only a set amount of bandwidth can be handled at the same time. There are several of these chokepoints in a cable network – at the node, on the data pipe serving the node, at several data concentration points within the headend, and with the pipe to the outside Internet. You can’t upgrade speeds without upgrading these chokepoints, and that can be expensive.
  • At some point if enough customers want fast speeds the network would need to be fundamentally reconfigured to a new technology. This might mean converting the whole headend and electronics to IPTV. It might mean moving the CMTS (the device that talks to the data at each node) into the field, similar to a fiber network. And it would mean building a lot more fiber, to the point where there would almost be as much fiber as in a fiber-to-the-premise network.

There is always some truth in these technological pronouncements. But these articles are way off base when they then imply that a given breakthrough is the end-all solution to broadband. Yes, cable systems can be faster now, which is great. But DOCSIS 3.1 does not make a cable network equivalent to a Google Fiber network that can already deliver a gigabit to everybody. And yes, there is great promise in wireless local loops. But even after all of the issues with deploying wireless in a real-life environment are solved, the technology is only going to work where there is fiber fairly close to customers and when a number of other factors are just right. These kind of nuances matter and I really wish that non-techie writers would stop telling us that the solution to all of our broadband speed problems is right around the corner. Because it’s not.

Thoughts on the Google Fiber News

800px-OSU_Bucket_TruckIt was recently reported that Larry Page, the CEO of Alphabet told Google Fiber to cut their staff in half from 1,000 to 500 and to also cut the cost of building new fiber. That certainly is going to slow or even stop Google Fiber’s expansion plans. There are a few lesson to be learned from that announcement for all fiber start-ups.

It’s expensive to build in cities. Building new fiber networks from scratch is expensive. It’s very doubtful that Google has found any magic that would let them build fiber networks for much less than anybody else. Verizon always said they stopped expanding FiOS due to the construction costs. Yet Verizon built most of their network in places where the construction was relatively inexpensive. They lashed fiber to existing phones cables where there was poles or installed underground fiber where there was existing conduit. But Google Fiber didn’t have either of those advantages and so they were spending a lot more than Verizon.

The company is now looking at wireless technologies to cut construction costs. Unless the company has some really good millimeter wave technology ready to roll out soon, this could cause a big delay on expansion. Lots of companies are thinking about wireless loops, but we know from past experience that there is a big transition with any wireless technology when moving from a lab into the real world.

You have to sell what customers want to buy. The news articles I saw say that Google Fiber had 200,000 customers at the end of 2014. One would expect that they have up to twice that by now. But considering their initial goal to have millions of customers those numbers are low.

My guess is that the company has had trouble convincing enough households to buy their $70 broadband. I would buy that in a flash, but I expect a lot of homes found the product to be out of reach for their budgets. I see that in Atlanta that Google Fiber has introduced a 100 Mbps broadband connection for $50, and that has to open up a lot more sales opportunities for them.

This highlights one of the biggest challenges for a fiber overbuilder. The biggest expense by far of getting into the business is to build the fiber network. Once you’ve sunk that money the goal is to get as many paying customers as possible onto the network, and that means having a wide enough array of products (which are still profitable) to generate revenue.

Delays are to be expected. There has been news over the last few years of delays in Google Fiber construction, much of it having to do with access to poles. Google Fiber learned the same lesson that had deviled many of my clients: when you need to get poles from the company you will competing with they are going to use every legal trick in the book to slow down the process. I’ve written about pole horror stories before in the blog.

Labor costs can be your enemy. The announcement said that Google Fiber had to slash their work force from 1,000 to 500. They face the same dilemma as other broadband start-ups – there is a lot of work to do in launching new markets compared to the number of people needed to operate them once they are mature.

For many years I have used a general rule of thumb for the number of employees that a telecom company ought to have. For example, a medium sized carrier with 20,000 to 50,000 customers ought to have roughly 1 employee for every 350 customers. As companies get larger and more efficient that ratio increases and companies up to about 250,000 ought to have around 1 employee for every 400 customers. Bigger companies should get even more efficient, but seems to be a natural cap at some size reached by very large companies.

It’s impossible to judge if Google Fiber has too many employees without knowing how many customers they have today. I also have no idea if the company uses contractors in addition to full-time employees because those would count in this calculation. But if the company has 400,000 customers today and doesn’t use contractors then the 1,000 employee count wouldn’t be too bad. But if they have fewer customers and also use contractors they are currently overstaffed.

One issue that can justify a larger staff than normal is the need to have a staff that is working ahead on future markets. Those employees would not be counted when looking at the size of the needed staff.

Also, the ratios I cited are more than a decade old and if I put some thought to them I would probably revise them higher. There are numerous improvements in customer service tools and other efficiencies that can reduce the number of times that a customer has to talk to or see a live employee. With the upcoming AI revolution one would imagine that a lot of customer service is going to be handled by bots and lower labor costs.

The bottom line of all of these issues is that Google Fiber hit the same wall hit by every other overbuilder. There comes a time when you have to show profitability or the money dries up. And it’s really hard to show profitability when you are still growing rapidly.

What’s the Right Price for a Gigabit?

Speed_Street_SignI often get asked how to price gigabit service by clients that are rolling it out for the first time. For an ISP already in the broadband business, layering in a super-fast Internet product on top of an existing product line can be a real challenge.

Google certainly lowered the bar for the whole industry when they priced a gigabit at $70. And that is the real price since Google doesn’t charge extra for the modem. I think the Google announcement recalibrated the public’s expectations and anybody else that offers a gigabit product is going to be compared to that price.

There are a few other large companies marketing a gigabit product in multiple markets. CenturyLink has a gigabit connection for $79.95 per month. But it’s hard to know if that is really the price since it is bundled with CenturyLink’s Prism TV. The cheapest Prism TV product offered on the web costs $39.99 per month and includes 150 channels of programming and also comes with an additional settop box fee of $9.99 per month – the highest box fee I’ve seen. I don’t know exactly what kind of bundle discount is available, but on the web I’ve seen customers claiming that the cheapest price for the gigabit bundle is around $125 per month. That’s a far cry from Google’s straight $70. And for customers who want to use a gigabit to cut the cord a forced bundles feel a bit like blackmail.

Verizon FiOS has not yet given in to the pressure to offer a gigabit product. In looking at their web site their fastest product is still a symmetrical 500 Mbps connection at $270 per month plus an added fee for a modem, and with a required 2-year commitment. A 1-year commitment is $280 per month.

Comcast will soon offer a gigabit in more markets than anybody else. In Atlanta where Comcast is competing against Google Fiber a gigabit is $70 per month with a 3-year contract, including an early termination fee (meaning that if you leave you pay for the remaining months). This package also requires an additional modem charge. Without a contract the price for the gigabit is $140. It’s unclear if Comcast is offering the same lower-price deal in other markets with newly upgraded DOCSIS 3.1 like Chicago. The word on the Internet is that customers are unable to sign-up for the lower-price option in these markets, but the company says it’s available. I’m sure the availability  will soon become clear.

One thing that happens to any company that offers a gigabit is that the prices for slower speeds are slashed. If a gigabit is $70 – $80 then slower products must become correspondingly less expensive. Google offers a 100 Mbps product for $50 and each of the other companies listed above has a range of slower bandwidth products.

The first question I always ask an ISP is if they are offering gigabit speed for the public relations value or they really want to sell a lot of it. There are plenty of ISPs that have gone for the first option and have priced a gigabit north of $100 per month.  But for somebody that hopes to sell the product, the dilemma is that they know that the majority of their customers will buy the least expensive product that provides a comfortable speed. The rule of thumb in the industry is that, in most markets, at least 80% of customers will buy the low or moderate priced options. But if the choice is between a gigabit product and a 100 Mbps product, the percentage buying the slower product is likely to be a lot higher.

The issue that small ISPs face when recalibrating their speeds is that they end up increasing speeds for most existing customers. If they migrate from a scale today where 50 Mbps or 100 Mbps is the fastest product up to a new scale topped by a gigabit, then they have to increase speeds across the board to accommodate the new gigabit product.

This is a hard mental block to get over for many small ISPs. If a company offers a range today of products from 6 Mbps to 75 Mbps it’s mentally a challenge to reset their slowest speed to 50 Mbps or faster. They often tell me that in doing so they feels like they are giving away something for free. If a company has been an ISP since the dial-up days they often have a number of customers that have been grandfathered with slow, but inexpensive broadband. It’s a real dilemma when rebalancing speeds and rates to know what to do with households that are happy with a very cheap connection at 1 Mbps or 2 Mbps product.

For the last ten years I have advised clients to raise speeds. ISPs that have raised speeds tell me that they generally only see a tiny bump in extra traffic volume after doing so. And I’ve always seen that customers appreciate getting faster speeds for the same price. Since it doesn’t cost much to raise speeds it’s one of the cheapest forms of marketing you can do, and it’s something positive that customers will remember.

I think most ISPs realize that the kick-up to gigabit speeds is going to be a change that lasts for a long time. There are not many customers in a residential market that need or can use gigabit speeds. What Google did was to leap many times over the natural evolution of speeds in the market, and I think this is what makes my clients uneasy. They were on a path to have a structure more like Verizon with a dozen products between slow and fast. But the market push for gigabit speeds has reduced the number of options they are able to offer.

Getting Access to Conduit

innerduraFuturePathGroupThere is an interesting case at the California Public Utilities Commission where Webpass is fighting with AT&T over access to conduit. You may have seen that Webpass was just recently bought by Google Fiber and I would think this case will be carried forward by Google.

The right for competitive providers to get access to conduit comes from the Telecommunications Act of 1996. In that Act, Congress directed that competitive telecom providers must be provided access to poles, ducts, conduits, and rights-of-way by utilities. A utility is defined as any company, except for electric cooperatives and municipalities, which owns any of those facilities that are used in whole or in part for communications by wire. Under this definition telcos, cable companies, commercial electric companies, gas companies, and others are required by law to make spare conduit available to others.

If a utility allows even one pole or piece of conduit to be used for communications, including for its own internal purposes, then the whole system must be made available to competitors at fair prices and conditions. About half of the states have passed specific rules governing those conditions while states without specific rules revert to the FCC rules.

Webpass tried to get access to AT&T conduits in California and ran into a number of road blocks. It seems like there are a few situations where AT&T has provided conduit to Webpass, but AT&T denied the majority of the requests for access.

This is not unusual. Over the years I have had several clients try to get access to AT&T and Verizon conduit and none of them were successful. AT&T, Verizon, and the other large telcos generally have concocted internal policies that make it nearly impossible to get access to conduit. When a competitor faces that kind of intransigence their only alternative is to take the conduit owner to court or arbitration – and small carriers generally don’t have the resources for this kind of protracted legal fight.

But even fighting the telcos is no guarantee of success because the FCC rules provide AT&T with several reasons to deny access. A utility can deny access on the basis of safety, reliability or operational concerns. So even when a conduit owner is ordered to provide access after invoking one of these reasons, they can just invoke one of the other exceptions and begin the whole fight again. It takes a determined competitor to fight through such a wall of denial.

Trying to get conduit reminds me of the battles many of my clients fought in trying to get access to dark fiber fifteen years ago. I remember that AT&T and Verizon kept changing the rules of the dark fiber request process so often that a competitor had a difficult time even formulating a valid request for dark fiber. Even when Commissions ordered the telcos to comply with dark fiber requests, the telcos usually found another reason to deny the requests.

This is a shame because getting access to conduits might be one of best ways possible to promote real competition. AT&T and Verizon both claim to have many hundreds of thousands of miles of fiber, much of it in conduit. I am sure there are many cases where older conduit is full. But newer conduits contain multiple empty tubes and one would have to think that there is a huge inventory of empty conduits in the telco networks. The same is true for the cable companies and the large electric companies, and I can’t recall any small carriers who has ever gotten access to any of this conduit. I think some of the large carriers like Level3 or XO probably have gotten some access to conduit, but I would imagine even they probably had to fight very hard to get it.

I remember talking to a colleague the day that we first read the Telecommunications Act of 1996 that ordered the telcos to make conduit available to competitors. We understood immediately that the telcos would adopt a strategy of denying such access – and they have steadfastly said no to conduit requests over the years. I am glad to see Webpass renewing this old fight and it will be interesting to see if they can succeed where others have failed.

Fiber for Everyone?

Fiber CableJust a few days ago I wrote about the two cities that are considering having citizens pay for their fiber networks through utility fees and pledges to support the fiber financing. After writing about Ammon, Idaho I heard back from several people in the industry pointing out that the proposed Ammon utility fee was a pledge intended to support bonds. The fees, which are supposed to be about $16.50 per month for about twenty years, would total nearly $4,000 over that twenty-year period and would be used to secure, and then pay for, the bonds needed to build the system.

That raises an issue that I have raised before: how important is it that everybody in a community get access to broadband? Every community that thinks about finding a fiber solution faces this issue. They can look for an approach that will get fiber to every household or they can settle for something less. This choice is sometimes a philosophical decision, but it often comes down to the difference in cost between the two choices.

Ammon has clearly chosen a solution that will benefit homeowners who are able and willing to pledge a lien on their homes. To be able to make the pledge a resident must own a home that can be pledged, so this eliminates renters. Interestingly it might also make it a challenge for anybody who doesn’t think they’ll be in their home for long. According to the US Census, the average time that families stay in an owned home is 13 years. And fewer than 40% of homeowners stay even 10 years. So anybody that thinks they are going to move out of their home in Ammon in the next few years probably ought not to pledge since they are likely to have to cover the remaining amount of the lien when they sell their home.

I don’t want to sound like I’m coming down negative on Ammon, because they have come up with a creative solution to get fast broadband to at least part of their city. And that is exactly what a whole lot of other cities have done. Ammon is unique because of their creative financing solution, but a whole lot of other cities have settled for broadband to less than everybody.

For instance, almost every city getting Google fiber is going to end up with fiber built to only parts of their city. Only cities willing to step up with a lot of city dollars like Huntsville, Alabama are going to get fiber everywhere. And the vast majority of cities that got Verizon FiOS years ago now thinks they made a mistake since they now have fiber in some neighborhoods and not others. They are now seeing a big difference between neighborhoods with fiber and those without. This difference is likely to grow since both Verizon and AT&T have made noises about tearing down copper in older city neighborhoods. We might end up with more urban households without affordable landline broadband than we have today in rural areas.

Fifteen years ago I worked for several cities that wanted to get Verizon’s attention to get onto the FiOS list. At that time these cities were so ecstatic to get some fiber that they didn’t insist that Verizon eventually build their whole city. But it probably would not have mattered if they had – because there are cities that got that agreement from Verizon but which still don’t have fiber everywhere.

I don’t want to make Google and Verizon sound like bad actors because almost every large fiber overbuilder is doing the same thing in only building to the most profitable parts of cities. The returns from only building to the best neighborhoods are dramatically better than from building everywhere – I’ve created dozens of business plans that quantify the difference. This is also the approach being taken by CenturyLink, Aspire, and half a dozen other fiber overbuilders – they are simply making the best financial decision for their company.

This is a tough philosophical issue for a city. Do they take the high ground and hold out for a solution that gets fiber everywhere or do they take the practical approach and get some fiber built? The risk of holding out for a whole-city solution might mean that nobody gets fiber. But the flip side of this is that building to only parts of a city probably means there will be neighborhoods that will be cut off from fiber for decades to come – talk to any city that has FiOS if you don’t believe that.

It’s almost impossible to build a reasonable business plan today to somehow fill in fiber where Verizon didn’t build – because they built where the construction costs were the lowest.  So Ammon is not at all unique, and in fact they are joining the majority of the cities in the US that have elected a solution that will result in something less than 100% fiber coverage. My primary reaction to this issue is a personal one – I know how I’d feel if I was in one of the neighborhoods that didn’t get fiber. I think that any city that elects to build less than 100% fiber ought to expect to hear an outcry from the rest of the city for many years to come.

Google Looking at Wireless Drops

Wi-FiIn an interview with Re/code Craig Barrett, the CEO of Access for Alphabet said that Google is looking at wireless last mile technologies. Google is not the only one looking at this. The founder of Aereo has announced a new wireless initiative to launch this summer in Boston under the brand name of Starry. And Facebook says it is also investigating the technology.

The concept is not new. I remember visiting an engineer in Leesburg, Virginia back in the 90s who had developed a wireless local loop technology. He had working prototypes that could beam a big data pipe for the time (I’m fuzzily remembering a hundred Mbps back when DSL was still delivering 1 Mbps). His technology was premature in that there wasn’t any good technology at the time for bringing fast broadband to the curb.

As usual there will be those that jump all over this news and declare that we no longer need to build fiber. But even should one of these companies develop and perfect the best imaginable wireless technology there is still going to have to be a lot of fiber built. All of these new attempts to develop wireless last mile technologies share a few common traits that are dictated by the nature of wireless spectrum.

First, to get good the kind of big bandwidth that Google wants to deliver, the transmitter and the customer have to be fairly close together. Starry is talking about a quarter mile deliver distance. One characteristic of any wireless signal is that the signal weakens with distance. And the higher the frequency of the spectrum used, the faster the signal deteriorates.

Second, unless there is some amazing breakthrough, a given transmitter will have a fixed and limited number of possible paths that be established to customers. This characteristic makes it very difficult to connect to a lot of customers in a densely populated area and is one of the reasons that wireless today is more normally used for less densely populated places.

Third, the connection for this kind of point-to-multipoint network must be line of sight. In an urban environment every building creates a radio ‘shadow’ and block access to customers sitting behind that building. This can be overcome to a small degree with technologies that bounce the signal from one customer to another – but such retransmission of a signal cuts the both the strength of the signals and the associated bandwidth.

However, Google has already recognized that there are a lot of people unwilling or unable to buy a gigabit of bandwidth from them on fiber. In Atlanta the company is not just selling a gigabit connection and is hitting the street with a 100 Mbps connection for $50. A good wireless system that had access to the right kind of spectrum could satisfy that kind of bandwidth to a fairly reasonable number of customers around a given transmitter. But it would be technically challenging to try to do the same with gigabit bandwidth unless each transmitter served fewer customers (and had to be even closer to the customer). A gigabit wireless network would start looking a lot like the one I saw year ago in Virginia where there was a transmitter for just a few nearby customers – essentially fiber to the curb with gigabit wireless local loops.

But if Starry can do what they are shooting for – the delivery of a few hundred Mbps of bandwidth at an affordable price will be very welcome today and would provide real competition to the cable companies that have monopolies in most urban neighborhoods. But, and here is where many might disagree with me, the time is going to come in a decade or two where 200 Mbps of bandwidth is going to become just as obsolete as first generation DSL has become in the twenty years since it was developed.

Over the next twenty years we can expect the full development of virtual and augmented reality so that real telepresence is available – holographic images of people and places brought to the home. This kind of technology will require the kind of bandwidth that only fiber can deliver. I think we’ll start seeing this just a few years from now. I can already imagine a group of teenagers gathering at one home, each with their own headset to play virtual reality games with people somewhere else. That application will very easily require a gigabit pipe just a few years from now.

I welcome the idea of the wireless last mile if it serves to break the cable monopoly and bring some real price competition into broadband. It’s a lot less appealing if the wireless companies decide instead to charge the same high prices as the incumbents. It sounds like the connections that Starry is shooting for are going to fast by today’s standards, but I’m betting that within a few decades that the technology will fall to the wayside – like every technology that doesn’t bring a fast wire to the home.

Getting Access to Poles

PoleGoogle Fiber is having problems getting onto poles in many parts of the Bay Area and the issues they are having make for a good primer on the very confusing rules for regulating different kinds of entities.

Google Fiber has only publicly announced that they are bringing service to parts of San Francisco. But they have also been talking to Palo Alto, Santa Clara, San Jose, Mountain View and Sunnyvale. Google has no significant pole issues in Palo Alto where the poles are owned by the City, nor in Santa Clara where the poles are mostly owned by the City and a few by AT&T.

The problems come in the other cities. In California a lot of poles are owned by what is called the Northern California Joint Pole Association which is owned by Comcast, Time Warner and AT&T. That group is disputing Google’s right to get on their poles.

The issue is purely a regulatory one. Google claims they are a cable TV company. The kind of company you are matters when it comes to poles. Many years ago the FCC and the industry worked out very specific rules for attachments to poles. Poles are divided into specific zones where various kinds of companies can place cables. The telephone incumbent has the lowest space. At the top is the power company, and historically the cable company fit between telco and power lines. Anybody else who gets on a pole has to fit somewhere in the middle, and in different parts of the country this is sometimes between the cable company and the power company and sometimes between the telco and the cable company.

The first problem Google faces is that by declaring themselves as a cable company, the pole rules only assume that there is one such company. So they can’t claim the ability to get into the cable space, which in all of these cities is already taken by an incumbent cable provider.

Google has always said that they don’t want to register as a CLEC, or competitive telephone company. And until the company announced a trial for voice service a few weeks ago they didn’t offer voice anywhere. But from a regulatory perspective, if Google was a CLEC they would have the right under law to connect to poles, which was guaranteed in the Telecommunications Act of 1996. But I don’t believe there is any similar law that would provide a second cable company the same right, and that has to be the basis for the pole owners to deny access to Google.

Of course, the companies in the association have a very vested interest in delaying Google Fiber from getting into their markets, so it’s only natural they would fight this. It’s actually somewhat rare for cable companies to own any substantial number of poles, but in this consortium two of the owners are cable companies.

AT&T has argued to the California PUC that they don’t believe that Google Fiber qualifies as a cable company and is using that distinction to deny Google access to these poles. There are generally two ways for a company to become certified as a cable company. They have to register with the FCC, which is a very rubber-stamp process, or they have to get a local cable TV franchise from the city where they want to provide service.

But California added a twist to that process. In 2006 the legislature passed a bill that allows companies to get a statewide cable franchise, which is the reason that the California PUC is involved in this dispute. That original law was passed for the benefit of Verizon and AT&T, so that they could provide a competitive cable TV alternative to the incumbents. Under the statewide rules a company only needs to notify a city 10 days before they first are going to offer cable TV service and there are no more regulatory requirements at the city level. A competitive cable TV provider has no obligation to serve an entire community and can serve only where they choose.

Early indications are that the California PUC is siding with the pole owners and might not be buying Google Fiber as a cable company. But even if they are a cable company I don’t know that this gets them access to poles. When AT&T and Verizon became statewide cable providers they already had access to poles. If Google Fiber was a CLEC they would automatically have the right to pole access, but Google apparently doesn’t want to take on the other obligations that come with being a CLEC. The dispute is going to be resolved in one of two ways – either a court will decide this if Google wants to pursue it, or Google will just walk away from those markets and pursue some of the other hundreds of markets that want their fiber.

Comcast and Real Competition

comcast-truck-cmcsa-cmcsk_largeIt’s really interesting to see how Comcast is reacting to Google Fiber in Atlanta. The company has had competition from fiber in the past in the form of Verizon FiOS. But the footprint for that competition hasn’t changed for years. Comcast and Verizon have competed with very similar data speeds and there was not a lot to distinguish one from the other from a product standpoint. Each company has bested the other in some markets, although Verizon seems to have gotten the upper hand in more places.

But now Comcast is facing Google Fiber for the first time and their reaction is interesting. From what I can see they are doing the following:

  • Comcast is offering a gigabit of speed for $70 per month. But it comes with a very ugly 3-year contract. For those that don’t take the 3-year contract the price will be $139.95 per month, plus Comcast will impose a 3 gigabit monthly data cap that could add up to $35 per month to anybody that actually uses the data.
  • Comcast is using negative advertising against Google’s WiFi router and says that Google’s Wifi’s speeds are 30 Mbps while their own is 725 Mbps.
  • And Comcast is widely distributing flyers that tell people in Atlanta not to fall for the Google hype.

So how do these claims stack up and will they be effective?

I think Comcast’s speed comparison is quite silly and that the public will see through it. The general public has been trained for a decade that fiber is better. Not that upload speeds matter to most people, but Google’s speeds are symmetrical while Comcast will have a relatively slow, perhaps 35 Mbps upload. On a fiber network it’s not too hard to engineer to deliver a true gigabit download almost all of the time. But Comcast is going to have the same issues it’s always had with its HFC network. If it sells too many gigabit customers, then its nodes will slow down for everybody on the node. I don’t believe that there are many homes today that really need a gigabit, but once Google is up and running it ought to win the speed test battle in the market.

There is some truth to Comcast’s claim about WiFi, although their numbers are quite skewed. For some reason Google Fiber is still using an 802.11n WiFi router. At best their WiFi routers are going to deliver about 300 Mbps – but in Kansas City the Google routers are reported on consumer websites to deliver about 80 Mbps on average. Comcast is offering 802.11ac routers, and while they are theoretically capable of the speeds they tout, in real life use they deliver between 200 Mbps and 300 Mbps.

The fact is that both companies (and most ISPs) are doing a very poor job with WiFi. Almost all of them offer a one-WiFi router solution which is not acceptable in today’s big bandwidth homes. I have a Comcast WiFi router and it delivers really low speeds to our offices which are opposite ends of the house from the central router. Until a carrier is willing to cross the threshold and install a WiFi network with multiple linked WiFi routers in a home, then all of their solutions are going to be poor in real life practice.

It appears that Comcast is relying on negative advertising against Google, and I seriously doubt this is going to work. Comcast has one of the most hated customer service experiences in the country and Google has been touted – so far – for offering outstanding customer service. It seems like a bad tactic to advertise negatively about somebody that will have a better network product and a better customer experience.

I think Comcast is really missing the point. It seems like they are spending their energy advertising against Google’s gigabit product. But Google announced that it is entering Atlanta with two data products – the gigabit at $70 and a 100 Mbps product at $50. My bet is that the slower product is likely to most cut into Comcast’s penetration rate unless they decide to scrap the 300 gigabit month data cap. Where Comcast says that only a small percentage of customers use more data than that per month, my clients tell me otherwise. Once any customer has been charged extra for a data cap overage on Comcast they most likely will change to Google and they are likely to never come back.

Google Fiber and the Triple Play

Fiber CableThere is some interesting news from Google Fiber lately about new product offerings. It was reported at the end of January that Google is testing a voice product for its fiber customers. And in early February Google announced that it was adding a 100 Mbps data product in the Atlanta roll-out.

News leaked out that Google is experimenting with Fiber Phone with members of its Trusted Tester Program. Google offered phone service to those customers and wrote the following:

With Fiber Phone, you can use the right phone for your needs, whether it’s your mobile device on the go or your landline at home. No more worrying about cell reception or your battery life when you’re home… Spam filtering, call screening and do-not-disturb make sure the right people can get in touch with you at the right time.

Google is installing the needed equipment for test customers and is at the beta stage of testing. There has been news about possible pricing or when this might be made available to all customers.

In early February Google announced it is now offering a 100 Mbps data product for $50 to go along with the $70 gigabit offering. In Atlanta the company has eliminated the ‘free’ Internet product where customers paid a one-time fee of $300 and got a 5 Mbps product for 7 years with no additional fees.

With these changes Google is looking more and more like a typical triple-play provider. It’s not hard to understand why they would make these changes. It’s very expensive to build a fiber network and the best way to pay for it is to get as many high-margin customers as possible on the network to pay for it.

As exciting as the $70 gigabit product is there are a huge number of households that just can’t afford that price. So by adding a $50 product that is still blazingly fast Google will make their broadband affordable to a lot more people in each market.

There is one interesting market dynamic that Google is probably going to soon see. In looking at the customer penetration rates for many of my client ISPs I’ve almost always seen that the fastest Internet product (assuming it isn’t priced too high) will get 10% to 15% of the customers in a given market. Given a choice, the rest of the customers will take something slower if it saves them money. This is not something that’s true only for fast fiber networks, but I’ve seen this same relationship hold true for cable companies with HFC networks and for DSL networks. There are only a few markets where a higher percentage of customers buy the premium data product.

If Google goes back and introduces the 100 Mbps product in their older markets they will probably see two things. First, they will add customers who find the $50 price affordable. But they are also going to see gigabit customers downgrade to 100 Mbps to save $20 per month. Overall I would guess this change will produce a significant net change upward in total revenues in Google’s older markets. In Atlanta I predict they will get a lot more 100 Mbps customers than gigabit customers.

And Google ought to do okay with voice. My experience is that they will have a hard time selling voice to existing customers but that they will do okay with new customers as they add them. The FCC reported that voice just fell under a 50% nationwide penetration, and that is still a lot of potential customers. I see clients still doing surprisingly well with residential voice and still doing extremely well with business voice.

It’s interesting to see that after a few years in the market that Google is morphing into a more normal triple play provider. I’ve expected this from the start because my take is that a large majority of the households still wants the double play or triple play and if you want to get a lot of customers you have to provide what customers want to buy. Anybody that expects customers to buy from more than one vendor to get what they want is going to drive away a lot of potential customers.