Categories
The Industry

The Downside to Cloud Services

I have a client who has been buying a cloud service for about two years and has a number of reasons to be unhappy with the service. I’m not going to name the specific service because there aren’t many vendors in this particular space. But the issues my client is experiencing look to be common with a lot of cloud services.

My client buys this service to resell along with other services to his own customers. His number one complaint is that he never knows what service he is going to wake up to each day. The cloud software will be working great one day and then the vendor will implement a software change and all of a sudden he gets calls from his end-user customers that things have gone wrong. And inevitably the problem turns out to be something that the cloud service vendor has introduced into the service as a supposed upgrade or improvement.

This is not a new phenomenon and anybody that purchased a voice switch, a cable TV headend, or a complex billing system can remember back to the day when this same sort of thing happened all of the time. Carriers would shudder each time that they got a software update from a vendor because it often caused more harm than it did good.

And the industry learned how to deal with this problem. First, carriers started to insist that vendors build test labs and that they try out new software first in the lab rather than foisting it onto the lab of end users. Second, they insisted that software vendors update software in discrete releases so that each carrier could decide if they needed to install a new update or not.

I remember a time in the late 90s when CCG routinely recommended that our clients not install software updates. There were so many problems with new software releases that we found that it was better to let the update hit the world and to let other carriers debug each new software release. My clients would purposefully fall numerous software upgrades behind, but as long as they weren’t experiencing end-user issues they were happy.

But now my clients are starting to buy services in the cloud, and in doing so we have gone back to the 90s all over again. The biggest problem with most cloud vendors is that they only run one version of their software – the latest. The vendor will update the cloud software and every one of their customers will have the same version. This certainly makes life easier on the cloud vendor.

But unless the vendor has amazing software coders that never make mistakes (and that is never going to happen) then the vendor can release an update that has dreadful bugs in it and the test lab for these bugs become the end-user customers of the carriers. A carrier might not even realize there was a software update until they get complaints from their own customers. But now the situation is much worse than the old days, because one of the most common ways to fix this sort of problem used to be to reinstall an earlier version of the software that they knew worked right.

I guess that cloud service providers need to learn the same lessons that the other vendors in the carrier industry learned a few decades ago. Just because software is in the cloud doesn’t change good software practices – in fact it makes them even more important. A software vendor that uses end-users as his software testing lab is going to get a horrible reputation and in the long run is not going to keep customers in the carrier space.

And so I hope that software vendors would implement the same kinds of changes that the industry forced vendors to implement decades ago:

  • It should be the carrier’s choice about accepting a software upgrade. Updates should never be automatic. This means that the cloud vendor needs to keep multiple versions of their software available online.
  • Software vendors need to maintain a test lab of some kind. Most software ultimately controls hardware and the vendor ought to have a lab on which to make certain that changes in the software do what they are intended to do while not screwing up something else.
Categories
The Industry

Another Municipal Model

City officials in San Francisco recently issued a report that takes a very different stance than most other cities that are looking at broadband issues. The city essentially rejects the normal demand-based commercial model for broadband and looks at a new structure that will bring broadband to everybody.

The report is authored by the office of supervisor Mark Farrell and reflects some of the recommendations from the San Francisco Municipal Fiber Advisory Panel. The report very correctly observes that today’s commercial broadband model leaves a lot of citizens without broadband. Numerous nationwide surveys have shown that the majority of households without broadband access today feel they cannot afford the market prices for service.

So the San Francisco report recommends that the City institute a $26 per month fee on all households – with a higher fee on businesses – to help pay for broadband to everyone. They further recommend a public private partnership model to operate the business and assume that tiered pricing will still allow for profitability for a commercial partner.

The numbers are based upon an estimate that it would cost $867.3 million to build a fiber network in the city and $231.7 million per year to maintain the network. In my experience in looking at other large cities, both numbers feel very high. One has to assume in an open access network where fiber was built to everybody that the ongoing maintenance expenses for a network would be far lower than that since much of those costs would accrue to the ISPs and not to the city.

The city is not the first place that has looked at paying for fiber using taxpayer money, but they are by far the largest. A few small communities like Leverett, Massachusetts have paid for fiber construction with tax levies. The city of North Kansas City built a network and essentially is giving free service to residents for the next ten years. And the Utopia system in Utah recently looked at the tax-payer funding model, although it looks like a lot of the communities involved are rejecting the idea.

It’s a very interesting concept that has a bunch of pros and cons. On the plus side this would certainly solve the digital divide if every household in a community had a fiber connection. There would still be the issue to solve of making sure that everybody has a computer, but that seems like an easier problem to solve than getting the fiber network built to everybody.

But I can foresee a few major hurdles in implementing such an idea in an NFL city, such as the following:

  • The City probably doesn’t have the right to insist that they can bring fiber into apartment buildings. The FCC has made it clear that building owners have the right to control the wiring and the access to services on their own property. Many of the apartment owners will already have made a long-term contractual arrangement and be doing revenue sharing with the local cable company or some other service provider.
  • One can envision multiple lawsuits from citizens and businesses that wouldn’t want the city solution or who won’t want to pay the fee. It’s one thing to do this in a tiny town like Leverett, MA where there was no existing broadband, but in a large city there are bound to be many who don’t want the city doing this.
  • This is such a drastic solution that it surely would invite legislative action and multiple lawsuits from the incumbent providers. California is one of the states that allows for municipal competition, but using direct tax revenues to compete against the existing broadband providers would raise legitimate concerns about unfair competition. One can envision attempts to pass state or national legislation that would outlaw the proposed business plan. ISPs would use every tool at their disposal to fight this for fear that it might work and could spread elsewhere.

As the report points out, cities have a broadband dilemma today. Even where there is fiber or good broadband today there are a lot of households that can’t afford broadband. The report estimates there are over 100,000 people in San Francisco that can’t afford the market price for broadband and another 50,000 that still use dial-up.

There is also the issue of carriers building to just some parts of a city. One only has to look at all of the east coast cities that have Verizon FiOS to see the result of allowing commercial broadband providers to cherry-pick in markets. These cities have some neighborhoods with fast fiber broadband and competition between the telco and the cable company (which many observe is not real competition). But they have many neighborhoods without fiber and none of these cities can formulate a business plan that can justify bringing fiber to the neighborhoods that Verizon bypassed as too expensive to build.

The San Francisco report was a little fuzzy on a few of the details, which is natural since those details can only be made clear through negotiations with carriers willing to operate on such a network. You have to give the city kudos for creativity. But I foresee a big uphill battle if they try to implement this. But it’s an idea that should work if it can overcome the opponents that will spend huge money to try to prevent it.

Categories
The Industry

TV a Decade from Now

SANYO DIGITAL CAMERA
SANYO DIGITAL CAMERA

I recently heard another consultant say that traditional cable TV as we know it will still be a very strong product a decade from now and that it’s far too soon for small cable providers to worry about the future of cable TV. That got me to thinking about everything that is going on in the industry and I come to a very different conclusion. I think TV a decade from now is going to be very different than today. There are so many major changes changes happening today, and while it’s hard to see through it all, I can’t imagine TV still be anything like what we have today a decade from now.

Skinny Bundles. While talking about cord cutting is interesting, last year new cord cutters were at most something like 2% of all cable viewers – that is not yet a revolution. The really big change in the industry is going to come from skinny bundles. These are the small packages that cable TV providers are assembling as an alternative to the 200-channel bundles. The cable companies are assembling packages of the most popular channels and are pricing them at $30 and $40.

I think skinny bundles are going to be wildly successful. Assuming that the skinny bundles contain a lot of what people want to watch, they will be a better option for most people than going to a pure OTT product like Sling TV. It’s easy to forget that people pay a huge penalty from breaking the cable company bundles and becoming a cord cutter can cost a family a $10 to $20 increase in their broadband price.

Skinny bundles are going to be significant because they mean that there are going to be a whole lot less viewers paying for the less-popular cable channels like the Tennis Channel or Discovery Health. I don’t think we should underestimate how much Wall Street is going to punish programmers if they start losing customers and revenues. We saw a little bit of this recently when it was reported that ESPN had lost 7 million viewers and Disney stock took a beating. I predict that as skinny bundles take off that we are going to see a number of lesser-viewed networks disappear.

Mega-Bundles. I think the OTT industry is going to have to consolidate in some way to be long-term successful. Already today it can cost more to buy the individual OTT offerings you want rather than just stick to a traditional cable package. In the long run, if each OTT package stands alone then many of them will fail from lack of viewers.

But I already see talk about the creation of the OTT bundle – a service that brings various OTT offerings together under one umbrella package. There are an incredible number of companies now making or planning to make original content. As somebody who only watches OTT content, it is already confusing and hard to find what I want to watch. So I expect that there will be bundlers that will bring original content from many sources together under one search engine – sort of like a TIVO for OTT content. I don’t really care if content is created by Netflix, Apple, YouTube or somebody else. If I could buy a service that would bridge the current OTT content into one package I’d buy it today.

Drop in Live Viewing. The continuing trend people watching less live television is going to feed into the above two trends. People are getting retrained to be less loyal to networks and are instead become fans of specific series or types of programming. Binge watching (or even just delayed watching) is becoming more the norm and there will be less and less programming that people insist on watching live.

The Trend. All of these trends together means that people are going to become less loyal to a given network and more loyal to specific content. And that is the change that will transform the industry. The programmers today have all of the power because they can force the cable providers to buy all of their networks. But if people elect options that avoid much of that content then the driving power in the industry will change from the programmers to the viewers.

Programmers are not going to able to sell things that people won’t pay to watch. The amount of new original content available today already provides an amazing alternative to traditional programming. And when you just look at the original content planned for the next year or two you can see that quality content might become the new driver in the industry.

The cable companies are not going to resist these changes, which might be a shock to the programmers. There is a decent chance that cable companies can make as much margin from skinny bundles as they make today from the huge bundles. And no cable company is going to be sorry to see the power of the big programmers get diluted by the change in people’s viewing preferences.

Categories
Meet CCG The Industry

Three Years and Counting

Today is the three year anniversary of this blog. I started writing this blog as a way to force myself to keep up with industry news. During the first month of writing the blog I worried that I would quickly run out of topics. But I underestimated then how dynamic our industry has become. The changes from just three years ago are amazing. Instead of running out of topics I often have to toss away topics because I can’t get to them fast enough.

I mostly write about the topics in the industry that I find most interesting, but I must be striking a chord because I pick up new readers to the blog daily. I now know that I am the only one writing daily about broadband and related topics and it makes me happy to see that others find these topics to also be of interest. Just since I’ve started this blog we’ve seen the following changes in the industry (and this is a short list):

An Activist FCC. The current FCC has waded into more new topics than any other FCC in my memory. The most significant one is the net neutrality decision that reclassifies broadband as a regulated service. But there have been many other rulings from this FCC. There was a time a few years ago when industry pundits predicted that regulation was dying, but it has done just the opposite.

Exploding Demand for Broadband. The penetration rates for broadband have continued to grow and in urban areas it seems like we are getting close to the time when everybody that can afford broadband has it. But there are still huge numbers of rural homes and businesses without broadband and they are starting to stridently demand it.

Growth of the OTT Industry. While Netflix has been streaming content a little longer than I have been writing this blog, the whole OTT phenomenon has really taken off in the last few years. Netflix now claims over 75 million customers and there is now a growing host of other OTT providers. Online video has completely transformed the Internet and video is by far the majority of online traffic.

New Products from the IoT. There are new products available to carriers for the first time in many years. I have a number of clients who are now successfully selling security and a number of them are getting into home automation and the many other related services associated with the Internet of Things.

Use of WiFi instead of Wires. It’s become recently obvious that the large ISPs have abandoned home wiring for delivering data. They now bring bandwidth into the home to a central WiFi router and don’t install wires to anything else. But a single WiFi router is already not sufficient for high-bandwidth homes and the next trend in this area is going to be the networking of multiple WiFi routers.

Services in the Cloud. More and more services are moving to the cloud. Carriers can buy voice and cable TV programming from the cloud today, something that was unimaginable just a few years ago. It was always assumed that expansive bandwidth made cloud cable TV impractical, but as bandwidth prices continue to tumble it makes more sense to buy programming from the cloud instead of building or maintaining a cable headend.

Public Private Partnerships. There were very few Public Private Partnerships a few years ago and now it’s something that everybody talks about. This is particularly relevant in rural America where communities are willing to kick in money to find a broadband solution. But we are even seeing this in urban areas, such as the deal just announced between Google Fiber and Huntsville.

Erosion of Landline and Cable Customers. Landline penetration rates are now under 50% nationwide and we are starting to see the erosion of traditional cable customers. The challenge for the next few years will be for triple play providers to find ways to replace these shrinking revenues and margins.

Massive Realignment of Rural Subsidies. We’ve seen subsidies shrink for small telcos. Access charges are being phased lower and the Universal Service Fund is being redirected from telephone to broadband. This has put a lot of pressure on some small carriers, but anybody who survives the end of this shift will probably be ready to succeed in the long run.

Categories
Regulation - What is it Good For?

FCC Looks at Consumer Data Security

The FCC will be voting on March 31 to release a Notice of Proposed Rulemaking (NPRM) concerning customer rights concerning their data on the Internet. More specifically, the NPRM is looking at the relationship between a customer and their ISP. It’s been assumed FCC Chairman Tom Wheeler already has the votes to get this passed.

The premise of the NPRM is that an ISP knows more about what a customer does than anybody else. They know what web sites you connect to and for how long, and even if you encrypt everything they know a lot about you. Most people don’t realize that an ISP has total knowledge of everything a customer does that is not encrypted. If they care to do so an ISP can record every keystroke made online.

And so the NPRM will be asking what rights customers should have as far as allowing their ISP to use or monetize the knowledge they gain about customers. The proposed rules are going to apply the same sorts of privacy rights to broadband that have been in place for telephone service. The privacy rules would not apply to social media sites, browsers or search engines, just to ISPs. The FCC’s reasoning is that customers voluntarily give their data to these edge series but they have not done so freely to their ISP.

The NPRM starts with the premise that consumers ought to have control over how their data is used by their ISP. Telephone customers have had similar rights for years. Here are the primary areas that will be covered by the NPRM:

Transparency. The FCC wants ISPs to inform people about the information they collect about them. They want ISPs to further tell customers how they use this data and if and how the data might be sold to others. And the FCC wants all of this written in plain English (good luck with that!)

Security. The FCC believes that ISPs have the responsibility to protect customer data. The NPRM wants to require ISPs to take reasonable steps to protect customer data.

  • This would mean new rules for ISPs. They would have to institute training practices for employees, adopt strong customer authorization practices, identify to the FCC the senior manager(s) responsible for data security, and take responsibility of customer data when it’s shared with a third party.
  • There would also be new rules about data breaches. Customers would have to be notified of data breaches within 10 days of discovery. The ISP would need to notify the FCC within 7 days of any breach. ISPs would have to notify the FBI and the US Secret Service of any breach of more than 5,000 customers.

Choice. The NPRM suggest that customers be given a choice to say what kind of data their ISP may use and under what conditions it can be shared with others. The FCC wants to categorize customer data into three categories:

  • First is the data that an ISP must have in order to serve customers. This would be things like name, address and other data needed to bill a customer. And because the product is broadband the FCC believes that an ISP has the inherent right to do things like measure your total data usage and other related network information.
  • Second, the FCC thinks that an ISP ought to be able to use a customer’s data to market other telecom products to them. But, like with telephone service, the FCC thinks customers should have the right to opt-out of ISP marketing activity.
  • Third, the FCC is then suggesting that customers would need to opt-in to give an ISP the right to use their data for any other purposes.

The FCC wants these to be rules about customer permission and protection of data and they are not prohibiting ISPs from gathering and using data as long as the customer approve of it. As is usual with this kind of NPRM we can expect a lot of comments both for and against the proposal. What I find most unusual about this NPRM is that it largely assumes that the FCC is going to prevail in its order to regulate broadband under Title II rules. If that gets order gets overturned then protection of customer data would probably revert back to the FTC.

Categories
The Industry

The Power of the Programmers

I heard from several different sources on the same day that the quantity of choices on Netflix has been declining outside of their original programming. Netflix now carries almost a third less shows than they did just a few years ago. When Netflix got started they carried a significant amount of network TV series and those have largely dried up. A lot of series on Netflix today are older and have been off the air for a while. It seems like there has been a major shift of TV series from Netflix to Hulu.

I’m also reading almost daily about companies trying to put together OTT packages for the Internet and which are getting nowhere in their attempts to gain programming rights. One of the few companies able to do this is Dish Networks which was able to craft Sling TV out of their existing programming contracts. But companies like Apple have been trying for years to put together a package and have gotten nowhere.

Finally, in the news a lot lately are cable providers that have changed their channel line-ups or are putting together skinny bundles and are getting push-back from the programmers. For example, Verizon had to change its skinny bundle package to accommodate a lawsuit from Disney over the placement of ESPN.

All of this points to the fact that programmers still have most of the power in the industry today. There is nothing stopping anybody from putting together OTT packages of original content, and we see a lot of movement to do just that. But getting permission to include normal cable networks in OTT packages is still incredibly difficult. Unless that changes, the OTT movement is liable to remain small and very fragmented.

What is probably the most interesting in all of this is the subtle shift of programming from Netflix to Hulu. It’s something that Netflix talks about a little, but not too loudly because they don’t want to hurt their stock prices. But over time Netflix has lost the ability to show a lot of the current content that people want to watch. They are not getting very much fresh network programming and not nearly as many hot movies as they did in the past.

Hulu is now getting most of that content. For those that don’t know it, Hulu is a joint venture between Disney (ABC Television Group), Fox Broadcasting, and NBC Universal (Comcast). The relationship between Hulu and its owners is an interesting one. For a number of years the programmers were hesitant to make Hulu too good since they saw it as a threat to their much larger sales of programming to cable TV providers. But as it’s become obvious that there is no putting the OTT genie back into the bottle these companies and other programmers are starting to see Hulu as their best way to combat Netflix, YouTube, and the other OTT start-ups. And so these network owners and others have been providing more content to Hulu while also giving them the right to show network shows the day after they air live. They have also developed commercial-free subscriptions and made Hulu easier to use.

The FCC has had an open docket now for almost a year that is supposedly looking at OTT content and programming issues. But that docket has been relatively quiet and I’ve seen no speculation on when – if ever – the FCC will make any decisions in this area. As many of the filers in that docket have noted, the FCC could be on shaky ground if they try to regulate content provided on the Internet. In order to do so they would probably have to somehow classify web video providers as cable companies – something they may be unwilling or even unable to do.

But until such time that the FCC can step into the fray in some manner the content providers continue to hold most of the power in the industry. While they probably can’t bring down Netflix (which is going to spend over $5 billion this year on original content) they can withhold other content from Netflix making them less attractive. And the programmers clearly have the ability to stop or slow anybody else from putting together meaningful OTT programming packages. They are still in a position to pick the winners and losers in the OTT industry.

Categories
Regulation - What is it Good For?

Getting Access to Poles

Google Fiber is having problems getting onto poles in many parts of the Bay Area and the issues they are having make for a good primer on the very confusing rules for regulating different kinds of entities.

Google Fiber has only publicly announced that they are bringing service to parts of San Francisco. But they have also been talking to Palo Alto, Santa Clara, San Jose, Mountain View and Sunnyvale. Google has no significant pole issues in Palo Alto where the poles are owned by the City, nor in Santa Clara where the poles are mostly owned by the City and a few by AT&T.

The problems come in the other cities. In California a lot of poles are owned by what is called the Northern California Joint Pole Association which is owned by Comcast, Time Warner and AT&T. That group is disputing Google’s right to get on their poles.

The issue is purely a regulatory one. Google claims they are a cable TV company. The kind of company you are matters when it comes to poles. Many years ago the FCC and the industry worked out very specific rules for attachments to poles. Poles are divided into specific zones where various kinds of companies can place cables. The telephone incumbent has the lowest space. At the top is the power company, and historically the cable company fit between telco and power lines. Anybody else who gets on a pole has to fit somewhere in the middle, and in different parts of the country this is sometimes between the cable company and the power company and sometimes between the telco and the cable company.

The first problem Google faces is that by declaring themselves as a cable company, the pole rules only assume that there is one such company. So they can’t claim the ability to get into the cable space, which in all of these cities is already taken by an incumbent cable provider.

Google has always said that they don’t want to register as a CLEC, or competitive telephone company. And until the company announced a trial for voice service a few weeks ago they didn’t offer voice anywhere. But from a regulatory perspective, if Google was a CLEC they would have the right under law to connect to poles, which was guaranteed in the Telecommunications Act of 1996. But I don’t believe there is any similar law that would provide a second cable company the same right, and that has to be the basis for the pole owners to deny access to Google.

Of course, the companies in the association have a very vested interest in delaying Google Fiber from getting into their markets, so it’s only natural they would fight this. It’s actually somewhat rare for cable companies to own any substantial number of poles, but in this consortium two of the owners are cable companies.

AT&T has argued to the California PUC that they don’t believe that Google Fiber qualifies as a cable company and is using that distinction to deny Google access to these poles. There are generally two ways for a company to become certified as a cable company. They have to register with the FCC, which is a very rubber-stamp process, or they have to get a local cable TV franchise from the city where they want to provide service.

But California added a twist to that process. In 2006 the legislature passed a bill that allows companies to get a statewide cable franchise, which is the reason that the California PUC is involved in this dispute. That original law was passed for the benefit of Verizon and AT&T, so that they could provide a competitive cable TV alternative to the incumbents. Under the statewide rules a company only needs to notify a city 10 days before they first are going to offer cable TV service and there are no more regulatory requirements at the city level. A competitive cable TV provider has no obligation to serve an entire community and can serve only where they choose.

Early indications are that the California PUC is siding with the pole owners and might not be buying Google Fiber as a cable company. But even if they are a cable company I don’t know that this gets them access to poles. When AT&T and Verizon became statewide cable providers they already had access to poles. If Google Fiber was a CLEC they would automatically have the right to pole access, but Google apparently doesn’t want to take on the other obligations that come with being a CLEC. The dispute is going to be resolved in one of two ways – either a court will decide this if Google wants to pursue it, or Google will just walk away from those markets and pursue some of the other hundreds of markets that want their fiber.

Categories
Regulation - What is it Good For?

Government and the Digital Divide

There were two interesting announcements from politicians in the last week concerning the digital divide. First, there was an announcement from President Obama saying that he wants to connect 20 million more Americans to broadband by 2020. Then Greg Abbott, the governor of Texas, announced that he wants to connect all of the 5.2 million schoolchildren in Texas to the Internet by 2018.

President Obama’s announcement was accompanied by a plan called ConnectALL. The plan was prompted in part by a recent study that shows that households making less than $25,000 per year are half as likely to have broadband as households that make more. The plan makes a number of specific proposals for things the federal government can do to increase broadband penetration rates:

  • The primary tool proposed is to revise the Lifeline program that subsidizes telephone service for low income households and to redirect the $1.2 billion spent annually on that program to subsidize broadband connections instead. This is something that is already underway at the FCC and the proposed rules on how this might work are expected out later this year.
  • The plan also includes an initiative to improve digital literacy skills. The plan would engage a number of volunteer and non-profit organizations to make this a priority. This would include AmeriCorps volunteers as well as organizations like the Corporation for National and Community Services, and the Institute of Museum and Library Services. The plan would also promote more computer skill training at two-year colleges.
  • The plan would also promote the reuse of computers and similar equipment no longer needed by the federal government.
  • The plan would also direct the NTIA to get more involved in supporting community broadband planning. It would also bring in a number of non-profits and philanthropic groups to help with this effort.
  • The plan also calls for ISPs to offer more lower-price products for low-income households.

The Texas governor has not yet released any details of how he might go about connecting all school children to broadband in such a short period of time. The only solution I can imagine that could happen that quickly would be some sort of cellular plan just for kids to get connected to school servers. 2018 is practically right around the corner in terms of solving broadband issues.

These kind of announcements always sound great. Certainly both politicians have identified real issues. It’s becoming quite clear that poor households are increasingly finding broadband unaffordable. But one has to ask how much success the federal plan might really have. Certainly subsiding internet connectivity for low-income households will bring some new households onto the Internet. But you need to ask how much of an incentive $10 per month is for a home that can’t afford broadband today.

Certainly the $1.2 billion per year in Lifeline funding can reach 20 million people – that amount will provide cheaper broadband to 10 million homes. But you would have to think that a lot of those homes are already receiving this same subsidy today for their home phone, and when a household swaps a phone subsidy for a broadband subsidy they are no better off in terms of total telecom spending. They will just have swapped a $10 per month discount from one bill to another.

And all of the other proposed solutions sound wonderful on paper – but will they work to get more people on the Internet? I know that computer literacy training can work well if done right. I have one client who has been holding training sessions for customers for well over a decade and over the years they have brought a lot of elderly in their community onto the Internet. But they say that it takes a major time commitment for each potential customer and a concentrated effort for this to work – they often will work with a given customer for many months before that person is comfortable enough to buy Internet at their home.

And none of the federal ideas really fix the underlying problem of affordability. The Lifeline program will reduce broadband by $10 per month, but in homes that are surviving on jobs that pay $12 per hour or less, broadband at any price is hard to afford. I certainly don’t have an answer to this problem, but there are other ideas that I think ought to be considered as well. For example, $1.2 billion per year could supply a lot of broadband by building a huge number of neighborhood WiFi transmitters that could bring cheap or free Internet to many homes at the same time. I’ve always thought that the cities that are looking to provide free WiFi broadband are on the right track because that brings broadband the neediest households  without the paperwork and expense that comes with subsidy programs.

The last item on the list above has the most promise. A lot of good could come from pushing the major ISPs to offer a $10 or $20 broadband alternative. But this was forced onto Comcast a number of years ago and they largely shirked the responsibility and provided low-price broadband to very few homes.

I’ve been skeptical for years that the Lifeline program makes a lot of difference. It probably did when the program first started in 1985 and the typical phone bill was under $20. But the $10 discount that was a lot in 1985 is worth a lot less now. It just doesn’t feel like enough of an incentive to make the difference the government is hoping for.

Categories
The Industry

Comcast and Real Competition

It’s really interesting to see how Comcast is reacting to Google Fiber in Atlanta. The company has had competition from fiber in the past in the form of Verizon FiOS. But the footprint for that competition hasn’t changed for years. Comcast and Verizon have competed with very similar data speeds and there was not a lot to distinguish one from the other from a product standpoint. Each company has bested the other in some markets, although Verizon seems to have gotten the upper hand in more places.

But now Comcast is facing Google Fiber for the first time and their reaction is interesting. From what I can see they are doing the following:

  • Comcast is offering a gigabit of speed for $70 per month. But it comes with a very ugly 3-year contract. For those that don’t take the 3-year contract the price will be $139.95 per month, plus Comcast will impose a 3 gigabit monthly data cap that could add up to $35 per month to anybody that actually uses the data.
  • Comcast is using negative advertising against Google’s WiFi router and says that Google’s Wifi’s speeds are 30 Mbps while their own is 725 Mbps.
  • And Comcast is widely distributing flyers that tell people in Atlanta not to fall for the Google hype.

So how do these claims stack up and will they be effective?

I think Comcast’s speed comparison is quite silly and that the public will see through it. The general public has been trained for a decade that fiber is better. Not that upload speeds matter to most people, but Google’s speeds are symmetrical while Comcast will have a relatively slow, perhaps 35 Mbps upload. On a fiber network it’s not too hard to engineer to deliver a true gigabit download almost all of the time. But Comcast is going to have the same issues it’s always had with its HFC network. If it sells too many gigabit customers, then its nodes will slow down for everybody on the node. I don’t believe that there are many homes today that really need a gigabit, but once Google is up and running it ought to win the speed test battle in the market.

There is some truth to Comcast’s claim about WiFi, although their numbers are quite skewed. For some reason Google Fiber is still using an 802.11n WiFi router. At best their WiFi routers are going to deliver about 300 Mbps – but in Kansas City the Google routers are reported on consumer websites to deliver about 80 Mbps on average. Comcast is offering 802.11ac routers, and while they are theoretically capable of the speeds they tout, in real life use they deliver between 200 Mbps and 300 Mbps.

The fact is that both companies (and most ISPs) are doing a very poor job with WiFi. Almost all of them offer a one-WiFi router solution which is not acceptable in today’s big bandwidth homes. I have a Comcast WiFi router and it delivers really low speeds to our offices which are opposite ends of the house from the central router. Until a carrier is willing to cross the threshold and install a WiFi network with multiple linked WiFi routers in a home, then all of their solutions are going to be poor in real life practice.

It appears that Comcast is relying on negative advertising against Google, and I seriously doubt this is going to work. Comcast has one of the most hated customer service experiences in the country and Google has been touted – so far – for offering outstanding customer service. It seems like a bad tactic to advertise negatively about somebody that will have a better network product and a better customer experience.

I think Comcast is really missing the point. It seems like they are spending their energy advertising against Google’s gigabit product. But Google announced that it is entering Atlanta with two data products – the gigabit at $70 and a 100 Mbps product at $50. My bet is that the slower product is likely to most cut into Comcast’s penetration rate unless they decide to scrap the 300 gigabit month data cap. Where Comcast says that only a small percentage of customers use more data than that per month, my clients tell me otherwise. Once any customer has been charged extra for a data cap overage on Comcast they most likely will change to Google and they are likely to never come back.

Categories
The Industry

Google Fiber and the Triple Play

There is some interesting news from Google Fiber lately about new product offerings. It was reported at the end of January that Google is testing a voice product for its fiber customers. And in early February Google announced that it was adding a 100 Mbps data product in the Atlanta roll-out.

News leaked out that Google is experimenting with Fiber Phone with members of its Trusted Tester Program. Google offered phone service to those customers and wrote the following:

With Fiber Phone, you can use the right phone for your needs, whether it’s your mobile device on the go or your landline at home. No more worrying about cell reception or your battery life when you’re home… Spam filtering, call screening and do-not-disturb make sure the right people can get in touch with you at the right time.

Google is installing the needed equipment for test customers and is at the beta stage of testing. There has been news about possible pricing or when this might be made available to all customers.

In early February Google announced it is now offering a 100 Mbps data product for $50 to go along with the $70 gigabit offering. In Atlanta the company has eliminated the ‘free’ Internet product where customers paid a one-time fee of $300 and got a 5 Mbps product for 7 years with no additional fees.

With these changes Google is looking more and more like a typical triple-play provider. It’s not hard to understand why they would make these changes. It’s very expensive to build a fiber network and the best way to pay for it is to get as many high-margin customers as possible on the network to pay for it.

As exciting as the $70 gigabit product is there are a huge number of households that just can’t afford that price. So by adding a $50 product that is still blazingly fast Google will make their broadband affordable to a lot more people in each market.

There is one interesting market dynamic that Google is probably going to soon see. In looking at the customer penetration rates for many of my client ISPs I’ve almost always seen that the fastest Internet product (assuming it isn’t priced too high) will get 10% to 15% of the customers in a given market. Given a choice, the rest of the customers will take something slower if it saves them money. This is not something that’s true only for fast fiber networks, but I’ve seen this same relationship hold true for cable companies with HFC networks and for DSL networks. There are only a few markets where a higher percentage of customers buy the premium data product.

If Google goes back and introduces the 100 Mbps product in their older markets they will probably see two things. First, they will add customers who find the $50 price affordable. But they are also going to see gigabit customers downgrade to 100 Mbps to save $20 per month. Overall I would guess this change will produce a significant net change upward in total revenues in Google’s older markets. In Atlanta I predict they will get a lot more 100 Mbps customers than gigabit customers.

And Google ought to do okay with voice. My experience is that they will have a hard time selling voice to existing customers but that they will do okay with new customers as they add them. The FCC reported that voice just fell under a 50% nationwide penetration, and that is still a lot of potential customers. I see clients still doing surprisingly well with residential voice and still doing extremely well with business voice.

It’s interesting to see that after a few years in the market that Google is morphing into a more normal triple play provider. I’ve expected this from the start because my take is that a large majority of the households still wants the double play or triple play and if you want to get a lot of customers you have to provide what customers want to buy. Anybody that expects customers to buy from more than one vendor to get what they want is going to drive away a lot of potential customers.

Exit mobile version