Categories
The Industry

Google Fiber to Push Speed Limits Again

Dinni Jain, the CEO of Google Fiber, posted a blog last week that talks about dramatically increasing the top speeds available on fiber. He says the specific announcement will come in the coming months to dramatically expand Google Fiber’s gigabit offerings.

The blog gives a hint at what might be coming. Included in the blog is a speed test from the home of a Google Fiber employee in Kansas City who is receiving 20.2 Gbps. I think this might be a signal to other ISPs that Google Fiber is prepared to surpass the capability of the XGS-PON technology that the industry is adopting. That technology delivers up to 10 gigabits symmetrical to a cluster of homes, depending on the electronics vendor. It’s obvious that Google Fiber is using something faster for the test than the currently available XGS-PON. It’s always been speculated that Google has developed its own customer electronics, but the company has always been moot on the issue.

It’s not easy for most current fiber providers to update to 20-gigabit speeds, even if there I a PON solution faster than 10 Gbps. An upgrade hits every portion of a network. It means a faster connection to neighborhoods. It means faster core routers and switches. And it means a more robust pipe to the Internet. Google Fiber has an edge in fiber backbone since it has built or leased dark fiber to many markets to support YouTube peering. It means all new ONTs and customer modems capable of receiving 20-gigabit speeds – and it means much faster WiFi within homes. Ultimately it means computers and devices cable of handling faster speeds.

Google Fiber was the first to make a national splash in 2010 with gigabit fiber for $70 per month – a price it has never increased. At that time, there were a handful of municipal, cooperatives, and small telcos that offered gigabit speeds – but all of them I know about charged significantly more than $70. It sounds like Google Fiber is getting ready to recalibrate the top of the speed market again. An affordable 20-gigabit product would certainly do that.

The most interesting thing said in the blog is that speed isn’t everything. The blog hints at having products that benefit from much faster speeds – something the industry has been searching for since the introduction of gigabit speeds. There are still very few uses that can fully utilize a gigabit connection in a home, let alone a much faster connection. There are some. I have a friend with several competitive games in the house that tax his gigabit FiOS connection. There are doctors with direct connections to hospitals that can use a gigabit to view complex imaging files. There are specialty engineers, data scientists, animators, and others who could use a gigabit and more if working from home. But most homes don’t use services that can use that much bandwidth.

The product on the near horizon that could use multiple gigabit bandwidth is 3D holograms as part of immersive virtual reality and telepresence. I keep waiting for somebody to offer such a product. It wouldn’t be hard to imagine hundreds of thousands of homes trying this almost immediately. My guess is that the roadblock to much faster services is the underlying middle-mile backbones. I don’t think most local ISPs have nearly enough backbone bandwidth to support multiple customers using a dedicated gigabit of bandwidth.

The other impediment to superfast broadband products is upload bandwidth. Telepresence is a 2-way service, and even if it can work on a gigabit download connection, there is no chance of such a service working on cable company networks where upload speeds are a minuscule fraction of download speeds. According to OpenVault, over 14% of homes now buy a gigabit download connection, but I have to imagine a large percentage of these connections are on cable companies.

It’s easy to write off fast broadband speeds as vanity purchases, and to some degree, it’s true. But the industry is now facing a classic chick-and-egg dilemma. We won’t get faster broadband products until there is some critical mass of homes ready to use them.

The blog says that Google will be discussing some of these issues in the coming weeks, including the upgrade of networks and maximizing speeds inside homes.

Categories
The Industry

Multi-gigabit Broadband

AT&T recently announced multi-gigabit broadband plans on its fiber connections. The company has priced 2-Gbps broadband at $110 per month and 5-Gbps broadband at $180. AT&T isn’t the first company to offer multi-gigabit broadband speeds and joins other large ISPs:

  • Google Fiber has the most affordable 2-Gbps plan that I can find at $100 per month.
  • Ziply Fiber, which purchased and is upgrading the former Frontier properties in the northwest is selling 2.Gbps broadband for $120 and 5-Gbps broadband for $300.
  • Comcast has priced a 3 Gbps broadband connection at $300. The 3-Gbps product is likely only available where Comcast has built fiber.
  • There are smaller ISPs, municipalities, and cooperatives offering speeds faster than 1 Gbps.

For now, multi-gigabit broadband is mostly a marketing gimmick. It’s a way for an ISP to tell the public that its networks are fast. But the same thing was said about Google Fiber in 2012 when the company introduced one-gigabit fiber at a time when the primary broadband products provided by cable companies were at 30 Mbps and 60 Mbps. In the decade since the Google Fiber announcement, the gigabit broadband product has been embraced by the public. OpenVault reported that at the end of the third quarter of 2021 that 11.4% of all U.S. households were subscribed to gigabit broadband products.

I hear from skeptics often who say that no home needs a gigabit broadband connection, let alone something faster. But the market is telling us that people are willing to pay for gigabit speeds. The subscriptions to gigabit broadband leaped during the pandemic. My guess is that a lot of homes using cable companies upgraded to faster speeds to find a broadband product that would allow them to better work from home. People found the upload speeds on normal cable products to be limiting and upgraded to faster broadband packages to get better performance. I’ve always wondered if that worked, because from many of the speed tests results I’ve seen, even the gigabit products on cable companies often have measured uploads speeds of only 20 Mbps, with the fastest I’ve ever seen at 40 Mbps.

Gigabit products on fiber are a totally different broadband product than what is offered by cable companies. Most fiber broadband products have symmetrical upload and download speeds – and even the ones that aren’t symmetrical are far faster than products offered by the cable companies. Fiber has lower latency and jitter, so data transmissions are clean and fast. I’ve always wondered why homes with a symmetrical 250 Mbps or 400 Mbps fiber connection would upgrade to something faster – but ISPs tell me that people are ponying up for a gigabit.

There is one benefit of fast broadband speeds that we don’t talk about enough. A lot of homes have serious challenges in deploying WiFi. There can be major issues in propagating WiFi in homes with multiple stories, older homes built with plaster walls, or homes that want WiFi to reach nearby sheds and barns. A stronger broadband input means that the WiFi signal will be stronger throughout the house.

It’s unlikely, for now, that ISPs will be selling very many subscriptions to multi-gigabit broadband. The most likely to succeed is Google Fiber, which has priced 2-gigabits at $100. It’s obvious that companies that set the price at $300 per month don’t expect many folks to buy. But I have to wonder if in ten years that 2-gigabit broadband will be a common product?

Categories
The Industry

Mediacom and West Des Moines

In 2020, the City of West Des Moines, Iowa announced it was building a fiber conduit network to bring fiber to pass all 36,000 residents and businesses in the city. It was a unique business model that can best be described as open-access conduit. What is unique about this arrangement is that conduit will be built along streets and into yards and parking lots to reach every home and business. The City is spending the money up front to cross the last hundred feet.

The City’s announcement also said that the conduit network is open access and is available to all ISPs. Google Fiber was announced as the first ISP tenant and agreed to serve everybody in the city. This means that Google Fiber will have to pay to pull fiber through the conduit system to reach customers.

Mediacom, the incumbent cable company in the city, sued West Des Moines and argued that the City had issued municipal bonds for the benefit of Google Fiber. The suit also alleges that the City secretly negotiated a deal with Google Fiber to the detriment of other ISPs. The suit claims Google Fiber had an advantage since one of the City Commissioners was also the primary Google Fiber lobbyist in the state.

As is usual with such suits, outsiders have no idea of the facts, and I’m not taking sides with either of the parties. A recent article said the two sides are nearing a settlement, and if so, we might never understand the facts. I find the lawsuit to be interesting because it raises several interesting issues.

A lot of cities are considering open-access networks. Politicians and the public like the idea of having a choice between multiple ISPs. But this suit raises an interesting dilemma that cities face. If a city launches an open-access network with only one ISP, like in this case, that ISP gets a huge marketing advantage over any later ISPs. On an open-access network, no ISP has a technological advantage – every ISP that might come to West Des Moines will be providing fiber broadband.

If Google Fiber is first to market, it has an opportunity to sign everybody in the city who prefers fiber broadband over cable broadband. In the case of West Des Moines, each future ISP would also have to pay to pull fiber through the network, and a second ISP might have a hard time justifying this investment if Google Fiber already has a large market share.

From my understanding of the West Des Moines business model, the City needs additional ISPs to recover the cost of building the network – the City clearly intends to bring the benefits of open-access to its citizens. It’s hard to believe the City would intentionally gave an unfair advantage to Google Fiber. But did they inadvertently do so by giving Google Fiber the chance to gain a lock-down market share by being first?

Another interesting question this suit raises is if Mediacom considered moving onto the fiber network? When somebody overbuilds a market with fiber, the cable company must be prepared to compete against a fiber ISP. But in West Des Moines and a few other open-access networks like Springfield, Missouri, the cable company has a unique option – the cable company could also jump onto the fiber network.

It would be interesting to know if Mediacom ever considered moving to fiber. The company already has most of the customers in the market, and one would think it could maintain a decent market share if it went toe-to-toe with Google Fiber or another ISP by also competing using fiber. It would be a huge decision for a cable company to make this leap because it would be an admission that fiber is better than coaxial networks – and this switch probably wouldn’t play well in other Mediacom markets. I also think that cable companies share a characteristic with the big telcos – it’s probably challenging for a cable company to swap to a different technology in only a single market. Every backoffice and operational system of the cable company is geared towards coaxial networks, and it might be too hard for a cable company to make this kind of transition. I’m always reminded that when Verizon decided to launch its FiOS business on fiber, the company decided that the only way to do this was to start a whole new division that didn’t share resources with the copper business.

Finally, one issue this suit raises for me is to wonder what motivates ISPs to join an open-access network in today’s market. I understand why small ISPs might do this – they get access to many customers without making a huge capital investment. But there is a flip side to that and there can be a huge financial penalty for an ISP to pursue open access rather than building a network. In the last few years, we’ve seen a huge leap-up in the valuation multiple applied to facility-based fiber ISPs. When it comes time for an ISP to sell a market, or even to leverage an existing market for borrowing money, a customer on a fiber network that is owned by an ISP might easily be worth ten times more than that same customer on a network owned by somebody else.

That is such a stark difference in value that it makes me wonder why any big ISP would join an open-access network. Open-access is an interesting financial model for an ISP because it can start generating positive cashflow with only a few customers. But is the lure of easy cash flow a good enough enticement for an ISP to forego the future terminal value created by owning the network? This obviously works for some ISPs like Google Fiber, which seems to only want to operate on networks owned by others. But consider a small rural telco that might be located outside of West Des Moines. The telco could generate a much higher value by building to a few thousand customers in a market outside West Des Moines than by adding a few thousand customers on the open-access network.

The giant difference in terminal value might explain why open-access networks have such a hard time luring ISPs. It probably also answers the question of why a cable company like Mediacom is not jumping to join somebody else’s network. It’s an interesting financial debate that I’m sure many ISPs have had – it it better to go for the quick and easy cash flow from open-access or take more risks but hope for the much bigger valuation from building and owning the network and the customers?

Categories
What Customers Want

Being Stingy with Broadband Speeds

I’ve never understood ISPs that build fiber networks and then sell small-bandwidth products. The fiber technologies in place today can easily provide gigabit speeds to every customer without straining the network. The cost of providing 10-gigabit electronics keeps dropping and is now only a few hundred dollars extra per customer. Why would a fiber-based ISP have speed tiers that provide 50 Mbps?

I’ve found that it’s not unusual for an ISP with a low-bandwidth product on fiber to also charge a lot for gigabit bandwidth. There are a number of ISPs that charge $150 to $200 for a residential gigabit bandwidth product.

Bandwidth pricing philosophies differ around the industry. There are ISPs like Google Fiber, and Ting that only offer gigabit broadband. These ISPs are declaring that their fiber is a faster technology and they are marketing based upon that technology advantage.

The big ISPs in the country have trained the public that extra bandwidth is expensive. The cellular companies are the king at this game and will sell an extra gigabyte of usage for as much as $10. The big ISPs like Comcast and AT&T charge a lot for any customer that exceeds an arbitrary data cap. These pricing philosophies make a lot of money for the big ISPs, but this pricing conveys the false message that extra customer usage drives up costs for an ISP, and that customers that use more data ought to pay more.

Anybody who understands how ISPs operate realizes that there is little or no incremental cost for a given customer to use more bandwidth. It seems counterintuitive, but a household that uses a terabyte of download for a month doesn’t cost the ISP any more than the customer that uses 200 gigabytes per month – all due to the way that ISPs pay for wholesale broadband. ISPs buy enough bandwidth to satisfy the busiest hour of the day – and the rest of the day a lot of the bandwidth sits unused. There was zero incremental cost to ISPs for wholesale broadband when their customers started downloading and uploading a lot more data during the daytime due to COVID-19 – because the daytime usage still didn’t exceed the evening busy hour.

An argument can be made that faster speeds are more efficient for an ISP.  Consider the difference between two customers that each download a 1-gigabyte data file. A customer with a 50 Mbps product is using the network, and potentially interfering with other traffic for twenty times longer than the customer with a gigabit bandwidth product. Faster speeds reduce collisions between data streams, and a fiber network with fast customer speeds is significantly more efficient than one with slower speeds.

This is not to say that I’m advocating that ISPs should sell only the gigabit product since that is the most efficient use of a network. I think ISPs with only a gigabit product are leaving revenues on the table – unless the product is priced low enough to be affordable for everybody. I think companies that only a gigabit product at $70 or $80 are pricing out of the financial reach of many homes.

I get to peek behind the curtain of a lot of ISPs, and I know that an ISP with a smart tier of products can have more customers and more revenues than the ISP with only one product. I’m positive that an ISP with a $60, a $70, and an $80 product will do better than an ISP with only a $70 gigabit product.

The COVID-19 pandemic has finally forced the industry to confront broadband affordability. Even in markets where there is fast broadband available, we found out during the pandemic that there are a lot of homes without broadband for students because their parents can’t afford it. ISPs have not cared much about the homes that can’t afford broadband – and in most markets, that’s anywhere from 10% to 40% of the market. ISPs have been happy selling expensive broadband to those that can afford their prices and have given little thought to those that can’t.

I am truly puzzled why ISPs with fiber networks have broadband products between 25 Mbps and 75 Mbps. That’s like buying a race car and driving it on the freeway at 25 miles per hour. A cable company is not afraid of a competitor that wants to fight the market battle at speeds the cable company can match. The cable companies are afraid of ISPs of affordable symmetrical data products they can’t match.

Categories
Current News

Google Fiber Comes to Iowa

The City of West Des Moines recently announced a deal with Google Fiber to bring fiber to pass all 36,000 residents and businesses in the city. This is a unique business model that can best be described as open-access conduit.

The city says that the estimated cost of the construction is between $35 million and $40 million and that the construction of the network should be complete in about two-and-a-half years. The full details of the plan have not yet been released, but the press is reporting that Google Fiber will pay $2.25 per month to the city for each customer that buys service from Google Fiber.

What is most unique about this arrangement is that conduit will be built along streets and into yards and parking lots to reach every home and business. I know of many cities that lease out some empty conduit to ISPs and carriers, but the big limitation of most empty conduit is that it doesn’t provide easy access to get from the street to reach a customer. West Des Moines will be spending the money to build the conduit to serve the last hundred feet.

This business arrangement will still require Google Fiber to pull fiber throughout the entire empty conduit network – but that is far cheaper for the company than building a network from scratch. The big cost of building any fiber network is the labor needed to bring the fiber along every street – and the city has absorbed that cost. The benefit of this arrangement for Google Fiber is obvious – the company saves the cost of building a standalone fiber network in the City. It’s the cost of financing expensive networks up-front that makes ISPs hesitant to enter new markets.

From a construction perspective, I’m sure that the City is building fiber with some form of innerduct – which is a conduit with multiple interior tubes that can accommodate multiple fibers (as is shown in the picture accompanying this blog). This would allow additional ISPs to coexist in the same conduits. If the conduits built through yards also include innerduct it would make it convenient for a customer to change fiber ISPs – disconnect fiber from ISP A and connect to the fiber from ISP B.

The City is banking on other ISPs using the empty conduit because Google Fiber fees alone won’t compensate the city for the cost of the conduit. The press reported that Google Fiber has guaranteed the City a minimum payment of at least $4.5 million over 20 years. I’m sure the City is counting on Google Fiber to perform a lot better than that minimum, but even if Google Fiber connects to half of all of the customers in the City, the $2.25 monthly fee won’t repay the City’s cost of the conduit.

This business model differs significantly from the typical open-access network model. In other open-access networks, the City pays for 100% of the cost of the network and the electronics up to the side of a home or business. The typical monthly fee for an ISP to reach a customer in these open access-networks ranges between $30 and $45 per month. Those high fees invariably push ISPs into cherry-picking and only pursuing customers willing to pay high monthly rates. The $2.25 fee in West Des Moines won’t push ISPs to automatically cherry-pick or charge a lot.

Any ISP willing to come to the city has a few issues to consider. They avoid the big cost of constructing the conduit network. But a new ISP will still need to pay to blow fiber through the conduit. Any new ISP will also be competing against Google Fiber. One of the most intriguing ISPs already in the market is CenturyLink. The company has shown in Springfield, Missouri that it is willing to step outside the traditional business model and use somebody else’s network. I would have to imagine that other ISPs in the Midwest perked up at this announcement.

In announcing the network, the City said that they hoped this network would bring fiber to everybody in the City. Google Fiber doesn’t typically compete on price. Earlier this year Google Fiber discontinued its 100 Mbps broadband connection for $50. Many homes are going to find the $70 gigabit product from Google Fiber to be unaffordable. It will be interesting over time to see how the city plans on getting broadband to everybody. Even municipalities that own their own fiber network are struggling with the concept of subsidizing fiber connections below cost to make them affordable.

One thing this partnership shows is that there are still new ideas to try in the marketplace. For an open-access conduit system to be effective means attracting multiple ISPs, so this idea isn’t going to work in markets much smaller than West Des Moines. But this is another idea for cities to consider if the goal is to provide world-class broadband for citizens and businesses.

Categories
Regulation - What is it Good For? The Industry

Setting the Definition of Broadband

One of the commenters on my blog asked a good question – can’t we set the definition of broadband by looking at the broadband applications used by the typical household? That sounds like a commonsense approach to the issue and is exactly what the FCC did when they set the definition of broadband to 25/3 Mbps in 2015. They looked at combinations of applications that a typical family of four might use in an evening, with the goal that a household ought to have enough broadband to comfortably do those functions at the same time. This might best be described as a technical approach to defining broadband – look at what households are really using and make sure that the definition of broadband is large enough to cover the expected usage for a typical household.

Taking this approach raises the bigger question – what should the policy be for setting the definition of broadband? I don’t know that I have any answers, but I ask the following questions:

  • The FCC largely conducted a thought experiment when setting the 25/3 definition of broadband – they didn’t try to measure the bandwidth used in the scenarios they considered. If the FCC had measured real homes doing those functions they likely would have found that bandwidth needs were different than they had estimated. Some functions use less bandwidth than they had supposed. But usage also would have been larger than they had calculated, because the FCC didn’t compensate for WiFi overheads and machine-to-machine traffic. As a household makes use of multiple simultaneous broadband functions, the WiFi networks we all use bog down when those applications collide with each other inside the home network. The busy-hour behavior of our home networks needs to be part of a mathematical approach to measuring broadband.
  • The FCC could have gotten a better answer had they hired somebody to measure evening broadband usage in a million homes. We know that broadband usage is like anything else and there are households that barely use broadband and others that use it intentsely. The idea of pinpointing the usage of a typical family is a quaint idea when what’s needed is to understand the curve of broadband usage – what’s the percentage of homes that are light, average, and heavy users. I’m sure that one of the big companies that track broadband usage could measure this somehow. But even after making such measurements we need a policy. Should the definition of broadband be set to satisfy the biggest broadband users, or something else like the medium speed used by households? Analytics can only go so far and at some point there has to be a policy. It’s not an easy policy to establish – if the definition of broadband is set anywhere below the fastest speeds used by households, then policy makers are telling some households that they use too much broadband.
  • If we are going to use measurements to determine the definition of broadband, then this also has to be an ongoing effort. If 25/3 was the right definition of broadband in 2015, how should that definition have changed when homes routinely started watching 4K video? I don’t think anybody can deny that households use more broadband each year, and homes use applications that are more data intensive. The household need for speed definitely increases over time, so any policy for setting a definition of broadband needs to recognize that the definition must grow over time.
  • One fact that is easy to forget is that the big cable companies now serve two-thirds of the broadband customers in the country, and any discussion we have about a definition of broadband is only considering how to handle the remaining one-third of broadband users. There is a good argument to be made that the cable companies already define the ‘market’ speed of broadband. The big cable companies all have minimum broadband speeds for new customers in urban markets today between 100 – 200 Mbps. The companies didn’t set these speeds in a vacuum. The cable companies have unilaterally increased speeds every 3-4 years in response to demands from their customers for faster speeds. I think there is a valid argument to be made that the market speeds used to serve two-thirds of the customers in the country should be the target broadband speed for everybody else. Any policymaker arguing that 25/3 Mbps should still be the definition of broadband is arguing that one-third of the country should settle for second-class broadband.
  • In a related argument I harken back to a policy discussion the FCC used to have when talking about broadband speeds. I can remember a decade or more ago when the FCC generally believed that rural broadband customers deserved to have access to the same speeds as urban customers. That policy was easy to support when cable networks and telco copper networks both delivered similar speeds. However, as cable broadband technology leaped ahead of copper and DSL, these discussion disappeared from the public discourse.
  • When looking at grant programs like the upcoming RDOF program, where the funded networks won’t be completed until 2027, any definition of broadband for the grants needs to look ahead to what the speeds might be like in 2027. Unfortunately, since we can’t agree on how to set the definition of broadband today, we have no context for talking about future speeds.

These are not easy questions. If the FCC was doing its job we would be having vigorous discussions on the topic. Sadly, I don’t foresee any real discussions at the FCC about the policy for setting the definition of broadband. The FCC has hunkered down and continues to support the 25/3 definition of broadband even when it’s clear that it’s grown obsolete. This FCC is unlikely to increase the definition of broadband, because in doing so they would be declaring that millions of homes have something less than broadband. It seems that our policy for setting the definition of broadband is to keep it where it is today because that’s politically expedient.

Categories
What Customers Want

Setting Broadband Rates

One of the more interesting things about being a consultant is that I often get to work with new ISPs. One of the topics that invariably arises is how to set rates. There is no right or wrong answer and I’ve seen different pricing structures work in the marketplace. Most rate structure fit into one of these categories:

  • Simple rates with no discounts or bundling;
  • Rates that mimic the incumbent providers;
  • High rates, but with the expectation of having discounts and promotions;
  • Complex rates that cover every imaginable option.

Over the years I’ve become a fan of simple rate structure for a couple of reasons:

  • Simple rates make it easy for customer service reps and other employees.
  • It’s easy to advertise simple rates: “Our rates are the same for everybody – no gimmicks, no tricks, no hidden fees”.
  • It’s easy to bill simple rates. Nobody has to keep track of when special promotions are ending. Simple rates largely eliminate billing errors.
  • It eliminates the process of having to negotiate prices annually with customers. That’s an uncomfortable task for customer service reps. There are customers in every market who chase the cheapest rates and the latest special. Many facility-based ISPs have come to understand that such customers are not profitable if they only stay with the ISP for a year before chasing a cheaper rate elsewhere.
  • It’s easier for customers. Customers appreciate simple, understandable bills. Customers who don’t like to negotiate rates don’t get annoyed when their neighbors pay less than them. Simple rates make it easy to place online orders.

As a consumer I like simple rates. When Sling TV first entered the market they had two similar channel line-ups to choose from, with several additional options on top of each basic package. Since they were the only online provider at the time, I waded through the process of comparing the packages. But I was really annoyed that they made me do so much work to buy their product, and when a simpler provider came along I jumped ship. To this day I can’t figure out what Sling TV gained from making it so hard to compare their options.

ISPs can be just as confusing. I was looking online the other day at the packages offered by Cox. They must have fifty or sixty different triple and double play packages online and it’s virtually impossible for a customer to wade through the choices unless they know exactly what they want.

There are fiber overbuilders who are just as confusing. I remember looking at the pricing list of one of the earliest municipal providers. They had at least a hundred different speed combinations of upload and download speeds. I understand the concept of giving customers what they want, but are there really customers in the world who care about the difference between speed combinations like 35/5 Mbps, 38/5 Mbps, or 35/10 Mbps? I know several smaller ISPs who have as many options as Cox and have a different product name for each unique combination of broadband, video, and voice.

There is such a thing as being too simple. Google Fiber launched in Kansas City with a single product, $70 gigabit broadband. They were surprised to find that a lot of customers wouldn’t consider them since they didn’t offer video or telephone service. Over a few years Google Fiber introduced simple versions of those products and now also offer a 100 Mbps broadband product for $50. Even with these product additions they still have one of the simplest product lineups in the industry – and they are now attractive to a lot more homes.

I know ISPs with complicated rates that have achieved good market penetration. But I have to wonder if they would have done even better had they used simpler rates and made it easier on their staffs and the public.

Categories
Technology The Industry

Google Fiber Leaving Louisville

Most readers have probably heard by now that Google Fiber is leaving Louisville because of failures with their fiber network. They are giving customers two months of free service and sending them back to the incumbent ISPs in the city. The company used a construction technique called micro-trenching where they cut a tiny slit in the road, one inch wide and few inches deep to carry the fiber. Only a year after construction the fiber is popping out of the micro-trenches all over the city.

Everybody I’ve talked to is guessing that it’s a simple case of ice heaving. While a micro-trench is sealed, it’s likely that small amounts of moisture seep into the sealed micro-trench and freezes when it gets cold. The first freeze would create tiny cracks, and with each subsequent freeze the cracks would get a little larger until the trench finally fills up with water, fully freezes and ejects the fill material. The only way to stop this would be to find a permanent seal that never lets in moisture. That sounds like a tall task in a city like Louisville that might freeze and thaw practically every night during the winter.

Nobody other than AT&T or Charter can be happy about this. The reason that Google Fiber elected to use micro-trenching is that both big ISPs fought tooth and nail to block Google Fiber from putting fiber on the utility poles in the city. The AT&T suit was resolved in Google’s favor, with the Charter one is still in court. Perhaps Google Fiber should have just waited out the lawsuits – but the business pressure was there to get something done. Unfortunately, the big ISPs are being rewarded for their intransigence.

One obvious lesson learned is not to launch a new network using an untried and untested construction technique. In this case, the micro-trenches didn’t just fail, they failed spectacularly, in the worst way imaginable. Google Fiber says the only fix for the problem would be to build the network again from scratch, which makes no financial sense.

Certainly, the whole industry is going to now be extremely leery about micro-trenching, but there is a larger lesson to be learned from this. For example, I’ve heard from several small ISPs who are ready to leap into the 5G game and build networks using millimeter wave radios installed on poles. This is every bit a new and untested technology like micro-trenching. I’m not predicting that anybody pursuing that business plan will fail – but I can assuredly promise that they will run into unanticipated problems.

Over my career, I can’t think of a single example where an ISP that took a chance on a cutting-edge technology didn’t have big problems – and some of those problems were just as catastrophic as what Google Fiber just ran into. For example, I can remember half a dozen companies that tried to deploy broadband networks using the LMDS spectrum. I remember one case where the radios literally never worked and the venture lost their $2 million investment. I remember several others where the radios had glitches that caused major customer outages and were largely a market disaster.

One thing that I’ve seen over and over is that telecom vendors take shortcuts. When they introduce a new technology they are under extreme pressure to get it to market and drive new revenues. Ideally, a vendor would hold small field trials of new technology for a few years to work out the bugs. But if a vendor finds an ISP willing to take a chance on a beta technology, they are happy to let the customers of that ISP be the real guinea pigs for the technology, and for the ISP to take the hit for the ensuing problems.

I can cite similar stories for the first generation of other technologies including the first generation of DSL, WiFi mesh networks, PON fiber-to-the-home and IPTV. The companies that were the first pioneers deploying these technologies had costly and sometimes deadly problems. So perhaps the lesson learned is that pioneers pay a price. I’m sure that this failure of micro-trenching will result in changing or abandoning the technique. Perhaps we’ll learn to not use micro-trenches in certain climates. Or perhaps they’ll find a way to seal the micro-trenches against humidity. But none of those future solutions will make up for Google Fiber’s spectacular failure.

The real victims of this situation are the households in Louisville who had changed to Google Fiber – and everybody else in the City. Because of Google Fiber’s lower prices, both Charter and AT&T lowered prices everywhere in the city. You can bet it’s not going to take long to get the market back to full prices. Any customers crawling back to the incumbents from Google Fiber can probably expect to pay full price immediately – there is no real incentive to give them a low-price deal. As a whole, every household in the City is going to be spending $10 or $20 more per month for broadband – which is a significant penalty on the local economy.

Categories
Regulation - What is it Good For?

Are There Any Level Playing Fields?

If you follow regulatory filings, one of the most common arguments you will encounter from the big ISPs is the concept of a level playing field. The idea behind the level playing field is that every competitor in the industry should be working from the same set of rules and nobody should have a market advantage due to regulatory rules. AT&T and Verizon have both rolled out the argument many times when arguing to tighten rules against potential competitors.

There are several good examples of the level playing field argument anywhere that the big ISPs fight to keep municipal entities from building fiber networks. They argue, for example, that municipal entities have an unfair market advantage because they don’t pay state and federal income taxes. But this argument falls apart quickly under examination. First, many municipal ventures such as electric or broadband entities pay in lieu of taxes. This is a tax-like fee that the local government charge to a municipal business. While it’s not really a tax, the fees ac like taxes and can be substantial.

Even more importantly, I can remember many years when AT&T or Verizon made the news due to paying no federal income taxes. Big corporations have numerous tax shelters that allow them to shield income from taxes, and the telcos have gotten numerous favorable rules into the tax code to allow them to walk away from most of their expected tax obligations. You can’t really fault a big corporation for legally avoiding taxes (unless you fault them for the lobbying that slanted the tax codes in their favor to begin with). It’s dishonest for these big ISPs to claim that a municipality has an advantage due to their tax-free status when they pay little or no taxes themselves. Under deeper examination, a municipal fiber venture paying 5% of revenues for in lieu of taxes is often paying a larger percentage of taxes than the big ISPs.

The big ISPs also claim that municipalities have an unfair advantage due to being able to finance fiber networks with municipal bonds. While it’s true that bonds often have a lower interest rate, I have compared bond and bank financing side-by-side many times and for various reasons that are too long to discuss in a blog, bond financing is usually more expensive than commercial loans. It’s also incredibly difficult for a municipality to walk away from a bond obligation while we have numerous examples, such as the Charter bankruptcy a few years back that let a big ISP walk away from repaying the debt used to build their networks.

The big ISPs don’t only use this argument against municipal competitors. AT&T is using the argument as a way to justify hanging 5G wireless devices on poles everywhere. They think there should be a level playing field for pole access, although at this early stage they are one of the few companies looking to deploy 5G small cells. Interestingly, while AT&T wants the right to easy and cheap pole access everywhere, in those places where they own the poles they fight vigorously to keep competitors from getting access. They effectively stopped Google Fiber plans to build in Silicon Valley by denying them access to AT&T poles.

Every time I hear the level playing field argument my first thought is that I would love it if we really had a level playing field. I look at the way that the current FCC is handing the big ISPs their wish list of regulatory rule changes and wish that my clients could get the same kind of favorable regulatory treatment.

A good case in point is again the 5G small cell deployment issue. The FCC has already said that they are in favor of making it cheap and easy for wireless carriers to deploy 5G cell sites. It seems likely that the FCC is going to pass rules to promote 5G deployments unless Congress beats them to the punch. Yet these regulatory efforts to make it easier to deploy 5G conveniently are not asking to make it easier to deploy fiber. If things go in favor of the big ISPs they will have a market advantage where it’s easier to deploy last mile 5G instead of last mile fiber. This will give them a speed-to-market advantage that will let them try to squash anybody trying to compete against them with a FTTP network.

The FCC is supposedly pro-competition, and so if we really had a level playing field they would be passing rules to make it easier to deploy all broadband technologies. They have had decades to fix the pole attachment issues for fiber deployment and have not done so. But now they are in a rush to allow for 5G deployments, giving 5G ISPs a market advantage over other technologies. The consequences for this will be less competition, not more, because we’ve already seen how AT&T and Verizon don’t really compete with the cable companies. In markets where we have both Verizon FiOS and Comcast cable networks both companies charge high prices and are happy with high-margin duopoly competition. There is no reason to think these big ISPs won’t do the same with 5G.

I look around and I don’t see any level playing fields – particularly not any that give small competitors any advantages over the big ISPs. I do, however, so scads of regulatory rules that provide unequal protection for the big ISPs, and with the current FCC that list of advantages is expanding quickly. The big ISPs don’t really want a level playing field because they don’t want actual competition. There are many reasons why other countries have far more last-mile fiber deployed than the US – but one of the biggest reasons are regulatory rules here that protect the big ISPs.

Categories
The Industry

Using Gigabit Broadband

Mozilla recently awarded $280,000 in grants from its Gigabit Communities Fund to projects that are finding beneficial uses of gigabit broadband. This is the latest set of grants and the company has awarded more than $1.2 million to over 90 projects in the last six years. For any of you not aware of Mozilla, they offer a range of open standard software that promotes privacy. I’ve been using their Firefox web browser and operating software for years. As an avid reader of web articles I daily use their Pocket app for tracking the things I’ve read online.

The grants this year went to projects in five cities: Lafayette, LA; Eugene, OR; Chattanooga, TN; Austin, TX; and Kansas City. Grants ranged from $10,000 to $30,000. At least four of those cities are familiar names. Lafayette and Chattanooga are two of the largest municipally-owned fiber networks. Austin and Kansas City have fiber provided by Google Fiber. Eugene is a newer name among fiber communities and is in the process of constructing an open access wholesale network, starting in the downtown area.

I’m not going to recite the list of projects and a synopsis of them is on the Mozilla blog. The awards this year have a common theme of promoting the use of broadband for education. The awards were given mostly to school districts and non-profits, although for-profit companies are also eligible for the grants.

The other thing these projects have in common is that they are developing real-world applications that require robust broadband. For example, several of the projects involve using virtual reality. There is a project that brings virtual reality to several museums and another that shows how soil erosion from rising waters and sediment mismanagement has driven the Biloxi-Chitimacha-Choctaw band of Indians from the Isle de Jean Charles in Louisiana.

I clearly remember getting my first DSL connection at my house after spending a decade on dial-up. I got a self-installed DSL kit from Verizon and it was an amazing feeling when I connected it. That DSL connection provided roughly 1 Mbps, which was 20 to 30 times faster than dial-up. That speed increase freed me up to finally use the Internet to read articles, view pictures and shop without waiting forever for each web site to load. I no longer had to download software updates at bedtime and hope that the dial-up connection didn’t crap out.

I remember when Google Fiber first announced they were going to build gigabit networks for households. Gigabit broadband brings that same experience. When Google Fiber announced the gigabit fiber product most cable networks had maximum speeds of perhaps 30 Mbps – and Google was bringing more than a 30-times increase in speed.

Almost immediately we heard from the big ISPs who denigrated the idea saying that nobody needs gigabit bandwidth and that this was a gimmick. Remember that at that time the CEO of almost every major ISP was on the record saying that they provided more than enough broadband to households – when it was clear to users that they didn’t.

Interestingly, since the Google Fiber announcement the big cable companies have decided to upgrade their own networks to gigabit speeds and ISPs like AT&T and Verizon rarely talk about broadband without mentioning gigabit. Google Fiber reset the conversation about broadband and the rest of the industry has been forced to pay heed.

The projects being funded by Mozilla are just a few of the many ways that we are finding applications that need bigger broadband. I travel to communities all over the country and in the last year I have noticed a big shift in the way that people talk about their home broadband. In the past people would always comment that they seemed to have (or not have) enough broadband speed to stream video. But now, most conversations about broadband hit on the topic of using multiple broadband applications at the same time. That’s because this is the new norm. People want broadband connections that can connect to multiple video streams simultaneously while also supporting VoIP, online schoolwork, gaming and other bandwidth-hungry applications. I now routinely hear people talking about how their 25 Mbps connection is no longer adequate to support their household – a conversation I rarely heard as recently as a few years ago.

We are not going to all grow into needing gigabit speeds for a while. But the same was true of my first DSL connection. I had that connection for over a decade, and during that time my DSL got upgraded once to 6 Mbps. But even that eventually felt slow and a few years later I was the first one in my area using the new Verizon FiOS and a 100 Mbps connection on fiber. ISPs are finally facing up to the fact that households are expecting a lot of broadband speed. The responsive ISPs are responding to this demand, while some bury their heads in the sand and try to convince people that their slower broadband speeds are still all that people need.

Exit mobile version