Generations Matter

Nielsen recently published their first quarter Total Audience Report for Q1 2017. It’s the best evidence that I’ve seen yet that there is a huge difference between generations when it comes to video viewing habits. Compared to most surveys that look at a few thousand people, these statistics are based on almost 300,000 households.

The report examined in detail the viewing habits of the different US generations – Generation Z (ages 2 – 20), Millennials (ages 21 – 37), Generation X (ages 38 – 52), Baby Boomers (ages 53 – 70) and the Greatest Generation (ages 71+). What might surprise a lot of people is that Generation Z and the Millennials together now make up 48% of the US population – and that means their viewing habits are rapidly growing in importance to the cable TV industry.

The report outlines how the various generations own or use various devices or services. But note that these responses represent the entire household. So, for example, when Nielsen sought answers from somebody in generation Z it’s likely that the answers represent what is owned by their parents who are likely a millennial or in generation X. Here are a few interesting statistics:

  • The broadband penetration rate between generations is about the same, ranging from 82% to 85% of households. It wasn’t too many years ago when the baby boomer households lagged in broadband adoption.
  • There is a significant difference in the use of OTT services like Netflix. 73% of homes representing generation Z subscribe to an OTT service, but only 51% of baby boomer only households.
  • Baby boomers also lag in smartphone adoption at 86% with the younger generations all between 95% and 97% adoption.
  • Baby boomers also lag in the adoption of an enabled smart TV (meaning it’s connected to the web). 28% of baby boomers have an enabled smart TV while younger households are at about 39%.

The biggest difference highlighted in the report is the daily time spent using various entertainment media that includes such things as TV, radio, game consoles, and surfing the Internet.

The big concern to the cable industry is the time spent watching cable content. For example, the average monthly TV viewing for those over 65 is 231 hours of live TV and 34 hours of time-sifted TV. But for people aged 12-17 that is only 60 hours live and 10 hours time-shifted. For ages 18-24 it’s 72 hours live and 12 hours time-shifted. For ages 25-34 it’s 101 hours live and 19 hours time-shifted. This is probably the best proof of how much less your generations are now in traditional TV.

This drastic difference for TV stands out because for other kinds of media there is not such a stark difference. For example, those over 65 spend about 67 hours per month using apps on smartphones while those 18-24 use 77 hours and those 25-34 use 76 hours.

There even wasn’t a drastic difference in the number of hours spent monthly watching video on a smartphone with those over 65 watching 2 hours per month compared to 7 hours for those 18-24 and 6 hours for those 25-34.

The only other media with a stark difference is video game consoles with those over 65 using 13 hours per month while those 18-24 use 49 hours per month. Other things like listening to the radio or using a multimedia device (like Roku or Apple TV) are similar across generations.

The drastic difference in TV viewing has serious repercussions for the industry. For example, TV is no longer a medium to be used to reach those aged 18-24 since they watch TV over 180 hours less per month than those over 65. We’re seeing a big shift in advertising dollars and during the last year the amount spent on web advertising surpassed TV advertising for the first time. When you trend this forward a decade it spells bad news for the broadcasting and cable industries. For many years there was a big hope that as people get older that they would revert to the usage patterns of their parents. But the evidence shows that the opposite seems to be true – that kids keep their viewing habits as they grow older.

When you compare this report to earlier ones it’s obvious that the difference between generations is widening. Just comparing to 2016 those over 65 are watching more TV each month while the youngest generations are cutting back on TV over time – Generation Z watched 15 minutes less TV per day just since 2016.

Pent-up Customer Demand

I’ve recently read several articles that talk about how the new ‘unlimited’ cellular plans have increased data demands. One article quoted analyst Chetan Sharma who pointed to research done by Opanga Networks that show that Verizon’s daytime data traffic has doubled since the introduction of the unlimited cellular data plans.

These plans aren’t really unlimited, but have increased the monthly data caps to much higher levels of 20 Gigabytes or more per month. For the average cellular user this is a large enough increase to allow them to stop self-limiting their cellular data usage. It finally frees customers to use their cellphones in the way they want.

This phenomenon was expected and is familiar to any network owner who has ever done a major broadband network upgrade. I’ve worked with a number of companies over the years that have improved customer broadband and they always see a similar surge in customer broadband usage. For example, companies that have made the transition from DSL to fiber have seen this same immediate surge in customer use of the network.

But it doesn’t take a network upgrade to experience this kind of surge. I’ve had customers that operate fiber networks that have had the same phenomenon when they increased network speeds. When one of my clients moved their basic broadband product from 10 Mbps to 50 Mbps they experienced almost the same thing as Verizon.

This surge comes from freeing pent-up customer demand for broadband. Customers limit their data usage when their broadband connection isn’t fast enough. For example, with a slow broadband connection they quickly learn that they can’t watch two different video streams simultaneously. Or parents might not let their kids game online while somebody else is watching streaming video. Customers quickly understand that slow download speeds impede their ability to do multiple things at the same time. And they learn to curtail their broadband usage accordingly.

But when customers find they can do multiple things at the same time they do so. They begin to use the broadband for anything they want to do and they stop curtailing usage. When a lot of customers discover they are no longer throttled then network owner experiences an immediate surge in broadband usage. Customers will use broadband in multiple ways simultaneously in the evenings. They will begin watching HD video instead of SD video. They will subscribe to OTT video services for the first time.

But speed is not the only thing that curtails customer usage. In the case of the unlimited wireless data plans it is the fear of exceeding a costly data cap that curtails usage. The same thing can happen for home broadband usage that has data caps – customers consciously don’t use bandwidth to avoid getting higher monthly bills.

There is an interesting thing that always happens following these data surges when customers are freed to do what they want. The amount of usage surges higher, like Verizon’s doubling, and then it flattens out at a higher usage level.

It’s been well known that home broadband usage, both in terms of desired speeds and total monthly downloads, has been doubling every three years for decades. Any time that customer broadband usage is somehow capped or curtailed, customers will catch up to this original curve and will start looking like other customers that don’t have broadband restrictions.

ISPs need to be aware of this phenomenon. I still know of numerous fiber-to-the-home networks that have base data products of 10 Mbps or 20 Mbps. The owners of these networks are squelching their customers’ usage and they are dictating to customers what they can and cannot do.

The bigger ISPs understand this. The cable companies have kept ahead of the customer broadband demand curve by unilaterally increasing data speeds. In many markets the base broadband product is now at least 60 Mbps – higher than the FCC definition of broadband and higher than what most customers need today. Cable companies have learned that giving customers a little more broadband than they need stops most complaints about broadband.

Little ISPs and fiber network owners need to understand this as well. There is not a lot of excuse on a gigabit-capable network for a fiber-owner to limit customers to speeds under 25 Mbps. Their base product ought to be at least as fast as what the big cable companies offer.

I know it is fear of having a surge in network usage that stops a lot of network owners from increasing speeds. I think a lot of them also don’t fully grasp the real implications of broadband demand constantly growing in a geometric manner. When a network owner first set speeds at 10 Mbps that might have been a great speed – but it’s now holding back customers from using the data product they are paying for. I always ask network owners the question – why did you build a fiber network if you don’t want customers to use all of the broadband they want?

Decommissioning Rural Copper, Part 2

In the last blog I wrote about my belief that AT&T and Verizon want out of the rural wireline business. They both have plans to largely walk away from their rural copper networks and replace landline copper services with cellular service. Today I want to talk about what regulators ought to do with those networks.

When these two giant telcos walk away from rural copper they will inevitably harm rural America. While many homes will get the ‘privilege’ of now buying highly-priced cellular-based broadband, other homes are going to find themselves without telephone service if they happen to live in one of the many cellular dead zones. Such homes will not only be unable to benefit from cellular broadband, but if they have poor cell service they will find themselves cut off from voice communications as well.

As somebody who has traveled extensively in rural America I can tell you that there are a lot more cellular dead zones than people realize. And it’s not only farms, and there are county seats in rural America where it’s difficult to get a working cellphone signal inside of buildings.

As part of this transition both companies are going to walk away from a huge amount of existing copper cable. I think this copper cable is an incredibly valuable asset and that regulators ought not to allow them to tear it down.

The copper wire network today goes almost everywhere in rural America. Congressional laws and FCC policies led to most homes in the country getting access the the copper network. These copper wires occupy a valuable space on existing telephone poles – on the majority of rural poles the only two wires are the power lines at the top and the telephone wires at the bottom.

If these copper wires are kept in place they could greatly reduce the cost of building rural fiber. It is far cheaper when building fiber to ‘lash’ the fiber onto an existing set of cables than to hang fiber from scratch. It was this construction technique that allowed Verizon to build a lot of its FiOS fiber network – they lashed fiber onto existing telephone wires. And my guess is that when Verizon decommissions urban copper they are still going to leave a lot of the copper wires in place as a guidewire for their fiber.

If these telcos are going to walk away from these copper wires, then they ought to be required to keep them in place for use by somebody else to hang fiber. Many states might force the big telcos to tear down the copper wires since they will eventually create safety hazards as they break away from poles if they aren’t maintained. But if somebody else is willing to take over that maintenance then it shouldn’t be an issue.

I can picture a regulatory process whereby some other carrier is allowed to come in and ‘claim’ the abandoned wires once they are empty of customers. That would provide fiber overbuilders or rural communities to claim this copper as an asset.

There is some salvage value to copper wires and and it’s possible, but not probable that the value of the copper could exceed the cost to tear it down. So I can see the telcos fighting such an idea as a confiscation of their assets. But these rural wires have been fully depreciated for decades and the telcos have earned back the cost of these copper lines many times over. I believe that by the act of abandoning the wires and depriving some homes of wireline service that the big telcos will have forfeited any rights they might have to the remaining assets.

Anybody claiming the abandoned copper could use it in two ways. First, in many cases there is still existing life left in the copper, as witnessed by Frontier and CenturyLink rehabbing old rural copper with upgraded DSL. Local communities or small carriers could use the copper to bring the better services that the big telcos have refused to do over the last few decades.

But more importantly these wires represent the cheapest path forward for building rural fiber. Anybody taking over the old copper can save a lot of fiber construction costs by lashing fiber onto the existing copper. If our nationwide goal is really to get better broadband to rural America, then offering abandoned copper to fiber builders might be one of the easiest tools available to help the process along.

The big telcos abandoned rural America dacades ago. They stopped doing routine maintenance on rural copper and slashed the number of rural technicians. They now want to walk away from that copper and instead force rural America to buy cellular services at inflated prices. We owe it to the folks who paid for this copper many times over to get some benefit from it and to offer an alternative to the new rural cellular monopolies.

Decommissioning Rural Copper

I’ve been watching AT&T and Verizon since I’ve been in the industry (including a short stint at Southwestern Bell in the early 80s). We are about to see both of these companies unravel their rural telco properties.

Verizon got ahead of the curve and has been selling off rural properties for a few decades, many of which ending up with Frontier. Verizon still serves some rural areas and probably has shed  half of their rural customers. But there are still big swaths or rural Verizon customers in Pennsylvania, New York, Maryland and other northeastern states. Verizon benefitted from these sell-offs by selling completely depreciated and poorly maintained networks at high prices – as can be evidenced by how much Frontier is struggling to cover their massive debts. AT&T has sold almost no rural properties and still serves gigantic rural areas in dozens of states.

Both companies are clearly on a path to tear down the remaining rural copper networks and replace them with cellular wireless networks. There are both pros and cons for these transitions for rural customers.

On the plus side, many of these rural areas have never had broadband since these big telcos never extended their DSL to their rural service areas. We know that they could have extended DSL, because we have hundreds of examples of independent telephone companies that brought DSL to all of their customers, no matter how remote. But the big companies stopped spending money on rural properties decades ago. The remaining copper is now in terrible shape and one has to imagine that cellular voice is probably often as good or better than voice over these old copper lines.

There will now many customers who can buy fixed cellular broadband. This uses the same frequencies as the broadband for smartphones, but the cellular companies are pricing it to be a little less expensive. For many households the fixed-cellular broadband will be the first real broadband alternative they have ever had.

But there are also big downsides to this shift from old copper to cellular networks. First, cellular networks are effective for only a few miles from any given cell site. Anybody who has driven in rural America knows that there are cellular dead spaces everywhere. Any customers living in the cellular dead spaces are going to be left with no communications to the outside world. They’ll lose their copper and they won’t have cellular voice or data. This will be a huge step backwards for many homes.

The big telcos will be taking advantage of the fact that, as a cellular provider, they have no obligations to try to serve everybody. One of the reasons that we had nearly ubiquitous telephone coverage in the country is that telcos were the carriers of last resort in their service areas. They were required by law to extend telephone service to all but extremely remote customers. But that obligation doesn’t apply to a cellular carrier. We already have tons of evidence that the cellular carriers make no apologies to homes that happen to live out of range of their cellular towers. With no copper landlines left we will now have rural communications dead zones. It will be hard for anybody living in these dead zones to stay there and certainly nobody is going to build new homes in a place that doesn’t have cellular service.

There is a downside even for those households that get fixed-cellular broadband. The speeds on this service are going to be slow by today’s standards, in the range of 10 – 15 Mbps for those that live relatively close to a cellular tower, but considerably slower for customers at greater distances. The real downside to getting cellular data is that the speeds are not likely to get better in rural America for many years, even decades. The whole industry is abuzz with talk about 5G cellular making a big difference, but it’s hard to see that technology making much impact in rural areas.

I think this transition away from copper is going to catch a lot of rural people by surprise. These two big telcos have already started the process of decommissioning copper and once that gets full FCC approval the speed of decommissioning copper is likely to soon accelerate. I think a lot of homes are going to be surprised when they find out that the telcos no longer have an obligation to serve them.

Regulating Online Video Content

Recently the Kommission für Zulassung und Aufsicht der Medienanstalten (ZAK) – the German equivalent of our FCC – recently concluded that OTT services ought to be regulated the same way as other broadcast radio and television networks. Specifically they were looking at Twitch.tv the web gaming service, but the ruling could have far-reaching consequences.

I think the ruling raises two questions. First, should any regulatory body be regulating video content on the Internet? Second, why are we still heavily regulating cable TV?

The European press is lambasting the order as nothing more than a money grab. One of the benefits of regulating anything is to charge fees for that regulation. Like many regulatory bodies around the world the ZAK is largely funded by fees charged to the companies that it regulates (which is also largely true for the FCC as well). This means that regulators have a perverse incentive to regulate things, even if they don’t need to be regulated.

The idea of regulating a worldwide web ‘channel’ like a TV station is absurd. For those of you that may not know about Twitch.tv, it’s the primary gaming network for worldwide gamers. It’s owned by Amazon. It’s a huge platform and works like YouTube where over 17,000 ‘partners’ post gaming content into ‘channels.’ The platform averages 625,000 simultaneous viewers at any given time, making it one of the most popular web platforms in the world.

So regulating Twitch.tv would be the same as regulating YouTube. It’s a platform where virtually all of its content is created by others. Other than extracting fees from the platform for the privilege of regulating it, it’s hard to understand what else the ZAK could regulate. Twitch.tv and YouTube are open platforms and only function because they allow anybody to post content. Both platforms will take down offensive content or content that violates copyrights if they are asked to do so. But the platforms, by definition of the way they operate, have no control of the content that is posted. I’m at a total loss what the ZAK thinks they can regulate.

You have to also wonder how effective any regulation would be. There are a huge number of smaller web platforms that might fall into the same category as Twitch.TV. It’s hard to imagine anybody being able to launch a new platform if they are expected to comply with different rules in a hundred countries. But it’s also hard to envision the ZAK doing anything other than somehow trying to ban the content from the whole country of a platform that refuses to comply with their regulations. I don’t think the ZAK understands the political ramifications of banning a platform used by all the young tech-savvy programmers (and hackers) in their country!

But thinking about this makes me ask why we are still regulating cable companies in the US. There are slews of FCC rules that dictate things like channel line-ups. It’s FCC rules that force cable companies to still offer basic, expanded basic, and premium tiers of service. It’s now pretty clear that few consumers are happy with this structure. The average household only watches about a dozen channels monthly regardless of the size of the tiers they purchase. It is the requirement for these tiers that has allowed the programmers to force programs onto cable companies that they don’t really want.

It is the cable tiers that have forced up the price of cable. Households spend huge monthly bills to watch a dozen channels – all because the regulations force channel line-ups that contain a hundred or more channels that the household isn’t interested in.

And cable companies are now competing against companies that don’t have these same restraints. Companies like SlingTV can put together any channel line-up they want with no regulatory constraints telling them what they can or can’t offer. Surveys have always shown that people would rather buy just those channels that they want to watch. And yet cable companies in the US are not allowed to compete head-on with OTT providers.

It would be easy to blame the FCC for not keeping up with the times. However, the most draconian cable rules come directly from Congress and the FCC’s hands are tied from deviating from rules that are embedded in law. We are now at a time when we really need to consider these old rules. The cable companies are being forced to sell programming that customers don’t want to pay for. The whole industry would benefit if cable companies were free to pursue packages that people actually want to buy. Freeing up all video providers to offer what customers want is a far better solution than trying to drag web companies into becoming regulated cable companies.

How Much Speed Do We Really Need?

There is a lot of buzz floating around in the industry that the FCC might lower the official definition of broadband from 25 Mbps down and 3 Mbps up. Two of the current FCC commissioners including the chairman opposed setting that definition a few years back. Lowering the speeds would let the FCC off the hook for the requirement by law to make sure that the whole country can get broadband. If they lower the definition, then voila, millions more Americans would be declared to have adequate broadband.

So today I thought I’d take a look at the download speeds we really need at our homes. You may recall that back when the FCC set the 25/3 Mbps definition that they made a list of the broadband speed needed to do typical activities. And in doing so they tried to create profiles of some typical American households. That attempt was awkward, but it was a good starting point for examining household bandwidth needs. I’m updating their list a bit for things that people do today, which is already different than just a few years ago. Consider the following web activities:

  • Web Background 5 Mbps
  • Web Browsing 1 – 2 Mbps
  • Online Class 1 – 2 Mbps
  • Social Media 1 – 2 Mbps
  • Streaming Music 3 Mbps
  • Voice over IP 2 Mbps
  • SD Video stream 1 – 3 Mbps
  • HD Video Stream 4 – 6 Mbps
  • 4K Video Stream 15 – 20 Mbps
  • Gaming 1 – 3 Mbps
  • Skype / Video Conference 1 – 3 Mbps
  • Big File Downloader 50 Mbps

People don’t agree with all of these listed speeds because there are no standards for how the web works. For example, by using different compression schemes a video stream from Netflix is not identical to one from Amazon. And even from one source there is variation since an action move takes more bandwidth than something like a stand-up comedy routine.

It’s important to remember that broadband demand can come from any device in your house – desktop, laptop, smartphone, tablet, etc. It’s also important to note that these are speed requirements for a single user. If two people in the house are watching an separate video, then you have to double the above number.

What the FCC failed to consider back when they set the speed definition is that households need enough bandwidth to handle the busiest times of the day. What matters is the number of simultaneous activities a home can do at the same time on the web, with most families being busiest in the evenings. There might be somebody on social media, somebody watching an HD movie, while somebody else is doing homework while also using a smartphone to swap pictures.

There is another issue to consider when trying to do simultaneous tasks on the Internet – packet loss. The connection between the ISP and a customer gets more congested when it’s trying to process multiple data streams at the same time. Engineers describe this as packet collision – which sounds like some kind of bumper-car ride – but it’s an apt way to describe the phenomenon. Most home routers are not sophisticated enough to simultaneously handle too many multiple streams at once. Packets get misdirected or lost and the router requests the missing packets to be sent again from the originator. The busier the router, the more packet interference. This is also sometimes called ‘overhead’ in the industry and this overhead can easily grow to 15% or more of the total traffic on a busy connection, meaning it takes 15% more bandwidth to complete a task than if that task was the only thing occurring on the broadband connection.

There is another kind of interference that happens in homes that have a WiFi network. This is a different kind of interference that has to do with the way that WiFi works. When a WiFi network gets multiple requests for service, meaning that many devices in the home are asking for packets, the WiFi router gets overwhelmed easily and shuts down. It then reinitiates and sends packets to the first device that gets its attention. In a busy network environment the WiFi router will shut down and restart constantly as it tries to satisfy the many needed devices. This kind of interference was designed into the WiFi specification as a way to ensure that WiFi could satisfy the needs of multiple devices. This WiFi overhead can also easily add 15% or more to the network demand.

Anybody who lives in a home with active users understands how networks can get overwhelmed. How many of you have been frustrated trying to watch a movie when others in the house are using the Internet? Even big bandwidth can be overwhelmed. I have a friend who has a 100 Mbps fiber connection on Verizon FiOS. He went to watch a video and it wouldn’t stream. He found that his two teenage sons were each using half a dozen gaming streams at the same time and had basically exhausted his fast bandwidth pipe.

The FCC can tinker with the official definition of broadband since that is their prerogative. But what they can’t do is to define for any given home how much bandwidth they really need. The funny thing is that the big ISPs all understand this issue. The cable companies have unilaterally increased speeds across-the-board to urban customers several times in recent years and in most markets offer speeds considerably faster than the current FCC definition of broadband. These ISPs know that if they were only delivering 25 Mbps that they would be overwhelmed with customers complaining about the connection. Those complaints are the real proof of how much bandwidth many homes need. If the FCC lowers the definition of broadband then they have on blinders and are ignoring how homes really use broadband today. If they lower the speed definition it’s hard to see it as anything other than a political move.

Another Comcast Bundle

Comcast just announced that they will be bundling solar panels with their other services in selective markets. This adds to the already-largest bundle of products in the industry and is one that many competitors will have a problem keeping up with.

Comcast has been doing a trial with Sunrun, a solar panel maker from San Francisco. Comcast found during this test that their customer satisfaction and customer retention rates rose significantly with customers who bought the solar panels. Comcast has now entered into an exclusive 40-month marketing deal with the company. It’s been reported that Comcast will get 10% of Sunrun’s stock if they can install 60,000 solar customers. Comcast has committed to spend $10 million on sales and marketing for the solar panels and will get a share of the customer revenue from the product.

Sunrun currently has about 150,000 solar installations in 22 states. Comcast has over 27 million potential solar customers. The cable company also has over 1 million home automation customers, which Comcast believes will be their best market for the new solar product.

Even before this announcement Comcast has become a fierce competitor. Comcast’s CEO Brian Roberts recently said that as he looked around the industry that he didn’t see any products of interest that the company doesn’t already have – a claim no other ISP can make.

This announcement falls on the heels of Comcast’s decision to get into the cellular business. They are now marketing in a few markets with prices lower than Verizon and AT&T and plan to eventually roll this out to their whole footprint. They also just bought a pile of spectrum that will help them increase margins on cellular service. Analysts say that over five years that Comcast could capture as much as 30% of the cellphone business in their markets.

Comcast says it is tackling both of these product lines to reduce churn and to increase customer stickiness. They understand that long-time customers are their most profitable customers and they are putting together bundle options that ought to please a lot of households.

All of their effort looks to be paying off. Comcast is the only cable company that gained cable TV customers for the year just ended in the second quarter. They gained 120,000 customers while the rest of the industry is now bleeding cable customers at an average rate of 2.5% of total customers per year. While the bundles are probably not the only reason for that it’s hard to argue with this success.

Comcast has done a lot of other things to increase customer satisfaction. They created Comcast Labs (similar to Bell Lab). This group of scientists and engineers are concentrated largely on developing products that improve the customer experience. This group developed the X1 settop box which has rave reviews from customers. It’s so popular that Comcast is now selling this box to other monopoly cable providers. The settop box has an ever-growing number of features and can be voice-activated. Comcast has also integrated Netflix and Sling TV into their settop box to keep customers on their box and platform.

Comcast has also found great success with their smart home product. This is probably the most robust such product on the market and includes such things as security and burglar alarms, smart thermostat, watering systems, smart blinds for energy control, security cameras, smart lights, smart door locks, etc. Their product suite can be easily monitored from the settop box or from a smartphone app. The press releases from the Sunrun announcement is the first time in a while that we’ve heard about their success and the million plus customers using these products.

The company still has a lousy reputation for customer service and most of their customers dread having to call them. But they are supposedly putting a lot of money into making their customer service better. They recently began moving a lot of customer service back to the US, finally understanding that the cost savings of using foreign reps is not worth the customer dissatisfaction.

The flip side to making customers more sticky is that it makes it that much harder for a competitor to take their customers. Somebody buying a solar panel on a long-term payment plan is not likely to leave them for a competitor, particularly if there are financial penalties for doing so. Customers with a suite of home automation products become locked in unless they are willing to yank all of the monitors out and start over. Bit by bit Comcast is shielding their most lucrative customers from being poached by others.

Do We Really Need Gigabit Broadband?

I recently read an article in LightReading titled “All That’s Gigabit Doesn’t Glitter.” The article asks the question if the industry really needs to make the leap to gigabit speeds. It talks about the industry having other options that can satisfy broadband demand but that telco executives get hooked into the gigabit advertising and want to make the gigabit claim. A few of the points made by the article are thought-provoking and I thought today I’d dig deeper into a few of those ideas.

The big question of course is if telco providers need to be offering gigabit speeds, and it’s a great question. I live in a cord cutter family and I figure that my download needs vary between 25 Mbps and 50 Mbps at any given time (look forward to a blog soon that demonstrates this requirement). I can picture homes with more than our three family members needing more since the amount of download speed needed is largely a factor of the number of simultaneous downloads. And certainly there are people who work at home in data intensive jobs that need far more than this.

There is no doubt that a gigabit is a lot more broadband than I need. If we look at my maximum usage need of 50 Mbps then a gigabit is 20 times more bandwidth capacity than I am likely to need. But I want to harken back to our broadband history to talk about the last time we saw a 20-fold increase in available bandwidth.

A lot of my readers are old enough to remember the agony of working on dial-up Internet. It could take as much as a minute at 56 kbps just to view a picture on the Internet. And we all remember the misery that came when you would start a software update at bedtime and pray that the signal didn’t get interrupted during the multi-hour download process.

But then along came 1 Mbps DSL. This felt like nirvana and it was 20 time faster than dial-up. We were all so excited to get a T1 to our homes. And as millions quickly upgraded to the new technology the services on the web upped their game. Applications became more bandwidth intensive, program downloads grew larger, web sites were suddenly filled with pictures that you didn’t have to wait to see.

And it took a number of years for that 1 Mbps connection to be used to capacity. After all, this was a 20-fold increase in bandwidth and it took a long time until households began to download enough simultaneous things to use all of that bandwidth. But over time the demand for web broadband kept growing. As cable networks upgraded to DOCSIS 3.0 the web started to get full of video and eventually the 1 Mbps DSL connection felt as bad as dial-up a decade before.

And this is perhaps the major point that the article misses – you can’t just look at today’s needed usage to talk about the best technology. Since 1980 we’ve experienced a doubling of the amount of download speeds needed by the average household every three years. There is no reason to think that growth is stopping, and so any technology that is adequate for a home today is going to feel sluggish in a decade and obsolete in two decades. We’ve now reached that point with older DSL and cable modems that have speeds under 10 Mbps.

The other point made by the article is that there are technology steps between today’s technology and gigabit speeds. There are improved DSL technologies and G.Fast that could get another decade out of embedded copper and could be competitive today.

But it’s obvious that the bigger telcos don’t want to invest in copper. I get the impression that if AT&T found an easy path to walk away from all copper they’d do so in a heartbeat. And none of the big companies have done a good job of maintaining copper and most of it is in miserable shape. So these companies are not going to be investing in G.Fast, although as a fiber-to-the-curb technology it would be a great first step towards modernizing their networks to be all-fiber. CenturyLink, AT&T and others are considering G.Fast as a technology to boost the speeds in large apartment buildings, but none of them are giving any serious consideration of upgrading residential copper plant.

It’s also worth noting that not all companies with fiber bit on the gigabit hype. Verizon always had fast products on their FiOS and had the fastest speed in the industry of 250 Mbps for many years. They only recently decided to finally offer a gigabit product.

And this circles back to the question of whether homes need gigabit speeds. The answer is clearly no, and almost everybody offering a gigabit product will tell you that it’s still largely a marketing gimmick. Almost any home that buys a gigabit would have almost the same experience on a fiber-based 100 Mbps product with low fiber latency.

But there are no reasonable technologies in between telephone copper and fiber. No new overbuilder or telco is going to build a coaxial cable network and so there is no other choice than building fiber. While we might not need gigabit speeds today for most homes, give us a decade or two and most homes will grow into that speed, just as we grew from dial-up to DSL. The gigabit speed marketing is really not much different than the marketing of DSL when it first came out. My conclusion after thinking about this is that we don’t need gigabit speeds, but we do need gigabit capable networks – and that is not hype.

What’s the Next FTTP Technology?

There is a lot of debate within the industry about the direction of the next generation of last mile fiber technology. There are three possible technologies that might be adopted as the preferred next generation of electronics – NG-PON2, XGS-PON or active Ethernet. All of these technologies are capable of delivering 10 Gbps streams to customers.

Everybody agrees that the current widely deployed GPON is starting to get a little frayed around the edges. That technology delivers 2.4 Gbps downstream and 1 Gbps upstream for up to 32 customers, although most networks I work with are configured to serve 16 customers at most. All the engineers I talk to think this is still adequate technology for residential customers and I’ve never heard of a neighborhood PON being maxed out for bandwidth. But many ISPs already use something different for larger business customers that demand more bandwidth than a PON can deliver.

The GPON technology is over a decade old, which generally is a signal to the industry to look for the next generation replacement. This pressure usually starts with vendors who want to make money pushing the latest and greatest new technology – and this time it’s no different. But after taking all of the vendor hype out of the equation it’s always been the case that any new technology is only going to be accepted once that new technology achieves and industry-wide economy of scale. And that almost always means being accepted by at least one large ISP. There are a few exceptions to this, like what happened with the first generation of telephone smart switches that found success with small telcos and CLECs first – but most technologies go nowhere until a vendor is able to mass manufacture units to get the costs down.

The most talked about technology is NG-PON2 (next generation passive optical network). This technology works by having tunable lasers that can function at several different light frequencies. This would allow more than one PON to be transmitted simultaneously over the same fiber, but at different wavelengths. But that makes this a complex technology and the key issue is if this can ever be manufactured at price points that can match other alternatives.

The only major proponent of NG-PON2 today is Verizon which recently did a field trial to test the interoperability of several different vendors including Adtran, Calix, Broadcom, Cortina Access and Ericsson. Verizon seems to be touting the technology, but there is some doubt if they alone can drag the rest of the industry along. Verizon seems enamored with the idea of using the technology to provide bandwidth for the small cell sites needed for a 5G network. But the company is not building much new residential fiber. They announced they would be building a broadband network in Boston, which would be their first new construction in years, but there is speculation that a lot of that deployment will use wireless 60 GHz radios instead of fiber for the last mile.

The big question is if Verizon can create an economy of scale to get prices down for NG-PON2. The whole industry agrees that NG-PON2 is the best technical solution because it can deliver 40 Gbps to a PON while also allowing for great flexibility in assigning different customers to different wavelengths. But the best technological solution is not always the winning solution and the concern for most of the industry is cost. Today the early NG-PON2 electronics is being priced at 3 – 4 times the cost of GPON, due in part to the complexity of the technology, but also due to the lack of economy of scale without any major purchaser of the technology.

Some of the other big fiber ISPs like AT&T and Vodafone have been evaluating XGS-PON. This technology can deliver 10 Gbps downstream and 2.5 Gbps upstream – a big step up in bandwidth over GPON. The major advantage of the technology is that is uses a fixed laser which is far less complex and costly. And unlike Verizon, these two companies are building a lot more FTTH networks that Verizon.

And while all of this technology is being discussed, ISPs today are already delivering 10 Gbps data pipes to customers using active Ethernet (AON) technology. For example, US Internet in Minneapolis has been offering 10 Gbps residential service for several years. The active Ethernet technology uses lower cost electronics than most PON technologies, but still can have higher costs than GPON due to the fact that there is a dedicated pair of lasers – one at the core and one at the customer site – for each customer. A PON network instead uses one core laser to serve multiple customers.

It may be a number of years until this is resolved because most ISPs building FTTH networks are still happily buying and installing GPON. One ISP client told me that they are not worried about GPON becoming obsolete because they could double the capacity of their network at any time by simply cutting the number of customers on a neighborhood PON in half. That would mean installing more cards in the core without having to upgrade customer electronics.

From what everybody tells me GPON networks are not experiencing any serious problems. But it’s obvious as the household demand for broadband keeps doubling every three years that the day will come when these networks will experience blockages. But creative solutions like splitting the PON could keep GPON working great for a decade or two. And that might make GPON the preferred technology for a long time, regardless of the vendors strong desire to get everybody to pay to upgrade existing networks.

The Louisville Pole Attachment Lawsuit

There has been a major legislative push lately to make it easier for wireless companies to get onto poles in order to deploy the small cell sites needed for 5G deployment. AT&T and Verizon have been leading the fight for easier access and there have been attempts at both the federal and state level to enact ‘one-touch’ rules. Proposed legislation not only sets a low price for compensating pole owners, but proposed legislation also removes the ability for pole owners or municipalities to slow down wireless deployments.

There is a lot of debate in the industry about the one-touch issue. As I have discussed in various blogs, issues with getting onto poles is still one of the major roadblocks to many fiber deployments. And from the examples cited by the cellular carriers they are seeing huge delays in deploying urban small cell sites.

Like any debate there are legitimate issues to be considered on both sides of the issues. Proponents of one-touch cite the extraordinary costs of wading through the paperwork-heavy pole attachment process as well as the dollar and cents costs of delaying construction projects.

But on the other side are pole owners and current networks hung on wires. Carriers are legitimately worried about safety issues for their technicians if large boxes the size of refrigerators are hung on poles without constraint. They legitimately worry about how such devices could cause problems during repairs from storm damage. And carriers are also worried about network outages if a new attacher is allowed and able to move their wires without their knowledge or permission.

A court decision a few weeks ago might be a first step into putting some clarity to the issue. In that suit AT&T had sued the City of Louisville in order to stop them from passing a one-touch make-ready ordinance. The ordinance was aimed at making it easier for Google Fiber and other competitive providers to get onto poles in the City. The City of Louisville owns most of the poles in the city and the City has been working with Google Fiber to deploy a fiber network to everybody in the City.

You have to let the irony of AT&T’s lawsuit sink in for a minute. This is a company that is spending millions right now lobbying for one-touch rules. AT&T not only wants to deploy small cell sites, but they are also in the process of building a huge amount of fiber to support those sites. And yet AT&T felt compelled to fight against the very kind of ordinance they are promoting because it would help one of their competitors.

It turns out that not all one-touch ordinances are the same. The ordinances that AT&T and Verizon are pushing are crafted very carefully to help them while still not making it quite so easy for their competitors. The Louisville ordinance made it easier for any new attacher to get onto poles, including AT&T.

The US District Court Judge of Kentucky completely rejected all of AT&T’s claims and tossed the lawsuit. The court basically said that all of AT&T’s claims in the suit were false. It’s ironic that many of the issues raised by the City in defense of the suit sound the same as the claims that AT&T makes elsewhere when lobbying for one-touch legislation.

I’ve always said that being in the regulatory department at AT&T has to be the hardest job in our industry. It’s a company that wears too many hats. AT&T owns a huge monopoly landline network and wants to protect itself from competitors. In some markets AT&T is a major pole owner. AT&T is also a huge wireless company that now wants access to poles. And AT&T is a huge builder of fiber, much of it now outside of its monopoly telco territory.

Any regulatory position the company takes to benefit one of these business lines is likely to not be in the best interest of other parts of the company. When looking at the big picture one has to think that AT&T will get far more benefit than harm from one-touch rules. Such rules will make it a lot easier to build more fiber and to deploy cell sites. And yet, a company with this many tentacles in the industry could not restrain itself from filing a lawsuit that probably was not in its own best long-term interest. The monopoly side of the company felt it could not sit back and let a competitor like Google Fiber build without the company taking steps to slow them down.