ATSC 3.0 – More Spectrum for Broadband?

This past February the FCC approved the voluntary adoption of the new over-the-air standard for ATSC 3.0. for television stations. There will be around twenty different standards included within the final protocol that will define such things as better video and audio compression, picture improvement using high dynamic range (HDR), a wider range of colors, the ability to use immersive sound, better closed captioning, an advanced emergency alert system, better security through watermarking and fingerprinting, and the ability to integrate IP delivery.

The most interesting new feature of the new standard is that it allows programmers to tailor their TV transmission signal in numerous ways. The one that is of the most interest to the telecom world is that the standard will allow a TV broadcaster to compress the existing TV transmission into a tiny slice of the spectrum which would free up about 25 Mbps of wireless bandwidth per TV channel.

A TV station could use that extra frequency themselves or could sell it to others. Broadcasters could use the extra bandwidth in a number of ways. For example, it’s enough bandwidth to transmit their signal in 4K. Stations could also transmit their signal directly to cellphones and other mobile devices. TV stations could instead the extra bandwidth to enhance their transmissions by the addition of immersive sound and virtual reality. They could also use the extra bandwidth to transmit additional digital channels inside one slice of spectrum.

But my guess is that a lot of TV stations are going to lease the spectrum to others. This is some of the most desirable spectrum available. The VHF bands range from 30 MHz to 300 MHz and the UHF bands from 300 MHz to 3 GHz. The spectrum has the desirable characteristics of being able to travel for long distances and of penetrating easily into buildings – two characteristics that benefit TV or broadband.

The first broadcasters that have announced plans to implement ATSC 3.0 are Sinclair and Nexstar. Together they own stations in 97 markets, including 43 markets where both companies have stations. The two companies are also driving a consortium of broadcasters that includes Univision and Northwest Broadcasting. This spectrum consortium has the goal of being able to provide a nationwide bandwidth footprint, which they think is essential for maximizing the economic value of leasing the spectrum. But getting nationwide coverage is going to require adding a lot more TV stations to the consortium, which could be a big challenge.

All this new bandwidth is going to be attractive to wireless broadband providers. One has to think that the big cellular companies will be interested in the bandwidth. This also might be an opportunity for the new cellular players like Comcast and Charter to increase their spectrum footprint. But it could be used in other ways. For instance, this could be used by some new provider to communicate with vehicles or to monitor and interface with IoT devices.

The spectrum could provide a lot of additional bandwidth for rural broadband. It’s likely that in metropolitan areas that the extra bandwidth is going to get gobbled up to satisfy one or more of the uses listed above. But in rural areas this spectrum could be used to power point-to-multipoint radios and could add a huge amount of bandwidth to that effort. The channels are easily bonded together and it’s not hard to picture wireless broadband of a few hundred Mbps.

But this may never come to pass. Unlike WiFi, which is free, or 3.65 GHz, which can be cheaply licensed, this spectrum is likely to be costly. And one of the major benefits of the spectrum – the ability to travel for long distances – is also a detriment for many rural markets. Whoever is using this spectrum in urban areas is going to worry about interference from rural uses of the spectrum.

Of course, there are other long-term possibilities. As companies are able to upgrade to the new standard they will have essentially have reduced their need for spectrum. Since the TV stations were originally given this spectrum to transmit TV signals I can’t think of any reason that they should automatically be allowed to keep and financially benefit from the freed spectrum. They don’t really ‘own’ the spectrum – it was provided to them originally by the FCC to launch television technology. There are no other blocks or spectrum I can think of that are granted in perpetuity.

TV station owners like Sinclair and Nexstar are watering at the mouth over the huge potential windfall that has come their way. I hope, though that the FCC will eventually see this differently. One of the functions of the FCC is to equitably allocate spectrum to best meet the needs of all users of spectrum. If the TV stations keep the spectrum then the FCC will have ceded their spectrum management authority and it will be TV stations that determine the future spectrum winners and losers. That can’t be in the best interests of the country.

Local, State or Federal Regulation?

Last week the FCC clarified its intentions for the Broadband Deployment Advisory Committee (BDAC). This group was tasked with exploring a wide range of topics with the goal of finding ways to lower barriers for broadband deployment.

The BDAC was divided into subgroups with each examining issues such as speeding up access to poles and conduits, or how to streamline the morass of local regulations of such things as rights-of-ways that can slow down fiber deployment.

There has been a huge amount of buzz in the industry since the expectation has been that the FCC would act to impose federal rules that ‘fix’ some of the most important impediments to competition. That expectation was bolstered on several occasions by speeches made by new FCC Chairman Ajit Pai that hinted that the FCC was willing to take steps to lower barriers to broadband deployment.

But FCC Senior Counsel Nicholas Degani just clarified that the FCC’s intentions are not to create new regulations, but rather to create ‘model codes’ that they hope that cities and states around the country will use to make it easier to deploy broadband.

We’ll have to wait a while to see if the FCC really can refrain from issuing new regulations. Chairman Pai has said many times that he is in favor of ‘light touch’ regulation and the agency is in the process of relaxing or undoing many of the regulations from the past. But one thing that I have repeatedly seen from regulators over the years is that they love to regulate. It will take major restraint for the FCC to not try to ‘fix’ the many problems that the BDAC is highlighting. This will be the ultimate test to see if they really are anti-regulation.

Frankly, some of the issues that the BDAC has been exploring cry out for some sort of regulatory relief. For example, in some parts of the country it takes so long and is so expensive to get onto poles that it’s nearly impossible to implement a business plan that needs pole access. And it is extremely frustrating for a national company that deploys fiber everywhere to work with local rules that vary widely from city to city.

Part of what is pushing this effort is the fact that everybody expects a massive investment in new fiber over the next decade as fiber is built to bring bandwidth to homes and as we deploy 5G networks. Everybody recognizes that there are impediments that add delay costs to those deployments.

At the same time that the FCC has been looking at the issues there are numerous state attempts to create state regulatory rules to fix some of these problems. A number of states have already created regulations that are aimed at making it easier to do things like get access to poles. But state efforts vary widely in the motivation for new regulations. There are some states that are looking hard at imposing statewide rules that balance the needs of competitors, network owners and municipalities.

But there are other attempts prompted by the big cellular companies and ISPs to run roughshod over the rights of pole owners and municipalities. These efforts are being driven, in part, by model legislation developed by ALEC and funded by the big companies. Many of these rules are attempting to set low nationwide rates for pole attachments and also to force streamlined timelines that ignore local conditions.

Finally, there are efforts being made by many cities to make it easier to deploy broadband. Most cities understand that they need fiber everywhere to remain competitive with other cities. Yet these efforts are often ineffective because cities, by definition, have a lot of stakeholders to satisfy. When a City looks at changing local rules they end up have to give a lot of weight to issues such as the environment, aesthetics, historic preservation, safety, unions and others that make it impossible to create rules that favor fiber deployment over these other concerns.

Fixing these issues is a problem that may never find the right solution. We live in a country where cities across the board have been granted varying degrees of controlling things like rights-of-way that affect network deployments. Fiber deployment is not the first issue that has come along that has pitted federal, state and local regulators against each other when trying to solve the same problems. It’s not unlikely that if either the FCC or the states try to strongarm cities that we will see a pile of lawsuits challenging any egregious decisions. And that just leads to delays since disputed laws don’t go into effect. I hope we can find solutions that don’t lead to those lawsuits, because the worst kind of regulation is one that is in limbo in some court for years. Nobody is likely to make any significant new investment in that environment.

Availability of Existing Fiber

I work with rural communities all over the country and one of the first thing I am usually asked is to help them figure out how much fiber is already in their community. There seems to be a natural assumption that all fiber is the same and that existing fiber can somehow be used to get better broadband in their area. I always hate to have to tell them that existing fiber is rarely of any benefit to them.

I also have to tell them is that it’s usually exceedingly difficult to find the location of existing fibers. Incumbent telcos, cable companies and electric companies rarely will provide that kind of detail to anybody. This is partly due to security issues, and anything told to a government entity ends up in the public domain.

But even if these companies were willing to provide details of all of their existing fiber, the chances are pretty high that the existing fiber cannot benefit communities in the ways the communities hope. There are a number of reasons for this:

  1. The primary reason for existing fiber lacking benefit for communities is that most existing fiber is part of a private network. This private network could be connecting two telco central offices. It could be connecting a cable TV headend with a neighborhood node. It could be a school network connecting schools. It might be used by the electric company to connect to neighborhood substations. It could be used by a railroad along its tracks. Or it might be used by the highway department to control hazard signs or sensors. And today we are finally seeing a lot of private fiber routes being built to reach cellular towers.
  2. Most fiber in private networks was built for a very specific function, like those listed above. The builders of the fiber designed and built the fiber for that purpose and are generally not very willing to use the fiber for any other purpose. Sharing fiber adds significant risk for a fiber owner. Sharing a fiber brings responsibilities that few of the private network owners are willing to tackle. They are generally content to use the fiber for their own purposes without having to worry about how their use of the fiber might affect somebody else.
  3. Private network owners are also extremely protective about who can have physical access to their fiber. I can’t think of a private network owner that will allow outside technicians to have direct access to their fibers. And this means that if they allow somebody to share their fiber they also have to take on all of the work to connect and maintain those connections. Companies that own rural fiber networks often have labor forces that are already stretched thin and they don’t want to take on this extra burden.
  4. There is another access issue that might be the most important reason to not share a fiber route. It’s likely that the party that wants to share an existing fiber wants to get onto and off of the fiber at different locations than the fiber owner. It’s not cheap or easy to gain access to existing fibers if it wasn’t designed with the needed access points. It is not unusual for a private fiber to be designed with no access points between the two ends of the fiber. It’s fairly easy and economical to add access points during the initial construction process. Handholes or other access points can be added to the fiber to provide future easy connectivity to the fiber. But adding access points to an existing fiber, particularly a buried fiber, can be costly and even risky since you have to dig to gain access to the already-buried fiber. Adding new connections also might mean adding pedestals or even something larger if power is needed at the new fiber junction.
  5. Often the fibers that pass through rural areas are long-haul fibers. These fibers are part of some larger network that connect large geographic areas or creates fiber rings. Long-haul fiber owners rarely will allow local connections to a fiber, because once a fiber is used for a local connection, that fiber can no longer be used to create a path around the larger fiber ring.
  6. The reason that communities find the most frustrating is when they find they are not allowed to use government-built fibers. I’ve often come across school or state government networks that were built with funding that prohibits sharing. I know of a number of state and county government networks that are not allowed to be shared for any commercial purpose – usually a requirement that was imposed by the funding that built the network. These kind of prohibitions often stem from laws in states that don’t want government networks competing with commercial networks. These networks often have large numbers of usable pairs that sit idle and that can’t be used by anybody but the government entity that built the fiber.

My message about the lack of benefits for existing local fiber is often met with incredulity. I have been hired a number of times just to prove that the fibers are not available. It generally only takes a few calls to the typical fiber owners to find that they have no desire to share fiber. But there are exceptions. For example, the large telcos will offer to share fiber if they have the capacity – but this is generally expensive and is part of a pricing scheme the telcos refer to as special access. And once communities understand the cost of special access they are rarely interested in that fiber.

I certainly understand the frustration that comes from finding out that a community might be fairly fiber rich, but that none of that fiber can be used to bring broadband to homes. It seems to fly against logic, but it is usually the market reality.

Broadband and Gaming

I recently saw an interesting statistic that showed that the most popular worldwide video content is gaming. The worldwide gaming video content industry has more than 665 million viewers which makes it bigger than HBO, Netflix, ESPN and Hulu combined. This is a segment of the video industry that I was only peripherally aware of, which I suspect is true for many of you as well.

The GVC (Gaming Video Content) industry is distinct from the on-line playing of games. The GVC content consists of watching others play games along with content that talks about gaming. The industry is estimated to generate $4.6 billion in revenues in 2017. One third of that will come from subscriptions to GVC content along with other direct consumer spending. The rest comes from advertising. There is a whole industry that has sprung up around the GVC content including big conventions and merchandise.

While you can’t characterize such a large group of people, the gamers and GVC viewers are often what you might think of as tech-savvy. In the US the average GVC viewer is around 30, has more education that average and makes a higher than average income of around $58,000. And while you might expect the viewers of GVC content to be largely male a surprising 46% of GVC viewers are female.

Around the world there are numerous video platforms that have been created for gaming content. In the US and Europe the biggest content provider is Twitch. This is a platform that was originally known as Justin.tv. The platform was created in 2007 by Justin Kan and Emmett Shear. The platform allowed users to post live video streams that could be watched by anybody else on the platform. The platform was often used to show pirated live sports feeds, but over time the majority of the content centered around gaming.

Justin.tv was a large content generator and in 2013 – before Netflix really took off – the service said it had 45 million unique viewers and was the fourth largest source of peak Internet traffic in the US. When the biggest competitor to Justin.tv shut down the platform had a near monopoly on gaming content.

The company was renamed to Twitch Interactive and was acquired at the end of 2014 by Amazon. Amazon beefed up the underlying delivery platform, which increased the quality of the streams. Since then Twitch has grown significantly. Amazon reports that the service has over 100 million unique viewers per month, nearly 10 million per day. The average number of simultaneous viewers at any given time on the platform is about 622,000.

Amazon has grown the service by opening up the platform to ‘partners’ much as it has done with OTT content. Twitch now has over 17,000 partners – those that stream unique content. 35% of the content is viewed on cellphones, with the rest on landline broadband connections.

Twitch viewers are loyal. Over half watch the service more than 20 hours per week – and for many of them this is their primary source of video content. The average Twitch user watches the service for 1 hour 46 minutes per day.

While the Twitch platform is free (and I recommend taking a few minutes to watch the above link), many of the channel partners charge monthly subscriptions.

I find it interesting that Twitch is not counted in the universe of OTT providers. But Twitch viewers and statistics are separate from, and not counted with viewers of Amazon Prime. Perhaps this is not considered as OTT content since a lot of the content is viewer-generated. But this is still largely true for YouTube, which is now counted among the OTT providers. Many of the channels on Twitch are now professionally produced and certainly are hard to distinguish from other OTT content.

The GVC industry is worth noting because they are a big source of video content on our broadband networks. The video watched on the web doesn’t just come from sources like Netflix and more and more of it is coming from platforms like Twitch that carry a mountain of viewer-generated content. This is just one more example of how the major programmers are going to be in trouble as the generations turn. Younger viewers are not watching traditional programming to nearly the degree of older generations.

Stats on OTT Viewing

A recent study by comScore examined OTT usage in detail across the country. They studied the OTT viewing habits in 12,500 homes over time across all devices. They looked at 52 OTT services, which collectively account for virtually all of the OTT content available. Their study is the most comprehensive study of OTT that I’ve seen to date.

Not surprisingly Netflix is the largest OTT provider and accounted for 40% of all viewing hours of OTT content. I must admit with all of the hype about Netflix that I thought they would be larger. They were followed by YouTube at 18%, Hulu at 14%, Amazon at 7% and all of the other OTT sources sharing 21%.

When it came to consumer engagement, measured by the amount of time that people watch a given service, the leader is Hulu with the average Hulu household watching over 2.9 hours of their content per day. This was followed by Netflix at 2.2 hours, YouTube at 2.1 hours and Amazon at 2.0 hours per day.

Here are some other interesting statistics generated by the survey:

  • 51 million homes in the US watched OTT content this past April. That is 41% of all homes.
  • The growth of OTT watching is exploding and only 44 million homes watched OTT in October 2016.
  • As you would expect, there is a substantial number of cord-cutters that watch OTT. The types of OTT viewers include 44% that also have a landline cable subscription, 22% that also have a satellite TV subscription, 18% that are pure cord-cutters, and 16% that mix OTT content with free content received through rabbit ears.
  • The average home watched OTT content 49 hours in a month. That viewing was spread on average across 15 viewing days – meaning that most homes don’t watch OTT content every day.
  • As you would expect, cord-cutters households watch OTT for more hours monthly than other households. For example, cord cutters watched Hulu 37 hours per month while other households watched 29. Cord cutters watched Netflix for 36 hours per month compared to 27 hours for other households.
  • OTT viewing largely matches regular TV viewing in that there is a big spike of viewing in the evening prime time hours.
  • However, OTT viewing differs from traditional cable with big spikes on weekends, largely due to binge-watching.
  • The survey shows that 10.1 million households use network TV apps (apps from a programmer such as HBO or ESPN).
  • There is an interesting correlation between the size of a household, the amount of OTT viewing, and whether a family has cut the cord. For cord cutting families, the smaller the size of the household the greater the amount of OTT viewing. But for families that still have a paid-cable subscription it’s reverse.
  • Single-member households are almost 50% more likely than average to be a cord cutter and 24% more likely than average to be a cord-never.
  • Cost of cable subscriptions have always been shown in other surveys as a factor in cord cutting. This survey shows a strong correlation between income and cord-cutting. The survey shows that hourseholds making less than $40,000 per year are cutting the cord at 19% more than average while households making between $75,000 and $100,000 are at 87% of average.
  • Their survey also was able to detail the devices used to watch OTT content on television screens. Of the 51 million homes that watched OTT in April, 38 million homes used a streaming stick / box like Roku, and 28 million homes used a smart TV.
  • The study also detailed penetration rates of streaming boxes / sticks for homes using WiFI: 16% own a Roku, 14% have Amazon Fire; 8% own Google hrome and 6% have AppleTV.
  • Samsung and Vizio are the big players in the smart TV market with shares in WiFi-connected homes of 33% and 30%. LG and Sony were next with 10% and 7% penetration with all other manufactures sharing the remaining 20% of the market.

The survey also analyzed Skinny bundles. They show that 3.1 million homes now have a skinny bundle. 2 million of those homes have SlingTV, with DirecTV Now and PlayStation Vue having most of the other customers. The survey shows that homes with one of these services watch the skinny bundle an average of 5.3 hours per day.

The main takeaway from this survey is a demonstration that OTT viewing has become mainstream behavior.  OTT viewing is now part of the viewing habits of a little over half the of homes in the nation that have an in-home WiFi connection.

 

Quad Bundling

Since Comcast and Charter are now embarking in the cellular business we are soon going to find out if there is any marketing power in a quad bundle. Verizon, and to a smaller degree AT&T, has had the ability to create bundles including cellular service, but they never really pushed this in the marketplace in the way that Comcast is considering.

Comcast has said that the number one reason they are entering the cellular business is to make customers “stickier” and to reduce churn. And that implies offering cellular service cheaper than competitors like Verizon, or to at least create bundles that give the illusion of big savings on cellular. For now, the preliminary pricing Comcast has announced doesn’t seem to be low enough to take the industry by storm. But I expect as they gain customers that the company will find more creative ways to bundle it.

The Comcast pricing announced so far shows only a few options. Comcast is offering a $45 per month ‘unlimited’ cell plan (capped at 20 GB of data per month), that is significantly less expensive than any current unlimited plan from Verizon or AT&T. But this low price is only available now for customers who buy one of the full expensive Comcast triple play bundles. The alternative to this is a $65 per month unlimited plan that is $5 per month lower than the equivalent Verizon plan. Comcast also plans to offer family plans that sell a gigabyte of data for $12 that can be used for any phone in the plan – for many families this might be the best bargain.

One interesting feature of the Comcast plan is that it will automatically offload data traffic to the company’s WiFi network. Comcast has a huge WiFi network with over 16 million hotspots. This includes a few million outdoor hotspots but also a huge network of home WiFi routers that also act as a public hotspot. That means that customers sitting in a restaurant or visiting a home that has a Comcast WiFi connection will automatically use those connections instead of using more expensive cellular data. Depending on where a person lives or works this could significantly lower how much a consumer uses 4G data.

There are still technical issues to be worked out to allow for seamless WiFi-to-WiFi handoffs. Comcast has provided the ability for a few years for customers to connect to their WiFi hotspots. I used to live in a neighborhood that had a lot of the Comcast home hotspots. When walking my dog it was extremely frustrating if I let my cellphone use the Comcast WiFi network because as I went in and out of hotspots my data connections would be interrupted and generally reinitiated. I always had to turn off WiFi when walking to use only cellular data. It will be interesting to see how, and if Comcast has overcome this issue.

A recent survey done by the investment bank Jeffries has to be of concern to the big four cellular companies. In that survey 41% of respondents said that they would be ‘very likely’ to consider a quad play cable bundle that includes cellular. Probably even scarier for the cellular companies was the finding that 76% of respondents who were planning on shopping for a new cell plan within the next year said they would be open to trying a cellular product from a cable company.

I wrote recently about how the cellular business has entered the phase of the business where cellular products are becoming a commodity. Competition between the four cellular companies is already resulting in lower prices and more generous data plans. But when the cable companies enter the fray in all of the major metropolitan areas the competition is going to ratchet up another notch.

The cable companies will be a novelty at first and many customers might give them a try. But it won’t take long for people to think of them as just another cellular provider. One thing that other surveys have shown is that people have a higher expectation for good customer service from a cellular provider than they do for the cable companies. If Comcast is going to retain cellular customers then they are either going to have to make the bundling discounts so enticing that customers can’t afford to leave, or they are going to have to improve their customer service experience.

Even if Comcast and Charter have only modest success with cellular, say a 10% market share, they will hurt the other cellular companies. The number one driver of profits in the cellular business is economy of scale – something you can see by looking at the bottom line of Sprint or T-Mobile compared to Verizon or AT&T. If Comcast is willing to truly use cellular to help hang on to other customers, and if that means they don’t expect huge profits from the product line, then they are probably going to do very well with a quad play product.

And of course, any landline ISP competing against Comcast or Charter has to be wary. If the cellular products work as Comcast hopes then it’s going to mean it will be that much harder to compete against these companies for broadband. Bundled prices have always made it hard for customers to peel away just one product and the cable companies will heavily penalize any customers that want to take only their data product elsewhere.

Broadband Shorts – July 2017

Today I’m going to talk about a few topics that relate to broadband, but that are too short for a separate blog.

Popularity of Telehealth. The Health Industry Distributors Association conducted a follow-up survey of people who had met with a doctor via a broadband connection instead of a live office visit. The survey found that a majority of people were very satisfied with the telehealth visit and 54% said that they thought the experience was better than a live office visit.

Interestingly over half of the telehealth users were under 50 and they preferred telehealth because of the convenience. Many said that once they found their doctor would allow telehealth visits that they requested them whenever possible. Of course, many telehealth users live in rural areas where it can be a long drive to make a routine doctor office visit. The doctors involved in telehealth also like it for routine office visits. They do complain, however, that not enough insurance companies have caught up with the concept and that they often encounter reimbursement problems.

Explosion of Mobile Data Usage. Ericsson, the company that supplies a lot of electronics for the cellular industry, has warned cellular companies to prepare for an explosive growth in cellular data traffic over the next five years. They warn that within five years that the average cellphone user will grow from the average of today’s monthly usage of 5 gigabytes to a monthly usage of 26 gigabytes. They say the usage will be up to 6.9 gigabytes just by the end of this year – a 40% growth over last year.

They say that several factors will contribute to the strong growth. Obviously video usage drives a lot of the usage, but there is also huge annual growth from social media usage as those platforms incorporate more video. They also predict that by 2022, as we start to meld 5G cellular into the network, that users will feel more comfortable using data on their cellphones.

New Satellite Broadband. ViaSat just launched a new satellite that will allow for data speeds up to 200 Mbps. The satellite was recently launched and that has a throughput of 300 gigabits per second. The satellite is expected to be placed into service in early 2018 and will boost the company’s Excede broadband product.

The new satellite, dubbed ViaSat 2, will originally augment and eventually replace the company’s current ViaSat 1 satellite. The company currently serves 659,000 customers from the ViaSat 1 satellite plus a few it purchased from WildBlue in 2009. The new satellite will allow an expansion of the customer base.

The company expects that the majority of customers will continue to buy data products with speeds up to 25 Mbps, like those already offered by Excede. This tells me that the faster speeds, while available, are going to be expensive. This satellite will still be in a high earth orbit, which means the continued high latency that makes satellite service incompatible with any real-time applications. And there is no word if the larger capacity will allow the company to raise the stingy data caps that customers seem to universally hate.

Growth of Music Streaming. Nielsen released statistics that show that streaming audio is growing at an explosive rate and seems to have crossed the threshold to become the primary way that most people listen to music. Audio streams in 2017 are 62% higher than just a year ago. The industry has grown from an annual number of 113.5 billion steams to 184 billion in just one year.

Nielsen estimates that total listens to music from all media including albums and music downloads will be 235 billion this year, meaning that streaming video now accounts for 78% of all music listened to.

And this growth has made for some eye-popping numbers. For example, Drake’s release of More Life in March saw 385 million streams in the week after release. Those kinds of numbers swamp the number of people that would listen to a new artist under older media.

The Consequences of Killing Network Neutrality

It looks almost certain that the FCC is going to kill Title II regulation, and with it net neutrality. Just as happened the last go around the FCC has already received millions of comments asking it to not kill net neutrality. And if you read all of the press you find dire predictions of the consequences that will result from the death of net neutrality. But as somebody who has a decent understanding of the way that broadband and the associated money flows in the industry I don’t think it will be as dire as critics predict, and I think there will also be unanticipated consequences.

Impact on Start-ups – the Cost of Access. One of the dire predictions is that a new start-up company that uses a lot of broadband – the next Netflix, Vine or Snapchat – won’t be able to gain the needed access with carriers, or that their access will be too expensive. Let me examine that conjecture:

  • Let me follow the flow of money that a start-up needs to spend to be on the web. Their direct largest cost is the cost of uploading their content onto the web through an ISP. The pricing for bulk access has always favored the bigger players and it’s more expensive today for a company that wants to upload a gigabyte per day compared to somebody that uploads a terabyte.
  • The normal web service doesn’t pay anything to then deliver their content to customers. Customers buy various speeds of download and use the product at will. Interestingly, it’s only the largest content providers that might run into issues without net neutrality. The big fights a few years ago on this issue were between Netflix and the largest ISPs. The Netflix volumes had grown so gigantic that the big ISPs wanted Netflix to somehow contribute to the big cost of electronics the ISPs were expending to distribute the service. The only way that there would be some cost to start-ups to terminate content would be if the ISPs somehow created some kind of access fee to get onto their network. But that sounds largely impractical. Bytes are bytes and they don’t exactly contain the name and billing address of the party that dumped the traffic on the web.
  • Some content like live video is a complicated web product. You can’t just dump it on the web at one location in the country and hope it maintains quality everywhere it ends up. There are already companies that act as the intermediary for streaming video to carry out the caching and other functions needed to maintain video quality. Even the big content providers like SlingTV don’t tackle this alone.
  • Finally, there will arise new vendors that will assist start-ups by aggregating their traffic with others. We already see that today with Amazon which is bundling the content of over 90 content providers on its video platform. The content providers benefit by taking advantage of the delivery mechanisms that Amazon has in place. This is obviously working and it’s hard to see how the end of net neutrality would stop somebody like Amazon from being a super-bundler. I think wholesalers like Amazon would fill the market gap for start-ups.

Paid Prioritization. The other big worry voiced by fans of Title II regulation is that it stops paid prioritization, or Internet fast lanes. There are both good and bad possible consequences of that.

  • It’s silly to pretend that we don’t already have significant paid prioritization – it’s called peering. The biggest content providers like Google, Netflix and Amazon have negotiated peering arrangements where they deliver traffic directly to ISPs in specific markets. The main benefits of this for the content providers is that it reduces latency and delay, but it also saves them from buying normal uploads into the open Internet. For example, instead of dumping content aimed at Comcast in Chicago onto the open web these big companies will directly deliver the Chicago-bound traffic to Comcast. These arrangements save money for both parties. And they are very much paid prioritization since smaller content providers have to instead route through the major Internet POPs.
  • On the customer side of the network, I can envision ISPs offering paid prioritization as a product to customers. Customer A may choose to have traffic for a medical monitoring company always get a priority, customer B might choose a gaming service and customer C might choose a VoIP connection. People have never had the option of choosing what broadband connections they value the most and I could see this being popular – if it really works.
  • And that leads into the last big concern. The big fear about paid prioritization is that any service that doesn’t have priority is going to suffer in quality. But will that really happen? I have a fairly good broadband connection at 60 Mbps. That connection can already deliver a lot of different things at the same time. Let’s say that Netflix decided to pay my ISP extra to get guaranteed priority to my house. That might improve my Netflix reception, although it already seems pretty good. But on my 60 Mbps connection would any other service really suffer if Netflix has priority? From what I understand about the routing of Internet traffic, any delays caused by such prioritization would be miniscule, probably in microseconds, which would be nearly imperceptible to me. I can already crash my Internet connection today if I try to download more content than it can handle at the same time. But as long as a customer isn’t doing that, I have a hard time seeing how prioritization will cause much problem – or even why somebody like Netflix would pay an ISP extra for it. They are already making sure they have a quality connection through peering and other network arrangements and I have a hard time understanding how anything at the customer end of the transaction would make much difference. This could be important for those on slow broadband connections – but their primary problem is lack of broadband speed and they are already easily overwhelmed by too much simultaneous traffic.

I am not as fearful of the end of net neutrality as many because I think the Internet operates differently than what people imagine. I truly have a hard time seeing how the ending net neutrality will really change the way I receive broadband at my home. However, I do have big concerns about the end of Title II regulation and fear things like data caps and of my ISP using my personal information. I think most of folks real concern is about Title II regulation, but that’s too esoteric for most folks and we all seem to be using the term ‘network neutrality’ as a substitute for that.

The Need for Fiber Redundancy

I just read a short article that mentioned that 30,000 customers in Corvallis, Oregon lost broadband and cable service when a car struck a utility pole and cut a fiber. It took Comcast 23 hours to restore service. There is nothing unusual about this outage and such outages happen every day across the country. I’m not even sure why this incident made the news other than that the number of customers that lost service from a single incident was larger than normal.

But this incident points to the issue of network redundancy – the ability of a network to keep working after a fiber gets cut. Since broadband is now becoming a necessity and not just a nice-to-have thing we are going to be hearing a lot more about redundancy in the future.

Lack of redundancy can strike anywhere, in big cities or small – but the effects in rural areas can be incredibly devastating. A decade ago I worked with Cook County, Minnesota, which is a county in the far north of the state. The economy of the county is driven by recreation and they were interested in getting better broadband. But what drove them to get serious about finding a solution was an incident that knocked out broadband and telephone to the entire county for several days. They County has now built their own fiber network that now includes redundant route diversity to the rest of the world.

We used to have this same concern about the telephone networks and smaller towns often got isolated from making or receiving calls when there was a cable cut. But as cellphones have become prevalent the cries about losing landline telephone have diminished. But the cries about lack of redundancy are back after communities suffer the kinds of outages just experienced by Corvallis. Local officials and the public want to know why our networks can’t be protected against these kinds of outages.

The simple answer is money. It often means building more fiber, and at a minimum it takes a lot more expensive electronics to create network redundancy. The way that redundancy works is simple – there must be separate fiber or electronic paths to provide service to an area in order to provide two broadband feeds. This can be created in two ways. On larger networks it’s created with fiber rings. In a ring configuration two sets of electronics are used to send every fiber signal in both directions around a fiber. In that configuration, when a fiber is cut the signal is still being received from the opposite direction. The other (and even more expensive) way to create diversity is to lay two separate fiber networks to reach a given location.

Route redundancy tends to diminish as a network gets closer to customers. In the US we have many different types of fiber networks. The long-haul fiber networks that connect the NFL cities are largely on rings. From the major cities there are then regional fiber networks that are built to reach surrounding communities. Some of these networks are also on fiber rings, but a surprising number are not and face the same kind of outages that Cook County had. Finally, there are local networks built of fiber, telephone copper, or coaxial cable that are built to get to customers. It’s rare to see route diversity at the local level.

But redundancy can be added anywhere in the network, at a cost. For example, it is not unusual for large businesses to seek local route diversity. They most often achieve this by buying broadband from more than one provider. But sometimes this doesn’t work if those providers are sharing the same poles to reach the business. I’ve also seen fiber providers create a local ring for large businesses willing to pay the high price for redundancy. But most of the last mile that we all live and work on has no protection. We are always one local disaster away from losing service like happened in Corvallis.

But the Corvallis outage was not an outage where a cut wire knocked out a dozen homes on a street. The fiber that got cut was obviously one that was being used to provide coverage to a wide area. A lot of my clients would not design a network where an outage could affect so many customers. If they served a town the size of Corvallis they would build some local rings to significantly reduce the number of customers that could be knocked out by an outage.

But the big ISPs like Comcast have taken shortcuts over the years and they have not spent the money to build local rings. But I am not singling out Comcast here because I think this is largely true of all of the big ISPs.

The consequences of a fiber cut like the one in Corvallis are huge. That outage had to include numerous businesses that lost their broadband connection for a day – and many businesses today cannot function without broadband. Businesses that are run out of homes lost service. And the cut disrupted homework, training, shopping, medical monitoring, security alarms, banking – you name it – for 30,000 homes and businesses.

There is no easy fix for this, but as broadband continues to become essential in our lives these kinds of outages are going to become less acceptable. We are going to start to hear people, businesses, and local governments shouting for better network redundancy, just as Cook County did a decade ago. And that clamor is going to drive some of these communities to seek their own fiber solution to protect from the economic devastation that can come with even moderate network outages. And to some degree, if this happens the carriers will have brought this upon themselves due to pinching pennies and not making redundancy a higher priority in network design.

Shaking Up the FTTP Industry

Every once in a while I see something in the equipment market that surprises me. One of my clients recently got pricing for building a gigabit PON FTTP network from the Chinese company ZTE. The pricing is far under the market price for other brands of equipment, and it makes me wonder if this is not going to put downward price pressure on the rest of the industry.

There are two primary sets of electronics in a PON network – the OLT and ONTs. The OLT (Optical Line Terminal) is a centrally located piece of equipment that originates the laser signal headed towards customers. The OLT is basically a big bay of lasers that talk to customers. The ONT (Optical Network Terminal) is the device that sits at a customer location that has the matching laser that talks back to the OLT.

ZTE’s pricing is industry shaking. They have priced OLTs at almost a third of the price of their competition. They have been able to do this partially by improving the OLT cards that hold the lasers and each of their cards can connect to twice as many customers as other OLTs. This makes the OLT smaller and more energy efficient. But that alone cannot account for the discount and their pricing is obviously aimed at gaining a foothold in the US market.

The ONT pricing is even more striking. They offer a gigabit Ethernet-only indoor ONT for $45. That price is so low that it almost turns the ONT into a throw away item. This is a very plain ONT. It has one Ethernet port and does not have any way to connect to existing inside wiring for telephone or cable TV. It’s clearly meant to work with WiFi at the customer end to deliver all services. Their pricing is made even more affordable by the fact that they offer lower-than-normal industry prices for the software needed to activate and maintain in future years.

This pricing is going to lead companies to reexamine their planned network design. A lot of service providers still use traditional ONTs that contain multiple Ethernet ports and that also have ports for connection to both telephone copper and cable company coaxial wiring. But those ONTs are still relatively expensive and the most recent quotes I’ve seen put these between $200 and $220.

Using an Ethernet-only ONT means dumping the bandwidth into a WiFi router and using that for all services. That means having to use voice adapters to provide telephone service, similar to what’s been used by VoIP providers for years. But these days I have clients that are launching fiber networks without a voice product, and even if they want to support VoIP the adapters are relatively inexpensive. This network design also means delivering only IPTV if there is a cable product and this ONT could not be used with older analog-based cable headends.

ZTE is an interesting company. They are huge in China and are a $17 Billion company. They make a lot of cellphones, which is their primary product line. But they also make a lot of different kinds of telecom gear like this PON equipment. They claim they FTTP equipment is widely used in China and that they have more FTTP customers connected than most US-based vendors.

This blog is not a blanket endorsement of the company. They have a questionable past. They have been accused of bribery in making sales in Norway and the Philippines. They also were fined by the US Commerce Department for selling technology to North Korea and Iran, both under sanctions. And to the best of my knowledge they are just now trying to crack into the US market, which always is something to consider.

But this kind of drop in FTTP pricing has been needed. It is surprising that OLTs and ONTs from other manufacturers still basically cost the same as they did years ago. We generally expect that as electronics are mass produced that the prices will drop, but we have never seen this in a PON network. One can hope that this kind of pricing will shake up other manufacturers to sharpen their pencils. Larger fiber ISPs already get pricing cheaper than what I mentioned above on today’s equipment. But most of my clients are relatively small and they have little negotiating power with equipment vendors. I hope this shakes the industry a bit – something that’s needed if we want to deploy fiber everywhere.