Decommissioning Rural Copper, Part 2

In the last blog I wrote about my belief that AT&T and Verizon want out of the rural wireline business. They both have plans to largely walk away from their rural copper networks and replace landline copper services with cellular service. Today I want to talk about what regulators ought to do with those networks.

When these two giant telcos walk away from rural copper they will inevitably harm rural America. While many homes will get the ‘privilege’ of now buying highly-priced cellular-based broadband, other homes are going to find themselves without telephone service if they happen to live in one of the many cellular dead zones. Such homes will not only be unable to benefit from cellular broadband, but if they have poor cell service they will find themselves cut off from voice communications as well.

As somebody who has traveled extensively in rural America I can tell you that there are a lot more cellular dead zones than people realize. And it’s not only farms, and there are county seats in rural America where it’s difficult to get a working cellphone signal inside of buildings.

As part of this transition both companies are going to walk away from a huge amount of existing copper cable. I think this copper cable is an incredibly valuable asset and that regulators ought not to allow them to tear it down.

The copper wire network today goes almost everywhere in rural America. Congressional laws and FCC policies led to most homes in the country getting access the the copper network. These copper wires occupy a valuable space on existing telephone poles – on the majority of rural poles the only two wires are the power lines at the top and the telephone wires at the bottom.

If these copper wires are kept in place they could greatly reduce the cost of building rural fiber. It is far cheaper when building fiber to ‘lash’ the fiber onto an existing set of cables than to hang fiber from scratch. It was this construction technique that allowed Verizon to build a lot of its FiOS fiber network – they lashed fiber onto existing telephone wires. And my guess is that when Verizon decommissions urban copper they are still going to leave a lot of the copper wires in place as a guidewire for their fiber.

If these telcos are going to walk away from these copper wires, then they ought to be required to keep them in place for use by somebody else to hang fiber. Many states might force the big telcos to tear down the copper wires since they will eventually create safety hazards as they break away from poles if they aren’t maintained. But if somebody else is willing to take over that maintenance then it shouldn’t be an issue.

I can picture a regulatory process whereby some other carrier is allowed to come in and ‘claim’ the abandoned wires once they are empty of customers. That would provide fiber overbuilders or rural communities to claim this copper as an asset.

There is some salvage value to copper wires and and it’s possible, but not probable that the value of the copper could exceed the cost to tear it down. So I can see the telcos fighting such an idea as a confiscation of their assets. But these rural wires have been fully depreciated for decades and the telcos have earned back the cost of these copper lines many times over. I believe that by the act of abandoning the wires and depriving some homes of wireline service that the big telcos will have forfeited any rights they might have to the remaining assets.

Anybody claiming the abandoned copper could use it in two ways. First, in many cases there is still existing life left in the copper, as witnessed by Frontier and CenturyLink rehabbing old rural copper with upgraded DSL. Local communities or small carriers could use the copper to bring the better services that the big telcos have refused to do over the last few decades.

But more importantly these wires represent the cheapest path forward for building rural fiber. Anybody taking over the old copper can save a lot of fiber construction costs by lashing fiber onto the existing copper. If our nationwide goal is really to get better broadband to rural America, then offering abandoned copper to fiber builders might be one of the easiest tools available to help the process along.

The big telcos abandoned rural America dacades ago. They stopped doing routine maintenance on rural copper and slashed the number of rural technicians. They now want to walk away from that copper and instead force rural America to buy cellular services at inflated prices. We owe it to the folks who paid for this copper many times over to get some benefit from it and to offer an alternative to the new rural cellular monopolies.

Decommissioning Rural Copper

I’ve been watching AT&T and Verizon since I’ve been in the industry (including a short stint at Southwestern Bell in the early 80s). We are about to see both of these companies unravel their rural telco properties.

Verizon got ahead of the curve and has been selling off rural properties for a few decades, many of which ending up with Frontier. Verizon still serves some rural areas and probably has shed  half of their rural customers. But there are still big swaths or rural Verizon customers in Pennsylvania, New York, Maryland and other northeastern states. Verizon benefitted from these sell-offs by selling completely depreciated and poorly maintained networks at high prices – as can be evidenced by how much Frontier is struggling to cover their massive debts. AT&T has sold almost no rural properties and still serves gigantic rural areas in dozens of states.

Both companies are clearly on a path to tear down the remaining rural copper networks and replace them with cellular wireless networks. There are both pros and cons for these transitions for rural customers.

On the plus side, many of these rural areas have never had broadband since these big telcos never extended their DSL to their rural service areas. We know that they could have extended DSL, because we have hundreds of examples of independent telephone companies that brought DSL to all of their customers, no matter how remote. But the big companies stopped spending money on rural properties decades ago. The remaining copper is now in terrible shape and one has to imagine that cellular voice is probably often as good or better than voice over these old copper lines.

There will now many customers who can buy fixed cellular broadband. This uses the same frequencies as the broadband for smartphones, but the cellular companies are pricing it to be a little less expensive. For many households the fixed-cellular broadband will be the first real broadband alternative they have ever had.

But there are also big downsides to this shift from old copper to cellular networks. First, cellular networks are effective for only a few miles from any given cell site. Anybody who has driven in rural America knows that there are cellular dead spaces everywhere. Any customers living in the cellular dead spaces are going to be left with no communications to the outside world. They’ll lose their copper and they won’t have cellular voice or data. This will be a huge step backwards for many homes.

The big telcos will be taking advantage of the fact that, as a cellular provider, they have no obligations to try to serve everybody. One of the reasons that we had nearly ubiquitous telephone coverage in the country is that telcos were the carriers of last resort in their service areas. They were required by law to extend telephone service to all but extremely remote customers. But that obligation doesn’t apply to a cellular carrier. We already have tons of evidence that the cellular carriers make no apologies to homes that happen to live out of range of their cellular towers. With no copper landlines left we will now have rural communications dead zones. It will be hard for anybody living in these dead zones to stay there and certainly nobody is going to build new homes in a place that doesn’t have cellular service.

There is a downside even for those households that get fixed-cellular broadband. The speeds on this service are going to be slow by today’s standards, in the range of 10 – 15 Mbps for those that live relatively close to a cellular tower, but considerably slower for customers at greater distances. The real downside to getting cellular data is that the speeds are not likely to get better in rural America for many years, even decades. The whole industry is abuzz with talk about 5G cellular making a big difference, but it’s hard to see that technology making much impact in rural areas.

I think this transition away from copper is going to catch a lot of rural people by surprise. These two big telcos have already started the process of decommissioning copper and once that gets full FCC approval the speed of decommissioning copper is likely to soon accelerate. I think a lot of homes are going to be surprised when they find out that the telcos no longer have an obligation to serve them.

Regulating Online Video Content

Recently the Kommission für Zulassung und Aufsicht der Medienanstalten (ZAK) – the German equivalent of our FCC – recently concluded that OTT services ought to be regulated the same way as other broadcast radio and television networks. Specifically they were looking at Twitch.tv the web gaming service, but the ruling could have far-reaching consequences.

I think the ruling raises two questions. First, should any regulatory body be regulating video content on the Internet? Second, why are we still heavily regulating cable TV?

The European press is lambasting the order as nothing more than a money grab. One of the benefits of regulating anything is to charge fees for that regulation. Like many regulatory bodies around the world the ZAK is largely funded by fees charged to the companies that it regulates (which is also largely true for the FCC as well). This means that regulators have a perverse incentive to regulate things, even if they don’t need to be regulated.

The idea of regulating a worldwide web ‘channel’ like a TV station is absurd. For those of you that may not know about Twitch.tv, it’s the primary gaming network for worldwide gamers. It’s owned by Amazon. It’s a huge platform and works like YouTube where over 17,000 ‘partners’ post gaming content into ‘channels.’ The platform averages 625,000 simultaneous viewers at any given time, making it one of the most popular web platforms in the world.

So regulating Twitch.tv would be the same as regulating YouTube. It’s a platform where virtually all of its content is created by others. Other than extracting fees from the platform for the privilege of regulating it, it’s hard to understand what else the ZAK could regulate. Twitch.tv and YouTube are open platforms and only function because they allow anybody to post content. Both platforms will take down offensive content or content that violates copyrights if they are asked to do so. But the platforms, by definition of the way they operate, have no control of the content that is posted. I’m at a total loss what the ZAK thinks they can regulate.

You have to also wonder how effective any regulation would be. There are a huge number of smaller web platforms that might fall into the same category as Twitch.TV. It’s hard to imagine anybody being able to launch a new platform if they are expected to comply with different rules in a hundred countries. But it’s also hard to envision the ZAK doing anything other than somehow trying to ban the content from the whole country of a platform that refuses to comply with their regulations. I don’t think the ZAK understands the political ramifications of banning a platform used by all the young tech-savvy programmers (and hackers) in their country!

But thinking about this makes me ask why we are still regulating cable companies in the US. There are slews of FCC rules that dictate things like channel line-ups. It’s FCC rules that force cable companies to still offer basic, expanded basic, and premium tiers of service. It’s now pretty clear that few consumers are happy with this structure. The average household only watches about a dozen channels monthly regardless of the size of the tiers they purchase. It is the requirement for these tiers that has allowed the programmers to force programs onto cable companies that they don’t really want.

It is the cable tiers that have forced up the price of cable. Households spend huge monthly bills to watch a dozen channels – all because the regulations force channel line-ups that contain a hundred or more channels that the household isn’t interested in.

And cable companies are now competing against companies that don’t have these same restraints. Companies like SlingTV can put together any channel line-up they want with no regulatory constraints telling them what they can or can’t offer. Surveys have always shown that people would rather buy just those channels that they want to watch. And yet cable companies in the US are not allowed to compete head-on with OTT providers.

It would be easy to blame the FCC for not keeping up with the times. However, the most draconian cable rules come directly from Congress and the FCC’s hands are tied from deviating from rules that are embedded in law. We are now at a time when we really need to consider these old rules. The cable companies are being forced to sell programming that customers don’t want to pay for. The whole industry would benefit if cable companies were free to pursue packages that people actually want to buy. Freeing up all video providers to offer what customers want is a far better solution than trying to drag web companies into becoming regulated cable companies.

How Much Speed Do We Really Need?

There is a lot of buzz floating around in the industry that the FCC might lower the official definition of broadband from 25 Mbps down and 3 Mbps up. Two of the current FCC commissioners including the chairman opposed setting that definition a few years back. Lowering the speeds would let the FCC off the hook for the requirement by law to make sure that the whole country can get broadband. If they lower the definition, then voila, millions more Americans would be declared to have adequate broadband.

So today I thought I’d take a look at the download speeds we really need at our homes. You may recall that back when the FCC set the 25/3 Mbps definition that they made a list of the broadband speed needed to do typical activities. And in doing so they tried to create profiles of some typical American households. That attempt was awkward, but it was a good starting point for examining household bandwidth needs. I’m updating their list a bit for things that people do today, which is already different than just a few years ago. Consider the following web activities:

  • Web Background 5 Mbps
  • Web Browsing 1 – 2 Mbps
  • Online Class 1 – 2 Mbps
  • Social Media 1 – 2 Mbps
  • Streaming Music 3 Mbps
  • Voice over IP 2 Mbps
  • SD Video stream 1 – 3 Mbps
  • HD Video Stream 4 – 6 Mbps
  • 4K Video Stream 15 – 20 Mbps
  • Gaming 1 – 3 Mbps
  • Skype / Video Conference 1 – 3 Mbps
  • Big File Downloader 50 Mbps

People don’t agree with all of these listed speeds because there are no standards for how the web works. For example, by using different compression schemes a video stream from Netflix is not identical to one from Amazon. And even from one source there is variation since an action move takes more bandwidth than something like a stand-up comedy routine.

It’s important to remember that broadband demand can come from any device in your house – desktop, laptop, smartphone, tablet, etc. It’s also important to note that these are speed requirements for a single user. If two people in the house are watching an separate video, then you have to double the above number.

What the FCC failed to consider back when they set the speed definition is that households need enough bandwidth to handle the busiest times of the day. What matters is the number of simultaneous activities a home can do at the same time on the web, with most families being busiest in the evenings. There might be somebody on social media, somebody watching an HD movie, while somebody else is doing homework while also using a smartphone to swap pictures.

There is another issue to consider when trying to do simultaneous tasks on the Internet – packet loss. The connection between the ISP and a customer gets more congested when it’s trying to process multiple data streams at the same time. Engineers describe this as packet collision – which sounds like some kind of bumper-car ride – but it’s an apt way to describe the phenomenon. Most home routers are not sophisticated enough to simultaneously handle too many multiple streams at once. Packets get misdirected or lost and the router requests the missing packets to be sent again from the originator. The busier the router, the more packet interference. This is also sometimes called ‘overhead’ in the industry and this overhead can easily grow to 15% or more of the total traffic on a busy connection, meaning it takes 15% more bandwidth to complete a task than if that task was the only thing occurring on the broadband connection.

There is another kind of interference that happens in homes that have a WiFi network. This is a different kind of interference that has to do with the way that WiFi works. When a WiFi network gets multiple requests for service, meaning that many devices in the home are asking for packets, the WiFi router gets overwhelmed easily and shuts down. It then reinitiates and sends packets to the first device that gets its attention. In a busy network environment the WiFi router will shut down and restart constantly as it tries to satisfy the many needed devices. This kind of interference was designed into the WiFi specification as a way to ensure that WiFi could satisfy the needs of multiple devices. This WiFi overhead can also easily add 15% or more to the network demand.

Anybody who lives in a home with active users understands how networks can get overwhelmed. How many of you have been frustrated trying to watch a movie when others in the house are using the Internet? Even big bandwidth can be overwhelmed. I have a friend who has a 100 Mbps fiber connection on Verizon FiOS. He went to watch a video and it wouldn’t stream. He found that his two teenage sons were each using half a dozen gaming streams at the same time and had basically exhausted his fast bandwidth pipe.

The FCC can tinker with the official definition of broadband since that is their prerogative. But what they can’t do is to define for any given home how much bandwidth they really need. The funny thing is that the big ISPs all understand this issue. The cable companies have unilaterally increased speeds across-the-board to urban customers several times in recent years and in most markets offer speeds considerably faster than the current FCC definition of broadband. These ISPs know that if they were only delivering 25 Mbps that they would be overwhelmed with customers complaining about the connection. Those complaints are the real proof of how much bandwidth many homes need. If the FCC lowers the definition of broadband then they have on blinders and are ignoring how homes really use broadband today. If they lower the speed definition it’s hard to see it as anything other than a political move.

Another Comcast Bundle

Comcast just announced that they will be bundling solar panels with their other services in selective markets. This adds to the already-largest bundle of products in the industry and is one that many competitors will have a problem keeping up with.

Comcast has been doing a trial with Sunrun, a solar panel maker from San Francisco. Comcast found during this test that their customer satisfaction and customer retention rates rose significantly with customers who bought the solar panels. Comcast has now entered into an exclusive 40-month marketing deal with the company. It’s been reported that Comcast will get 10% of Sunrun’s stock if they can install 60,000 solar customers. Comcast has committed to spend $10 million on sales and marketing for the solar panels and will get a share of the customer revenue from the product.

Sunrun currently has about 150,000 solar installations in 22 states. Comcast has over 27 million potential solar customers. The cable company also has over 1 million home automation customers, which Comcast believes will be their best market for the new solar product.

Even before this announcement Comcast has become a fierce competitor. Comcast’s CEO Brian Roberts recently said that as he looked around the industry that he didn’t see any products of interest that the company doesn’t already have – a claim no other ISP can make.

This announcement falls on the heels of Comcast’s decision to get into the cellular business. They are now marketing in a few markets with prices lower than Verizon and AT&T and plan to eventually roll this out to their whole footprint. They also just bought a pile of spectrum that will help them increase margins on cellular service. Analysts say that over five years that Comcast could capture as much as 30% of the cellphone business in their markets.

Comcast says it is tackling both of these product lines to reduce churn and to increase customer stickiness. They understand that long-time customers are their most profitable customers and they are putting together bundle options that ought to please a lot of households.

All of their effort looks to be paying off. Comcast is the only cable company that gained cable TV customers for the year just ended in the second quarter. They gained 120,000 customers while the rest of the industry is now bleeding cable customers at an average rate of 2.5% of total customers per year. While the bundles are probably not the only reason for that it’s hard to argue with this success.

Comcast has done a lot of other things to increase customer satisfaction. They created Comcast Labs (similar to Bell Lab). This group of scientists and engineers are concentrated largely on developing products that improve the customer experience. This group developed the X1 settop box which has rave reviews from customers. It’s so popular that Comcast is now selling this box to other monopoly cable providers. The settop box has an ever-growing number of features and can be voice-activated. Comcast has also integrated Netflix and Sling TV into their settop box to keep customers on their box and platform.

Comcast has also found great success with their smart home product. This is probably the most robust such product on the market and includes such things as security and burglar alarms, smart thermostat, watering systems, smart blinds for energy control, security cameras, smart lights, smart door locks, etc. Their product suite can be easily monitored from the settop box or from a smartphone app. The press releases from the Sunrun announcement is the first time in a while that we’ve heard about their success and the million plus customers using these products.

The company still has a lousy reputation for customer service and most of their customers dread having to call them. But they are supposedly putting a lot of money into making their customer service better. They recently began moving a lot of customer service back to the US, finally understanding that the cost savings of using foreign reps is not worth the customer dissatisfaction.

The flip side to making customers more sticky is that it makes it that much harder for a competitor to take their customers. Somebody buying a solar panel on a long-term payment plan is not likely to leave them for a competitor, particularly if there are financial penalties for doing so. Customers with a suite of home automation products become locked in unless they are willing to yank all of the monitors out and start over. Bit by bit Comcast is shielding their most lucrative customers from being poached by others.

Do We Really Need Gigabit Broadband?

I recently read an article in LightReading titled “All That’s Gigabit Doesn’t Glitter.” The article asks the question if the industry really needs to make the leap to gigabit speeds. It talks about the industry having other options that can satisfy broadband demand but that telco executives get hooked into the gigabit advertising and want to make the gigabit claim. A few of the points made by the article are thought-provoking and I thought today I’d dig deeper into a few of those ideas.

The big question of course is if telco providers need to be offering gigabit speeds, and it’s a great question. I live in a cord cutter family and I figure that my download needs vary between 25 Mbps and 50 Mbps at any given time (look forward to a blog soon that demonstrates this requirement). I can picture homes with more than our three family members needing more since the amount of download speed needed is largely a factor of the number of simultaneous downloads. And certainly there are people who work at home in data intensive jobs that need far more than this.

There is no doubt that a gigabit is a lot more broadband than I need. If we look at my maximum usage need of 50 Mbps then a gigabit is 20 times more bandwidth capacity than I am likely to need. But I want to harken back to our broadband history to talk about the last time we saw a 20-fold increase in available bandwidth.

A lot of my readers are old enough to remember the agony of working on dial-up Internet. It could take as much as a minute at 56 kbps just to view a picture on the Internet. And we all remember the misery that came when you would start a software update at bedtime and pray that the signal didn’t get interrupted during the multi-hour download process.

But then along came 1 Mbps DSL. This felt like nirvana and it was 20 time faster than dial-up. We were all so excited to get a T1 to our homes. And as millions quickly upgraded to the new technology the services on the web upped their game. Applications became more bandwidth intensive, program downloads grew larger, web sites were suddenly filled with pictures that you didn’t have to wait to see.

And it took a number of years for that 1 Mbps connection to be used to capacity. After all, this was a 20-fold increase in bandwidth and it took a long time until households began to download enough simultaneous things to use all of that bandwidth. But over time the demand for web broadband kept growing. As cable networks upgraded to DOCSIS 3.0 the web started to get full of video and eventually the 1 Mbps DSL connection felt as bad as dial-up a decade before.

And this is perhaps the major point that the article misses – you can’t just look at today’s needed usage to talk about the best technology. Since 1980 we’ve experienced a doubling of the amount of download speeds needed by the average household every three years. There is no reason to think that growth is stopping, and so any technology that is adequate for a home today is going to feel sluggish in a decade and obsolete in two decades. We’ve now reached that point with older DSL and cable modems that have speeds under 10 Mbps.

The other point made by the article is that there are technology steps between today’s technology and gigabit speeds. There are improved DSL technologies and G.Fast that could get another decade out of embedded copper and could be competitive today.

But it’s obvious that the bigger telcos don’t want to invest in copper. I get the impression that if AT&T found an easy path to walk away from all copper they’d do so in a heartbeat. And none of the big companies have done a good job of maintaining copper and most of it is in miserable shape. So these companies are not going to be investing in G.Fast, although as a fiber-to-the-curb technology it would be a great first step towards modernizing their networks to be all-fiber. CenturyLink, AT&T and others are considering G.Fast as a technology to boost the speeds in large apartment buildings, but none of them are giving any serious consideration of upgrading residential copper plant.

It’s also worth noting that not all companies with fiber bit on the gigabit hype. Verizon always had fast products on their FiOS and had the fastest speed in the industry of 250 Mbps for many years. They only recently decided to finally offer a gigabit product.

And this circles back to the question of whether homes need gigabit speeds. The answer is clearly no, and almost everybody offering a gigabit product will tell you that it’s still largely a marketing gimmick. Almost any home that buys a gigabit would have almost the same experience on a fiber-based 100 Mbps product with low fiber latency.

But there are no reasonable technologies in between telephone copper and fiber. No new overbuilder or telco is going to build a coaxial cable network and so there is no other choice than building fiber. While we might not need gigabit speeds today for most homes, give us a decade or two and most homes will grow into that speed, just as we grew from dial-up to DSL. The gigabit speed marketing is really not much different than the marketing of DSL when it first came out. My conclusion after thinking about this is that we don’t need gigabit speeds, but we do need gigabit capable networks – and that is not hype.

What’s the Next FTTP Technology?

There is a lot of debate within the industry about the direction of the next generation of last mile fiber technology. There are three possible technologies that might be adopted as the preferred next generation of electronics – NG-PON2, XGS-PON or active Ethernet. All of these technologies are capable of delivering 10 Gbps streams to customers.

Everybody agrees that the current widely deployed GPON is starting to get a little frayed around the edges. That technology delivers 2.4 Gbps downstream and 1 Gbps upstream for up to 32 customers, although most networks I work with are configured to serve 16 customers at most. All the engineers I talk to think this is still adequate technology for residential customers and I’ve never heard of a neighborhood PON being maxed out for bandwidth. But many ISPs already use something different for larger business customers that demand more bandwidth than a PON can deliver.

The GPON technology is over a decade old, which generally is a signal to the industry to look for the next generation replacement. This pressure usually starts with vendors who want to make money pushing the latest and greatest new technology – and this time it’s no different. But after taking all of the vendor hype out of the equation it’s always been the case that any new technology is only going to be accepted once that new technology achieves and industry-wide economy of scale. And that almost always means being accepted by at least one large ISP. There are a few exceptions to this, like what happened with the first generation of telephone smart switches that found success with small telcos and CLECs first – but most technologies go nowhere until a vendor is able to mass manufacture units to get the costs down.

The most talked about technology is NG-PON2 (next generation passive optical network). This technology works by having tunable lasers that can function at several different light frequencies. This would allow more than one PON to be transmitted simultaneously over the same fiber, but at different wavelengths. But that makes this a complex technology and the key issue is if this can ever be manufactured at price points that can match other alternatives.

The only major proponent of NG-PON2 today is Verizon which recently did a field trial to test the interoperability of several different vendors including Adtran, Calix, Broadcom, Cortina Access and Ericsson. Verizon seems to be touting the technology, but there is some doubt if they alone can drag the rest of the industry along. Verizon seems enamored with the idea of using the technology to provide bandwidth for the small cell sites needed for a 5G network. But the company is not building much new residential fiber. They announced they would be building a broadband network in Boston, which would be their first new construction in years, but there is speculation that a lot of that deployment will use wireless 60 GHz radios instead of fiber for the last mile.

The big question is if Verizon can create an economy of scale to get prices down for NG-PON2. The whole industry agrees that NG-PON2 is the best technical solution because it can deliver 40 Gbps to a PON while also allowing for great flexibility in assigning different customers to different wavelengths. But the best technological solution is not always the winning solution and the concern for most of the industry is cost. Today the early NG-PON2 electronics is being priced at 3 – 4 times the cost of GPON, due in part to the complexity of the technology, but also due to the lack of economy of scale without any major purchaser of the technology.

Some of the other big fiber ISPs like AT&T and Vodafone have been evaluating XGS-PON. This technology can deliver 10 Gbps downstream and 2.5 Gbps upstream – a big step up in bandwidth over GPON. The major advantage of the technology is that is uses a fixed laser which is far less complex and costly. And unlike Verizon, these two companies are building a lot more FTTH networks that Verizon.

And while all of this technology is being discussed, ISPs today are already delivering 10 Gbps data pipes to customers using active Ethernet (AON) technology. For example, US Internet in Minneapolis has been offering 10 Gbps residential service for several years. The active Ethernet technology uses lower cost electronics than most PON technologies, but still can have higher costs than GPON due to the fact that there is a dedicated pair of lasers – one at the core and one at the customer site – for each customer. A PON network instead uses one core laser to serve multiple customers.

It may be a number of years until this is resolved because most ISPs building FTTH networks are still happily buying and installing GPON. One ISP client told me that they are not worried about GPON becoming obsolete because they could double the capacity of their network at any time by simply cutting the number of customers on a neighborhood PON in half. That would mean installing more cards in the core without having to upgrade customer electronics.

From what everybody tells me GPON networks are not experiencing any serious problems. But it’s obvious as the household demand for broadband keeps doubling every three years that the day will come when these networks will experience blockages. But creative solutions like splitting the PON could keep GPON working great for a decade or two. And that might make GPON the preferred technology for a long time, regardless of the vendors strong desire to get everybody to pay to upgrade existing networks.

The Louisville Pole Attachment Lawsuit

There has been a major legislative push lately to make it easier for wireless companies to get onto poles in order to deploy the small cell sites needed for 5G deployment. AT&T and Verizon have been leading the fight for easier access and there have been attempts at both the federal and state level to enact ‘one-touch’ rules. Proposed legislation not only sets a low price for compensating pole owners, but proposed legislation also removes the ability for pole owners or municipalities to slow down wireless deployments.

There is a lot of debate in the industry about the one-touch issue. As I have discussed in various blogs, issues with getting onto poles is still one of the major roadblocks to many fiber deployments. And from the examples cited by the cellular carriers they are seeing huge delays in deploying urban small cell sites.

Like any debate there are legitimate issues to be considered on both sides of the issues. Proponents of one-touch cite the extraordinary costs of wading through the paperwork-heavy pole attachment process as well as the dollar and cents costs of delaying construction projects.

But on the other side are pole owners and current networks hung on wires. Carriers are legitimately worried about safety issues for their technicians if large boxes the size of refrigerators are hung on poles without constraint. They legitimately worry about how such devices could cause problems during repairs from storm damage. And carriers are also worried about network outages if a new attacher is allowed and able to move their wires without their knowledge or permission.

A court decision a few weeks ago might be a first step into putting some clarity to the issue. In that suit AT&T had sued the City of Louisville in order to stop them from passing a one-touch make-ready ordinance. The ordinance was aimed at making it easier for Google Fiber and other competitive providers to get onto poles in the City. The City of Louisville owns most of the poles in the city and the City has been working with Google Fiber to deploy a fiber network to everybody in the City.

You have to let the irony of AT&T’s lawsuit sink in for a minute. This is a company that is spending millions right now lobbying for one-touch rules. AT&T not only wants to deploy small cell sites, but they are also in the process of building a huge amount of fiber to support those sites. And yet AT&T felt compelled to fight against the very kind of ordinance they are promoting because it would help one of their competitors.

It turns out that not all one-touch ordinances are the same. The ordinances that AT&T and Verizon are pushing are crafted very carefully to help them while still not making it quite so easy for their competitors. The Louisville ordinance made it easier for any new attacher to get onto poles, including AT&T.

The US District Court Judge of Kentucky completely rejected all of AT&T’s claims and tossed the lawsuit. The court basically said that all of AT&T’s claims in the suit were false. It’s ironic that many of the issues raised by the City in defense of the suit sound the same as the claims that AT&T makes elsewhere when lobbying for one-touch legislation.

I’ve always said that being in the regulatory department at AT&T has to be the hardest job in our industry. It’s a company that wears too many hats. AT&T owns a huge monopoly landline network and wants to protect itself from competitors. In some markets AT&T is a major pole owner. AT&T is also a huge wireless company that now wants access to poles. And AT&T is a huge builder of fiber, much of it now outside of its monopoly telco territory.

Any regulatory position the company takes to benefit one of these business lines is likely to not be in the best interest of other parts of the company. When looking at the big picture one has to think that AT&T will get far more benefit than harm from one-touch rules. Such rules will make it a lot easier to build more fiber and to deploy cell sites. And yet, a company with this many tentacles in the industry could not restrain itself from filing a lawsuit that probably was not in its own best long-term interest. The monopoly side of the company felt it could not sit back and let a competitor like Google Fiber build without the company taking steps to slow them down.

The Worst Broadband in America

I recently read an article by Clare Malone from fivethirtyeight titled, “The Worst Internet in America.” The article discussed Saguache County, Colorado, which was identified by researchers at the University of Iowa and Arizona State University as having the lowest broadband penetration in the US. Only 5.6% of households there have broadband that meets the FCC definition of 25 Mbps down / 3 Mbps up. It’s an article worth reading and highlights the problems caused by lack of broadband.

As you might imagine it’s a rural farming community. Slow broadband has historically been offered by CenturyLink and Fairpoint, the two incumbent telcos serving the county. Like much of rural America the county now has a WISP offering fixed-wireless broadband. Ralph Abrams, the former mayor of Crestone, CO founded the WISP in 2011 as a reaction to the poor DSL service in the county.

My main takeaway from the article is that this same article could be written about almost any pocket of rural customers in the country. We are a nation of broadband haves and have-nots. In most of rural America there is a clear line that defines who has broadband. If a county is lucky enough to have a cable TV company in some of its towns there is always a place at the edge of the town where the coaxial cables stops. The dividing line with DSL is always a little fuzzier, but there is always some distance from town where the DSL is too slow to be of any use – and that’s not generally more than a mile or two from a town.

People that live outside the broadband boundary have three options – satellite broadband, cellular broadband or no broadband. I have never met anybody that was satisfied with satellite broadband. Some of the services today deliver speeds as fast as 17 Mbps. But the satellite plans are expensive and have two major drawbacks. First are small monthly data caps that average in the range of 10 gigabytes of downloaded data. Unlike cellphone plans where you pay more for extra data, most satellite plans kick you off for the rest of the month when you hit your cap. And satellites have dreadful latency that is as much as twenty times higher than on fiber. Latency is a measure of the time delay for a data packet to reach a customer. High latency means that real-time applications don’t work. With a high-latency connection you can’t make a phone call over the Internet. You can’t watch live-streaming video. You can’t connect to services that require real-time connections like online classes. You can’t hold a connection to a corporate server to work at home.

And cellular data is no better. Rural customers use their cellphones as hot spots. Since cellular data speeds decrease with distance from a cell tower, rural customers are likely to get poor speeds with their cellphones if they can find any broadband connection at all. And unlike satellite broadband the cellular companies will let you buy unlimited extra gigabytes of data – at a high price. I think US cellular data is probably the most expensive data in the developed world priced at $8 – $10 per gigabyte. I have talked to numerous rural households that pay $500 or more a month for cellphone data in order for their kids to do homework.

Rural customers are all highly aware of the hot spots in their region and it’s not unusual to see cars gathered around a library, restaurant or other place that offers public WiFi. Folks drive school kids into town regularly to sit in the car and do homework. People trying to work at home must drive to a hotspot to send or retrieve big data files.

The article asks the same question that I asked a few months ago – is broadband an American right? People have very strong opinions about this idea because of all the political overtones. But one has to only look back to our past to see other times when the US government thought that providing utilities in rural America was good for the country as a whole. There were major government programs to help push electricity into rural America, including cheap long-term loans for places that created local cooperatives to get this done. The same thing happened with rural telephone service and most of rural America got connected to the voice network.

And I ask myself why this is any different. We found ways to string poles and wires to farms for both electricity and telephone service. When you look at the cost of that effort adjusted for inflation it’s hard to think that it was any cheaper to do this back then than it is today to string fiber. As a country we found a way to get electricity and telephone everywhere for the simple reason that we knew it made the whole country better when we didn’t leave parts of the economy behind. I have no idea if there was debate a century ago asking if electricity to farms was a right. But it seems like it was obvious to the country that it was a necessity.

Two Tales on the Privacy Front

Protecting customer data has been in the news a lot recently and today I’m going to discuss two different news stories concerning the privacy of customer data.

The first story involves a case that will be decided soon by the U.S. Supreme Court. The case, Carpenter vs. United States, is contemplating the rules of how the government can access historical cellphone call records (and one assumes all other telecom records for calls and emails).

Without discussing all of the details of the case, the short version is that police had asked MetroPCS for the complete cellphone records of sixteen people suspected of robbing cellphone stores. MetroPCS supplied the details of all of the calls to and from each suspected cellphone as well as information about the location of the cell sites servicing each phone during the duration of the calls. The legal question being asked is if this represented a warrantless search and specifically as asked by government attorneys, “Whether the government’s acquisition, pursuant to a court order issued under 18 U.S.C. 2703(d), of historical cell-site records created and maintained by a cellular-service provider violates the Fourth Amendment rights of the individual customer to whom the records pertain.”

Recently fourteen companies including Google, Apple, Facebook, and Microsoft filed an amicus brief in the case that argues that the government is relying on outdated privacy laws from the 1970s that allow for the government to ask for telephone records without a warrant. Interestingly, Verizon joined in this argument.

Most small carriers are aware of this issue by the fact that local police often ask them for call records without a warrant. I can’t recall a time when a telco hasn’t responded to such requests, but I’ve talked to many companies who are often uncomfortable with the process. The fourteen companies get similar requests for call records but also for email records, web search results and other kinds of customer information. They argue that such requests should only be made with a warrant that reflects some level of probable cause. Court experts are calling this the biggest Fourth Amendment case in years because it’s going to consider the issues involved with the search for digital records.

The second news story is a different take on privacy. The Electronic Privacy Information Center (EPIC) has asked the Federal Trade Commission (FTC) to investigate how Google tracks customers. Specifically they say that Google analyzes credit card data to understand the in-store shopping habits of customers. They then sell this data to retailers. EPIC is asking the FTC to investigate the actual practices being deployed as well as to provide some sort of mechanism for people to opt out of this kind of tracking program.

If the FCC takes up this investigation it could also be groundbreaking. This case is the first specific case that asks the government to create some boundaries for such tracking and to allow people to opt out of being tracked.

There are many other companies other than Google who are now using ‘big data’ to compile detailed profiles of people. These profiles are being marketed to vendors of products and services, but there is a great fear among privacy advocates that these same profiles can be used for nefarious purposes by governments and others. For instance, scam artists would probably love to know the identity of every household in the country that has somebody suffering from early-stage dementia.

Anybody that is getting involved in selling smart home products needs to be concerned about these issues. Recently researchers Ming Jin, Ruoxi Jia and Costas Spanos of the University of California at Berkeley examined some routine data collected by smart electric meters and were surprised at how much they were able to figure out about the occupants of a home using the data. For example, they were able to understand the patterns of when homes were occupied and unoccupied and were fairly easily able to tell when a given residence was unoccupied.

As we get more smart devices in homes the combination of the data collected by the various devices will be able to paint a detailed picture of the occupants of a home. This case could be the first step towards defining customer rights for control of their personal data.