AT&T’s Fiber Strategy

On the most recent earnings call with investors, AT&T’s EVP and CFO John Stevens reported that AT&T has only 800,000 customers nationwide remaining on traditional DSL. That’s down from 4.5 million DSL customers just four years ago. The company has been working hard to work its way out of the older technology.

The company overall has 15.8 million total broadband customers including a net gain of 82,000 customers in the first quarter. This compares to overall net growth for the year of 2017 of only 114,000 customers. The company has obviously turned the corner and after years of stagnant growth is adding broadband customers again. The overall number of AT&T broadband customers has been stagnant for many years, and if you go nearly a decade the company had 15 million broadband customers, with 14 million on traditional DSL.

The 15 million customers not served by traditional DSL are served directly by fiber-to-the-premises (FTTP) or fiber-to-the-node (FTTN) – the company doesn’t disclose the number on each technology. The FTTN customers in AT&T are served with newer DSL technologies that bond two copper pairs. This technology generally has relatively short copper drops of less than 3,000 feet and can deliver broadband download speeds above 40 Mbps download. AT&T still has a goal to pass 12.5 million possible customers with fiber by the end of 2019, with an eventual goal to pass around 14 million customers.

The AT&T fiber buildout differs drastically from that done by Verizon FiOS. Verizon built to serve large contiguous neighborhoods to enable mass marketing. AT&T instead is concentrating on three different customer segments to reach the desired passings. They are building fiber to business corridors, building fiber to apartment complexes and finally, offering fiber to homes and businesses that are close to their many existing fiber nodes. Homes close enough to one of these nodes can get fiber while those only a block away probably can’t. It’s an interesting strategy that doesn’t lend itself to mass marketing, which is probably why the press has not been flooded with stories of the company’s fiber expansion. With this buildout strategy I assume the company has a highly targeted marketing effort that reaches out only to locations it can easily reach with fiber.

To a large degree AT&T’s entire fiber strategy is one of cherry picking. They are staying disciplined and are extending fiber to locations that are near to their huge existing fiber networks that were built to reach large businesses, cell sites, schools, etc. I work across the country and I’ve encountered small pockets of AT&T fiber customers in towns of all sizes. The cherry picking strategy makes it impossible to map their fiber footprint since it consists of an apartment complex here and a small cluster of homes there. Interestingly, when AT&T reports these various pockets they end up distorting the FCC’s broadband maps, since those maps count a whole census block as having gigabit fiber speeds if even only one customer can actually get fiber.

Another part of AT&T’s strategy for eliminating traditional DSL is to tear down rural copper and replace DSL with cellular broadband. That effort is being funded to a large extent by the FCC’s CAF II program. The company took $427 million in federal funding to bring broadband to over 1.1 million rural homes and businesses. The CAF II program only requires AT&T and the other telcos to deliver speeds of 10/1 Mbps. Many of these 1.1 million customers had slow DSL with typical speeds in the range of 1 Mbps or even less.

AT&T recently said that they are not pursuing 5G wireless local loops. They’ve looked at the technology that uses 5G wireless links to reach from poles to nearby homes and said that they can’t make a reasonable business case for the technology. They say that it’s just as affordable in their expansion model to build fiber directly to customers. They also know that fiber provides a quality connection but are unsure of the quality of a 5G wireless connection. That announcement takes some of the wind out of the sails for the FCC and legislators who are pressing hard to mandate cheap pole connections for 5G. There are only a few companies that have the capital dollars and footprint to pursue widespread 5G, and if AT&T isn’t pursuing this technology then the whole argument that 5G is the future of residential broadband is suspect.

This is one of the first times that AT&T has clearly described their fiber strategy. Over the last few years I wrote blogs that wondered where AT&T was building fiber, because outside of a few markets where they are competing with companies like Google Fiber it was hard to find any evidence of fiber construction. Instead of large fiber roll-outs across whole markets it turns out that the company has been quietly building a fiber network that adds pockets of fiber customer across their whole footprint. One interesting aspect of this strategy is that those who don’t live close to an AT&T fiber node are not likely to ever get their fiber.

The Flood of New Satellite Networks

I wrote a blog a few months ago about SpaceX, Elon Musk’s plan to launch a massive network starting with over 4,400 low-orbit satellites to blanket the world with better broadband. SpaceX has already launched the first few test satellites to test the technology. It seems like a huge logistical undertaking to get that many satellites into orbit and SpaceX is not the only company with plans for satellite broadband. Last year the FCC got applications for approval for almost 9,000 different new communications satellites. Some are geared to provide rural broadband like SpaceX, but others are pursuing IoT connectivity, private voice networks and the creation of space-based backhaul and relay networks.

The following companies are targeting the delivery of broadband:

Boeing. Boeing plans a network of 2,956 satellites that will concentrate on providing broadband to government and commercial customers worldwide. They intend to launch 1,396 satellites within the next six years. This would be the aerospace company’s first foray into being an ISP, but they have experience building communications satellites for over fifty years.

OneWeb. The company is headquartered in Arlington, Virginia and was founded by Greg Wyler. The company would be a direct competitor to SpaceX for rural and residential broadband and plans a network of over 700 satellites. They have arranged launches through Virgin Galactic, the company founded by Richard Branson. The company plans to launch its first satellite next year.

O3b. The company’s name stands for the ‘other 3 billion’ meaning those in the world with no access to broadband today. This company is also owned by Greg Wyler. They already operate a few satellites today that provide broadband to cruise ships and to third-world governments. Their plan is to launch 24 additional satellites in a circular equatorial orbit. Rather than launching a huge number of small satellites they plan an interconnected network of high-capacity satellites.

ViaSat. The company already provides rural broadband today and plans to add an additional 24 satellites at an altitude of about 4,000 miles. The company recently launched a new ViaSat-2 satellite this year to augment the existing broadband satellite service across the western hemisphere. The company is promising speeds of up to 100 Mbps. In addition to targeting rural broadband customers the satellite is targeting broadband delivery to cruise ships and airplanes.

Space Norway. The company wants to launch two satellites that specifically target broadband delivery to the Arctic region in Europe, Asia and Alaska.

The business plans of the following companies vary widely and shows the range of opportunities for space-based communications:

Kepler Communications. This Canadian company headquartered in Toronto is proposing a network of up to 140 tiny satellites the size of a football which will be used to provide private phone connectivity for shipping, transportation fleets and smart agriculture. Rather than providing broadband, the goal is to provide private cellphone networks to companies with widely dispersed fleets and locations.

Theia Holdings. The company is proposing a network of 112 satellites aimed at telemetry and data gathering for services such as weather monitoring, agricultural IoT, natural resource monitoring, general infrastructure monitoring and security systems. The network will consist almost entirely of machine to machine communication.

Telesat Canada. This Canadian company already operates satellites today that provide private voice communications networks for corporate and government customers. The company is launching two new satellites to supplement the 15 already in orbit and has plans for a network consisting of at least 117 satellites. The company’s largest targeted customer is the US Military.

LeoSat MA. The company is planning a worldwide satellite network that can speed a transmission around the globe about 1.5 times faster than terrestrial fiber networks. Their market will be large businesses and governments that need real-time communication around the globe for applications like stock exchanges, business communications, scientific applications and government communications.

Audacy Corp. The company want to provide the first satellite network aimed at providing communications between satellites and spacecraft. Today there is a bandwidth bottleneck between terrestrial earth stations and satellites and Audacy proposes to create a space-only broadband relay network to enable better communications between satellites, making them the first space-based backbone network.

Can the FTC Regulate Broadband?

When the FCC wrote themselves out of the regulation of broadband, one of the primary arguments made by Chairman Ajit Pai was that the Federal Trade Commission (FTC) would still be empowered to step in to stop any ISP abuses of broadband customers. The FTC has the general mandate to stop large corporations from engaging in unfair or abusive practices and Pai’s argument was made that ISPs are no different than other large corporations and that FTC oversight is sufficient.

There are several reasons why this argument is full of holes and the FTC cannot be an adequate replacement for the FCC. First, the FTC is not structured to regulate monopolies. We are now watching cable companies become a virtual broadband monopoly for residential service in most markets. The FCC loves to point out that there is still usually a telco DSL option, but when Comcast increases minimum broadband speeds to 150 Mbps while DSL is at a small fraction of that speed, then cable broadband and DSL are no longer equivalent services. The cable companies are winning the broadband war and becoming broadband monopolies as DSL disappears from the conversation.

One of the natural roles of government is to regulate monopolies. FERC heavily regulates local electric companies. The FCC was originally created to deal with the monopoly power that the old Ma Bell held over 95% of the country’s telephony needs. The government regulates industries where a few players hold all of the power like airlines and banks.

The government has always dealt with monopolies in one of two ways – regulate them to curtail abuse of monopoly power or else break up the monopolies up to create competition. The government forced the divestiture of the Bell System when it became apparent that their continued existence was a natural barrier to competition. It seems ironic that the FCC would wash its hands of regulating broadband at the point in time when cable companies are becoming classic monopolies.

The other primary reason that the FTC cannot regulate broadband is that they regulate purely by exception. The agency is empowered to pursue specific abuses by a specific corporation and can require and fine a given company for bad behavior. This puts the FTC in the role of corporate policeman – they can go after an ISP for a bad business practice but that doesn’t directly prohibit other ISPs from engaging in the same behavior. The FTC’s powers are pale compared to the ability of a regulatory agency like the FCC to make a ruling that instantly applies to every ISP in the industry. Ajit Pai’s argument that the FTC can take the FCC’s place is faulty because policing is not regulating.

As weak as the FTC’s power is over regulating broadband there is a chance they will lose even that ability. The FTC sued AT&T in 2014 because the company throttled data usage by unlimited customers to try to get them to drop their unlimited data plans. AT&T challenged that lawsuit and argued that the FTC had no authority over the company. Recall that this was at a time when the FCC still claimed jurisdiction over broadband issues.

The US District Court of Northern California recently ruled against AT&T in favor of the FTC. AT&T has until May 29 to appeal that ruling to the Supreme Court. If the company appeals, it will be to directly ask the Supreme Court if the FTC has jurisdiction over them. A ruling in AT&T’s favor would remove the last vestige of broadband regulation and would make broadband a completely unregulated industry.

It’s not hard to imagine how a truly unfettered broadband industry would react over time if not regulated. We will see big price increases, data caps, the free use and abuse of customer personal data and a violation of all of the principles of net neutrality. This would push broadband in the wrong direction by making it too expensive for many households while degrading the online experience for all broadband customers. The Internet as we know it can be broken if the ISPs are allowed to ignore customers and answer only to Wall Street.

We are already near to this point even if the AT&T suit against the FTC doesn’t conclude with an AT&T victory at the Supreme Court. After the FCC washed their hand of broadband regulation we now have the only regulation of the industry being the FTC which can tackle bad behavior at a single ISP on a single topic. Mass bad behavior by all of the big ISPs will quickly swamp the FTC, and within a few years the higher prices and bad ISP behavior will likely become the industry norm.

The fact that only a few companies own the wires of the broadband network makes this industry a natural monopoly just like electricity, water and natural gas delivery. Nobody likes to be regulated and I can’t even fully believe I am advocating for more regulation. Even before the FCC withdrew from broadband regulation it was one of the mostly lightly regulated monopoly industries in the country. Big ISPs have always fought against being regulated, but I don’t think even they thought that all broadband regulation would be removed in one fell swoop. We are going to have to somehow put regulations back in place or watch our industry go down a very ugly path.

Future Technology – May 2018

I’ve seen a lot of articles recently that promise big improvements in computer speeds, power consumption, data storage, etc.

Smaller Transistors. There has been an assumption that we are at the end of Moore’s Law due to reaching the limit on the smallness of transistors. The smallest commercially available transistors today are 10 nanometers in diameter. The smallest theoretical size for silicon transistors is around 7 nm since below that size the transistor can’t contain the electron flow due to a phenomenon called quantum tunneling.

However, scientists at the Department of Energy’s Lawrence Berkeley Laboratory have developed a 1 nanometer transistor gate, which is several magnitudes smaller than silicon transistors. The scientists used molybdenum disulfide, a lubricant commonly used in auto shops. Combining this material with carbon nanotubes allows electrons to be controlled at the 1 nm distance. Much work is still needed to go from lab to production, but this is the biggest breakthrough in transistor size in many years and if it works will provide a few more turns of Moore’s Law.

Better Data Storage. A team of scientists at the National University of Singapore have developed a technology that could be a leap forward in data storage technology. The breakthrough uses skyrmions which were identified in 2009. The scientists have combined cobalt and palladium into a film that is capable of housing the otherwise unstable skyrmions at room temperatures.

Once stabilized the skyrmions, at only a few manometers in size, can be used to store data. If these films can be stacked they would provide data storage with 100 times the density of current storage media. We need better storage since the amount of data we want to store is immense and expected to increase 10-fold over the next decade.

Energy Efficient Computers.  Ralph Merkle, Robert Freitas and others have created a theoretical design for a molecular computer than would be 100 billion times more energy efficient than today’s most energy efficient computers. This is done by creating a mechanical computer that creates small physical gates at the molecular level that mechanically open and close to create circuits. This structure would allow the creation of the basic components for computing such as AND, NAND, NOR, NOT, OR, XNOR and XOR gates without electronic components.

Today’s computers create heat due to the electrical resistance in components like transistors, and it’s this resistance that requires huge electricity bills to operate and then cool big data centers. Mechanical computer create less heat from the mechanical process of opening and closing logic gates, and this friction can be nearly eliminated by creating tiny gates at the molecular level.

More Powerful Supercomputers. Scientists at Rice University and the University of Illinois at Urbana-Champaign have developed a process that significantly lowers the power requirements while making supercomputers more efficient. The process uses a mathematical technique developed in the 1600s by Isaac Newton and Joseph Raphson that cut down on the number of calculations done by a computer. Computers normally calculate every mathematical formula to the seventh or eight decimal point, but using the Newton-Raphson tool can reduce the calculations to only the third or fourth decimal place while also increasing the accuracy of the calculations by three orders of magnitude (1000 times).

This method drastically reduces the amount of time needed process data, which makes the supercomputer faster while drastically reducing the amount of energy needed to perform a given calculation. This has huge implications when running complex simulations such as weather forecasting programs that require the crunching of huge amounts of data. Such programs can be run much more quickly while producing significantly more accurate results.

Who’s Pursuing Residential 5G?

I’ve seen article after article over the last year talking about how 5G is going to bring gigabit speeds to residents and give them an alternative to the cable companies. But most of the folks writing these articles are confusing the different technologies and businesses cases that are all being characterized as 5G.

For example, Verizon has announced plans to aggressively pursue 5G for commercial applications starting later this year. The technology they are talking about is a point-to-point wireless link, reminiscent of the radios that have been commonly used since MCI deployed microwave radios to disrupt Ma Bell’s monopoly. The new 5G radios use higher frequencies in the millimeter range and are promising to deliver a few gigabits of speed over distance of a mile or so.

The technology will require a base transmitter and enough height to have a clear-line-of-sight to the customer, likely sited on cell towers or tall buildings. The links are only between the transmitter and one customer. Verizon can use the technology to bring gigabit broadband to buildings not served with fiber today or to provide a second redundant broadband feed to buildings with fiber.

The press has often confused this point-to-point technology with the technology that will be used to bring gigabit broadband to residential neighborhoods. That requires a different technology that is best described as wireless local loops. The neighborhood application is going to require pole-mounted transmitters that will be able to serve homes within perhaps 1,000 feet – meaning a few homes from each transmitter. In order to deliver gigabit speeds the pole-mounted transmitters must be fiber fed, meaning that realistically fiber must be strung up each street that is going to get the technology.

Verizon says it is investigating wireless local loops and it hopes someday to eventually use the technology to target 30 million homes. The key word there is eventually, since this technology is still in the early stages of field trials.

AT&T has said that it is not pursuing wireless local loops. On a recent call with investors, CFO John Stevens said that AT&T could not see a business case for the technology. He called the business case for wireless local loops tricky and said that in order to be profitable a company would have to have a good grasp on who was going to buy service from each transmitter. He says that AT&T is going to stick to it’s current network plans which involve edging out from existing fiber and that serving customers on fiber provides the highest quality product.

That acknowledgement is the first one I’ve heard from one of the big telcos talking about the challenges of operating a widespread wireless network. We know from experience that fiber-to-the-home is an incredibly stable technology. Once installed it generally needs only minor maintenance and requires far less maintenance labor that competing technologies. We also know from many years of experience that wireless technologies require a lot more tinkering. Wireless technology is a lot more temperamental and it might take a decade or more of continuous tweaking until wireless local loop become as stable as FTTH. Whoever deploys the first big wireless local loop networks .better have a fleet of technicians ready to keep it working well.

The last of the big telcos as CenturyLink and their new CEO Jeff Storey has made it clear that the company is going to focus on high-margin enterprise business opportunities and will stop deploying slow-payback technologies like residential broadband. I think we’ve seen the end of CenturyLink investing in any last-mile residential technologies.

So who will be deploying 5G wireless local loops? We know it won’t be AT&T or CenturyLink. We know Verizon is considering it but has made no commitment. It won’t be done by the cable companies which have upgraded to DOCSIS 3.1. There are no other candidates that are willing or able to spend the billions needed to deploy the new technology.

Every new technology needs to be adopted by at least one large ISP to become successful. Vendors won’t do the needed R&D or crank up the production process until they have a customer willing to place a large order for electronics. We’ve seen promising wireless technologies like LMDS and MMDS die in the past because no large ISP embraced the technologies and ordered enough gear to push the technology into the mainstream.

I look at the industry today and I just don’t see any clear success path 5G wireless loop electronics. The big challenged faced by wireless local loops is to become less expensive than fiber-to-the-home. Until the electronics go through a few rounds of improvements that only come after field deployment, the technology is likely to require more technician time than FTTH. It’s hard to foresee anybody taking the chance on this in any grand way.

Verizon could make the leap of faith and sink big money into an untried technology, but that’s risky. We’re more likely to keep seeing press releases talking about field trials and the potential for the 5G technology. But unless Verizon or some other big ISP commits to sinking billions of dollars into the gear it’s likely that 5G local loop technology will fizzle as has happened to other wireless technologies in the past.

Comcast Broadband Bundles

Comcast recently announced unilateral broadband speed increases for some customers. Customers with current 60 Mbps service today are being increased to 150 Mbps, those with 150 Mbps are moving up to 250 Mbps, and those with 250 Mbps are being bumped up to 400 Mbps or 1 Gbps depending upon their cable package.

The Houston Chronicle reported that the speed upgrades are only available to customers who have a cable package and an X1 settop box. This article has spawned a number of outraged reactions from customers and industry journalists.

This is not news, and in my experience has been a long-term practice of the company. When there is an event like this speed increase the Comcast practice percolates up to the surface again. The company has been reserving their fastest broadband speeds for customers who buy cable TV for years. When I moved to Florida five years ago Comcast would not sell me standalone broadband any faster than 20 Mbps unless I purchased a cable package.

That speed was not adequate for my family and home office and so I was corralled into buying their basic TV package in order to get 100 Mbps broadband. They wouldn’t let me buy the faster standalone broadband at any price. The cable settop box went immediately into my closet and was never plugged in. The $20 basic TV package ended up costing me over $40 per month after layering on the settop box and local programming fees. I felt like I was being extorted every time I paid my Comcast bill. I called periodically to try to drop the cable package but was always told that would mean reducing my broadband speed.

The articles I’ve read assume that this pricing structure is intended to hurt cord cutters. But when this happened to me five years ago there were very few cord cutters. I’ve always assumed that Comcast wanted to maintain cable customer counts to please Wall Street and were willing to strongarm customers to do so. I was a cable customer in terms of counting, but I never watched any of the TV I was forced to buy. I always wondered how many other people were in the same position. For the last few years Comcast has lost fewer cable customers than the other big cable companies and perhaps this one policy is a big part of the reason for that.

Today it’s easier to make the argument that this is to punish cord cutters. This policy clearly harms those who refuse to buy the company’s cable products by forcing them into the company’s smallest bandwidth data products. Last year Comcast declared that they are now a broadband company and not just a traditional cable company – but this policy challenges that assertion.

Comcast is further punishing card cutters by enforcing their data caps. Due to public outcry a few years ago they raised the monthly data limit to one terabyte. While that sounds generous, it’s a number that is not that hard to hit for a house full of cord cutters. Over time more households will hit that limit and have to pay even more money for their broadband.

This policy is a clear example of monopolist behavior. I’m positive that this policy is not invoked in those markets where Comcast is competing with a fiber overbuilder. There is no better way to identify the monopolist policies than by seeing what gets waived in competitive markets.

Unfortunately for the public there is no recourse to monopolistic behavior. The FCC has largely washed their hands of broadband regulations and is going to turn a deaf ear to issues like this. Comcast and the other big ISPs are now emboldened to implement any policies that will maximize their revenues at the expense of customers.

It’s not hard to understand some of the ramifications of this policy. My 100 Mbps connection from Comcast was costing me over $100 per month and this is both a ridiculous price and unaffordable to many homes. The scariest thing about these kinds of policies is that the cable company monopoly is strengthening as they chase out the last remnants of DSL. There will be huge numbers of markets where Comcast and the other large cable companies will be the only realistic broadband option.

I’ve noted in a few blogs that there seem to be consensus on Wall Street that the big ISPs are going to significantly increase broadband prices over the next few years. They continue to also bill outrageous rates for a cable modem and slap on hidden fees to further jack up prices. When you layer in policies like this one and data caps it’s clear that Comcast cares about profits a whole lot more than they care if households can afford broadband. I know that’s inevitable monopoly behavior, and in the ideal world the federal government would step in to stop the worst monopoly abuses.

The False 5G Narrative in DC

The FCC and some members of Congress have adopted a false narrative about our need for the rapid deployment of 5G. The narrative says that rest of the world is already ahead of the US with 5G deployment and warns about the huge downsides to our economy should we not sweep aside all barriers for deploying 5G.

This is the narrative being used to justify giving wireless carriers cheap and ubiquitous access to poles for 5G transmitters. The FCC and others want to sweep away all state and local rules for pole-related issues. They want rules that will allow wireless carriers to deploy electronics first and straighten out the paperwork later. They argue that all of this is needed so that the country can keep up with the rest of the world in 5G deployment, with some horrific, yet unspecific disastrous result should we fail to make this happen.

The big problem with this narrative is that it’s based upon false premises. The narrative is nothing more than a fairy tale spun by the wireless industry as a way to justify bypassing the normal regulatory process, to hand them fast and cheap connections on poles for wireless devices.

First, there is no big impending needed to deploy huge numbers of 5G devices, because the technology doesn’t yet exist. There are two distinct 5G technologies – 5G cellular and 5G millimeter wave broadband. The industry agrees that it’s going to take a decade until we have a 5G-compliant cellular technology available. There are thirteen key aspects of the new 5G standard that must now be tackled by engineers and then woven into the next generation electronics. We made numerous gradual incremental improvements in technology to evolve from 3G to 4G and it was only last year that we finally saw the first deployments of 4G technology that meets most of the original 4G specifications. There is no reason to think that we are going to progress any faster towards 5G and we will upgrade over time to 4.1G, 4.2G, etc. until a decade from now we finally have a 5G cellular network. By then we will no doubt start over and begin implementing 6G.

There is similarly no pressing need to deploy millimeter wave 5G. This is a technology that promises to potentially offer a gigabit alternative in residential neighborhoods. We have a long way to go before we are going to see wide-spread deployments of this technology. We are just now seeing the first early trials of the technology and it’s going to take years before electronics are widely available and affordable. Further, this technology is going to require a lot of concurrent fiber deployment, and that is likely to be the biggest cost barrier to deployment – not getting onto poles. I even have to wonder who is going to be deploying the 5G millimeter wave radios on a big scale – every one of the big telcos has made it clear that they are backing away from residential broadband, and the big cable companies have, or will soon have, gigabit-capable networks. We might never see the gigabit wireless networks that are the bait being used to tout 5G, because there might not be any deep-pocket ISPs willing to tackle such a large infrastructure investment.

What the wireless carriers are starting to deploy today are 4G small cell sites. These cell sites are being used to supplement and boost the existing cellular networks. The original big-tower cellular network was built to provide voice services and the cell site spacing is terrible for delivering broadband, which uses frequencies that don’t carry as far as the lower frequencies used for voice. The exploding demand for cellular broadband is driving the need for more cell sites just to accommodate the number of users and the amount of bandwidth that can be deployed in a given neighborhood.

The existing cellular networks are clearly under stress in urban areas. But the real issue we should be talking about is how to bolster 4G networks, not how we are already behind in the mythical 5G race. The cellular carriers are crafty and they are using the 5G race narrative as a way to get politicians to support their demands. They are promising wireless gigabit cellular speeds in just the next few years and cheap wireless gigabit broadband soon coming into every home. They have created a feigned panic that the current regulatory rules will stop this progress dead in it’s track unless carriers get fast and cheap pole access.

If this 5G narrative was true we’d be seeing a collapse of cable company stock prices. Cable companies have the most to lose if they are suddenly faced with gigabit cellular and gigabit wireless to the home. We are probably decades away from seeing cellular speeds approaching anything close to a gigabit – that’s the biggest myth in this narrative. And even when the new technology is developed for wireless gigabit to the home one has to ask what ISPs are going to spend the huge billions needed to build that network to compete against the entrenched cable companies.

I don’t want to minimize some of the barriers faced by wireless companies when trying to get onto poles today. Wireless carriers have cited a few horror stories in FCC filings. But like anything else brand new, most pole owners aren’t sure yet how to respond to requests for wireless attachments. There are a lot of issues to work through including safety, pricing, aesthetics and the long-term impact on the real estate space on poles. These are all issues that need solutions, but I can’t find one reason why we need to tackle this at breakneck speed or why we need to give the wireless carriers everything on their wish list. It’s important to bolster the stressed 4G network and we will want to be ready for the 5G technology when it is finally available. We have the time to make the needed regulatory changes in the deliberative manner that makes sure that all aspects of the issues are considered. We don’t need a fast knee-jerk response to a false 5G narrative that might create more problems than it solves.

Light Poles and 5G

There is a lot of regulatory activity right now concerning wireless providers adding small cell site and 5G electronic to poles. A few states have adopted legislation setting low prices for such connections and similar bills are moving through many state legislatures. There is discussion at the FCC for mandating nationwide rules on some of the issues, and one of the FCC’s BDAC advisory groups was created to look at these specific issues.

One topic I haven’t seen covered in any of these efforts is how to deal with light poles – that is poles that don’t carry wires. I think this is a germane issue for many reasons. There are many poles that have been built solely for the purpose of providing street lights and I don’t think these poles are automatically covered by any of these regulatory or legislative efforts.

I’ve recently looked again at the various pole attachment rules to see if I’m right. One of the primary laws affecting pole attachments was the Pole Attachment Act of 1978 that determined a price structure for pole attachments and that authorized the FCC to develop specific rules for pole make-ready which included in Section 224 of the FCC rules. The right for carriers to use poles was bolstered significantly by the Telecommunications Act of 1996 that granted carriers the ability to use the poles, conduits and rights-of-way of existing utilities. That act defined poles as structures that carry telecommunications wires.

In many cases light poles fall naturally into this definition. In my neighborhood the streetlights are placed at the top of existing utility poles that carry wires for the various utilities. Clearly such light poles are covered by the FCC rules. One has to wonder how useful these poles are for 5G since light fixtures occupy the coveted top space on the poles that wireless carriers want to use, but from a regulatory perspective such poles are covered.

There are a lot of light poles that don’t fit into the current regulatory regime. A lot of light poles have been erected in neighborhoods where the other utilities are buried. These poles are not designed to carry wires. They are connected to the buried power lines to provide electricity for the street lights, but otherwise have no connection to other utility wires. A similar class of poles are ornamental ones. The last neighborhood I lived in had street lights that looked like they came straight out of a Sherlock Holmes story – metal poles with a big light globe at the top.

I’ve read the FCC rules several times this week and I can’t see where poles that aren’t intended to carry wires fall under FCC jurisdiction. Such poles often can’t even easily accommodate pole connections and might be made out of metal or concrete.

Cities of all sizes have required utilities to bury wires. The regulatory question is if the FCC will try to claim jurisdiction over poles that were built in such neighborhoods to only support street lights? This would pull millions of light poles under FCC jurisdiction, something that shouldn’t be done without deliberation.

The 5G legislation I’ve seen doesn’t recognize these issues. Some of these laws grant carte blanc authority to wireless carriers to deploy 5G networks without regard to local oversight. This could results in 5G transmitters being added to ornamental poles. It might mean constructing new poles in neighborhoods where the other utilities are buried. It could even allow wireless carriers to string fiber between such new poles, even though other utilities are buried. 5G networks are also going to want an unobstructed line-of-sight to buildings and wireless carriers might use aggressive tree trimming to get the paths they want. Such deployments are going to be wildly unpopular to homeowners and local governments.

None of this is going to happen without a big fight. Current federal pole attachment rules derive from acts of Congress, and anything short of a new federal law on the issues can’t easily change what has been done in the past. It’s questionable if the FCC can preempt state and local laws concerning pole attachments without a new federal law since earlier legislation granted states to optionally claim jurisdiction over pole issues.

One thing that is clear to me is that any new laws need to carefully consider all of the issues. A law that just gives carte blanc authority for wireless carriers to do whatever they want to going to be widely unpopular and will eventually get huge pushback. Even the idea of expanding regulatory authority over standalone light poles would likely be challenged as a state versus federal issue, meaning big court fights. I’m seeing a mad regulatory rush to give wireless carriers the ability to deploy 5G, but there are numerous issues involved that demand careful deliberation if we want to do this right.

CenturyLink and Residential Broadband

CenturyLink is in the midst of a corporate reorganization that is going to result is a major shift in the focus of the company. The company merged with Level 3 in 2016 and the management team from Level 3 will soon be in charge of the combined business. Long-time CEO Glen Post is being pushed out of day-to-day management of the company and Jeff Storey, the former CEO of Level 3 will become the new CEO of CenturyLink. Storey was originally slated to take the top spot in 2019, but the transition has been accelerated and will happen this month.

It’s a shift that makes good financial sense for the company. Mr. Storey had huge success at Level 3 and dramatically boosted earnings and stock prices over the last four years. Mr. Storey and CenturyLink CFO Sunit Patel have both made it clear that they are going to focus on the more profitable enterprise business opportunities and that they will judge any investments in last-mile broadband in terms of the expected returns. This differs drastically from Mr. Post who comes from a background as an independent telephone company owner. As recently as a year ago Mr. Post publicly pledged to make the capital investments needed to improve CenturyLink’s last-mile broadband networks.

This is going to mean a drastic shift in the way that CenturyLink views residential broadband. The company lost 283,000 broadband customers for the year ending in December 2017, dropping them to 5.7 million broadband customers. The company blames the losses on the continued success of the cable companies to woo away DSL customers.

This size of the customer losses is a bit surprising. CenturyLink said at the end of 2017 that they were roughly 60% through their CAF II upgrades which is bringing better broadband to over 1.1 million rural households. Additionally, the company built FTTP past 900,000 potential business and residential customers in 2017. If the company was having even a modest amount of success with those two new ventures it’s hard to understand how they lost so many broadband customers.

What might all of this mean for CenturyLink broadband customers? For rural customers it means that any upgrades that are being made using CAF II funding are likely the last upgrades they will ever see. Customers in these rural areas are already used to being neglected and their copper networks are in lousy condition due to decades of neglect by former owner Qwest.

CenturyLink is required by the CAF II program to upgrade broadband speeds in the rural areas to at least 10/1 Mbps. The company says that over half of the upgraded customers are seeing speeds of at least twice that. I’ve always had a concern about any of the big telcos reaching the whole CAF II footprint, and I suspect that when the CAF II money is gone, anybody that was not upgraded as promised will never see upgrades. I’ve also always felt that the CAF II money was a waste of money –  if CenturyLink walks away from the cost of maintaining these newly upgraded DSL networks they will quickly slide back into poor condition.

There are already speculation on Wall Street that CenturyLink might try to find a buyer for their rural networks. After looking at the problems experienced by Frontier and Fairpoint after buying rural telco copper networks one has to wonder if there is a buyer for these properties. But in today’s world of big-deal corporate finance it’s not impossible to imagine some group of investors willing to tackle this. The company could also take a shot at selling rural exchanges to independent telcos – something US West did over twenty years ago.

It’s also likely that the company’s foray into building widespread FTTP in urban areas is done. This effort is capital intensive and only earns infrastructure returns that are not going to be attractive to the new management. I wouldn’t even be surprised to see the company sell off these new FTTP assets to raise cash.

The company will continue to build fiber, but with the emphasis on enterprise opportunities. They are likely to adopt a philosophy similar to AT&T’s which has been building residential fiber only to large apartment complexes and to households that are within short distances from existing fiber pops. This might bring fiber broadband to a lucky few, but mostly the new management team has made it clear they are deemphasizing residential broadband.

This management transition probably closes the book on CenturyLink as a last-mile ISP. If they are unable to find a buyer for these properties it might take a decade or more for their broadband business to quietly die. This is bad news for existing broadband customers because the company is unlikely to invest in keeping the networks in operational shape. They only ones who might perceive this as good news are those who have been thinking about overbuilding the company – they are not going to see any resistance.

“But I Live Close to Fiber”

I often hear from people who are excited that fiber is coming to their neighborhood. They see work crews installing fiber and they hope this means that they are finally getting fiber to their homes. But unless folks are in one of the lucky neighborhoods where some ISP is making the big investment in last mile fiber-to-the-home, the chances are good that the new fiber that is tantalizingly close is not going to reach them.

There are a lot of fiber networks in the country that are being used for purposes other than serving homes. Consider some of the following reasons why fiber might be close to you, but unavailable:

  • Electric companies have private fiber networks to connect substations and other electric company facilities. In the last few years we’ve seen some of the biggest electric companies pull back from sharing fiber with others because of security concerns for the electric grid. It’s not uncommon for the electric company to be the only tenant on such fibers.
  • Telcos have fiber networks that connect their central offices in various towns. They have more extensive local fiber networks that are built to supply neighborhood DSL cabinets. If your neighborhood has DSL speeds greater than 15 Mbps, the chances are good that there is telco fiber close to you.
  • Cable companies have fiber for similar reasons. Cable networks are subdivided into neighborhood nodes. These nodes used to be large and served upwards of a thousand homes, but cable companies have reduced node sized to eliminate the problem of their broadband slowing down in the evenings. Nodes might now be as small as a hundred homes – and since each node is fiber fed there is cable company fiber somewhere near to every cluster of homes.
  • A large number of cities have built fiber networks to connect city hall, libraries, firehouses, water utility facilities and other city locations. This has largely been done to reduce the high payments to ISPs to connect these locations with broadband. While many municipal FTTH projects got started by expanding these networks, the vast majority of the municipal fiber networks serve only the city. There’s a decent chance that there is fiber at the library, firehouse or other city facility near your neighborhood.
  • Similarly there are a number of states that have built state-wide fiber networks to connect their own facilities. These networks are often shared with anchor institutions like city halls and other local and state government buildings. Most of these networks are prohibited by state law from sharing the fiber with last-mile fiber builds, even municipal ones.
  • Many school districts have fiber networks to connect schools to provide gigabit speeds. While some of these networks can be shared with other providers, the majority of these networks are used only for the school district.
  • Various companies including telcos, cable companies, and big ISPs build fiber to reach large businesses or industrial parks. The larger downtown buildings in most cities now also have fiber.
  • There is now a major push for building fiber to large apartment complexes. For example, a lot of the push by AT&T to pass millions of locations with fiber is mostly being done by reaching apartment complexes.
  • Today every cell tower is fed with fiber. There will be a lot of new fiber built to reach the smaller cell sites we’ll see on utility and light poles.
  • There are long-haul fiber networks that only function to connect cities and major markets. These networks rarely allow any connections to the network other than at major network nodes.
  • Many cities now have fiber networks that feed traffic signals and traffic cameras. Because of the way that these networks are funded with highway money, these fiber networks are often inexplicably separate from other municipal fiber networks.
  • State highway departments also now operate a lot of fiber networks for their own use to feed the signs that provide traffic information and to feed cameras that are used to monitor traffic.

The chances are that if you live in any kind of populated area, even in rural counties, that there are several of these fiber networks close to you. If you live in a city it’s likely that you can easily walk to half a dozen different fiber networks – none which are being used to bring fiber to your home.  The chances are high that the new fiber you see being built is not being built for you.