Prices for Wireless Pole Attachments

The FCC’s advisory BDAC group looking at pole attachment issues gathered prices for wired and wireless pole attachments. Their goal was to understand the range of attachments prices and to see if the group could come up with a consensus pricing recommendation to the FCC. Wired pole attachments are the annual fee that the owner of a wired network (telco, cable company, fiber) pays for each place one of their wires connects to a pole. Wireless pole attachments are the fees charged for putting some kind of wireless transmitter on a pole.

There were no surprises for wired pole attachments. The group looked at 577 different attachments and found that the average price was $17.58 per year for each wired pole attachment while the median was $15.56. These are similar to the prices I see all over the country.

Wireless attachments varied a lot more. The BDAC group looked at 407 samples of wireless pole attachment prices from around the country and the average price was $505.56 while the median price was $56.60. For the median to be that low means that the sample was stacked with low readings.

That’s easy to understand if you look at wireless pole attachment rules around the country. Three states – Arizona, Indiana and North Carolina have capped the annual price of a wireless pole attachment at $50 per year, while Texas capped it at $20. Other states like Colorado, Delaware and Virginia cap rates at actual cost. For the median price to be that low means that just less than half of the of the 407 same prices were likely from this group of states. And this means that that no conclusions can be drawn from the results of the BDAC’s sampling – it was definitely not a random or representative sample – yet the BDAC group summarized the results as if it was, and even calculates a standard deviation.

Thirteen states have already acted to limit the cost for wireless attachments, mostly through legislation. Florida and Rhode Island have capped the cost of a wireless pole attachment at $150; Minnesota set the rate at $175 and Ohio set the maximum rate at $200. Kansas says the rate must be ‘competitively neutral’ and Iowa caps the rate at the FCC rate.

One of the biggest issues with arbitrarily setting wireless pole attachment rates is that the wireless devices being put onto poles vary by size and can use between 1 and 10 feet of pole space. Regulators have traditionally used the concept of allocating costs by the amount of usable space taken by a given connector, and in fact uses the term ‘pole real estate’ to describe the relationship between space used and price paid. Any attachment that uses more of the pole real estate should expect to pay more for the attachment – largely in a linear relationship.

The results of the sample might have been more valid has the group not included prices for places where the legislators have capped or limited rates. Also, the big wireless companies are part of the BDAC group and I have to suspect that they brought in the worst case examples they could find where they are paying the highest prices. This exercise proved nothing other than that the price for wireless connections are higher in states where the rates are not capped.

It’s not surprising, but the BDAC group was unable to secure a consensus on prices or pricing methodology for the FCC. Unsurprisingly the network operator – those who attach to poles – think rates should be cost based. Pole owners think rates ought to be market based.

There are, of course, many other factors to consider in setting pole attachment rates. In the case of wireless connections there are concerns about the safety of working near the wireless devices after storm damage. There are also significant concerns in cities about aesthetics.

The battle in setting these rates is still heating up. An additional fifteen states – AK, CA, CT, GA, HI, IL, ME, MO, NE, NM, PA, WA and WI – have considered pole attachment legislation that didn’t pass. There is the possibility of the FCC trying to set rates and there have been drafts of several bills in Congress that have considered the idea. Since this seems to be the primary focus of the wireless companies there will be a lot of lobbying on the issue.

Eliminating Unbundled Network Elements (UNEs)

At the urging of USTelecom, the lobbying arm for the big telcos, the FCC has opened WC Docket No. 18-141 that is seeking to eliminate the requirement for the big telcos to offer unbundled network elements (UNEs) or resale for their products. Comments are due in this docket by June 7, with reply comments due June 22.

The requirement for the big telcos to unbundle their networks was one of the primary features of the Telecommunications Act of 1996. The Congress at that time recognized that there was little competition in the telecom market and decided that allowing competitors to resell or use components of the big telco networks would help to jump-start competition. The idea worked and within just a few years there were giant CLECs created that used resale and UNEs to create large competitive telecoms. I recall that at least six different CLECs salespeople visiting the CCG offices located just inside the Washington DC Beltway. Most of those big competitive companies imploded spectacularly in the big telecom collapse in 2000, but there are still numerous companies utilizing the unbundled elements of the big telco networks.

The docket talks about forbearance, which in this case means ceasing a regulatory requirement, and specifically this docket asks the FCC to forbear:

  • Section 251 and 252 of the FCC rules that require the big telcos to resale or offer unbundled network elements to competitors;
  • Section 272 of the FCC’s rules that specify timelines for the telcos to negotiate or respond to requests for service from competitors;
  • Section 271 of the FCC’s rules that lay out the rights for competitors to gain access to poles, ducts, conduits and rights-of-way.

This forbearance would be devastating to a number of competitive carriers. Consider just a few examples of how the industry still uses these sections of the FCC’s rules:

  • There is still a lot of resale of telco products. I know one Northwest rural area where a competitor resells nearly 90% of the rural DSL provided by CenturyLink. This reseller gained the business by knocking on doors and selling DSL to homes that didn’t even know it was available from the telco. In much of rural America the big telcos have almost no employees, no marketing and no customer service and resellers are making big telco products work even where the telcos don’t make any effort.
  • There are still numerous DSL providers that collocate their own DSL electronics in telco central offices and then use the unbundled telco loops to provide decent DSL to customers. These competitors offer newer generation and faster DSL where the telcos are often still only offering slow first generation DSL from twenty years ago.
  • Facility-based fiber overbuilders regularly use unbundled network elements to operate in areas where they have not yet built fiber. Or they use UNEs to serve distant branches of a fiber customer – for instance they might use UNEs to create a private network between locations of a bank with branches in several communities.
  • Any competitor that wants to offer facility-based long distance in a metropolitan market must have a physical connection to the primary big telco switching locations (tandems) in that market. These connections are needed due to requirements that the telcos have forced upon competitors since the 1996 Act to try to make it more expensive to compete. Nobody would build the massive network needed to connect these office just to provide voice and so competitors satisfy this requirement using UNEs.
  • Competitors routinely want to make connections between carriers located at the big telco hubs. They make this happen by buying UNEs that reach between carrier A and B within these hubs (might only require a few feet of fiber).

All of these situation, and the many other uses of the resale and UNEs would disappear if the FCC sides with the big telcos. The big telcos set to work to neuter the requirements of the Telecommunications Act of 1996 right after it passed. Over the years they have eliminated many forms of resale. They have made it virtually impossible for a competitor to gain access to their dark fiber. They have routinely made it harder and harder each year for competitors by introducing changes in their contracts with competitors.

This forbearance would be a huge victory for the telcos. This would have a huge chilling impact on competition and customer choice. This would mean that the only way to compete with the telcos would be by overbuilding 100% to reach customers and to interconnect networks. Numerous competitive providers would be quickly bankrupted and disappear. Huge numbers of customers, primarily businesses, would lose their vendor of choice as competitive carriers would no longer be able to serve them. This could even kill wholesale VoIP since the underlying carriers providing that service rely heavily within their networks on interconnection UNEs.

The big telcos argue that they shouldn’t have to continue to unbundle their networks because these requirements deal mostly with legacy TDM technology. But this is not only copper technology and many of the UNEs used for interconnection are on fiber. And even where this is being done on copper, it makes sense for the FCC to allow competitors to use that copper for as long as it exists. Copper UNEs will die a natural death as the copper disappears, but until then, if a competitor can use that copper better than the telco they should be allowed to do so. Competitors have used UNEs and resale to build thriving businesses that benefit consumers by providing choice and lower prices. Forbearing on resale and UNEs would be another giveaway by the FCC to the big telcos at the expense of competition and customer choice.

If you are a small carrier that relies on resale or UNEs you need to file comments in this docket by June 7. They need to hear real life stories of small carriers and the customers you serve, and hear why they should not kill UNEs. You don’t need to be a lawyer to tell the FCC your story, especially not if you have a good story to tell.

Carrier of Last Resort

Every once in a while I see a regulatory requirement that makes me scratch my head. One of the requirements of the current CAF II reverse auction is that every winner must become an Eligible Telecommunications Carrier (ETC) before receiving the funding – and I wonder why this is needed? This requirement is coupled with another puzzling requirement that anybody taking this funding must provide telephone service in addition to broadband. Since the purpose of the CAF II program is to expand rural broadband these requirements seem incongruous with the purpose of the program.

The ETC regulatory status was created by the Telecommunications Act of 1996. Congress created this new class of carriers to mean any carrier that is willing to provide basic services within a specified geographic area (and in 1996 this was specified as providing voice service) and for that willingness to serve would be eligible to receive any available subsidies.

While this is not in writing anywhere, I’m guessing that these requirements are part of the ongoing plan to erode the rural carrier of last resort obligation (COLR) for the big telcos. Carrier of last resort is a regulatory concept that is applied by regulators to utility infrastructure providers including telco incumbents, electric, gas and water providers. The textbook definition of carrier of last resort is a utility provider that is required by law to serve customers within a defined service area, even if serving that customer is not economically viable. Further a COLR is required to charge just and reasonable prices and generally has legal hurdles that make it difficult to withdraw from serving within the defined service area.

We are seeing rural carrier of last resort obligations eroding all over the place. For example, the FCC is proposing rules that will allow copper providers to tear down copper networks with no obligation to replace them with some alternate technology.

I think this requirement in the CAF II reverse auction is along the same vein. All of the areas covered by this auction are within the historic regulated footprint of one of the large telcos. Except for the Verizon service areas, where Verizon did not accept the original CAF II funding, these are the most remote customers in this auction are in very rural areas. These are the customers at the far end of long copper lines who have no broadband, and likely no quality telephone service.

Anybody accepting the CAF II reverse funding must file for ETC status for those census blocks where they are getting funding. This is a requirement even if the auction winner is only going to be serving one or two people within that census block. Census blocks are areas that generally include 600 – 800 homes. In cities a census block might be as small as a block or two, but in rural areas a census block can be large.

My bet is that the large telcos are going to claim that they no longer have carrier of last resort in any rural area where there is now a second ETC. They will ask regulators why they need to serve a new home built in one of these areas if there is another carrier with similar obligations. If that’s the case, then this reverse auction is going to remove huge chunks of rural America from having a carrier of last resort provider. It’s likely that incumbent telcos will use the existence of a second ETC to avoid having to bring service to new homes.

I have an even more nagging worry. ETC status is something that is granted by state regulatory commissions. In granting this status it’s possible that some states are going to interpret this to mean that a new ETC might have some carrier of last resort obligations. If the incumbent telco tears down the copper network, it’s not unreasonable to think that state regulators might turn to the new ETC in an area to serve newly constructed homes and businesses.

I would caution anybody seeking ETC status as part of getting this funding to make sure they are not unknowingly picking up carrier of last resort obligations along with that status. If I was making such a filing myself I would query the regulators directly to get their response on the record.

I will be the first to tell you that I could be off base on this – but this feels like one of those regulatory requirements that could have hidden consequences. I can’t think of any reason why this program would require a new provider to supply telephone service other than for letting the large telcos off the hook to do so. I know that many companies going after this funding would think twice about taking it if it means they become the carrier of last resort.

Migrating the Voice Switch

We’re seeing switch vendors ending support for the first generation of soft switches and this is forcing a lot of companies to consider how to continue safely operate their telephone products. Almost all of my clients are seeing an erosion of their voice customer base which makes it hard to justify investments in new switch hardware.

There are alternatives to buying new hardware that should be considered before ponying up for a new switch. Some of the options include:

Migrating to the Cloud. There are numerous options for migrating some or all of your voice services to the cloud.

The simplest migration path is to use a call feature server in the cloud. This is the device that supplies all of the voice features and is the core of a soft switch. A migration to the cloud will eliminate the call feature server hardware and software at the carrier end and replace it with a lease for the capability from a cloud-based vendor like Alianza / Level 3.

At the other extreme you can abandon the whole voice switch and move everything to a cloud-based VoIP service. This eliminates the switching hardware and software, and also eliminates the cost of interconnection and the purchase of things like SS7. There are options between the two extremes and it’s possible to outsource only some switch functions.

Sharing with a Neighbor. I’ve been preaching for years that neighboring carriers ought to partner for voice services. The typical voice switch is capable of processing huge volumes of calls and there can be significant savings when companies share the cost of the core hardware, software, interconnection costs and technician labor associated with a softswitch.

What to Watch Out For. There are possible gotchas to look out for in any switch migration. For example, a carrier that still relies on access charge revenues needs to be careful that a transition doesn’t drastically change access billing. Obviously losing access revenue is a negative, but a migration that drives access charges higher can also be negative and can draw challenges and possibly even bypass by long distance carriers.

Another wrinkle to be aware of is the ability to maintain special switching arrangements like EAS or regional long distance plans that are mandated by regulatory bodies. With good planning such arrangements can be accommodated – but address them up front.

Traditional ILECs also need to be aware of changes in settlements. Switching subsidies and related access charges have largely been phased out, but any change to rate base and access billing is something that should always be run past settlement consultants.

If planned properly and with a little creativity a carrier can save money by outsourcing switching while still meeting all regulatory requirements including network structures like host/remote complexes and even the tandem function. But if done poorly a carrier can put related revenues at risk while possibly messing up the ability of customers to make calls.

I don’t normally use this blog to directly market CCG Consulting services – I know I rarely read marketing blogs from others. But this kind of migration has hidden perils to those who aren’t careful – if you are going to do it, then make sure it’s done perfectly. There are so many moving parts in a switch migration, and often a lot of dollars at stake that you must get it right the first time. The CCG staff has migrating and upgrading switches for decades and we can help you to save money on your switching function while maintaining cash flows and meeting regulatory requirements.

 

Skyrocketing Retransmission Fees

I just saw that the Sinclair Broadcast Group, one of the largest owners of local television stations now gets nearly 50% of their revenues from retransmission fees. This is an extraordinary number when considering that a decade ago that number would have been zero. What is most telling about that number is that Sinclair is growing profits while their core business is sinking. Their advertising revenues are down and the cable companies that carry their signals continue to lose customers due to cord cutting.

Retransmission fees are probably the leading factor in the escalating cost of buying cable TV. Retransmission fees are billed by local over-the-air (OTA) network stations like ABC, CBS, NBC and Fox to allow cable systems to carry their programming. FCC rules make it mandatory for cable companies to carry local OTA content. A decade ago there were almost no retransmission fees outside of a few major markets like New York City. Traditionally cable companies added local programming to their line-up without having to pay a fee – and everybody was happy because the local networks affiliates still got the eyeballs for their advertising.

But today every local network affiliate charges cable companies to carry their content. The current fees average fee per station has climbed into the range of $3 per network month, meaning that roughly $12 of every cable subscription is being sent directly to the local network affiliates – not into the pocket of the cable companies.

This is a rigged game and cable subscribers are funding huge profits for the network affiliates. The cable companies are forced to carry the content and the OTA network affiliates raise rates each year, essentially printing money. There is supposedly a negotiation process for the rates, but the OTA network affiliates have all of the power in the relationship and generally are inflexible on prices.

I suspect that the average cable subscriber has no idea that they are paying close to $12 per month to get the same content they could get for free with a TV antenna. We know from numerous surveys that cost is the leading factor that convinces households to cut the cord, and the escalating retransmission fees have been driving up cable rates by $2 to $3 per year.

We know from looking around the industry that many folks are rebelling against the high prices stemming from the retransmission fees. Over the last three years it’s estimated that there are almost 11 million new homes that are using antennas to get local content. Some of the most successful online video sources like Sling TV and Playstation Vue offer their lowest-cost packages of programming without any local programming.

Sling TV has gone so far as to offer as to offer a box it calls Air-TV that includes an antenna to get local content that is then integrated with the other Sling TV content. The CEO of Sling TV says that the company makes no money with this product, but it’s proven popular and helps to convince households to drop traditional cable by making it easy.

Most of the large cable companies have responded to the retransmission fees by creating a fee separate from the cost of the cable product. They label these fees with names such as Local Programming Fee and hope subscribers think it’s a tax. This is a deceptive billing practice because it lets cable companies advertise the price of cable without the local fees, yet every customer must pay these fees. I was looking at a customer bill from a big cable company last week that had a $40 rate for cable and an $11 cost for the local networks, meaning that the real price for the product was $51 yet they advertise the $40 rate when trying to attract customers.

Congress is the only one who can fix this issue. It is rules established by Congress that make it mandatory for cable providers to carry the network affiliates. I think almost every small cable provider I know would gladly provide free rabbit ears to customers if they could get out paying these fees.

Of course, there is another side to the issue to consider. If these fees ended today a lot of local TV stations would likely fold. This raises the question of whether these businesses should be propped up by this clear subsidy. The original concept behind the Congressional rules was not to establish a revenue for local affiliates, but rather to make sure that cable subscribers had access to local programming. But some smart industry consultants talked local stations into charging these fees and it’s now as close as you can get to a business that prints money.

The retransmission fees should probably not go to zero – but there needs to be a balance between network affiliates and cable companies. Today there is nearly zero negotiations on these fees. There are several possible fixes I can think of. One easy fix would be to allow cable companies to make these stations optional for customers. Then people who want to use an antenna could avoid the fees. This would put some balance into the negotiations since the networks would be less likely to gouge when they’d see the impact of customers dropping their programming. This is not the only possible fix – but we need to try something, because this is badly broken.

Predicting Broadband Usage on Networks

One of the hardest jobs these days is being a network engineer who is trying to design networks to accommodate future broadband usage. We’ve known for years that the amount of data used by households has been doubling every three years – but predicting broadband usage is never that simple.

Consider the recent news from OpenSource, a company that monitors usage on wireless networks. They report a significant shift in WiFi usage by cellular customers. Over the last year AT&T and Verizon have introduced ‘unlimited’ cellular plans and T-Mobile has pushed their own unlimited plans harder in response. While the AT&T and Verizon plans are not really unlimited and have caps a little larger than 20 GB per month, the introduction of the plans has changed the mindset of numerous users who no longer automatically seek WiFi networks.

In the last year the percentage of WiFi usage on the Verizon network fell from 54% to 51%; on AT&T from 52% to 49%, and on T-Mobile from 42% to 41%. Those might not sound like major shifts, but for the Verizon network it means that the cellular network saw an unexpected additional 6% growth in data volumes in one year over what the company might normally have expected. For a network engineer trying to make sure that all parts of the network are robust enough to handle the traffic this is a huge change and means that chokepoints in the network will appear a lot sooner than expected. In this case the change to unlimited plans is something that was cooked-up by marketing folks and it’s unlikely that the network engineers knew about it any sooner than anybody else.

I’ve seen the same thing happen with fiber networks. I have a client who built one of the first fiber-to-the-home networks and use BPON, the first generation of electronics. The network was delivering broadband speeds of between 25 Mbps and 60 Mbps, with most customers in the range of 40 Mbps.

Last year the company started upgrading nodes to the newer GPON technology, which upped the potential customer speeds on the network to 1 gigabit. The company introduced both a 100 Mbps product and a gigabit product, but very few customers immediately upgraded. The upgrade meant changing the electronics at the customer location, but also involved a big boost in the size of the data pipes between neighborhood nodes and the hub.

The company was shocked to see data usage in the nodes immediately spike upward between 25% and 40%. After all, they had not arbitrarily increased customer speeds across-the-board, but had just changed the technology in the background. For the most part customers had no idea they had been upgraded – so the spike can’t be contributed to a change in customer behavior like what happened to the cellular companies after introducing unlimited data plans.

However, I suspect that MUCH of the increased speeds still came from changed customer behavior. While customers were not notified that the network had been upgraded, I’m sure that many customers noticed the change. The biggest trend we see in household broadband demand over the last two years is the desire by households to utilize multiple big data streams at the same time. Before the upgrades households were likely restricting their usage by not allowing kids to game or do other large bandwidth activities while the household was video streaming or doing work. After the upgrade they probably found they no longer had to self-monitor and restrict usage.

In addition to this likely change in customer behavior the spikes in traffic also were likely due to correcting bottlenecks in the older fiber network that the company had never recognized or understood. I know that there is a general impression in the industry that fiber networks don’t see the same kind of bottlenecks that we expect in cable networks. In the case of this network, a speed test on any given customer generally showed a connection to the hub at the speeds that customers were purchasing – and so the network engineers assumed that everything was okay. There were a few complaints from customers that their speeds bogged down in the evenings, but such calls were sporadic and not widespread.

The company decided to make the upgrade because the old electronics were no longer supported by the vendor and they also wanted to offer faster speeds to increase revenues. They were shocked to find that the old network had been choking customer usage. This change really shook the engineers at the company and they feared that the broadband growth curve was going to now be at the faster rate. Luckily, within a few months each node settled back down to the historic growth rates. However, the company found itself instantly with network usage they hadn’t expected for at least another year, making them that much closer to the next upgrade.

It’s hard for a local network owner to predict the changes they are going to affect the network utilization. For example, they can’t predict that Netflix will start pushing 4K video. They can’t know that the local schools will start giving homework that involves watching a lot of videos at home. Even though we all understand the overall growth curve for broadband usage, it doesn’t grow in a straight line and there are periods of faster and slower growth along the curve. It’s enough to cause network engineers to go gray a little sooner than expected!

Broadband in Eugene, Oregon

Today is the first of hopefully many blogs that looks at stories about ISPs, municipalities and cooperatives who have built fiber networks. My goal is to highlight the wide range of different business plans that are being tackled to show that there are many different paths to success. Not all of these stories will be about CCG clients and today’s story is about Eugene, Oregon who is not a client. If you have an interesting business plan model I’d love to hear from you to be featured in a future blog.

The City of Eugene, Oregon identified a lack of big bandwidth as barrier to their economic success. The city has over 400 tech-related firms and many of them told the city that they did not have enough bandwidth to fully fulfill their potential. Neither the incumbents nor Google expressed interest in upgrading Eugene’s infrastructure and the city worried that they were going to be left behind.

As is often the case the project got started by the municipally owned electric and water utility, EWEB. The project quickly grew into a partnership that also involved the city, the Lane Council of Governments (LCOG), a regional planning and coordination organization) and the Technology Association of Oregon.

The project kicked off with a pilot project that brought fiber to five buildings in Eugene. This first effort was largely funded by the city and used the utility’s existing underground electrical conduit. LCOG contributed a local internet exchange location where carriers could easy interconnect. The utility pulled the fiber to buildings.

The operating business selected was to lease dark fiber to ISPs. A number of ISPs showed interest in using the fiber and from their competition the market price for gigabit speed has settled to an affordable $59 for residents in these buildings and $79 for businesses. The pilot project was deemed a success and in one of the first buildings connected the network grew from a 60% occupancy to 90%.

The consortium decided to move to the next stage and expand the network to about 120 other downtown buildings in the city. One major source of funding for that effort was to be a $1.9 million grant from the Economic Development Administration (EDA). But after a year and a half of trying that grant fell through and the consortium has regrouped and raised the money for the expansion from a mix of local funds, including funding from the downtown Urban Renewal district. This is money that was available to benefit the economically distressed downtown area and is now being pointed at fiber expansion.

Currently the project has connected to 30 buildings with plans to get to as many as 120 in the near future. The city asks for a $2,000 payment from a building owner to demonstrate interest and was pleasantly surprised when a low-tech downtown donut shop ponied up the deposit – demonstrating that companies of all types value fast broadband.

The benefits to the city from this venture are significant. With the dark fiber model, they expect the leases from the dark fiber to cover operating expenses. They had originally estimated that bringing fiber downtown would allow many tech companies to expand and they were hoping the fiber would bring 215 new jobs with average salaries of $74,000. The early successes show that they should easily surpass that goal, and the benefit to the city from more tech jobs is immense.

There are some interesting lessons that can be learned from this venture:

  • Without government intervention it seems unlikely that the many downtown buildings were going to get gigabit broadband. We always hear how the incumbent private sector will take care of broadband in business districts – but in Eugene it wasn’t happening.
  • The government involvement is bringing affordable gigabit broadband by creating competition between multiple ISPs selling services on the dark fibers. A single broadband provider would likely charge much higher prices.
  • An interesting lesson is how hard it is to get federal government funding. The consortium feels like they wasted almost two years by pursuing the EDA grant when it turns out that the project was never going to qualify. There are often hidden hurdles in federal funding that are impossible to overcome.
  • The city is seeing immediate economic development with new firms locating in the downtown and a number of existing tech companies now hiring – firms have been able to take on new projects due to the availability of broadband. Rather than spending a lot of effort to attract new businesses, sometimes the best economic development plan is to invest in basic infrastructure that supports businesses that are already in a community.
  • And finally, I think the city is discovering that once you solve broadband for part of the community that you create that same demand and expectation everywhere. The city regularly receives requests from the rest of the city to bring faster broadband, and my bet is that they won’t be finished with broadband expansion when this project is complete.

Tearing Down Rural Copper

In his FCC blog, FCC Chairman Ajit Pai is touting the June 7 open FCC meeting as his own version of “Avengers: Infinity War”. He says the FCC is taking on familiar headliners like “freeing up spectrum, removing barriers to infrastructure buildout, expanding satellite services, modernizing outdated rules, eliminating waste, improving accessibility, protecting consumers—and rolling them into one, super-sized meeting.”

I want to focus on the agenda item “removing barriers to infrastructure buildout”. The Chairman goes on in his blog to say the following about that topic:

Removing regulatory barriers to encourage the deployment of next-generation networks and close the digital divide certainly fits that bill. That’s something that consumers strongly support; as I’ve traveled from the Mountain West to the Gulf Coast, I’ve heard many of them say that they want to benefit from modern, more resilient technologies like optical fiber instead of limping along with slower services like DSL provided over old, often-degraded copper. To respond to that desire, I’ve shared an order with my colleagues that would make it easier for companies to discontinue outdated, legacy services and transition to the networks of the future. These reforms would enable the private sector to stop spending scarce dollars propping up fading technologies of the past and promote investment in technologies of the future. They will also make it easier to restore service in the aftermath of natural disasters and other catastrophic and unforeseen events. 

The Chairman’s rhetoric sounds great and anybody in rural America would love for the FCC to help them “benefit from modern, more resilient technologies like optical fiber”. However, this is another false narrative coming from the Chairman. Rather than promoting fiber or fast broadband, the FCC will be voting on the attached order which authorizes the following:

  • Expedites the ability of telcos to discontinue broadband services slower than 25/3 Mbps;
  • Streamlines the process for discontinuing legacy voice services.
  • Eliminates the notice periods that telcos must give to customers before discontinuing legacy services or tearing down copper;
  • Extends streamlined notice period during force majeure events, meaning telcos can walk away from a legacy network that gets damaged from a natural disaster, like happened a few years ago on Fire Island after hurricane Sandy.

This order makes it a lot easier for AT&T and the other giant telcos to walk away from their copper technology, their DSL networks and their legacy copper services. This comes straight from the wish list of the big telcos and is another example of how this FCC is is handing the reins to the big ISPs.

The premise behind the Chairman’s rhetoric is that we must be able to discontinue the old copper networks if we are to make the investments in newer broadband technologies. This sounds like a reasonable premise except for one thing: the big telcos are not going to be bringing fiber or technologies like 5G to rural America today, tomorrow or ever.

This docket does nothing more than make it easier for the big telcos to kill copper and DSL networks and walk away from rural America. We all know those networks are dying and eventually have to come down. What bothers me about the Chairman’s rhetoric is that he is hiding the truth about this agenda item behind a lie – that tearing down the old networks somehow makes it easier to build new networks. There will be many rural households hurt by this docket. The farm with no broadband and no cellular coverage is going to see their copper lines torn down and will lose their landlines, their last remaining connection to the outside world, and the Chairman doesn’t want to publicly say that he thinks that is okay. The big telcos would like nothing more than to completely wash their hands of rural markets and this FCC is making it easier for them to walk away.

The Chairman is painting a picture that killing copper is the first step towards getting faster broadband in rural America and that’s the big lie. The FCC has it within their authority to force the big telcos invest some of their profits back into rural America, but they are instead letting them walk away. Once the copper lines are down there will be nothing to replace them and future regulators will have zero leverage over the telcos after the copper networks are gone.

I find it disturbing that we have regulators without the courage to tell the American public the truth. If this FCC believes that it’s time to start tearing down rural copper, then they should say so. They know there is nothing to replace rural copper and so they are sugarcoating the topic to avoid the wrath of angry citizens. It’s disingenuous to paint the picture that this FCC is going to bring better broadband to rural America when we all know that’s not true.

AT&T’s Fiber Strategy

On the most recent earnings call with investors, AT&T’s EVP and CFO John Stevens reported that AT&T has only 800,000 customers nationwide remaining on traditional DSL. That’s down from 4.5 million DSL customers just four years ago. The company has been working hard to work its way out of the older technology.

The company overall has 15.8 million total broadband customers including a net gain of 82,000 customers in the first quarter. This compares to overall net growth for the year of 2017 of only 114,000 customers. The company has obviously turned the corner and after years of stagnant growth is adding broadband customers again. The overall number of AT&T broadband customers has been stagnant for many years, and if you go nearly a decade the company had 15 million broadband customers, with 14 million on traditional DSL.

The 15 million customers not served by traditional DSL are served directly by fiber-to-the-premises (FTTP) or fiber-to-the-node (FTTN) – the company doesn’t disclose the number on each technology. The FTTN customers in AT&T are served with newer DSL technologies that bond two copper pairs. This technology generally has relatively short copper drops of less than 3,000 feet and can deliver broadband download speeds above 40 Mbps download. AT&T still has a goal to pass 12.5 million possible customers with fiber by the end of 2019, with an eventual goal to pass around 14 million customers.

The AT&T fiber buildout differs drastically from that done by Verizon FiOS. Verizon built to serve large contiguous neighborhoods to enable mass marketing. AT&T instead is concentrating on three different customer segments to reach the desired passings. They are building fiber to business corridors, building fiber to apartment complexes and finally, offering fiber to homes and businesses that are close to their many existing fiber nodes. Homes close enough to one of these nodes can get fiber while those only a block away probably can’t. It’s an interesting strategy that doesn’t lend itself to mass marketing, which is probably why the press has not been flooded with stories of the company’s fiber expansion. With this buildout strategy I assume the company has a highly targeted marketing effort that reaches out only to locations it can easily reach with fiber.

To a large degree AT&T’s entire fiber strategy is one of cherry picking. They are staying disciplined and are extending fiber to locations that are near to their huge existing fiber networks that were built to reach large businesses, cell sites, schools, etc. I work across the country and I’ve encountered small pockets of AT&T fiber customers in towns of all sizes. The cherry picking strategy makes it impossible to map their fiber footprint since it consists of an apartment complex here and a small cluster of homes there. Interestingly, when AT&T reports these various pockets they end up distorting the FCC’s broadband maps, since those maps count a whole census block as having gigabit fiber speeds if even only one customer can actually get fiber.

Another part of AT&T’s strategy for eliminating traditional DSL is to tear down rural copper and replace DSL with cellular broadband. That effort is being funded to a large extent by the FCC’s CAF II program. The company took $427 million in federal funding to bring broadband to over 1.1 million rural homes and businesses. The CAF II program only requires AT&T and the other telcos to deliver speeds of 10/1 Mbps. Many of these 1.1 million customers had slow DSL with typical speeds in the range of 1 Mbps or even less.

AT&T recently said that they are not pursuing 5G wireless local loops. They’ve looked at the technology that uses 5G wireless links to reach from poles to nearby homes and said that they can’t make a reasonable business case for the technology. They say that it’s just as affordable in their expansion model to build fiber directly to customers. They also know that fiber provides a quality connection but are unsure of the quality of a 5G wireless connection. That announcement takes some of the wind out of the sails for the FCC and legislators who are pressing hard to mandate cheap pole connections for 5G. There are only a few companies that have the capital dollars and footprint to pursue widespread 5G, and if AT&T isn’t pursuing this technology then the whole argument that 5G is the future of residential broadband is suspect.

This is one of the first times that AT&T has clearly described their fiber strategy. Over the last few years I wrote blogs that wondered where AT&T was building fiber, because outside of a few markets where they are competing with companies like Google Fiber it was hard to find any evidence of fiber construction. Instead of large fiber roll-outs across whole markets it turns out that the company has been quietly building a fiber network that adds pockets of fiber customer across their whole footprint. One interesting aspect of this strategy is that those who don’t live close to an AT&T fiber node are not likely to ever get their fiber.

The Flood of New Satellite Networks

I wrote a blog a few months ago about SpaceX, Elon Musk’s plan to launch a massive network starting with over 4,400 low-orbit satellites to blanket the world with better broadband. SpaceX has already launched the first few test satellites to test the technology. It seems like a huge logistical undertaking to get that many satellites into orbit and SpaceX is not the only company with plans for satellite broadband. Last year the FCC got applications for approval for almost 9,000 different new communications satellites. Some are geared to provide rural broadband like SpaceX, but others are pursuing IoT connectivity, private voice networks and the creation of space-based backhaul and relay networks.

The following companies are targeting the delivery of broadband:

Boeing. Boeing plans a network of 2,956 satellites that will concentrate on providing broadband to government and commercial customers worldwide. They intend to launch 1,396 satellites within the next six years. This would be the aerospace company’s first foray into being an ISP, but they have experience building communications satellites for over fifty years.

OneWeb. The company is headquartered in Arlington, Virginia and was founded by Greg Wyler. The company would be a direct competitor to SpaceX for rural and residential broadband and plans a network of over 700 satellites. They have arranged launches through Virgin Galactic, the company founded by Richard Branson. The company plans to launch its first satellite next year.

O3b. The company’s name stands for the ‘other 3 billion’ meaning those in the world with no access to broadband today. This company is also owned by Greg Wyler. They already operate a few satellites today that provide broadband to cruise ships and to third-world governments. Their plan is to launch 24 additional satellites in a circular equatorial orbit. Rather than launching a huge number of small satellites they plan an interconnected network of high-capacity satellites.

ViaSat. The company already provides rural broadband today and plans to add an additional 24 satellites at an altitude of about 4,000 miles. The company recently launched a new ViaSat-2 satellite this year to augment the existing broadband satellite service across the western hemisphere. The company is promising speeds of up to 100 Mbps. In addition to targeting rural broadband customers the satellite is targeting broadband delivery to cruise ships and airplanes.

Space Norway. The company wants to launch two satellites that specifically target broadband delivery to the Arctic region in Europe, Asia and Alaska.

The business plans of the following companies vary widely and shows the range of opportunities for space-based communications:

Kepler Communications. This Canadian company headquartered in Toronto is proposing a network of up to 140 tiny satellites the size of a football which will be used to provide private phone connectivity for shipping, transportation fleets and smart agriculture. Rather than providing broadband, the goal is to provide private cellphone networks to companies with widely dispersed fleets and locations.

Theia Holdings. The company is proposing a network of 112 satellites aimed at telemetry and data gathering for services such as weather monitoring, agricultural IoT, natural resource monitoring, general infrastructure monitoring and security systems. The network will consist almost entirely of machine to machine communication.

Telesat Canada. This Canadian company already operates satellites today that provide private voice communications networks for corporate and government customers. The company is launching two new satellites to supplement the 15 already in orbit and has plans for a network consisting of at least 117 satellites. The company’s largest targeted customer is the US Military.

LeoSat MA. The company is planning a worldwide satellite network that can speed a transmission around the globe about 1.5 times faster than terrestrial fiber networks. Their market will be large businesses and governments that need real-time communication around the globe for applications like stock exchanges, business communications, scientific applications and government communications.

Audacy Corp. The company want to provide the first satellite network aimed at providing communications between satellites and spacecraft. Today there is a bandwidth bottleneck between terrestrial earth stations and satellites and Audacy proposes to create a space-only broadband relay network to enable better communications between satellites, making them the first space-based backbone network.