Rising Broadband Speeds

For the second year in a row the coalition M-Lab measured broadband speeds in 200 countries. This coalition includes New America’s Open Technology Institute, Google Open Source Research, Princeton University’s PlanetLab and others, compiled by Cable of the UK. The statistics are based upon over 163 million speed tests. The results are available in a spreadsheet and are worth looking at for those that love numbers.

Because the results use speed tests, the vagaries of those tests must be factored into the results. Hopefully all of the reading use the same speed test, because each speed test on the market uses a different algorithm to calculate speed. For example, the algorithm for speedtest.net operated by Ookla discards the fastest 10% and the slowest 30% of the results obtained. Speed tests are overinflated in many instances when ISPs use a burst technology that provides a faster broadband speed for the first minute or two or any web connection. The results are also lowered due to any network issues at a customer such as an underperforming WiFi network. The bottom line is that any given speed test number must be taken with a grain of salt, but comparing millions of speed test results ought to make a valid relative comparison.

Overall the tests show a worldwide increase in broadband speeds in just one year of 23%. However, to put that in perspective that’s an increase worldwide going only from 7.4 Mbps to 9.1 Mbps. It’s more interesting to look at the results from the countries with the fastest and slowest broadband. The top 25 fastest broadband countries on the list increased speeds by 28.9% while the bottom 25 only increased by 7.4%.

The US moved up one slot, from number 21 to number 20 to this year – increasing average speeds from 25.0 Mbps to 25.9 Mbps. This is a substantial increase that I think can be attributed to three factors. The most significant is probably that several large cable companies have unilaterally increased base speeds due to the introduction of DOCSIS 3.1. Average speeds also continue to climb as several million customers per year migrate from DSL to cable modems. Finally, we are slowly building fiber to residences and probably added a few million fiber passings last year.

The worldwide broadband leader is Singapore with an average speed of 60.4 Mbps. They are followed by Sweden, Denmark, Norway, Romania, Belgium and the Netherlands. Romania is interesting because they rose 13 places with a jump in speed from 21.3 Mbps to 38.6 Mbps – they obviously have been implementing a lot of fiber. The biggest drop on the chart is Hong Kong that fell 10 places on the list as their broadband speeds dropped slightly from 27.2 Mbps to 26.5 Mbps. It wasn’t too many years ago when Hong Kong was way ahead of the US, but that gap has completely closed.

One of the more important things this research shows is that good broadband can be found in North America, most of Europe and some of southeast Asia. Broadband speeds everywhere else are far behind. The gap between the haves and have-nots is growing. The increase in average speeds for the top 100 countries on the list was 5.4 Mbps in one year while the increase for the bottom countries was only 0.4 Mbps.

It’s also worth remembering that speeds differ within each country. In this country we still have millions of rural homes that have no Internet access or access at third world speeds. The same is likely true around the world with better broadband in urban areas compared to rural areas. It’s also worth remembering that only about 4.1 billion people, or 54% of the population of the world have access to broadband.

These kinds of statistics are useful because they probably act as a goad to governments that are far down the list to find ways to improve broadband. We know that good Internet brings a huge number of economic and other advantages, and countries with good broadband are implementing new technologies that aren’t going to be available in countries with slow broadband networks.

There is hope for those areas with little or now broadband. Several groups are proposing satellites that can bring broadband everywhere. Endeavors like Google’s Loon are looking at bringing broadband to rural areas across the globe. Hopefully we will see speeds in the third world increasing significantly over the next decade. While only in the second year, the work being done by M-Lab is another good measuring stick for governments to measure their progress.

A Deeper Look at 5G

The consulting firm Bain & Company recently looked at the market potential for 5G. They concluded that there is an immediate business case to be made for 5G deployment. They go on to conclude that 5G ‘pessimists’ are wrong. I count myself as a 5G pessimist, but I admit that I look at 5G mostly from the perspective of the ability of 5G to bring better broadband to small towns and rural America. I agree with most of what Bain says, but I take the same facts and am still skeptical.

Bain says that the most immediate use for 5G deployment is in urban areas. They cite an interesting statistic I’ve never seen before that says that it will cost $15,000 – $20,000 to upgrade an existing cell site with 5G, but will cost between $65,000 and $100,000 to deploy a new 5G node. Until the cost for new 5G cell sites comes way down it’s going to be hard for anybody to justify deploying new 5G cell sites except in those places that have potential business to support the high investment cost.

Bain recommends that carriers should deploy 5G quickly in those places where it’s affordable in order to be the first to market with the new technology. Bain also recommends that cellular carriers take advantage of improved mobile performance, but also look at hard at the fixed 5G opportunities to deliver last mile broadband. They say that an operator that maximizes both opportunities should be able to see a fast payback.

A 5G network deployed on existing cell towers is going to create small circles of prospective residential broadband customers – and that circle isn’t going to be very big. Delivering significant broadband would mean small circles delivering broadband for 1000 to 1,500 feet from a transmitter. Cell towers today are much farther apart than those distances, and this means a 5G delivery map consisting of scattered small circles.

There are not many carriers willing to tackle that business plan. It means selectively marketing only to those households within range of a 5G cell site. AT&T is the only major ISP that already uses this business plan. AT&T currently offers fiber to any homes or businesses close to their numerous fiber nodes. They could use that same sales plan to sell fixed broadband to customers close to each 5G cell site. However, AT&T has said that, at least for now, they don’t see a business case for 5G similar to their fiber roll-out.

Verizon could do this, but they have been walking away from a lot of their residential broadband opportunities, going so far as to sell a lot of their fiber FiOS customers to Frontier. Vericaon says they will deploy 5G in several cities starting next year but has never talked about the number of potential households they might cover. This would require a major product roll-out for T-Mobile or Sprint, but in the document they filed with FCC to support their merger they said they would tackle this market. Both companies currently don’t have the fleet of needed technicians or the backoffice ready to support the fixed residential broadband market.

The report skims past the the question of the availability of 5G technology. Like any new technology the first few generations of field equipment are going to have problems. Most players in the industry have learned the lesson of not widely deploying any new technology until it’s well-proven in the field. Verizon says their early field trials have gone well and we’ll have to wait until next year to see how 5G they are ready to deploy with first generation technology.

Bain also says there should be no ‘surge’ in capital expenditures if companies deploy 5G wisely – but the reverse is also true, and bringing 5G small cells to places without current fiber is going to be capital intensive. I agree with Bain that, technology concerns aside, that the only place where 5G makes sense for the next few years is urban areas and mostly on existing cell sites.

I remain a pessimist of 5G being feasible in more rural areas. The cost of the electronics will need to drop to a fraction of today’s cost. There are going to always be pole issues for deploying smaller cells in rural America – even should regulators streamline the hanging of small cell sites, those 5G devices can’t be placed onto the short poles we often see in rural America. While small circles of broadband delivery might support an urban business model, the low density in rural America might never make economic sense.

I certainly could be wrong, but I don’t see any companies sinking huge amounts of money into 5G deployments until the technology has been field-proven and until the cost of the technology drops and stabilizes. I hope I am proven wrong and that somebody eventually finds a version of the technology that will benefit rural America – but I’m not going to believe it until I can kick the tires.

Do People Really Want a la Carte TV?

We just got a glimpse of a la carte TV and it makes me wonder if this is what people really want. Poll after poll over the years have shown that people would like to pick their own channels. I’m not sure that many people really want a la carte channels once they see the market reality of the product.

Sling TV just started offering a number of a la carte channels and they are available to anybody. Subscribers don’t need to buy another Sling TV package and can buy just one channel. The company says they are planning on offering more a la carte channels.

For now the a la carte line-up is small. It includes Showtime for $10 per month, which is also available elsewhere on line. The other channels available now include:

  • Dove Channel for $5 per month. This channel is not carried on any cable systems and is marketed direct to consumers. It carries a library of Christian-based programming.
  • CuriosityStream for $6 per month. This is an ad-free network that delivers documentaries and shows about science, technology, technology and nature.
  • Stingray Karaoke for $7 per month. This network carries a big library of karaoke songs that streams both the music and lyrics.
  • Outside TV Features for $5 per month. This network carries a big library of outdoor adventure sports films. This is the network that carries the dramatic footage of surfing, skiing, skydiving and numerous adventure sports.
  • UP Faith & Family for $5 per month. This carries original content and movies that are family-based and faith-friendly.
  • Pantaya for $6 per month. This network carries Spanish movies.
  • NBA League Pass for $28.99 per month. This network carries all NBA games and related content.

Sling TV is not the first one to offer a la carte channels and it’s a big part of Amazon Prime. Amazon carries many of these same networks, and over 100 others. However, you must subscribe to the Amazon Prime service for $119 per year in order to buy the a la carte channels. Amazon has taken the approach of being the biggest bundler of content and has become the portal to a huge array of content.

The only other service with any real a la carte characteristics is the new package offered by Charter, only to their own customers. They provide the local networks in a market and then let a subscriber choose 10 out of 65 networks. This is supposedly priced at $21.99, but the fine print shows there will be other fees, typical of a cable company, and I’m guessing this will cost around $30.

What strikes me most about the Sling TV offering is the monthly fee of between $5 and $7 per channel. How many people are willing to spend $60 to $84 per year for one channel? Surveys by Nielsen have shown that the average family regularly watches about a dozen networks. A price of $5 per channel would mean a price of $60 per month to get the networks a household wants. But local network channels, movie networks and sports networks would likely cost more than $5 and it wouldn’t be hard to see a bill of $75 to $100 to buy only the channels a family regularly watches.

I don’t think this is what households want. When people respond to surveys talking about buying channels individually they were not thinking of paying $5 each. I recall a Nielsen survey from a few years ago where people suggested they would be willing to pay less than $2 per channel if they could buy them individually.

I saw a Google article that said that the Dove Channel had over 100,000 customers. Even if they now have twice that, at $5 per month per subscriber the network would have a monthly income of $1 million. That might sound like a lot, but it’s not enough to support a staff, buy the needed content and also try to fund original programming.

Contrast this with a network that sits today on the traditional line-ups on cable systems. At the current nationwide cable TV penetration rate of 69%, a network that charges only a nickel to the cable companies would make $4.4 million per month. A network like the Dove Channel would need to get nearly 900,000 subscribers at $5 per month to perform as well as traditional cable network that charges only a nickel. You can see why most cable networks are scared of the a la carte model because there are very few of them could survive as online providers.

Shrinking Cellular Backhaul Revenues

There are a few carriers that rely on cellular backhaul as a major part of their revenue stream, but there are many more carriers that provide transport to a handful of cell sites. In all cases these are some of the highest-margin and lucrative products sold on the market today, and a business line that every carrier wants to keep. However, there are big changes coming in the cellular market and today I will look at the trends that are going to affect this market over the next decade.

Increasing Bandwidth Demand. The growth in bandwidth demand at many cell sites is explosive with the overall growth in cellular data doubling every 18 months. This growth is not the same everywhere with growth coming in cell sites serving residential customers and not in older cell sites built to satisfy highway phone coverage.

The demand growth is being driven by several factors. First, it’s becoming far more prevalent for customers to use cellphones to watch video. Part of that growth in demand comes directly from the big cellular companies which are bundling in access to content as part of the service. But a more important reason for the growth in demand is that the historic reluctance of customers to use cellular data is eroding as the cellular companies push ‘unlimited’ data plans.

Demands for Lower Transport Costs. Cellular service has become a commodity. The industry is no longer adding many new customers since almost everybody has a cellphone. This has led to price wars between cellular providers, and lower average customer prices are driving the cellular companies to look to cost reductions. At least in urban areas they are starting to also lose significant customers to Comcast, with Charter just entering the fray.

Recently I’ve seen cellular companies ask for lower prices as contracts get renewed or else demand greater bandwidth for the prices already in place. This means that fiber owners are not likely to see increases in revenues even as the bandwidth they are delivering grows.

Cellular Carriers Building Fiber. I’ve had several clients tell me recently that Verizon or AT&T is building fiber in their area. While this construction might be to reach a new large customer, the most likely reason these companies are building right now is to eliminate leased transport at cell sites. This is not just happening in urban areas and one of my clients who serves a market of 10,000 homes tells me that Verizon is building fiber to all of the cell sites in the area.

Verizon made headlines last year when they ordered $1 billion in fiber. AT&T is also building furiously. If you believe the claims made by T-Mobile and Sprint as part of the proposed merger – they also will be expanding their own fiber.

I also expect the cellular carriers to make reciprocal deals to swap fiber connections at cell sites where they now own fiber. If Verizon and AT&T each build to 2,000 cell sites they could easily swap transport and both gain access to 4,000 cell sites – that’s a huge nationwide decrease in transport revenues for others.

Growth of Small Cells. Layered on top of all of this is the predicted growth of small cell sites. I don’t think anybody knows how big this might market grow. I’ve seen optimistic predictions that small sell sites will be everywhere and other predictions that the business case for small cell sites might never materialize. Many of my clients are seeing the deployment of a few small cell sites to relive 4G congestion, but it’s hard to predict in smaller markets if this will ever expand past that.

One thing we can know for sure is that the cellular carriers will not be willing to pay the same prices for connection to small cell sites that they’ve been paying for the big cell tower sites. By definition, a smaller cell site is going to serve a smaller number of customers and the pricing must be reduced accordingly for it to make sense for the cellular providers.

Conclusion. My best guess is that cellular transport will be hit and miss depending up the specific local situation. There are many who will lose all cell site transport where the cellular carriers decide to build their own fiber. But even where they don’t build fiber I would expect the cellular carriers to bring the threat of physical bypass into price negotiations to drive transport prices far below where they are today.

This is a natural economic consequence of cellular becoming a commodity. As the cellular industry tightens its belt it’s going to demand lower costs from its supply chain. Transport costs are one of the major costs of the cellular industry and the most natural place for them to look to reduce costs. The big cell companies already understand this future which is one of the primary reasons they are furiously building fiber today while they have the cash to do so.

Simultaneous Data Streams

By working all over the country I get to hear a lot of stories about how people use broadband. I’ve noticed that over the last few years that the household expectation for broadband performance has changed.

As recently as three or four years ago most households seemed to judge the adequacy of their broadband connection by how well it would handle a video stream from Netflix or other streaming service. Households that couldn’t stream video well were unhappy, but those that could generally felt that their broadband connection was good enough.

Interestingly, much of the perceived improvement in the ability to steam video was not due to better broadband performance. Streaming services like Netflix took steps to improve the performance of their product. Netflix had always buffered their content, meaning that a customer would load the video stream a few minutes ahead of viewing to eliminate the variation in customer broadband connections. They subsequently built some brains into the service so that the compression used for a given stream would vary according to the broadband connection of the customer. They also began caching their content with ISPs so that their signal would be generated from the ISP’s local network and not from somewhere in the distant cloud.

Streaming quality then became an issue again with the introduction of live streaming sports and other content, and many of the flaws in the video stream became more apparent. I remember trying to watch ESPN online when it was first offered by Sling TV and the experience was miserable – the stream would crash a number of times during a football or basketball game. Live-streaming services have subsequently improved their product to work better with a variety of broadband connections.

Over the last two years I’ve noticed a big change in how households talk about their broadband performance. I haven’t heard anybody mention single video streaming in a few years and the expectation for a broadband connection now is that it can handle multiple data streams at the same time.

This tells me two things. First, as mentioned above, video streaming has improved to the point where you don’t get interruptions on most broadband connections. But more importantly, households have changed how they use broadband. I think my household is a typical example. The only broadband need we have that is different from many families is that my wife and I both work from home. But other than that, we don’t have atypical broadband demands.

If you go back five years we probably had perhaps half a dozen devices in our home capable of connecting to the Internet. We rarely demanded a lot of simultaneous broadband. Today we have over 40 Internet capable devices in our house. While some of them use little or no broadband, we’ve changed how we use broadband. We are cord cutters and routinely are streaming several videos at the same time while also using the Internet for gaming and schoolwork. We’re often stream music. Our computers automatically upload files to the cloud and download software updates. Cellphones are connected to the WiFi and there is regular use of FaceTime and other apps that include video streams.

Interestingly, when the FCC established 25/4 Mbps as the definition of broadband they justified the speed by looking at simultaneous uses of multiple broadband services. At that time a lot of critics derided the FCC’s justification since it wasn’t realistic for how most households really used broadband. Perhaps the staff at the FCC was prescient, because their illustrative examples are exactly how a lot of homes use broadband today.

If anything, the FCC’s method was conservative because it didn’t account for the interference that arises in a home network that is processing multiple data streams at the same time. The more streams, the more interference, and it wouldn’t be unusual for a home like ours to experience 20% to 30% overhead in our WiFi network while processing the numerous simultaneous streams.

Unfortunately, many policy makers are still stuck on the old paradigm. This is the only way they can justify something like the CAF II program that will provide data steams in the 10 Mbps range. They still talk about how that connection will allow a household to watch video or do homework, but they ignore all of the other ways that homes really use broadband today. I know for my home that a 25 Mbps broadband stream is not sufficient and will bog down at various times of the day – so I buy something faster. It’s hard to imagine stepping back to a 10 Mbps connection, because doing so would force us to make hard choices on curtailing our broadband usage.

Comcast Dismantles Data Throttling

On June 11 Comcast announced they had dismantled a congestion management system that had been in place since 2008. This system was used to throttle data speeds for large users of residential data. The company says that their networks are now robust enough that they no longer need to throttle users and that they system wasn’t used for the last year.

Comcast implemented the congestion management system in 2008 after it had been caught throttling traffic to and from Bit Torrent. The FCC said the throttling was discriminator and ordered Comcast to cease the practice. Comcast responded to the FCC with the introduction of the congestion management system that cut back usage for all large residential data users, with what Comcast said was a non-discriminatory basis.

At the time Comcast claimed that large data users, who at that time were exchanging video files, were slowing down their network – and they were probably right. The ISP industry has been blindsided twice in my memory by huge increases in demand for bandwidth. The first time was in the 1990s when Napster and many others promoted the exchange of music MP3 files. The same thing happened a decade ago when people started sharing video files – often pirated copy of the latest movies.

To be fair to Comcast, a decade ago the number one complaint about cable company broadband was that speeds bogged down during the evening prime time hours – the time when most customers wanted to use the network. The Comcast throttling was an attempt to lower the network congestion during the busiest evening hours. Comcast says the throttling system is no longer needed since the widespread implementation of DOCSIS and improvements in backhaul have eliminated many of the network bottlenecks.

Comcast now offers gigabit download speeds in many markets. I suspect that they are relying that only a small percentage of their customers will buy and use this big bandwidth in a given neighborhood, because a significant number of gigabit users could still swamp an individual neighborhood node. I wonder if the company would reinstitute the throttling system again should their network become stressed with some future unexpected surge in broadband traffic. It’s possible that some big bandwidth application such as telepresence could go viral and could swamp their data networks like happened in the past with music files and then video.

Interestingly, the company still maintains customer data caps. Any customer that uses more than 1 terabyte in a month must pay $10 for each extra 50 gigabytes or pay $50 extra to get unlimited data. Comcast never directly said that the data caps were for congestion management, although they often hinted that was the reason for the caps.

The official explanation of the data caps has been that heavy users need to pay more since they use the network more. Comcast has always said that they use the revenues from data caps to pay for the needed upgrades for the network. But this seems a little ingenuous from a company that generated $21.4 billion in free cash in 2017 – nearly $1.8 billion per month.

Comcast is not the only ISP that has been throttling Internet traffic. All four major wireless carriers throttle big data users at some point. T-Mobile is the most generous and starts throttling after 50 GB of month usage while the other three big wireless carriers throttle after 20 – 25 GB per month.

A more insidious form of data throttling is the use of bursting technology that provides faster broadband speeds for the first minute or two of any given broadband session. During this first minute customers will get relatively fast speeds – often set at the level of their subscription – but if the session is prolonged past that short time limit then speeds drop significantly. This practice fools customers into thinking that they get the speeds they have subscribed to – which is true for the short duration of the burst – but is not true when downloading a large file or streaming data for more than a minute or two. The carriers boast about the benefits of data bursts by saying they give extra broadband for each request – but they are really using the technology to throttle data for any prolonged data demands.

5G Cellular for Home Broadband?

Sprint and T-Mobile just filed a lengthy document at the FCC that describes the benefits of allowing the two companies to merge. This kind of filing is required for any merger that needs FCC approval. The FCC immediately opened a docket on the merger and anybody that opposes the merger can make counterarguments to any of the claims made by the two companies.

The two companies decided to highlight a claim that the combined Sprint and T-Mobile will be able to roll out a 5G network that can compete with home broadband. They claim that by 2024 they could gain as much as a 7% total market penetration, making them the fourth biggest ISP in the country.

The filing claims that their 5G network will provide a low-latency broadband product with speeds in excess of 100 Mbps within a ‘few years’. They claim that customers will be able to drop their landline broadband connection and tether their home network to their unlimited cellular data plan instead. Their filing claims that the this will only be possible with a merger. I see a lot of holes that can be poked into this claim:

Will it Really be that Fast? The 5G cellular standard calls for eventual speeds of 100 Mbps. If 5G follows the development path of 3G and 4G, then those speeds probably won’t be fully met until near the end of the next decade. Even if 5G network can achieve 100 Mbps in ideal conditions there is still a huge challenge to meet those speeds in the wild. The 5G standard achieves 100 Mbps by bonding multiple wireless paths, using different frequencies and different towers to reach a customer. Most places are not receiving true 4G speeds today and there is no reason to think that using a more complicated delivery mechanism is going to make this easier.

Cellphone Coverage is Wonky.  What is never discussed when talking about 5G is how wonky all wireless technologies are in the real world. Distance from the cell site is a huge issue, particular with some of the higher frequencies that might be used with 5G. More important is local interference and propagation. As an example, I live in Asheville, NC. It’s a hilly and wooded town and at my house I have decent AT&T coverage, but Verizon sometimes has zero bars. I only have to go a few blocks to find the opposite situation where Verizon is strong and AT&T doesn’t work. 5G is not going to automatically overcome all of the topographical and interference issues that affect cellular coverage.

Would Require Significant Deployment of Small Cell Sites. To achieve the 100 Mbps in enough places to be a serious ISP is going to require a huge deployment of small cell sites, and that means the deployment of a lot of fiber. This is going to be a huge hurdle for any wireless company that doesn’t have a huge capital budget for fiber. Many analysts still believe that this might be a big enough hurdle to quash a lot of the grandiose 5G plans.

A Huge Increase in Wireless Data Usage. Using the cellular network to provide the equivalent of landline data means a magnitude increase in the bandwidth that will be carried by the cellular networks. FierceWireless along with Strategic Analytics recently did a study on how the customers of the major cellular companies use data. They reported that the average T-Mobile customer today uses 18.4 GB of data per month with 5.3 GB on the cellular network and the rest on WiFi. Sprint customers use 18.2 GB per month with 4.4 GB on the cellular networks. Last year Cisco reported that the average residential landline connection used over 120 GB per month – a number that is doubling every three or four years. Are cellular networks really going to be able to absorb a twenty or thirty times increase in bandwidth demand? That will require massive increases in backhaul bandwidth costs along with huge capital expenditures to avoid bottlenecks in the networks.

Data Caps are an Issue.  None of the cellular carriers offers truly unlimited data today. T-Mobile is the closest, but their plan begins throttling data speeds when a customer hits 50 GB in a month. Sprint is stingier and is closer to AT&T and Verizon and starts throttling data speeds when a customer hits 23 GB in a month. These caps are in place to restrict data usage on the network (as opposed to the ISP data caps that are meant to generate revenue). Changing to 5G is not going to eliminate network bottlenecks, particularly if we see millions of customers using cellular networks instead of landline networks. All of the carriers also have a cap on tethering data – making it even harder to use as a landline substitute – T-Mobile caps tethering at 10 GB per month.

Putting it all into Context. To put this into context, John Legere already claims today that people ought to be using T-Mobile as a landline substitute. He says people should buy a multi-cellphone plan and use one of the phones to tether to landline. 4G networks today have relatively high latency and 4G speeds today can reach 15 Mbps in ideal conditions but are usually slower. 4G also ‘bursts’ today and offers faster speeds for the first minute or two and then slows down to a crawl (you see this when you download phone apps). I think we have to take any claims made by T-Mobile with a grain of salt.

I’m pretty sure that concept of using the merger to create a new giant ISP is mostly a red herring. No doubt 5G will eventually offer an alternative to landline broadband for those homes that aren’t giant data users – but it’s also extremely unlikely that a combined T-Mobile / Sprint could somehow use 5G cellular to become the fourth biggest ISP starting ‘a few years from now’. I think this claim is being emphasized by the two companies to provide soundbites to regulators and politicians who want to support the merger.

Sonic – the Transition from UNEs to Fiber

In my continuing series of writing about interesting competitors, today’s blog is about Sonic, a CLEC and fiber overbuilder working in the San Francisco Bay area and other communities in California. It’s an interesting company because they are the poster child for building a competitive telecom company based upon the rules established by the Telecommunications Act of 1996. That Act required that the large telephone companies unbundle their networks to allow competitors to use their copper lines.

Sonic got started in 1994 as an ISP, then became a CLEC in 2006 and followed the path envisioned by the 1996 Act. This meant collocating electronics in AT&T central offices to provide DSL to customers over unbundled copper loops (UNEs). The company found a receptive customer base since they offered faster broadband than AT&T’s at an affordable price. They grew to be collocated in 200 AT&T central offices around the Bay Area, Sacramento and greater Los Angeles. These offices are tied together by the use of unbundled interoffice transport – also created by the 1996 Act. They originally deployed DSL that used one copper pair but have migrated to VDSL2 and other faster versions of DSL that use two copper pairs and delivers significant bandwidth. They still have almost 50,000 customers in the region using this technology.

What’s interesting is that Sonic did this starting in 2006 – a time by which much of the rest of the industry had written off the use of telco copper. The UNE business plan got a sour reputation with many in the industry when the CLEC industry using UNEs spectacularly imploded in 2001-2002. This collapse of the CLEC industry was due to a perfect storm of economic events and had little to do with the benefits of using telco copper.

If anything, it’s easier to use telco copper today because today’s DSL technology is far better than the DSL in 2000. Sonic and other CLECs are able to provide fast and reliable broadband using ADSL2+ and VDSL2, bonded over multiple copper pairs. Most people in the industry are probably surprised to hear that Sonic can use bonded copper UNEs to provide speeds as fast as 400Mbps to serve businesses. The usefulness of unbundled UNEs is far from dead.

Sonic also reaches roughly 25,000 customers using resale. This allows them to sell the same DSL products sold by AT&T in locations where they don’t have collocations. All of the Sonic products offer a bundle with a voice product that includes all of the expected features plus unlimited calling to the US and to landlines in 66 other countries. They are still finding strong demand for the voice product – something that also might surprise many in the industry.

Five years ago the company decided to use the cash flow from the UNE business to build fiber. Their fiber network now covers roughly 1/3 of the City of San Francisco, plus Brentwood, Sebastopol, Albany, Kensington and Berkeley in the East Bay. They are eying other markets around the region, the state, and beyond. They are an aggressive competitor and their fiber product line starts with a symmetrical gigabit for $40 per month, bundled with the unlimited voice product. They won’t publicly disclose the number of fiber customers, but their goal is to soon have more customers on fiber than on DSL. In my opinion, this is the essence of the vision of the 1996 Act – a transition from UNEs to facility-based networks.

The company’s biggest worry right now is that the FCC recently got a petition from the large telcos asking to end the use of unbundled network elements (UNEs). The big telcos argue that the UNE business plan is obsolete and that there is sufficient competition in the marketplace without unbundling their copper – while also claiming that “In the residential marketplace, competition will not be materially affected by forbearance from Section 251 ( c )(3) because there is effectively no remaining UNE-based competition in that marketplace.” and that “To the extent CLECs serve residential customers using ILEC facilities, they do so on commercial platforms.

But Sonic and a number of other CLECs using UNEs show this to be untrue. Given that just Sonic alone serves nearly 50,000 California households with UNEs these claims are incorrect and misleading. Sonic is using the unbundled copper in exactly the manner envisioned by Congress when they wrote the 1996 Act – to allow competitors to place the best technology possible on the telco copper networks. The Congress at the time reasoned that telephone ratepayers had paid for the copper networks and that the public ought to derive any benefits possible from the networks they had paid for.

The big telcos have always hated the idea of unbundling their networks. They have slowly chipped away at some of the products envisioned by the 1996 Act such as access to telco dark fiber. They would love to kick CLECs like Sonic off their networks – and in Sonic’s case that would deprive 50,000 customers of fast DSL and telephone service at prices they can afford.

Almost every major market in the country, and many smaller ones have CLECs that use unbundled network elements to provide DSL – usually the newer and faster DSL that the telcos won’t invest in. The telcos are slowly walking away from DSL which can be seen by the huge numbers of customers switching to the cable companies.

But CLECs like Sonic have used the copper to bring products that people want – and, unlike the telcos they are pouring those profits back into building fiber to these same communities. That’s exactly what Congress had in mind in 1996 and it would be a shame to see the FCC choke off some of the companies who are offering a competitive alternative to the big cable companies.

Buying a Home with No Broadband

A few weeks ago attended a public meeting at one of my clients and I met a guy there who recently purchased a house in the area that has no broadband. He was told by both customer service at bth the cable company and the local telco that broadband was available – but when he showed up they would not serve him.

It seems like everywhere I travel today I hear this or similar stories and it makes me realize the gigantic value difference between homes with and without broadband. This particular guy works from home and is now scratching his head looking for a solution. He’s not unique and most families with school kids and even most families without look at broadband today as a necessity. Buying a house without broadband is starting to feel a lot like buying a house without electricity or running water – it’s not a home that most people would willingly buy.

Unfortunately, people like this guy, who are not familiar with rural broadband are often told there is broadband when there isn’t. People who move from urban areas often have no clue about the atrocious state of broadband in rural America. They can’t imagine a world where there isn’t even DSL and where folks have to somehow get by on cellular data or satellite data to have connection to the outside world.

I purchased several homes over the last few decades and I’ve always made proof of broadband a contingency in my purchase offer. I then contacted the ISPs and placed an order to be sure that the broadband was real. Sadly, like the guy in this story, one often gets the wrong answer from a call to customer service and I’ve always gone a step further and placed an order. Even that is not always a great solution – when I moved to Florida I was in the house for over a month before Comcast finally connected my home – even though there was a Comcast pedestal at the end of my driveway!

I’ve spoken to a number of rural real estate agents over the last few years and they say almost universally that home broadband is now at or near to the top of homebuyer’s wish these days. They are often surprised by homebuyers who don’t understand the lack of rural broadband. They all have stories about buyers who quickly abandon searches in all parts of a county that don’t have broadband.

There have been numerous studies done that show that a home with broadband is worth more than one without. But I don’t buy the results of those studies any more. We are now at an overall 84% national penetration for broadband and a huge majority of people don’t want a home without broadband. Those studies show an increase of a few thousand dollars in value for home without broadband – but what is value of broadband if you are unable to find a buyer for a home that doesn’t have it? That’s the story that real estate agents tell me today – the inability to sell rural homes without broadband.

One of the interesting things about rural broadband is that the people in rural areas know exactly where the broadband line stops. They know the home closest to them with cable service, they know where DSL becomes too slow to be relevant, and they know where cell phones lose their bars for broadband connectivity. Many rural customers are irate because many of them live just past the broadband dividing line. I hear it all of the time, “The home two houses away has cable TV”, “I’m within a quarter of a mile of good DSL”, “The people on the other side of that hill have a good WISP”, “I can walk to the fiber”.

I remember when I was house-hunting here in Asheville. I live a mile from center city and I can look out my window and see homes with no broadband. My wife had assembled a list of homes to check out and I recall saying a lot, “This area has no broadband, turn the car around”. It is often surprising how close you can be to a town and have no broadband. I think this area is not untypical of a rural county seat where broadband extends only sporadically past the city limits. Folks who don’t know how to look at the wires on poles often don’t realize how broadband often ends at, or just past the city boundary.

This issue is going to get more severe over the next decade and I predict that we’ll start seeing people walk away from rural homes due to lack of willing buyers. I keep expecting to see a lawsuit from a homebuyer who sues a realtor for not telling them the truth about lack of broadband. Such a suit will inevitably bring another piece of paper into home disclosures – a broadband disclosure – which most people care more about than termites and the dozen other things we check off before buying a home.

Working From Home

Governments are starting to catch onto to the idea that one of the most dynamic parts of the new economy is people working from home. Governor Phil Scott of Vermont just signed legislation that provides an incentive for people who want to move to Vermont and work from their homes.

The program consists of grants of up to $5,000 per year, not to exceed $10,000 to help cover the cost of relocating to the state. To qualify for the grants a worker must already be employed by an out-of-state company, work primarily from home and move to the state after January 1, 2019.

The overall program isn’t large, set at $125,000 for 2019, $250,000 for 2020 and back to $125,000 in 2022. If awards are made at the $5,000 level this would cover moving 100 new workers to the state.

In economic development terms, landing 100 new full-time families using a $500,000 tax subsidy is a bargain. Governments regularly provide tax incentives of this size to attract factories or other large employers. The impact on the economy from 100 new high-income families is gigantic and over time time the taxes and other local benefits from these new workers will greatly exceed the cost of the program.

Vermont is like many states and finds itself with an aging population while also seeing an outflow of young people seeking work in New York, Boston and other nearly cities. These grants create an opportunity for young families to move back to the state.

One key aspect of the work-at-home economy is good broadband. Many companies are now insisting that employees have an adequate broadband connection at a home before agreeing to allow a worker to work remotely. I’ve talked to a few people who recently made the transition to home work and they had to certify the speed and latency of their broadband connection.

One reason that this program can work in Vermont is there are areas of the state with fiber broadband. The City of Burlington built a citywide fiber network and local telcos and other cities in the state have built fiber in more rural parts of the state. But like most of America, Vermont still has many rural areas where broadband is poor or non-existent.

What surprises me is that many communities with fiber networks don’t take advantage of this same opportunity. It’s easy for a community with good broadband to not recognize that much of America today has lousy broadband. Communities with fiber networks should consider following Vermont’s example.

I know of one community that is doing something similar to the Vermont initiative. The City of Independence, Oregon has benefitted from a municipal fiber network since 2007, operating under the name of MINET and built jointly with the neighboring city of Monmouth. The city has a new economic development initiative that is touting their fiber network. Nearby Portland is now a hotbed for technology companies including a lot of agricultural technology research.

Independence has one major benefit over Portland and the other cities in the state – gigabit broadband. The new economic development initiative involves getting the word out directly to workers in the agricultural research sector and letting them know that those that can work at home can find a simpler and less expensive lifestyle by moving to a small town. They hope that young families will find lower housing prices and gigabit fiber to be an attractive package that will lure work-at-home families. Independence is still close enough to Portland to allow for convenient visits to the main office while offering faster broadband than can be purchased in the bigger city.