The Crowded MVPD Market

The virtual MVPD (Multichannel Video Programming Distributor) market is already full of providers and is going to become even more crowded this year. Already today there is a marketing war developing between DirecTV Now, Playstation Vue, Sling TV, Hulu Live, YouTube TV, CBS All Access, fuboTV and Layer3 TV. There are also now a lot of ad-supported networks offering free movies and programming such as Crackle and TubiTV. All of these services tout themselves as an alternative to traditional cable TV.

This year will see some new competitors in the market. ESPN is getting ready to launch its sports-oriented MVPD offering. The network has been steadily losing subscribers from cord cutting and cord shaving. While the company is gaining some customers from other MVPD platforms they believe they have a strong enough brand name to go it alone.

The ESPN offering is likely to eventually be augmented by the announcement that Disney, the ESPN parent company, is buying 21st Century Fox programming assets, including 22 regional sports networks. But this purchase won’t be implemented in time to influence the initial ESPN launch.

Another big player entering the game this year is Verizon which is going to launch a service to compete with the offerings of competitors like DirecTV Now and Sling TV. This product launch has been rumored since 2015 but the company now seems poised to finally launch. Speculation is the company will use the platform much like AT&T uses DirecTV Now – as an alternative to customers who want to cut the cord as well as a way to add new customers outside the traditional footprint.

There was also announcement last quarter by T-Mobile CEO John Legere that the company will be launching an MVPD product in early 2018. While aimed at video customers the product will be also marketed to cord cutters. The T-Mobile announcement has puzzled many industry analysts who are wondering if there is any room for a new provider in the now-crowded MVPD market. The MVPD market as a whole added almost a million customers in the third quarter of 2017. But the majority of those new customers went to a few of the largest providers and the big question now is if this market is already oversaturated.

On top of the proliferation of MVPD providers there are the other big players in the online industry to consider. Netflix has announced it is spending an astronomical $8 billion on new programming during the next year. While Amazon doesn’t announce their specific plans they are also spending a few billion dollars per year. Netflix alone now has more customers than the entire traditional US cable industry.

I would imagine that we haven’t seen the end of new entrants. Now that the programmers have accepted the idea of streaming their content online, anybody with deep enough pockets to work through the launch can become an MVPD. There have already been a few early failures in the field and we’ve seen Seeso and Fullscreen bow out of the market. The big question now is if all of the players in the crowded field can survive the competition. Everything I’ve read suggests that margins are tight for this sector as the providers hold down prices to build market share.

I have already tried a number of the services including Sling TV, fuboTV, DirecTV Now and Playstation Vue. There honestly is not that much noticeable difference between the platforms. None of them have yet developed an easy-to-use channel guide and they feel like the way cable felt a decade ago. But each keeps adding features that is making them easier to use over time. While each has a slightly different channel line-up, there are many common networks carried on most of the platforms. I’m likely to try the other platforms during the coming year and it will be interesting to see if one of them finds a way to distinguish themselves from the pack.

This proliferation of online options spells increased pressure for traditional cable providers. With the normal January price increases now hitting there will be millions of homes considering the shift to online.

 

Verizon Announces Residential 5G Roll-out

Verizon recently announced that it will be rolling out residential 5G wireless in as many as five cities in 2018, with Sacramento being the first market. Matt Ellis, Verizon’s CFO says that the company is planning on targeting 30 million homes with the new technology. The company launched fixed wireless trials in eleven cities this year. The trials delivered broadband wirelessly to antennas mounted in windows. Ellis says that the trials using millimeter wave spectrum went better than expected. He says the technology can achieve gigabit speeds over distances as great as 2,000 feet. He also says the company has had some success in delivering broadband without a true line-of-sight.

The most visible analyst covering this market is Craig Moffett of Moffett-Nathanson. He calls Verizon’s announcement ‘rather squishy’ and notes that there are no discussions about broadband speeds, products to be offered or pricing. Verizon has said that they would not deliver traditional video over these connections, but would use over-the-top video. There have been no additional product descriptions beyond that.

This announcement raises a lot of other questions. First is the technology used. As I look around at the various wireless vendors I don’t see any equipment on the market that comes close to doing what Verizon claims. Most of the vendors are talking about having beta gear in perhaps 2019, and even then, vendors are not promising affordable delivery to single family homes. For Verizon to deliver what it’s announced obviously means that they have developed equipment themselves, or quietly partnered on a proprietary basis with one of the major vendors. But there is no other ISP talking about this kind of deployment next year and so the question is if Verizon really has that big of a lead over the rest of the industry.

The other big question is delivery distance. The quoted 2,000 feet distance is hard to buy with this spectrum and that is likely the distance that has been achieved in a test in perfect conditions. What everybody wants to understand is the realistic distance to be used in deployments in normal residential neighborhoods with the trees and many other impediments.

Perhaps the most perplexing question is how much this is going to cost and how Verizon is going to pay for it. The company recently told investors that it does not see capital expenditures increasing in the next few years and may even see a slight decline. That does not jive with what sounds like a major and costly customer expansion.

Verizon said they chose Sacramento because the City has shown a willingness to make light and utility poles available for the technology. But how many other cities are going to be this willing (assuming that Sacramento really will allow this)? It’s going to require a lot of pole attachments to cover 30 million homes.

But even in Sacramento one has to wonder where Verizon is going to get the fiber needed to support this kind of network? It seems unlikely that the three incumbent providers – Comcast, Frontier and Consolidated Communications – are going to supply fiber to assist Verizon to compete with them. Since Sacramento is not in the Verizon service footprint the company would have to go through the time-consuming process needed to build fiber on their own – a process that the whole industry is claiming is causing major delays in fiber deployment. One only has to look at the issues encountered recently by Google Fiber to see how badly incumbent providers can muck up the pole attachment process.

One possibility comes to mind, and perhaps Verizon is only going to deploy the technology in the neighborhoods where it already has fiber-fed cellular towers. That would be a cherry-picking strategy that is similar to the way that AT&T is deploying fiber-to-the-premise. AT&T seems to only be building where they already have a fiber network nearby that can make a build affordable. While Verizon has a lot of cell sites, it’s hard to envision that a cherry-picking strategy would gain access to 30 million homes. Cherry-picking like this would also make for difficult marketing since the network would be deployed in small non-contiguous pockets.

So perhaps what we will see in 2018 is a modest expansion of this year’s trials rather than a rapid expansion of Verizon’s wireless technology. But I’m only guessing, as is everybody else other than Verizon.

Title II Regulation and Investment

As the FCC continues its effort to reversing Title II regulation, I’ve seen the carriers renewing their argument that Title II regulation has reduced their willingness to invest in infrastructure. However, their numbers and other actions tell a different story.

The FCC put broadband under Title II regulation in February of 2015 and revised the net neutrality rules a few months later in April. So we’ve now had nearly three years to see the impact on the industry – and that impact is not what the carriers are saying it is.

First, we can look at annual infrastructure spending for the big ISPs. Comcast spent $7.6 billion upgrading its cable plant in 2016, its highest expenditure ever. Charter spent 15% more in 2016 compared to what was spent on it and the cable companies it purchased. Even Verizon’s spending was up in 2016 by 3% over 2015 even though the company had spun off large fiber properties in Florida, Texas, California and other states. AT&T spent virtually the same amount on capital on 2015 and 2016 as it had done in 2013 and 2014.

I’ve seen a number of articles that focus on the overall drop in investment from the cellular industry in 2015. But that drop is nearly 100% attributable to Sprint, which pulled back on new capital spending due to lack of cash. All of the big cellular companies are now crowing about how much they are going to spend in the next few years to roll-out 5G.

It’s important to remember that what the big ISPs tell their investors is often quite different than what they say when lobbying. As publicly traded companies the ISPs are required by law to provide accurate financial data including a requirement to warn stockholders about known risk factors that might impact stock prices. I’m one of those guys that actually reads financial statements and I’ve not seen a single warning about the impact of Title II regulation in the financial reporting or investor press releases of any of the big ISPs.

But the lobbying side of these businesses is a different story. The big ISPs started complaining about the risks of Title II regulations as far back as 2013 when it was first suggested. The big companies and their trade associations have written blogs warning about Title II regulation and predicted that it would stifle innovation and force them to invest less. And they’ve paid to have ‘scholarly’ articles written that come to the same conclusion. But these lobbying efforts are aimed mostly at the FCC and at legislators, not at stockholders.

The fact that big corporations can get away with having different public stories has always amazed me. One would think that something published on the AT&T or Comcast blog would be under the same rules as documents formally given to investors – but it’s obviously not. AT&T in particular tells multiple stories because the company wears so many different hats. In the last year the company has taken one position as an owner of poles that is diametrically opposed to the position it takes as a cellular company that wants to get onto somebody else’s poles. Working in policy for the big ISPs has to be a somewhat schizophrenic situation.

It seems almost certain that this FCC is going to reverse Title II regulation. The latest rumor floating around is that it will be on their agenda on the day before Thanksgiving. That may lead you to ask why the ISPs are still bothering cranking out the lobbying arguments against Title II if they have already won. I think they are still working hard to get a legislative solution through Congress to kill Title II regulation and net neutrality, even if the FCC kills it for now. I think they well understand that a future FCC under a different administration could easily reinstate Title II regulation – particularly now that it has passed muster through several court challenges. The ISPs understand that it will be a lot harder to get a future Congress to reverse course than it might be if Democrats are back in charge of the FCC.

Until recently I always wondered why the ISPs are fighting so hard against Title II regulation. All of the big companies like Comcast, AT&T and Verizon have told stockholders that their initial concerns about Title II regulation did not materialize. And it’s obvious that Title II hasn’t changed the way they invest in their own companies.

But recently I saw an article and wrote a blog about an analyst who thinks that the ISPs are going to drastically increases broadband prices once Title II regulation is gone. Title II is the only tool that the government can use to investigate and possibly act against the ISP for rate increases and for other practices like data caps. If true, and his arguments for this are good ones, then there is a huge motivation for the big ISPs to shed the only existing regulation of broadband.

Cellular WiFi Handoffs

If you use anybody except Verizon you may have noticed that your cellphone has become adept at handing your cellular connections to a local WiFi network. Like most people I keep my smartphone connected to WiFi when I’m at home to save from exhausting my cellular data cap. I have AT&T cellular service and I’ve noticed over the last year that when I’m out of the house that my phone often logs onto other WiFi networks. I can understand AT&T sending me to their own AT&T hotspots, but often I’m logged on to networks I can’t identify.

When I lived in Florida I was a Comcast customers and so when I was out of the house my phone logged onto Comcast hotspots. Even today my phone still does this, even though I’m no longer a Comcast customer and I assume there is a cookie on the phone that identifies me as a Comcast customer. I understand these logins, because after I the first time I logged onto a Comcast hotspot my phone assumed that any other Comcast hotspot is an acceptable network. This is something I voluntarily signed up for.

But today I find my phone automatically logged onto a number of hotspots in airports and hotels which I definitely have not authorized. I contrast this with using my laptop in an airport or hotel. With the laptop I always have to go through some sort of greeting screen, and even if it’s a free connection I usually have to sign on to some terms of service. But my phone just automatically grabs WiFi in many airports, even those I haven’t visited in many years. I have to assume that AT&T has some sort of arrangement with these WiFi networks.

I usually notice that I’m on WiFi when my phone gets so sluggish it barely works. WiFi is still notoriously slow in crowed public places. Once I realize I’m on a WiFi network I didn’t authorize I turn the WiFi off on my phone and revert to cellular data. Every security article I’ve ever read says to be cautious when using public WiFi and so I’d prefer not to use these connections unless I have no other option.

There was a major effort made a few years back to create a seamless WiFi network for just this purpose. The WiFi Alliance created a protocol called Hotspot 2.0 that is being marketed under the name of Passpoint. The purpose of this effort was to allow cellular users to automatically connect and roam between a wide variety of hotspots without having to ever log in. Their ultimate goal was to enable WiFi calling that could hand off between hotspots in the same way that cellular phones hand-off between cell sites.

It’s obvious that AT&T and other cellular carriers have implemented at least some aspects of Hotspot 2.0. In the original vision of Hotspot 2.0 customers were to be given the option of authorizing their participation in the Passpoint network. But AT&T has never asked my permission to log me onto WiFi hotspots (unless it was buried in my terms of service). AT&T has clearly decided that they want to use these WiFi handoffs in a busy environment like an airport to protect their cellular networks from being swamped.

It’s interesting that Verizon is not doing this. I think one reason for this is that they don’t want to give up control of their customers. Verizon foresees a huge future revenue stream from mining customer data and I’m guessing they don’t want their customer to be shuttled to a WiFi network controlled by somebody else, where they can’t track customer behavior. Verizon is instead pushing forward with the implementation of LTE-U where they can direct some data traffic into the WiFi bands, but all under their own control. While LTE-U uses WiFi frequency, it is not a hotspot technology and is as hard to intercept or hack as any other cellular traffic.

Most new cellphones now come with the Passpoint technology baked into the chipset. I think we can expect that more and more of our cellular data connections will be shuttled to hotspots without notifying us. Most people are not going to be bothered by this because it will reduce usage on their cellular data plans. I’m just not nuts about being handed off to networks without some sort of notification so that I can change my settings if I don’t want to use the selected network. I guess this is just another example of how cellular companies do what they want and don’t generally ask for customer permission.

Decommissioning Rural Copper, Part 2

In the last blog I wrote about my belief that AT&T and Verizon want out of the rural wireline business. They both have plans to largely walk away from their rural copper networks and replace landline copper services with cellular service. Today I want to talk about what regulators ought to do with those networks.

When these two giant telcos walk away from rural copper they will inevitably harm rural America. While many homes will get the ‘privilege’ of now buying highly-priced cellular-based broadband, other homes are going to find themselves without telephone service if they happen to live in one of the many cellular dead zones. Such homes will not only be unable to benefit from cellular broadband, but if they have poor cell service they will find themselves cut off from voice communications as well.

As somebody who has traveled extensively in rural America I can tell you that there are a lot more cellular dead zones than people realize. And it’s not only farms, and there are county seats in rural America where it’s difficult to get a working cellphone signal inside of buildings.

As part of this transition both companies are going to walk away from a huge amount of existing copper cable. I think this copper cable is an incredibly valuable asset and that regulators ought not to allow them to tear it down.

The copper wire network today goes almost everywhere in rural America. Congressional laws and FCC policies led to most homes in the country getting access the the copper network. These copper wires occupy a valuable space on existing telephone poles – on the majority of rural poles the only two wires are the power lines at the top and the telephone wires at the bottom.

If these copper wires are kept in place they could greatly reduce the cost of building rural fiber. It is far cheaper when building fiber to ‘lash’ the fiber onto an existing set of cables than to hang fiber from scratch. It was this construction technique that allowed Verizon to build a lot of its FiOS fiber network – they lashed fiber onto existing telephone wires. And my guess is that when Verizon decommissions urban copper they are still going to leave a lot of the copper wires in place as a guidewire for their fiber.

If these telcos are going to walk away from these copper wires, then they ought to be required to keep them in place for use by somebody else to hang fiber. Many states might force the big telcos to tear down the copper wires since they will eventually create safety hazards as they break away from poles if they aren’t maintained. But if somebody else is willing to take over that maintenance then it shouldn’t be an issue.

I can picture a regulatory process whereby some other carrier is allowed to come in and ‘claim’ the abandoned wires once they are empty of customers. That would provide fiber overbuilders or rural communities to claim this copper as an asset.

There is some salvage value to copper wires and and it’s possible, but not probable that the value of the copper could exceed the cost to tear it down. So I can see the telcos fighting such an idea as a confiscation of their assets. But these rural wires have been fully depreciated for decades and the telcos have earned back the cost of these copper lines many times over. I believe that by the act of abandoning the wires and depriving some homes of wireline service that the big telcos will have forfeited any rights they might have to the remaining assets.

Anybody claiming the abandoned copper could use it in two ways. First, in many cases there is still existing life left in the copper, as witnessed by Frontier and CenturyLink rehabbing old rural copper with upgraded DSL. Local communities or small carriers could use the copper to bring the better services that the big telcos have refused to do over the last few decades.

But more importantly these wires represent the cheapest path forward for building rural fiber. Anybody taking over the old copper can save a lot of fiber construction costs by lashing fiber onto the existing copper. If our nationwide goal is really to get better broadband to rural America, then offering abandoned copper to fiber builders might be one of the easiest tools available to help the process along.

The big telcos abandoned rural America dacades ago. They stopped doing routine maintenance on rural copper and slashed the number of rural technicians. They now want to walk away from that copper and instead force rural America to buy cellular services at inflated prices. We owe it to the folks who paid for this copper many times over to get some benefit from it and to offer an alternative to the new rural cellular monopolies.

Decommissioning Rural Copper

I’ve been watching AT&T and Verizon since I’ve been in the industry (including a short stint at Southwestern Bell in the early 80s). We are about to see both of these companies unravel their rural telco properties.

Verizon got ahead of the curve and has been selling off rural properties for a few decades, many of which ending up with Frontier. Verizon still serves some rural areas and probably has shed  half of their rural customers. But there are still big swaths or rural Verizon customers in Pennsylvania, New York, Maryland and other northeastern states. Verizon benefitted from these sell-offs by selling completely depreciated and poorly maintained networks at high prices – as can be evidenced by how much Frontier is struggling to cover their massive debts. AT&T has sold almost no rural properties and still serves gigantic rural areas in dozens of states.

Both companies are clearly on a path to tear down the remaining rural copper networks and replace them with cellular wireless networks. There are both pros and cons for these transitions for rural customers.

On the plus side, many of these rural areas have never had broadband since these big telcos never extended their DSL to their rural service areas. We know that they could have extended DSL, because we have hundreds of examples of independent telephone companies that brought DSL to all of their customers, no matter how remote. But the big companies stopped spending money on rural properties decades ago. The remaining copper is now in terrible shape and one has to imagine that cellular voice is probably often as good or better than voice over these old copper lines.

There will now many customers who can buy fixed cellular broadband. This uses the same frequencies as the broadband for smartphones, but the cellular companies are pricing it to be a little less expensive. For many households the fixed-cellular broadband will be the first real broadband alternative they have ever had.

But there are also big downsides to this shift from old copper to cellular networks. First, cellular networks are effective for only a few miles from any given cell site. Anybody who has driven in rural America knows that there are cellular dead spaces everywhere. Any customers living in the cellular dead spaces are going to be left with no communications to the outside world. They’ll lose their copper and they won’t have cellular voice or data. This will be a huge step backwards for many homes.

The big telcos will be taking advantage of the fact that, as a cellular provider, they have no obligations to try to serve everybody. One of the reasons that we had nearly ubiquitous telephone coverage in the country is that telcos were the carriers of last resort in their service areas. They were required by law to extend telephone service to all but extremely remote customers. But that obligation doesn’t apply to a cellular carrier. We already have tons of evidence that the cellular carriers make no apologies to homes that happen to live out of range of their cellular towers. With no copper landlines left we will now have rural communications dead zones. It will be hard for anybody living in these dead zones to stay there and certainly nobody is going to build new homes in a place that doesn’t have cellular service.

There is a downside even for those households that get fixed-cellular broadband. The speeds on this service are going to be slow by today’s standards, in the range of 10 – 15 Mbps for those that live relatively close to a cellular tower, but considerably slower for customers at greater distances. The real downside to getting cellular data is that the speeds are not likely to get better in rural America for many years, even decades. The whole industry is abuzz with talk about 5G cellular making a big difference, but it’s hard to see that technology making much impact in rural areas.

I think this transition away from copper is going to catch a lot of rural people by surprise. These two big telcos have already started the process of decommissioning copper and once that gets full FCC approval the speed of decommissioning copper is likely to soon accelerate. I think a lot of homes are going to be surprised when they find out that the telcos no longer have an obligation to serve them.

What’s the Next FTTP Technology?

There is a lot of debate within the industry about the direction of the next generation of last mile fiber technology. There are three possible technologies that might be adopted as the preferred next generation of electronics – NG-PON2, XGS-PON or active Ethernet. All of these technologies are capable of delivering 10 Gbps streams to customers.

Everybody agrees that the current widely deployed GPON is starting to get a little frayed around the edges. That technology delivers 2.4 Gbps downstream and 1 Gbps upstream for up to 32 customers, although most networks I work with are configured to serve 16 customers at most. All the engineers I talk to think this is still adequate technology for residential customers and I’ve never heard of a neighborhood PON being maxed out for bandwidth. But many ISPs already use something different for larger business customers that demand more bandwidth than a PON can deliver.

The GPON technology is over a decade old, which generally is a signal to the industry to look for the next generation replacement. This pressure usually starts with vendors who want to make money pushing the latest and greatest new technology – and this time it’s no different. But after taking all of the vendor hype out of the equation it’s always been the case that any new technology is only going to be accepted once that new technology achieves and industry-wide economy of scale. And that almost always means being accepted by at least one large ISP. There are a few exceptions to this, like what happened with the first generation of telephone smart switches that found success with small telcos and CLECs first – but most technologies go nowhere until a vendor is able to mass manufacture units to get the costs down.

The most talked about technology is NG-PON2 (next generation passive optical network). This technology works by having tunable lasers that can function at several different light frequencies. This would allow more than one PON to be transmitted simultaneously over the same fiber, but at different wavelengths. But that makes this a complex technology and the key issue is if this can ever be manufactured at price points that can match other alternatives.

The only major proponent of NG-PON2 today is Verizon which recently did a field trial to test the interoperability of several different vendors including Adtran, Calix, Broadcom, Cortina Access and Ericsson. Verizon seems to be touting the technology, but there is some doubt if they alone can drag the rest of the industry along. Verizon seems enamored with the idea of using the technology to provide bandwidth for the small cell sites needed for a 5G network. But the company is not building much new residential fiber. They announced they would be building a broadband network in Boston, which would be their first new construction in years, but there is speculation that a lot of that deployment will use wireless 60 GHz radios instead of fiber for the last mile.

The big question is if Verizon can create an economy of scale to get prices down for NG-PON2. The whole industry agrees that NG-PON2 is the best technical solution because it can deliver 40 Gbps to a PON while also allowing for great flexibility in assigning different customers to different wavelengths. But the best technological solution is not always the winning solution and the concern for most of the industry is cost. Today the early NG-PON2 electronics is being priced at 3 – 4 times the cost of GPON, due in part to the complexity of the technology, but also due to the lack of economy of scale without any major purchaser of the technology.

Some of the other big fiber ISPs like AT&T and Vodafone have been evaluating XGS-PON. This technology can deliver 10 Gbps downstream and 2.5 Gbps upstream – a big step up in bandwidth over GPON. The major advantage of the technology is that is uses a fixed laser which is far less complex and costly. And unlike Verizon, these two companies are building a lot more FTTH networks that Verizon.

And while all of this technology is being discussed, ISPs today are already delivering 10 Gbps data pipes to customers using active Ethernet (AON) technology. For example, US Internet in Minneapolis has been offering 10 Gbps residential service for several years. The active Ethernet technology uses lower cost electronics than most PON technologies, but still can have higher costs than GPON due to the fact that there is a dedicated pair of lasers – one at the core and one at the customer site – for each customer. A PON network instead uses one core laser to serve multiple customers.

It may be a number of years until this is resolved because most ISPs building FTTH networks are still happily buying and installing GPON. One ISP client told me that they are not worried about GPON becoming obsolete because they could double the capacity of their network at any time by simply cutting the number of customers on a neighborhood PON in half. That would mean installing more cards in the core without having to upgrade customer electronics.

From what everybody tells me GPON networks are not experiencing any serious problems. But it’s obvious as the household demand for broadband keeps doubling every three years that the day will come when these networks will experience blockages. But creative solutions like splitting the PON could keep GPON working great for a decade or two. And that might make GPON the preferred technology for a long time, regardless of the vendors strong desire to get everybody to pay to upgrade existing networks.

More Pressure on WiFi

As if we really needed more pressure put onto our public WiFi spectrum, both Verizon and AT&T are now launching Licensed Assisted Access (LAA) broadband for smartphones. This is the technology that allows cellular carriers to mix LTE spectrum with the unlicensed 5 GHz spectrum for providing cellular broadband. The LAA technology allows for the creation of ‘fatter’ data pipes by combining multiple frequencies, and the wider the data pipe the more data that makes it to the end-user customer.

When carriers combine frequencies using LAA they can theoretically create a data pipe as large as a gigabit while only using 20 MHz of licensed frequency. The extra bandwidth for this application comes mostly from the unlicensed 5 GHz band and is similar to the fastest speeds that we can experience at home using this same frequency with 802.11AC. However, such high-speed bandwidth is only useful for a short distance of perhaps 150 feet and the most practical use of LAA is to boost cellphone data signals for customers closest to a cell tower. That’s going to make LAA technology most beneficial in dense customer environments like busy downtown areas, stadiums, etc. LAA isn’t going to provide much benefit to rural cellphone towers or those along interstate highways.

Verizon recently did a demonstration of the LAA technology that achieved a data speed of 953 Mbps. They did this using three 5 GHz channels combined with one 20 megahertz channel of AWS spectrum. Verizon used a 4X4 MIMO (multiple input / multiple output) antenna array and 256 QAM modulation to achieve this speed. The industry has coined the new term of four-carrier aggregation for the technology since it combines 4 separate bands of bandwidth into one data pipe. A customer would need a specialized MIMO antenna to receive the signal and also would need to be close to the transmitter to receive this kind of speed.

Verizon is starting to update selected cell sites with the technology this month. AT&T has announced that they are going to start introducing LAA technology along with 4-way carrier aggregation by the end of this year. It’s important to note that there is a big difference between the Verizon test with 953 Mbps speeds and what customers will really achieve in the real world. There are numerous factors that will limit the benefits of the technology. First, there aren’t yet any handsets with the right antenna arrays and it’s going to take a while to introduce them. These antennas look like they will be big power eaters, meaning that handsets that try to use this bandwidth all of the time will have short battery lives. But there are more practical limitations. First is the distance limitation and many customers will be out of range of the strongest LAA signals. A cellular company is also not going to try to make this full data connection using all 4 channels to one customer for several reasons, the primary one being the availability of the 5 GHz frequency.

And that’s where the real rub comes in with this technology. The FCC approved the use of this new technology last year. They essentially gave the carriers access to the WiFi spectrum for free. The whole point of licensed spectrum is to provide data pipes for all of the many uses not made by licensed wireless carriers. WiFi is clearly the most successful achievement of the FCC over the last few decades and providing big data pipes for public use has spawned gigantic industries and it’s hard to find a house these days without a WiFi router.

The cellular carriers have paid billions of dollars for spectrum that only they can use. The rest of the public uses a few bands of ‘free’ spectrum, and uses it very effectively. To allow the cellular carriers to dip into the WiFi spectrum runs the risk of killing that spectrum for all of the other uses. The FCC supposedly is requiring that the cellular carriers not grab the 5 GHz spectrum when it’s already busy in use. But to anybody that understands how WiFi works that seems like an inadequate protection, because any of the use of this spectrum causes interference by definition.

In practical use if a user can see three or more WiFi networks they experience interference, meaning that more than one network is trying to use the same channel at the same time. It is the nature of this interference that causes the most problems with WiFi performance. When two signals are both trying to use the same channel, the WiFi standard causes all competing devices to go quiet for a short period of time, and then both restart and try to grab an open channel. If the two signals continue to interfere with each other, the delay time between restarts increases exponentially in a phenomenon called backoff. As there are more and more collisions between competing networks, the backoff increases and the performance of all devices trying to use the spectrum decays. In a network experiencing backoff the data is transmitted in short bursts between the times that the connection starts and stops from the interference.

And this means that when the cellular companies use the 5 GHz spectrum they will be interfering with the other users of that frequency. That’s what WiFi was designed to do and so the interference is unavoidable. This means other WiFi users in the immediate area around an LAA transmitter will experience more interference and it also means a degraded WiFi signal for the cellular users of the technology – and they reason they won’t get speeds even remotely close to Verizon’s demo speeds. But the spectrum is free for the cellular companies and they are going to use it, to the detriment of all of the other uses of the 5 GHz spectrum. With this decision the FCC might well have nullified the tremendous benefits that we’ve seen from the 5 GHz WiFi band.

Merger Madness

The last year was a busy one for mergers in the industry. We saw Charter gobble up Time Warner Cable and Bright House Networks. We saw CenturyLink buy Level 3 Communications. But those mergers were nothing like we see on the horizon right now. I can barely read industry news these days without reading about some rumored gigantic acquisitions.

There have always been mergers in the industry, but I can’t remember a time when there was this level of merger talk happening. This might be due in part to an administration that says it won’t oppose megamergers. It’s also being driven by Wall Street that makes a lot of money when they find the financing for a big merger. Here are just a few of the mergers being talked discussed seriously in the financial press:

Crown Castle and Lightower. This merger is already underway with Crown Castle paying $7.1 billion for Lightower. It matches up two huge fiber networks along with tower assets to make the new company the major player in the small cell deployment space, particularly in the northeast.

Discovery and Scripps. Discovery Communications announced a deal to buy Scripps Networks for about $11.9 billion. This reduces the already-small number of major programmers and Discovery will be picking up networks like the Food Network, HGTV, Travel Channel, the Cooking Channel and Great American Country.

Comcast, Altice and Charter. Citigroup issued a report that speculates that Comcast and Altice would together buy Charter and split the assets. Comcast would gain the former Time Warner cable systems with the rest going to Altice. There is also talk of Altice trying to finance the purchase of Charter on its own. But with Charter valued at about $120 billion while also carrying around $63 billion in debt that seems like a huge number to finance. This would be an amazing merger with the ink not yet dry on Charter’s merger with Time Warner.

Amazon and Dish Network. This makes sense because Amazon could finally help Dish capitalize on its 700 E-block and AWS-4 spectrum licenses. This network could be leveraged by Amazon to track trucks and packages, monitor the IoT and to control drones.

T-Mobile and Sprint. Deutsche Telecom currently owns 63% of T-Mobile and Softbank owns 82% of Sprint. A straight cashless merger would create an instantly larger company and gain major operational advantages. The FCC and the Justice Department nixed a merger between T-Mobile and AT&T a few years back, but in an environment where the cellular companies are getting into the wireless business this might sail through a lot easier today. Sprint has also been having negotiations for either a merger or some sort of partnership with Comcast and Charter.

Comcast and Verizon. There is also Wall Street speculation about Comcast buying Verizon. The big advantage would be to merge the Comcast networks with the Verizon Wireless assets. Comcast has a history of buying companies in distress and Verizon’s stock price has dipped 17% already this year. But this would still be a gigantic merger worth as much as $215 billion. There are also some major regulatory hurdles to overcome with the big overlap in the northeast between Comcast and the Verizon FiOS networks.

FirstNet – A Boon or Boondoggle?

The federal program FirstNet was born out of the tragedy of the 9/11 terrorist attacks. At the time there was a lot of negative press when it was realized that first responders from New Jersey were unable to communicate with those from New York. And the idea was born to create a nationwide platform so that all first responders could easily communicate with each other.

The FirstNet concept first tackled the concept of interoperability. There were a number of jurisdictions where interoperability was an issue then. But since 9/11 most metropolitan areas have solved the interoperability issue on their own. The fire and police departments in regions got together in the years after 9/11 and made sure they could communicate with each other. One of the easiest fixes was for first responders to add cellphones to complement the first responder radios that were the major method of communications in 2001. So the concept morphed into a discussion of finding cellular bandwidth for first responders. We’ve seen repeatedly that local cellular networks instantly get clogged during any kind of major emergency, and this means that first responders have trouble making cellphone connections just like everybody else.

Congress stepped into the discussion in 2012 and created FirstNet (First Responder Network Authority). As part of that action Congress set aside Band 14 of the 700 MHz spectrum for the exclusive use of first responders nationwide. After several reboots of the RFP process the new agency finally chose AT&T to provide a nationwide LTE network for first responders. The company was given $7 billion as the first payment towards creating the nationwide cellular network. The GAO had estimated that the final network could cost as much as $47 billion.

States were given the right to opt-in to FirstNet with zero cost to the states. In the last month or so thirteen states have chosen to be part of the effort. That means that AT&T will provide the network in those states using federal dollars.

But there is a huge question, mostly technical, of whether this network makes any sense. A lot of engineers say that FirstNet is overkill and that there are now other ways to solve the same problem. A hint of how easily this can be done came from a press release from Kansas, which just bought into FirstNet. In that release AT&T said that until FirstNet is built in the state that first responders would immediately get priority access to cell towers and by the end of this year would have preemptive access – meaning that a call attempt made by a first responder would shove somebody else off the cellular network. Providing preemptive access is a far less costly way of solving the problem. If first responders can be given preemptive access that easily, then there really is no longer a need for FirstNet.

To add fuel to the fire, Verizon just announced at the end of the next week that they would offer these same services to first responders everywhere – and with zero federal dollars. Verizon will immediately offer preemptive access to cell towers to all first responders.

Any topic having to do with first responders is always an emotional one and much of the first responder community has bought into the concept of having interference-free spectrum. But the Verizon announcement shows that the FirstNet solution is obsolete before the first piece of network is constructed.

And the FirstNet implementation comes with a big national cost. It’s clear that we need a huge amount of bandwidth to satisfy customer demands for cellular data. It seems wasteful to use a slice of prime spectrum in Band 14 of 700 MHz when it’s not needed. That spectrum is worth more to the country for providing cellular data than for handling calls from first responders. This would not be true if first responders really needed this spectrum to communicate – but the cellular companies can give them preemptive access using existing cellular spectrum. For the vast majority of time the FirstNet spectrum will sit virtually unused – at any given time in a city it might be handling hundreds of transmissions from first responders when it could instead be handling hundreds of thousands of transmissions for everybody.

There is also the rural issue to deal with. FirstNet is supposed to provide nationwide first responder access. But as somebody who travels widely in rural America, I can tell you that a lot of the AT&T LTE coverage map is bosh. There is a whole lot of rural America where cell coverage is either spotty or non-existent. When you get to a rural place you quickly come to understand the short distance that a cell signal travels from any given cellular tower. There are gaps everywhere in rural America between widely-spaced cell towers.

First responders in rural America are not going to rely on the FirstNet spectrum even if it’s freely available to them. They are more likely going to keep their current radio networks that work today, using spectrum that travels farther than the 700 MHz spectrum. I can’t help but picture a rural tragedy, such as a downed-plane, where first responders from outside the area will have no communication ability if the FirstNet signal to the needed area is weak or nonexistent.

I see this as another giant government handout to the huge carriers. You can be assured that a lot of the money going to AT&T will go to their bottom line. I hope, at least, that some of the money they are getting for FirstNet will at least improve normal cellular coverage in rural America – but I’m not holding my breath. To me this seems like another big federal program that is being spent to fix a problem that no longer exists. Local jurisdictions solved the interoperability problem in the first few years after 9/11. And the ability of cellular companies to give preemptive access to first responders means there is no reason to set aside a huge valuable slice of spectrum.

CHECK Comments for correction and update.