Wireless is Not a Substitute for Wireline

Cell-TowerAny time there is talk about government funding for broadband, arguments arise that wireless broadband is just as good as wireline broadband. But it is not the same and is not a substitute. I love wireless broadband and it is a great complement to having a home or business broadband connection, but there are numerous reasons why wireless broadband ought not to be funded by government broadband programs.

The most recent argument for wireless broadband comes the Minnesota House which is currently in session. In last year’s legislative session, Minnesota approved a $20 million grant program to help expand broadband in rural areas of the state. That grant was distributed to a number of broadband projects, all wireline, which required a significant matching fund from an entity building the wireline facilities. The 2014 funding, which mostly went to independent telephone companies, is being used to bring broadband to thousands of rural residents as well as 150 rural businesses and 83 rural schools and libraries.

But the chairman of the House Job Growth and Energy Affordability Committee in Minnesota killed an additional state grant; it’s been left out of this year’s House budget. Rep. Pat Garofalo, R-Farmington, said that wired broadband is too costly in sparsely populated areas and believes that wireless and satellite technologies are more financially effective.

In another case, Verizon recently got the New Jersey State Board of Public Utilities to agree that it could use LTE data plans as substitutes for homes that are losing their copper or DSL services.

Another place where this same argument is being made concerns the upcoming funding from the Connect America Fund, which is part of the federal Universal Service Fund, and that is being directed towards expanding rural broadband. As written several years ago, the Fund is allowed to consider investing in wireless as well as wireline broadband networks.

There have been numerous parties lobbying to try to get these billions get directed towards landline networks and not towards wireless networks. The NTCA, which is now called the Rural Broadband Association, sponsored a report from Vantage Point Solutions that compares wireless and wireline technologies, and which argues that government funding should only be used to fund wireline networks. This whitepaper makes many of the same arguments I have been making for years about the topic, and included a few I had not considered. Here are some of the major arguments made by the whitepaper:

  • Even without considering the cost of spectrum, it costs far more to build a wireless network when comparing construction cost per megabit that can be delivered to end users. Modern fiber networks rarely cost more than $10 per Mbps capacity created, and often far less than that, while it costs several hundred dollars per effective megabit to construct a wireless network using any of the common technologies like LTE.
  • From a physics perspective, the amount of frequency available through US allocated spectrum is not large enough to deliver large symmetrical bandwidth, which is the goal of the National Broadband Plan. This limitation is a matter of physics and not of technology. That limitation is still going to be there with 5G or later wireless technology unless the FCC massively reworks the way it allows frequency to be used.
  • At least in today’s world, the prices charged to customers are drastically different for wireless and wireline data. Already today, 25% of residences are downloading more than 100 gigabits per month in total data. That can be affordable on wireline, but almost every current wireless provider has monthly data caps that range upward from just a few gigabits per month. A customer on a capped data plan who uses 100 gigabits in a month would face an astronomical monthly bill.
  • The report also made the economic argument that the shelf-life for wireless equipment and networks is relatively short, in the range of seven years, while fiber networks can have an incredibly long economic life. The report argues that the Connect America Fund should not be investing in technology that will obsolete and potentially unusable just a few years after it’s built. There certainly is no guarantee that the large wireless carriers will make needed future investments once they stop getting a federal subsidy.
  • The report also made all of the normal comparisons between the two technologies in terms of operating characteristics such as available bandwidth, latency times, and high reliability, all of which tilt in favor of landline.

I agree with this report wholeheartedly. I know that when I first read the language in the Connect America Fund my initial reaction was that the money would all go to cellular companies who would use the money to build rural cell towers. But fiber technology has gotten far more efficient in just the few years since that order. Also, the wireless businesses of Verizon and AT&T are the two most profitable entities in telecom, by far, and it makes no sense to flow billions of federal dollars to them to build what they will probably build anyway with their own money.

Certainly, expanding rural LTE would get some broadband to more people, but in the long run we would be better off directing that same money to bring a permanent solution to some rural areas rather than a poor solution for all of it.

Verizon’s Strategy

Verizon2The news coming out of Verizon lately is really interesting and set me to musing about their long-term strategy.

First, they are selling off $10 billion in landlines in Texas, California, and Florida to Frontier. These properties include 3.7 million voice lines, 2.2 million high-speed data customers, and around 1.6 million FiOS customers. This divests their FiOS service everywhere except the east coast. The 1.6 million customers represents a very significant 24% of the reported 6.6 million FiOS customers at the end of 2014. A few weeks ago Verizon had also announced an end to any further expansion of FiOS.

It’s been clear for years that Verizon has wanted out of the copper business. They first sold off large portions of New England to Fairpoint. Then in 2010 they sold a huge swath of lines in fourteen states to Frontier including the whole state of West Virginia. And now comes this sale. It’s starting to look like Verizon doesn’t want to be in the landline business at all, perhaps not even in the fiber business.

After all, this latest selloff was done to finance another big chunk of wireless spectrum. When Verizon CEO Lowell McAdam announced the landline sale he said that the company would be focusing on its 108 million wireless customers. One can see the emphasis on wireless in the company just by looking at their annual reports. One has to go many pages deep to see a discussion of the landline business and most of the report talk about the wireless business.

McAdam said that the company was going to put its emphasis on selling data and video to LTE customers. McAdam repeated a past announcement that Verizon would be rolling out an online video package later this summer and he hinted that the service would include a significant number of networks when launched. They plan to sell the new video packages to both their wireless customers and to anybody online.

I find several things about Verizon’s decisions to be very interesting:

  • One has to wonder how Verizon will deliver a lot of video programming through the cellular network. Certainly LTE has enough speed to deliver video, and most urban LTE network tests come in between 10 Mbps and 20 Mbps. But the issue in the cellular network is not speed, but overall capacity from a given cell site. Perhaps some of the new spectrum they are buying will be used strictly for this purpose to beef up capacity. But I find it a bit ironic that Verizon would now be pushing such a data-heavy network application when just a few short years ago they claimed that network congestion was the reason they needed to impose skimpy monthly data caps.
  • You also have to wonder how they are going to reconcile this product with their existing data caps. An hour of video streaming can use a gigabit of bandwidth and so it won’t take very much video viewing to hit the existing data caps. Verizon stopped selling unlimited data plans in 2012. They throttle the top 5% of unlimited 3G users and threatened last fall to do the same thing to LTE customers, but backed down after a lot of pushback. The majority of their customers have low caps that are not going to match up well with cellular video products.
  • Perhaps they were hoping to exempt their own video product from data caps, but that would violate the impending new net neutrality rules which won’t allow favoring your own product over those from other video providers. So perhaps Verizon is going to go back to unlimited data plans or at least raise the caps significantly. But doing that will allow Netflix and others to compete on cellphones.
  • One also has to wonder how they will keep up with the inevitable trend for bigger bandwidth video. Will wireless networks really be able to deliver 4k video and the even bigger 8k bandwidth products that will inevitably follow?
  • In general, one has to be curious about their obvious desire to be only a wireless company. The general trend in the cellular industry is towards lower prices. I know I was able to cut my own cellphone plan price almost in half this past year, and the trend is for prices to keep going lower. Certainly having companies like Google enter the market is going to push prices lower. Also, Cablevision announced a cellphone plan that mostly uses WiFi and that will only dip into the cellular network as a fallback. Comcast and others are considering this and it could produce significant competition for Verizon.
  • This announcement also tells me that they see profits in selling over-the-top video. It’s well known that nobody makes much money selling the huge traditional cable lineups, but Verizon obviously sees better margins in selling smaller packages of programming. But will margins remain good for online video if a lot of companies jump into that business?

I scratch my head over selling off FiOS. Verizon reports an overall 41% market penetration for its data product on FiOS networks. Data has such a high profit margin that it’s hard to think that FiOS is not extremely profitable for them. The trend has been for the amount of data used by households to double every three years, and one doesn’t have to project that trend forward very far to see that future bandwidth needs are only going to be met by fiber or by significantly upgraded cable networks. Landline networks today deliver virtually all of the bandwidth that people use. There are now more cellular data dips than landline data dips, but people rely on their landline connection for any application that uses significant bandwidth.

Verizon was a market leader getting into the fiber business. FiOS was a bold move at the time. It’s another bold move to essentially walk away from the fiber business and concentrate on wireless. They obviously think that wireless has a better future than wireline. But since they are already at the top of pile in cellular one has to wonder where they see future growth? One has to admit that they have been right a lot in the past and I guess we’ll have to wait a while to see if this is the right move.

The History of Cellphones

IBM-SimonThis is another blog that looks at the history of the industry and that today I look at the history of the cellphone. Cellphones are arguably the most successful product in the history of our industry, but young people are often surprised to find out that the industry and technology are still relatively very new.

Prior to 1973 and stretching back into the 1920s there was some version of radio phones that were mostly used by businesses with vehicle fleets. These services were generally of somewhat poor quality and also limited either by the number of simultaneous users (only 3 at a time, per city in the early 50’s) or by geography (you couldn’t leave the range of the tower you were connected to).

But several breakthroughs enabled the cellphone technology we know today. First, in the late 1960’s Philip T. Porter and a team of engineers at Bell Labs proposed the system of modern directional cell phone towers that we still have in place today. In 1970 Amos E. Joel of Bell Labs invented the ‘three-sided trunk circuit’ that is the basis for cellular roaming, allowing a call to be handed from one cell tower to another.

The big breakthrough came in 1973 when Martin Cooper of Motorola and researchers at Bell Labs came up with the first hand-held cellphone. The first phone weighted two and a half pounds and was nine inches long. The first phone could hold enough charge for 30 minutes of talking and took ten hours to recharge. But the idea of having a handheld portable phone took hold and several companies began developing a wireless product. Interestingly, none of the prognosticators at the time thought that the technology had much of a future. They predicted future customers in the tens of thousands and not in the billions that we see today.

The first commercial use of the new cellular technologies was introduced in Tokyo in 1979, Scandinavia in 1981 and in the US in 1983. The technology was analog and referred to as Advanced Mobile Phone System (AMPS). It had a number of flaws by modern standards in that it was susceptible to eavesdropping by use of a scanner and it was easy to introduce unauthorized phones onto the network. I can recall occasionally seeing somebody talking on one of these mobile phones in the 80s, but there were relatively rare. But the phones got smaller and batteries improved and the first flip phone was introduced in 1989.

The first system that was more like what we have today was also introduced in the US by DynaTAC using 1G technology. Early 1G was an analog service and was made into a digital offering in 1990. In the early 1990s the second generation network was introduced using 2G. There were two competing technologies at the time (and still are today) that differed by the underlying standards – the GSM standard from Europe and the US-developed CDMA standard. The first GSM network was introduced in Finland in 1991 and hit the US in 1993.

Also introduced in 1993 was the IBM Simon phone that could be called the first smartphone. It has features like a pager, fax machine and PDA merged with a cellphone. It included advanced features for the time including things like a stylus touch screen, address book, calendar, calculator, notepad and email. About this same time was the introduction of texting. The first text message was sent in England in December 1992 followed by Finland in 1993. Texting was everywhere by the mid-1990s.

The demand for accessing the web from a cellphone drove the creation of 3G. This changed the phone from circuit switching to packet switching allowing the introduction of a data connection. The first 3G network was introduced in Japan in 2001, Korea in 2002 and in the rest of the world starting in 2003. By the end of 2007 there were 295 million customers using a 3G network which represented 9% of worldwide cell phone subscribers. Apple released its first iPhone in 2007 that used the 3G technology. That phone was the first ‘modern’ smartphone and today smartphone sales dominate the worldwide market. Finally, around 2009 saw the introduction of the first 4G networks, This increased theoretical data speeds by a factor of 10. There were two different commercial standards for 4G data – WiMAX and LTE. Many of these networks in the US have just been completed for most urban and suburban customers.

So it’s easy for a kid to think we have always had cellphones. But the first iPhone was only seven years ago and the flip-phone was the predominant phone for more than a decade before that. Before the flip phone there were very few cellphones users compared to today. This is an industry that has grown entirely during my career in the industry and it’s still hard sometimes to believe how well it has done. Now, if I had just bought that Apple stock . . .

The Skinny on U.S. 4G Data Speeds

Cell-TowerI am a statistic freak and I read any and all statistics I can find about the telecom industry. A lot of statistics are interesting but require a lot of heavy lifting to see what is going on beneath the numbers. But I ran across one set of statistics that sums up the problems of wireless 4G data in this country in a few simple numbers.

A company called OpenSignal has an app that people can use to measure the actual download speeds they see on LTE 4G networks. This app is used worldwide and so we can also compare the US to other parts of the world. In 2014 the comparisons were made from readings from 6 million users of the app.

The first interesting statistic is that the US came in 15th in the world in LTE speeds. In 2014 the US average download speed was a paltry 6.5 Mbps across all US downloads using 4G. At the top of the chart was Australia at 24.5 Mbps, Hong Kong at 21 Mbps, Denmark at 20.1 Mbps, Canada at 19.3 Mbps, Sweden at 19.2 Mbps and South Korea at 18.6 Mbps. Speeds drop pretty significantly after that, and for example Japan was at 11.8 Mbps. So beyond all of the hype from AT&T and Verizon touting their network speeds, they have not done a very good job in the US.

But the second statistic is even more telling. The speeds in the US dropped from 9.6 Mbps in 2013 to 6.5 Mbps in 2014. The US was the only country on the list of the top fifteen countries that saw a significant percentage drop from one year to the next. Sweden did have a drop, but they went from 22.1 Mbps to 19.2 Mbps

So what does this all mean? First, the drop in speed can probably best be explained by the fact that so many people in this country are using wireless data. Large amount of users are obviously overwhelming the networks, and as more people use the wireless data networks the speeds drop. Our wireless networks are all based upon the total bandwidth capacity at a given cell site, and so to the extent that more people want data than a cell site is designed for, the speeds drop as the cell site tries to accommodate everybody.

But for the average 4G speed for the whole year to only be 6.5 Mbps there has to be a whole lot more to the story. One might expect Canada to be faster than the US simply because we have a lot more large cities that can put strains on wireless networks. But you wouldn’t expect that to make the Canadian 4G experience three times faster than the US experience. And there are very few places on earth as densely populated as Hong Kong and they have the second fastest 4G networks in the world.

It’s obvious from these numbers that the US wireless carriers are not making the same kinds of investments per customer as other countries are doing. It’s one thing to beef up urban cell sites to 4G, but if those cell sites are too far apart then too many people are trying to use the same site. I would have to guess that our main problem is the number and spacing of cell sites.

But we also have a technology issue and regardless of what the carriers say, there are a lot of places that don’t even have 4G yet. I don’t have to drive more than 2 miles outside my own town to drop to 3G coverage and then only a few more miles past that to be down to 2G. A few weeks ago I was in Carlsbad California, a nice town halfway between LA and San Diego and right on I-5. I couldn’t even find a 2G network there at 5:00 in the evening, probably due to all of the traffic on the interstate.

I hope the FCC looks at these kinds of statistics because they debunk all of the oligopoly hype we get from the wireless carriers. I laugh when people tell me they are getting blazing fast speeds on 4G, because it’s something I look at all of the time when I travel and I have never seen it. When I hear of somebody who claims that they are getting 30 Mbps speeds I know that they must be standing directly under a cell tower at 3:00 in the morning. I like speed, but not quite that much.

Will the Real 4G Please Stand Up?

English: 4G LTE single mode modem by Samsung, ...

English: 4G LTE single mode modem by Samsung, operating in the first commercial 4G network by Telia (Photo credit: Wikipedia)

We are all aware of grade inflation where teachers give out more high grades than are deserved. But US cellular marketers have been doing the same thing to customers and have inflated the performance of their data products by calling every new development the next generation. Earlier this year the International Telecommunications Union (ITU) approved the final standards for 4G cellular data. One of the features of the final standard is that a 4G network must be able to deliver at least 100 Mbps of data to a phone in a moving vehicle and up to 1 Gbps to a stationary phone.

Meanwhile in the US we have had cellular networks marketed as 4G for several years. In the US the earliest deployments of 3G networks happened just after 2001. That technology was built to a standard that had to deliver at least 200 kbps of data, which was more than enough when we were using our flip phones to check sports scores.

But since then there have been a number of incremental improvements in the 3G technology. Improvements like switching to 64-QAM modulation and multi-carrier technologies improved 3G speeds. By 2008 3G networks were pretty reliably delivering speeds up to 3 Mbps download using these kinds of improvement. Around the rest of the world this generation of 3G improvements was generally referred to as 3.5G. But in the US the marketers started calling this 4G. It certainly was a lot faster than the original 3G, but it is still based on the 3G standard and is not close to the 4G standard.

And since then there has been other big improvements in 3G using LTE and HSPA. For example, LTE is an all-packet technology and this allows it to send voice traffic over the data network, gaining efficiency by not having to switch between voice and data. One of the biggest improvements was the introduction of MIMO (multiple input multiple output). This allows LTE to use different frequencies to send and receive data, saving it from switching back and forth between those functions as well.

For a while Wi-max looked like a third competitor to LTE, but it’s pretty obvious now in the US that LTE has won the platform battle. All of the major carriers have deployed significant amounts of LTE and most of them say these deployments will be done by the end of this year in metropolitan markets. Speeds on LTE are certainly much faster than earlier speeds using 3.5G technology. But this is still not 4G and around the rest of the world this technology is being referred to as 3.9G or Pre-4G.

But to date there are very few phones that have been deployed that use the LTE network to its fullest. There have been a few handsets, like the HTC Thunderbolt that have been designed to use the available LTE speeds. And Verizon says it will roll out smartphones in 2014 that will only work on the LTE network.

There is a big trade-off in handsets between power consumption and the ability to switch between multiple cellular technologies. A typical cell phone today needs to be able to work on 3G networks, 3.5G networks and several variations of the latest networks including the different flavors of LTE as well as the HSPA+ used by T-Mobile. So, interestingly, the most popular phones like the iPhone and the Galaxy S4 will work on LTE, but don’t come close to achieving the full speeds available with LTE. And of course, nobody tells this to customers.

Starting in September in South Korea will be a new deployment of another incremental improvement in cellular data speeds using a technology called LTE-A (LTE Advanced). This is achieving data speeds of about twice those achieved on the current US LTE deployments. This is achieved by layering in a technology called carrier aggregation (CA) that links together two different spectrums into one data path.

And the US carriers have talked about deploying the LTE-A technology starting sometime in 2014. No doubt when this is deployed in the US some marketer is going to call it 5G. And yet, it is still not up to the 4G standard. Maybe this is now 3.95G. Probably by the time somebody actually deploys a real 4G phone in the US it is going to have to be called 8G.