Categories
Technology The Industry

Shutting Down Obsolete Technologies

There was an interesting statement during the recent Verizon first quarter earnings report call. The company admitted that shutting down the 3G cellular networks cost it about 1.1 million retail cellular customers along with the corresponding revenues.

This was long expected because there are still a lot of places where 3G technology was the only cellular signal available to rural customers living in remote areas. There were also a lot of people still happy with 3G flip phones even where 4G was available. Some of these customers will likely come back with 4G phones, but many might be angry with Verizon for cutting them off and go elsewhere.

Verizon has been trying to shut down the 3G network for at least five years. Its original plans got delayed due to discussions with the FCC and then got further delayed because of the pandemic – it didn’t seem like a good idea to cut folks dead when cellular stores were shuttered.

This change was inevitable. The bandwidth that can be delivered on the 3G networks is tiny. Most of you remember when you used 3G and a flip phone to check the weather and sports scores. Cellular carriers want to repurpose the spectrum used for 3G to support 4G and 5G. This is something that is inevitable – technologies become obsolete and have to be upgraded or replaced. The 3G transition is particularly abrupt, because the only possible transition is to cut the 3G signal dead, and 3G phones become bricks.

All of the technologies used for broadband and telecom eventually become obsolete. I remember when we used ISDN to deliver 128 Kbps broadband to businesses. I remember working with n-carrier and other technologies for creating data connections between central offices. Telephone switches took up a big room instead of being housed inside a small computer. The earlier version of DOCSIS technology were largely abandoned and upgraded to new technology. BPON became GPON and is now becoming XGS-PON.

Most transitions to new technologies are phased in over time. You might be surprised that there are still working ISDN lines chugging along that are being used to monitor remote sensors. There are still tiny rural cable companies operating the early versions of DOCSIS. But the industry inevitably replaces ancient technology in the same way that none of you are reading this blog on an IBM 5150 or a Macintosh 128k.

But some upgrades are painful. There were folks who lost cellular coverage when 3G was cut dead since they lived in a place that might not be able to receive the 4G replacement. A 3G phone needed only a tiny amount of bandwidth to operate – at levels that newer phones would perceive to be far under one bar of service.

The other painful technology replacement that keeps getting press is the big telcos killing off the copper networks. When copper is cut off in an area, the traditional copper landlines and DSL go dead. In some cases, customers are offered to move to a fiber network. The price might be higher, but such customers are offered a good permanent technology replacement. But not every DSL customer in a city that loses copper service is offered a fiber alternative. Customers find themselves likely having to pay $30 or $40 more to move to the cable company.

In rural areas, the telcos often offer to move customers to wireless. For a customer that lives within a decent distance from a cell tower, this should be an upgrade. Fixed wireless delivered for only a few miles should be faster than rural DSL. But like all wireless technologies, there is a distance limitation around any given tower, and the FWA signal isn’t going to work for everybody. Some customers that lose rural copper are left with whatever alternatives are available – because the telephone company is basically abandoning them. In many rural areas, the broadband alternatives are dreadful – which is why many were sticking with slow rural DSL.

I hear a lot of complaints from folks who lose traditional copper who are upset that they lose the ability to use services that work on copper technology, such as fax machines and medical monitors. It may sound uncaring, but these folks need to buy something newer that works with today’s broadband. Those are the kind of changes that are inevitable with technology upgrades. Just like you can’t take your old Macintosh to get fixed at Best Buy, you can’t keep using a technology that nobody supports. That’s an inevitable result of technology getting better over time. This is not a comfort to the farmer who just lost his 3G cell coverage – but there is no way to keep older technology operating forever.

Categories
Technology The Industry

Killing 3G

I have bad news for anybody still clinging to their flip phones. All of the big cellular carriers have announced plans to end 3G cellular service, and each has a different timeline in mind:

  • Verizon previously said they would stop supporting 3G at the end of 2019, but now says it will end service at the end of 2020.
  • AT&T has announced the end of 3G to be coming in early 2022.
  • Sprint and T-Mobile have not expressed a specific date but are both expected to stop 3G service sometime in 2020 or 2021.

The amount of usage on 3G networks is still significant. GSMA reported that at the end of 2018 that as many as 17% of US cellular customers still made 3G connections, which accounted for as much as 19% of all cellular connections.

The primary reason cited for ending 3G is that the technology is far less efficient than 4G. A 3G connection to a cell site chews up the same amount of frequency resources as a 4G connection yet delivers far less data to customers. The carriers are also anxious to free up mid-range spectrum for upcoming 5G deployment.

Opensignal measures actual speed performance for millions of cellular connections and recently reported the following statistics for the average 3G and 4G download speeds as of July 2019:

4G 2019 3G 2019
AT&T 22.5 Mbps 3.3 Mbps
Sprint 19.2 Mbps 1.3 Mbps
T-Mobile 23.6 Mbps 4.2 Mbps
Verizon 22.9 Mbps 0.9 Mbps

The carriers have been hesitating on ending 3G because there are still significant numbers of rural cell sites that still don’t offer 4G. The cellular carriers were counting on funding from the FCC’s Mobility Fund Phase II to upgrade rural cell sites. However, that funding program got derailed and delayed when the FCC found there were massive errors in the data provided for distributing that fund. The big carriers were accused by many of rigging the data in a way to give more funding to themselves instead of to smaller rural cellular providers.

The FCC staff conducted significant testing of the reported speed and coverage data and released a report of their findings in December 2019. The testing showed that the carriers have significantly overreported 4G coverage and speeds across the country. This report is worth reading for anybody that needs to be convinced of the garbage data that has been used for the creation of FCC broadband maps. I wish the FCC Staff would put the same effort into investigating landline broadband data provided to the FCC. The FCC Staff recommended that the agency should release a formal Enforcement Advisory including ‘a detailing of the penalties associated with carrier filings that violate federal law’.

The carriers are also hesitant to end 3G since a lot of customers still use the technology. Opensignal says there are several reasons for the continued use of 3G. First, 12.7% of users of 3G live in rural areas where 3G is the only cellular technology available. Opensignal says that 4.1% of 3G users still own old flip phones that are not capable of receiving 4G. The biggest category of 3G users are customers that own a 4G capable phone but still subscribe to a 3G data plan. AT&T is the largest provider of such plans and has not forced customers to upgrade to 4G plans.

The carriers need to upgrade rural cell sites to 4G before they can be allowed to cut 3G dead. In doing so they need to migrate customers to 4G data plans and also notify customers who still use 3G-only flip phones that it’s finally time to upgrade.

One aspect of the 3G issue that nobody is talking about is that AT&T says it is using fixed wireless connections to meet its CAF II buildout requirements. Since the CAF II areas include some of the most remote landline customers, it stands to reason that these are the same areas that are likely to still be served with 3G cell towers. AT&T can’t deliver 10/1 Mbps or faster speeds using 3G technology. This makes me wonder what AT&T has been telling the FCC in terms of meeting their CAF II build-out requirements.

Categories
Regulation - What is it Good For?

The Puzzling Lifeline Order

The FCC has released more details of the revised Lifeline program order. It’s a long order and I won’t even try to summarize the order in this blog since the Internet will be full of such summaries in a few days.

Instead, I am going to highlight a few parts of the order that truly have me puzzled. The intent of the Lifeline order was to help to promote broadband adoption for low-income households. Unfortunately there are parts of the order that I think might accomplish the opposite of what is intended.

My primary beef with the plan (and it’s a huge one) is that the fund can be used to subsidize 3G cellular service. Not only that, but it will support cellular data plans with a monthly data cap of only 500 MB (half of 1 gigabit). This is mind-boggling to consider.

One of the stated purposes of the Lifeline plan is to help close the “homework gap” by providing data connections for school age children. What sort of homework gap does the FCC think it is closing with a 3G connection and a miniscule monthly data cap? The FCC is basically supporting a flip-phone data plan.

There has been a lot of recent press about how some broadband customers are now opting for mobile data over landline data, and I figure this has to mostly be to save money. The people who are choosing mobile data as their only option either aren’t big data users or else they have access during the day somebody’s WiFi on a landline data connection.

A few weeks ago I was in eastern Washington State at a hotel that had data speeds so slow that I couldn’t even open email. And so for two evenings I used my mobile data to connect my laptop. I didn’t watch any video and just conducted business, followed some election news and looked at Facebook a bit, and in two short evenings I used over 2.5 GB of data. It is impossible to use mobile data to do normal functions over the Internet.

And yet, somehow a family with school kids is supposed to be able to use a 3G mobile connection that has a data cap for the whole month of half of a gigabit? Have you ever tried opening a big web page on 3G? The FCC’s plan is beyond ludicrous. I’m picturing that AT&T and Verizon are either going to cut people off the Lifeline connection when they reach the tiny monthly cap or else they are going to nail the poorest households with data overage charges – and those households will end up spending more for mobile data than they do today.

I guess the FCC thinks the ½ gigabyte cap is too small and the cap will grow to 2 GB by the end of 2018. But even that will provide almost no real functionality for kids doing homework. I’m picturing kids watching assigned videos on their phone and using their monthly data cap on the first school day of the month. The FCC has caved in to special interests and has handed a huge revenue stream to the wireless carriers that is downright sickening. This one provision basically ruins the functionality of the Lifeline plan in my eyes because the wireless carriers are going to siphon off huge amounts of Lifeline fund for worthless data plans.

The other part of the plan that I dislike is the cap on wireline data. This requires that low income households be given connection speeds of at least 10 Mbps downstream and 1 Mbps upstream. This is not great (and not broadband according to the FCC) but it is good enough for the homework gap. Yet anybody getting this assistance can still be subjected to a monthly data cap of 150 GB.

And so, a household today that might already have a data plan with no cap is going to get a data cap slapped on their household due to taking advantage of a $10 per month subsidy from the FCC. Comcast just raised their data caps to 1 TB (terabit), something that I was very happy to see. But now the FCC comes along and imposes a much smaller data cap on Lifeline landline connections. Should a customer who is paying $40 today for a data connection be penalized that heavily because they accept a $10 subsidy on their broadband? This feels vindictive to me, as if the sentiment is “No on-line video for you poor people!”

I honestly don’t understand why the FCC would impose data caps on Lifeline plans, and particularly don’t understand why they would impose data caps that are more stringent than what the carriers already have in place today. Hopefully the carriers will ignore these caps and let customers have the same cap as anybody else with the same plan. But I fear otherwise, and that makes the practical application of the Lifeline order pretty rotten in my mind.

Categories
The Industry

Issues Facing Cellular Networks

Most networks today are under stress due to growing broadband traffic. The networks that are easily the most stressed are cellular networks and I think that there can be lessons learned in looking how mobile providers are struggling to keep up with demand. Consider the following current issues faced by cellular network owners:

Traffic Volume Growth. Around the world cellular networks are seeing between 60% to 120% annual growth in data volumes. The problem with that kind of growth is that as soon as any upgrade is made to a part of the network it is consumed by the growth. This kind of growth means constant choke points in the network and problems encountered by customers.

The large cellular companies like Verizon and AT&T are handling this with big annual capital budgets for network improvements. But they will be the first to tell you that even with those expenditures they are only putting band-aids on the problem and are not able to get ahead of the demand curve.

WiFi Offload Not Effective. For years cellular networks have talked about offloading data to WiFi. But the industry estimates are that only between 5% and 15% of data through cellphones is being handled by WiFi. This figure does not include usage in homes and offices where the phone user elects to use their own local network, but rather is the traffic that is offloaded when users are outside of their base environment. Finding ways to increasing WiFi offload would lower the pressure on mobile networks.

Traffic has Moved Indoors. An astounding 75% of mobile network traffic originates from inside buildings. Historically mobile traffic came predominantly from automobiles and people outside, but the move indoors looks like a permanent new phenomenon driven by video and data usage.

The biggest impact of this shift is that most cellular networks were designed and the towers spaced for outdoor customers and so the towers and radios are in the wrong places to best serve where the volume is greatest today. This trend is the number one driver of micro cell sites that are aimed at relieving congestion for specific locations.

Network Problems Can be Extremely Local. The vagaries of wireless delivery mean that there can be network congestion at a location but no network issues as close as 50 yards away. This makes it very hard to diagnose and fix network issues. Problems can pop up and disappear quickly. A few more large data users than normal can temporarily cripple a given cell site.

Network owners are investigating technologies that will allow customers to pick up a more distant cell site when their closest one is full. Wireless networks have always allowed for this but it’s never worked very well in practice. The carriers are looking for a more dynamic process that will find he best way to serve each customer quickly in real time.

Networks are Operating too Many Technologies. It’s not unusual to find a given cell site operating several versions of 3G and 4G and sometimes still even 2G. The average cell site carries 2.2 different technologies, provided by 1.3 different vendors.

Cellular operators are working quickly towards software defined networks that will allow them to upgrade huge numbers of cell sites to a new version of software at the same time. They are also working to separate voice and data to different frequencies making it easier to handle each separately. Finally, the large cellular carriers are looking to develop and manufacture their own custom equipment to cut down on the number of vendors.

Still Too Many Failures. There are still a lot of dropped voice calls, and 80% of them are caused by mobility failures, meaning a failure of the network to handle a customer on the move. 50% of dropped data sessions are due to capacity issues.

Cellular providers are looking for the capacity to more dynamically assign radio resources on the fly at different times of the day. It’s been shown that there are software techniques that can optimize the local network and can reduce failures by as much as 25%.

Categories
The Industry

The History of Cellphones

This is another blog that looks at the history of the industry and that today I look at the history of the cellphone. Cellphones are arguably the most successful product in the history of our industry, but young people are often surprised to find out that the industry and technology are still relatively very new.

Prior to 1973 and stretching back into the 1920s there was some version of radio phones that were mostly used by businesses with vehicle fleets. These services were generally of somewhat poor quality and also limited either by the number of simultaneous users (only 3 at a time, per city in the early 50’s) or by geography (you couldn’t leave the range of the tower you were connected to).

But several breakthroughs enabled the cellphone technology we know today. First, in the late 1960’s Philip T. Porter and a team of engineers at Bell Labs proposed the system of modern directional cell phone towers that we still have in place today. In 1970 Amos E. Joel of Bell Labs invented the ‘three-sided trunk circuit’ that is the basis for cellular roaming, allowing a call to be handed from one cell tower to another.

The big breakthrough came in 1973 when Martin Cooper of Motorola and researchers at Bell Labs came up with the first hand-held cellphone. The first phone weighted two and a half pounds and was nine inches long. The first phone could hold enough charge for 30 minutes of talking and took ten hours to recharge. But the idea of having a handheld portable phone took hold and several companies began developing a wireless product. Interestingly, none of the prognosticators at the time thought that the technology had much of a future. They predicted future customers in the tens of thousands and not in the billions that we see today.

The first commercial use of the new cellular technologies was introduced in Tokyo in 1979, Scandinavia in 1981 and in the US in 1983. The technology was analog and referred to as Advanced Mobile Phone System (AMPS). It had a number of flaws by modern standards in that it was susceptible to eavesdropping by use of a scanner and it was easy to introduce unauthorized phones onto the network. I can recall occasionally seeing somebody talking on one of these mobile phones in the 80s, but there were relatively rare. But the phones got smaller and batteries improved and the first flip phone was introduced in 1989.

The first system that was more like what we have today was also introduced in the US by DynaTAC using 1G technology. Early 1G was an analog service and was made into a digital offering in 1990. In the early 1990s the second generation network was introduced using 2G. There were two competing technologies at the time (and still are today) that differed by the underlying standards – the GSM standard from Europe and the US-developed CDMA standard. The first GSM network was introduced in Finland in 1991 and hit the US in 1993.

Also introduced in 1993 was the IBM Simon phone that could be called the first smartphone. It has features like a pager, fax machine and PDA merged with a cellphone. It included advanced features for the time including things like a stylus touch screen, address book, calendar, calculator, notepad and email. About this same time was the introduction of texting. The first text message was sent in England in December 1992 followed by Finland in 1993. Texting was everywhere by the mid-1990s.

The demand for accessing the web from a cellphone drove the creation of 3G. This changed the phone from circuit switching to packet switching allowing the introduction of a data connection. The first 3G network was introduced in Japan in 2001, Korea in 2002 and in the rest of the world starting in 2003. By the end of 2007 there were 295 million customers using a 3G network which represented 9% of worldwide cell phone subscribers. Apple released its first iPhone in 2007 that used the 3G technology. That phone was the first ‘modern’ smartphone and today smartphone sales dominate the worldwide market. Finally, around 2009 saw the introduction of the first 4G networks, This increased theoretical data speeds by a factor of 10. There were two different commercial standards for 4G data – WiMAX and LTE. Many of these networks in the US have just been completed for most urban and suburban customers.

So it’s easy for a kid to think we have always had cellphones. But the first iPhone was only seven years ago and the flip-phone was the predominant phone for more than a decade before that. Before the flip phone there were very few cellphones users compared to today. This is an industry that has grown entirely during my career in the industry and it’s still hard sometimes to believe how well it has done. Now, if I had just bought that Apple stock . . .

Categories
Current News The Industry

Living Within Our Data Caps

An interesting thing happened to the wireless carriers on their trip to bring us 4G. They warned us repeatedly that we could expect issues as they upgraded their networks, and they forced us onto skinny data plans of a few gigabits of data so that most of us have learned to use WiFi with our cellphones rather than spend a fortune with the cellphone provider.

But maybe the wireless carriers have gone too far. Adobe Systems reported last week that that more than half of all data from cell phones is now using WiFi instead of 3G or 4G. Total WiFi traffic from mobile passed data directly on the wireless networks more than a year ago. This has to be troubling to AT&T and Verizon because their business plans rely on consumers using the faster 4G LTE networks. They have made huge investments over the last few years in increasing data speeds and that is the basis of all of their advertising.

So perhaps the tactic of imposing small data caps has backfired on them. They are not seeing their new expensive networks used nearly as much as they counted on, and this is limiting their ability to monetize the expensive upgrades. I know that I personally am very happy buying a 2 gigabit monthly cap and I only use cellphones data for directions while driving or when I have no other choice when traveling. I would never consider watching a video on my phone when I’m not at home. Apparently there are a lot of people like me in the world.

When AT&T and Verizon realized that people weren’t using as much data as they had hoped for they both got into the tablet business hoping that it would boost the use of their 4G LTE data. They have been bundling tablets into plans and even selling them below cost as a way to drive more data usage on their networks. But that move has also backfired and I saw a report that estimated that 93% of tablet data usage is using WiFi instead of the LTE network.

The WiFi trend is only going to get worse for the carriers as Hotspot 2.0 rolls out. That is the new WiFi standard that is going to let cellphones and other devices easily and automatically log into public hotspots without going through today’s annoying process of having to log onto each new network. With Hotspot 2.0 you can be pre-certified to join any WiFi router that is part of that network. So as you walk down the street in a business district you might long onto numerous different WiFi routers as you walk by them – while staying off the LTE network.

The precursors for Hotspot 2.0 are already in the market today. I know that once I have logged in once with my cellphone to any AT&T or Comcast hotspot that my phone doesn’t ask my permission whenever I come into range of another of their hotspots and just automatically connects me.

It’s been reported that the wireless carriers have had pretty good success getting families to upgrade to monthly 10 GB deluxe plans. But what they didn’t count on is how so many people are being careful to stay within their plan to avoid getting hit with charges for extra data.

It’s been reported that both AT&T and Verizon have invested heavily in the Internet of Things and they are touting 4G connectivity as the best way to connect for a wide range of devices from wireless utility meters to animal-tracking collars. But a lot of the IoT devices in the world are going to be inside of homes and businesses where an LTE connection is often not as good as a signal from an inside-the-home WiFi router. The fact is that any outdoor radio broadcast signal is going to vary with factors like weather, temperature and the amount of the spectrum being used by others. This often makes LTE less reliable locally than a solid WiFi signal.

It will be interesting to see how the wireless carriers react to this. They have spent many billions upgrading their wireless networks and are not seeing the kind of revenue they expected from that effort. This might make them more cautious about leaping in to make the next big network upgrade, which seems to be needed every few years. It’s possible that they will expand their network more through mini-cell sites to make their signal stronger where people live as a way to make it more usable. The one thing they are unlikely to do, at least for a while is to give customers more data in the base wireless plans. They are likely to stick with the incremental data usage plans in place today.

One place the wireless carriers are counting on is in the connected car industry since that is one market where WiFi is not a real alternative. It is expected that every new car will come with data connectivity and that the amount of data used by each car will climb over time as more and more apps are included with cars. Expect them to be selling tens of millions of small monthly data plans to car owners as a way to make up for us all avoiding their expensive data on our cellphones. But even in that market they are competing against the smartphone which can handle all of the functions promised for the 4G functionality that is part of the smart car. I know I would rather get driving directions as part of my existing cellphone plan than buy a second data plan for my car.

Categories
The Industry

Cellular is Not the Rural Broadband Solution

I’m often asked why we can’t let cellular 4G bandwidth take care of the bandwidth needs for rural America. When you look at the ads on TV by Verizon and AT&T you would assume that the cellular data network is robust and is being built everywhere. But there are a lot of practical reasons why cellular data is not the answer for rural broadband:

Rural areas are not being upgraded. The carriers don’t make the same kinds of investments in rural markets that they do in urban markets. To see a similar situation in a related industry, consider how the large cable companies are upgrading cable modems in the metropolitan areas years before they upgrade rural areas. It seems that urban cellular technology is being upgraded every few years while rural cell sites might get upgraded once a decade.

Rural networks are not built where people live. Even where the cellular networks have been upgraded, rural cellular towers have been historically built to take care of car traffic, referred to in the industry as roaming traffic. Think about where you always see cellular towers and they are either on the top of tall hills or else along a highway not close to many homes and businesses. This matters because like all wireless traffic, the data speeds drop drastically with distance from the tower. Where a 3G customer in a City might get 30 Mbps download speed because they are likely less than a mile from a transmitter, a customer who is 4 miles from a tower might now get 5 Mbps. And in a rural area 4 miles is not very far.  

The carriers have severe data plans and caps. Even when customers happen to live close to a rural transmitter and can get good data speeds, the data plans for the large carriers are capped at very skimpy levels. One HD movie uses around 1.5 gigabits, meaning that a cap of 2 to 4 gigabits is a poor substitute for landline broadband. There are still a few unlimited data plans around but they are hard to get and dwindling in availability. And it’s been widely reported that once a customer reaches a certain level of usage on an unlimited plan that the speeds are choked to go very slow for the rest of the month.

Voice gets a big priority on the network. Cellular networks were built to deliver vice calls to cell phones and voice calls still get a priority on the network. A cell phone tower is limited to a finite amount of bandwidth. And so, once a few customers are downloading something big at the same time, the performance for the rest of the cell site gets noticeably worse. 3G networks are intended to deliver short bursts of fast data, such as when a cell phone user downloads an app. But there is not enough bandwidth at a cell phone tower to support hundreds of ‘normal’ data customers who are watching streaming video and using bandwidth like we use in our homes and businesses.

The plans are really expensive. Cellular data plans are not cheap. For example, Verizon will sell you a data plan for an iPad at $30 per month and a 4 gigabit total usage cap. Additional gigabits cost $10 to $15 each. To get the same plan for an iPhone is $70 per month since the plan requires voice and text messaging. Cellular data is the most expensive bandwidth in a country that already has some of the most expensive bandwidth in the world. 

There are no real 4G deployments yet. While the carriers are all touting 4G wireless, what they are delivering is 3G wireless. By definition the 4G wireless specification allows for gigabit data download speeds. What we now have, in engineering terms can best be described as 3.5 G and real 4G is still sometime in the future. There are reports of current cellular networks in cities getting bursts of speed up to 50 Mbps, which is very good, but is not close to being 4G. But most realized speeds are considerably slower than that.

Exit mobile version