The Next Big Thing

I’ve always been somewhat amused to read about the colossally important technology trends that are right around the corner. These trends are mostly driven by the wishful thinking of vendors, and they have rarely come true, at least to the extent that is predicted. Even when the next big thing comes to pass, it’s almost never at the predicted magnitude. There has been at least one of these big trends announced every year, and here are a few of the more interesting ones.

I can remember when it was announced that we would be living in an Internet of Things world. Not only would our houses be stuffed full of labor-savings IOT devices, but our fields, forests, and even the air around us would be full of small sensors that would give us feedback on the world around us. The reality was not the revolution predicted by the industry press, but over a decade, most of us now have smart devices in our homes. But the fields, forests, and surrounding environment – not so much.

The IOT trend was followed by big pronouncements that we’d all be adopting wearables. This was not only devices like Google Glass, but we’d all have wearables built into our everyday clothes so that we could effortlessly carry a computer and sensors with us everywhere. This prediction was about as big of a flop as imaginable. Google Glass crashed and burned when the public made it clear that nobody wanted everyday events to be live streamed. Other than gimmicks at CES, there was no real attempt at smart clothes.

But wearables weren’t the biggest flop of all – that is reserved in my mind for 5G. The hype for 5G swamps the hype for all of the other big trends combined. 5G was going to transform the world. We’d have near gigabit speeds everywhere, and wireless was going to negate the need for investing in fiber broadband networks. 5G was going to enable fleets of driverless cars. 5G would drive latency so low that it was going to be the preferred method for connection by gamers and stock traders. There was going to be 5G small cell sites on every corner, and fast wireless broadband would be everywhere. Instead of 5G, we got a watered-down version of 4G LTE labeled as 5G. Admittedly, cellular broadband speeds are way faster, but none of the predicted revolution came to pass.

A few predictions came to pass largely as touted – although at a much slower pace. Five years ago, we were told that everything was going to migrate to the cloud. Big corporations were going to quickly ditch internal computing, and within a short time, the cloud would transform computing. It didn’t happen as quickly as predicted, but we have moved a huge amount of our computing lives into the cloud. Tasks like gaming, banking, and most of the apps we’ve come to rely on are in the cloud today. The average person doesn’t realize the extent that they rely on the cloud until they lose broadband and realize how little of the things they do are stored in the computers at their homes and offices.

This blog was prompted by the latest big trend. The press is full of stories about how computing is moving back to the edge. In case the irony of that escapes you, this largely means undoing a lot of the big benefits of going to the cloud. There are some good reasons for this shift. For example, the daily news about hacking has corporations wondering if data will be safer locally than in the cloud. But the most important reason cited for the movement to edge computing is that the world is looking for extremely low latency – and this can only come when computer processing is done locally. The trouble with this prediction is that it’s hard to find applications that absolutely must have a latency of less than 10 milliseconds. I’m sure there are some, but not enough to make this into the next big trend. I could be wrong, but history would predict that this will happen to a much smaller degree than being touted by vendors.

All big technology trends have one big weakness in common – the fact that the world naturally resists change. Even when the next big thing has clear advantages, there must be an overwhelming reason for companies and people to drop everything to immediately adopt something new, and that usually is untested in the market. Most businesses have learned that being an early adapter is risky – a new technology can bring a market edge, but it can also result in having egg on one’s face.

AT&T’s 5G Strategy

AT&T recently described their long-term 5G strategy using what they call the 3 pillars of 5G – the three areas where the company is putting their 5G focus. The first pillar is a concentration on 5G cellular, and the company’s goal is to launch a 5G-based cellular service, with some cities coming on board in the second half of 2020. This launch will use frequencies in the sub-6 GHz range. This admission that there won’t be any AT&T 5G until at least 2020 contradicts the AT&T marketing folks who are currently trying to paint the company’s 4G LTE as pre-5G.

The biggest problem for the public will be getting a 5G cellphone. AT&T is working with Samsung to hopefully launch two phones later this year that have some 5G capability. As always with a new generation of wireless technology, the bottleneck will be in handsets. The cell phone makers can’t just make generic 5G phones – they have to work with the carriers to be ready to support the spectific subset of 5G features that are released. You might recall that the 5G cellular specification contains 13 improvements, and only the first generation of a few of those will be included in the first generation 5G cell sites. Cellphone manufacturers will also have to wrestle with the fact that each big cellular carrier will introduce a different set of 5G features.

This is a real gamble for cellphone makers because a 5G phone will become quickly obsolete. A 5G phone sold in late 2019 probably won’t include all of the 5G features that will be on the market by late 2020 – and this is likely to be true for the next 3 or 4 years as the carriers roll out incremental 5G improvements. It’s also a gamble for customers because anybody that buys an early 5G cellphone will have early bragging rights, but those cool benefits can be out of date in six months. I think most people will be like me and will wait a few years until the 5G dust settles.

AT&T’s second pillar is fixed wireless. This one is a head-scratcher because they are talking about the fixed cellular product they’ve already been using for several years – and that product is not 5G. This is the product that delivers broadband to homes using existing low-band cellular frequencies. This is not the same as Verizon’s product that delivers hundreds of megabits per second but is instead a product that delivers speeds up to 50 Mbps depending upon how far a customer lives from a cell tower – with reports that most households are getting 15 Mbps at best. This is the product that AT&T is mostly using to satisfy its CAF II requirements in rural America. All of the engineers I’ve talked to don’t think that 5G is going to materially improve this product.

The final pillar of AT&T’s strategy is edge computing. What AT&T means by this is to put fast processors at customer sites when there is the need to process low-latency, high-bandwidth data. Like other carriers, AT&T has found that not everything is suited for the cloud and that trying to send big data to and from the cloud can create a bandwidth bottleneck and add latency. This strategy doesn’t require 5G and AT&T has already been deploying edge routers. However, 5G will enhance this ability at customer sites that need to connect a huge number of devices simultaneously. 5G can make it easier to connect to a huge number of IoT devices in a hospital or to 50,000 cell phones in a stadium. The bottom line is that the migration to more edge computing is not a 5G issue and applies equally to AT&T’s fiber customers.

There is really nothing new in the three-pillar announcement and AT&T has been talking about all three applications from some time – but the announcement does highlight the company’s focus for stockholders.

In what was mostly a dig at Verizon, AT&T’s CEO Randall Stephenson did hold out the possibility of AT&T following Verizon into the 5G fixed wireless local loop using millimeter wave spectrum – however, he said such a product offering is probably three to five years into the future. He envisions the product as an enhancement to AT&T’s fiber products, not necessarily a replacement. He emphasized that AT&T is happy with the current fiber deployments. He provided some new statistics on a recent earnings call and said the company is seeing customer penetration rates between 33% and 40% within 18 months of new fiber deployment and penetration around 50% after three years. Those are impressive statistics because AT&T’s fiber deployments have been largely in urban areas competing with the big cable companies.

A year ago, Stephenson said that getting sufficient backhaul was his number one concern with deploying high-bandwidth wireless. While he hasn’t repeated that recently, it fits in with his narrative of seeing millimeter wave radio deployments in the 3-5 year time frame. The company recently released a new policy paper on its AirGig product that says that the product is still under development and might play well with 5G. AirGig is the mysterious wireless product that shoots wireless signals along power lines and somehow uses the power lines to maintain focus of the signal. Perhaps the company is seeing a future path for using AirGig as the backhaul to 5G fixed wireless deployments.

The Return of Edge Computing

We just went through a decade where the majority of industry experts told us that most of our computing needs were going to move to the cloud. But it seems that that trend is starting to reverse somewhat and there are many applications where we are seeing the return of edge computing. This trend will have big implications for broadband networks.

Traditionally everything we did involved edge computing – or the use of local computers and servers. But a number of big companies like Amazon, Microsoft and IBM convinced corporate America that there were huge benefits of cloud computing. And cloud computing spread to small businesses and homes and almost every one of us works in the cloud to some extent. These benefits are real and include such things as:

  • Reduced labor costs from not having to maintain an in-house IT staff.
  • Disaster recovery of data due to storing data at multiple sites
  • Reduced capital expenditures on computer hardware and software
  • Increased collaboration due to having a widely dispersed employee base on the same platform
  • The ability to work from anywhere there is a broadband connection.

But we’ve also seen some downsides to cloud computing:

  • No computer system is immune from outages and an outage in a cloud network can take an entire company out of service, not just a local branch.
  • A security breach into a cloud network exposes the whole company’s data.
  • Cloud networks are subject to denial of service attacks
  • Loss of local control over software and systems – a conversion to cloud often means losing valuable legacy systems, and functionality from these systems is often lost.
  • Not always as cheap as hoped for.

The recent move away from cloud computing comes from computing applications that need huge amounts of computing power done in real time. The most obvious examples of this is the smart car. Some of the smart cars under development run as many as 20 servers onboard the car, making them a driving datacenter. There is no hope of ever moving the brains from smart cars or drones to the cloud due to the huge amounts of data that must be passed quickly between the car’s sensors and its computers. Any external connection is bound to have too much latency to make true real-time decisions.

But smart cars are not the only edge devices that don’t make sense on a cloud network. Some other such applications include:

  • Drones have the same concerns as cars. It’s hard to imagine a broadband network that can be designed to always stay in contact with a flying drone or even a sidewalk delivery drone.
  • Industrial robots. Many new industrial robots need to make decisions in real-time during the manufacturing process. Robots are no longer just being used to assemble things, but are also being used to handle complex tasks like synthesizing chemicals, which requires real-time feedback.
  • Virtual reality. Today’s virtual reality devices need extremely low latencies in order to deliver a coherent image and it’s expected that future generations of VR will use significantly more bandwidth and be even more reliant on real-time communications.
  • Medical devices like MRIs also require low latencies in order to pass huge data files rapidly. As we built artificial intelligence into hospital monitors the speed requirement for real-time decision making will become even more critical.
  • Electric grids. It turns out that it doesn’t take much of a delay to knock down an electric grid, and so local feedback is needed to make split-second decisions when problems pop up on grids.

We are all familiar with a good analogy of the impact of performing electronic tasks from a distance. Anybody my age remembers when you could pick up a telephone, have instant dialtone, and then also got a quick ring response from the phone at the other end. But as we’ve moved telephone switches farther from customers it’s no longer unusual to wait seconds to get a dialtone, and to wait even more agonizing seconds to hear the ringing starting at the other end. Such delays are annoying for a telephone call but deadly for many computing applications.

Finally, one of the drivers to move to more edge computing is the desire to cut down on the amount of bandwidth that must be transmitted. Consider a factory where thousands of devices are monitoring specific operations during the manufacturing process. The idea of sending this mountains of data to a distant location for processing seems almost absurd when local servers can handle the data at faster speeds with lower latency. But cloud computing is certainly not going to go away and is still the best network for many applications. In this factory example it would still make sense to send alarms and other non-standard data to some remote monitoring location even if the data needed to keep a machine running is done locally.