I can remember when it was announced that we would be living in an Internet of Things world. Not only would our houses be stuffed full of labor-savings IOT devices, but our fields, forests, and even the air around us would be full of small sensors that would give us feedback on the world around us. The reality was not the revolution predicted by the industry press, but over a decade, most of us now have smart devices in our homes. But the fields, forests, and surrounding environment – not so much.
The IOT trend was followed by big pronouncements that we’d all be adopting wearables. This was not only devices like Google Glass, but we’d all have wearables built into our everyday clothes so that we could effortlessly carry a computer and sensors with us everywhere. This prediction was about as big of a flop as imaginable. Google Glass crashed and burned when the public made it clear that nobody wanted everyday events to be live streamed. Other than gimmicks at CES, there was no real attempt at smart clothes.
But wearables weren’t the biggest flop of all – that is reserved in my mind for 5G. The hype for 5G swamps the hype for all of the other big trends combined. 5G was going to transform the world. We’d have near gigabit speeds everywhere, and wireless was going to negate the need for investing in fiber broadband networks. 5G was going to enable fleets of driverless cars. 5G would drive latency so low that it was going to be the preferred method for connection by gamers and stock traders. There was going to be 5G small cell sites on every corner, and fast wireless broadband would be everywhere. Instead of 5G, we got a watered-down version of 4G LTE labeled as 5G. Admittedly, cellular broadband speeds are way faster, but none of the predicted revolution came to pass.
A few predictions came to pass largely as touted – although at a much slower pace. Five years ago, we were told that everything was going to migrate to the cloud. Big corporations were going to quickly ditch internal computing, and within a short time, the cloud would transform computing. It didn’t happen as quickly as predicted, but we have moved a huge amount of our computing lives into the cloud. Tasks like gaming, banking, and most of the apps we’ve come to rely on are in the cloud today. The average person doesn’t realize the extent that they rely on the cloud until they lose broadband and realize how little of the things they do are stored in the computers at their homes and offices.
This blog was prompted by the latest big trend. The press is full of stories about how computing is moving back to the edge. In case the irony of that escapes you, this largely means undoing a lot of the big benefits of going to the cloud. There are some good reasons for this shift. For example, the daily news about hacking has corporations wondering if data will be safer locally than in the cloud. But the most important reason cited for the movement to edge computing is that the world is looking for extremely low latency – and this can only come when computer processing is done locally. The trouble with this prediction is that it’s hard to find applications that absolutely must have a latency of less than 10 milliseconds. I’m sure there are some, but not enough to make this into the next big trend. I could be wrong, but history would predict that this will happen to a much smaller degree than being touted by vendors.
All big technology trends have one big weakness in common – the fact that the world naturally resists change. Even when the next big thing has clear advantages, there must be an overwhelming reason for companies and people to drop everything to immediately adopt something new, and that usually is untested in the market. Most businesses have learned that being an early adapter is risky – a new technology can bring a market edge, but it can also result in having egg on one’s face.