Categories
Technology

Immersive Virtual Reality

In case you haven’t noticed, virtual reality has moved from headsets to the mall. At least two companies now offer an immersive virtual reality experience that goes far beyond what can be experienced with only a VR headset at home.

The newest company is Dreamscape Immersive that has launched virtual reality studies in Los Angeles and Dallas, with more outlets planned. The virtual reality experience is enhanced by the use of a headset, hand and foot trackers, and a backpack holding the computers. The action occurs within a 16X16 room with vibrating haptic floor (responds to actions of the participant). This all equates to an experience where a user can reach out and touch objects or can walk around all sides of a virtual object in the environment.

The company has launched with three separate adventures, each lasting roughly 15 minutes. In Alien Zoo the user visits a zoo populated by exotic and endangered animals from around the galaxy. In The Blu: Deep Rescue users try to help reunite a lost whale with its family. The Curse of the Lost Pearl feels like an Indiana Jones adventure where the user tries to find a lost pearl.

More established is The Void, which has launched virtual reality adventure sites in sixteen cities, with more planned. The company is creating virtual reality settings based upon familiar content. The company’s first VR experience was based on Ghostbusters. The current theme is Star Wars: Secrets of the Empire.

The Void lets users wander through a virtual reality world. The company constructs elaborate sets where the walls and locations of objects in the real-life set correspond to what is being seen in the virtual reality world. This provides users with real tactile feedback that enhances the virtual reality experience.

You might be wondering what these two companies and their virtual reality worlds have to do with broadband. I think they provide a peek at what virtual reality in the home might become in a decade. Anybody who’s followed the growth of video games can remember how the games started in arcades before they were shrunk to a format that would work in homes. I think the virtual reality experiences of these two companies are a precursor to the virtual reality we’ll be having at home in the not-too-distant future.

There is already a robust virtual reality gaming industry, but it relies entirely on providing a virtual reality experience through the use of goggles. There are now many brands of headsets on the market, ranging from the simple cardboard headset from Google to more expensive headsets from companies like Oculus Rift, Nintendo, Sony, HTC, and Lenovo. If you want to spend an interesting half an hour, you can see the current most popular virtual reality games at this review from PCGamer. To a large degree, virtual reality gaming has been built modeled on existing traditional video games, although there are some interesting VR games that are now offering content that only makes sense in 3D.

The whole video game market is in the process of moving content online, with the core processing of the gaming experience done in data centers. While most games are still available in more traditional formats, gamers are increasingly connecting to a gaming cloud and need a broadband connection akin in size to a 4K video stream. Historically, many games have been downloaded, causing headaches for gamers with data caps. Playing the games in the cloud can still chew up a lot of bandwidth for active gamers but avoids the giant gigabyte downloads.

If history is a teacher, the technologies used by these two companies will eventually migrate to homes. We saw this migration occur with first-generation video games – there were video arcades in nearly every town, but within a decade those arcades got displaced by the gaming boxes in the home that delivered the same content.

When the kind of games offered by The Void and Dreamscape Immersive reach the home they will ramp up the need for home broadband. It’s not hard to imagine immersive virtual reality needing 100 Mbps speeds or greater for one data stream. These games are the first step towards eventually having something resembling a home holodeck – each new generation of gaming is growing in sophistication and the need for more bandwidth.

Categories
Technology The Industry

Virtual Reality and Broadband

For the second year in a row Turner Sports, in partnership with CBS and the NCAA will be streaming March Madness basketball games in virtual reality. Watching the games has a few catches. The content can only be viewed on two VR sets – the Samsung Gear VR and the Google Daydream View. Viewers can buy individual games for $2.99 or buy them all for $19.99. And a viewer must be subscribed to the networks associated with the broadcasts – CBS, TNT, TBS and truTV.

Virtual reality viewers get a lot of options. They can choose which camera to watch from or else opt for the Turner feed that switches between cameras. When the tournament reaches the Sweet 16 viewers will receive play-by-play from a Turner team broadcasting only for VR viewers. The service also comes with a lot of cool features like the ability to see stats overlays on the game or on a particular player during the action. Games are not available for watching later, but there will be a big library of game highlights.

Last year Turner offered the same service, but only for 6 games. This year the line-up has been expanded to 21 games that includes selected regionals in the first and second round plus Sweet Sixteen and Elite Eight games. The reviews from last year’s viewers were mostly great and Turner is expecting a lot more viewers this year.

Interestingly none of the promotional materials mention the needed bandwidth. The cameras being used for VR broadcasts are capable of capturing virtual reality in 4K. But Turner won’t be broadcasting in 4K because of the required bandwidth. Charles Cheevers, the CTO of Arris said last year that a 720p VR stream in 4K requires at least a 50 Mbps connection. That’s over 30 times more bandwidth than a Netflix stream.

Instead these games will be broadcast in HD video at 60 frames per second. According to Oculus that requires a data stream of 14.4 Mbps for ideal viewing. Viewing at slower speeds results in missing some of the frames. Many VR viewers complain about getting headaches while watching VR, and the primary reason for that the headaches is missing frames. While the eye might not be able to notice the missing frames the brain apparently can.

One has to ask if this is the future of sports. The NFL says it’s not ready yet to go to virtual reality until there is more standardization between different VR sets – they fear for now that VR games will have a limited audience due to the number of viewers with the right headsets. But the technology has been tried for football and Fox broadcast the Michigan – Notre Dame game last fall in virtual reality.

All the sports networks have to be looking at the Turner pricing of $2.99 per game and calculating the potential new revenue stream from broadcasting more games in VR in addition to traditional cable broadcasts. Some of the reviews I read of last year’s NCAA broadcasts said that after watching a game in VR that normal TV broadcasts seemed boring. Many of us familiar with this feeling. I can’t watch linear TV any more. It’s not just sitting through the commercials, but it’s being captive to the stream rather than watching the way I want. We can quickly learn to love a better experience.

Sports fans are some of the most intense viewers of any content. It’s not hard to imagine a lot of sports fans wanting to watch basketball, football, hockey or soccer in VR. Since the format favors action sports it’s also not hard to imagine the format also drawing viewers to rugby, lacrosse and other action sports.

It’s possible that 4K virtual reality might finally be the app that justifies fast fiber connections. There is nothing else on the Internet today that requires that much speed plus low latency. Having several simultaneous viewers in a home watching 4K VR would require speeds of at least a few hundred Mbps. You also don’t need to look out too far to imagine virtual reality in 8K, requiring a data stream of at least 150 Mbps – which might be the first home application that can justify a gigabit connection.

Categories
Technology What Customers Want

The Next Big Broadband Application

Ever since Google Fiber and a few municipalities began building gigabit fiber networks people have been asking how we are going to use all of that extra broadband capability. I remember a few years ago there were several industry contests and challenges to try to find the gigabit killer app.

But nobody has found one yet and probably won’t for a while. After all, a gigabit connection is 40 times faster than the FCC’s current definition of broadband. I don’t think Google Fiber or anybody thought that our broadband needs would grow fast enough to quickly fill such a big data pipe. But year after year we all keep using more data, and since the household need for broadband keeps doubling every three years it won’t take too many doublings for some homes to start filling up larger data connections.

But there is one interesting broadband application that might be the next big bandwidth hog. Tim Cook, the CEO of Apple, was recently on Good Morning America and he said that he thinks that augmented reality is going to be a far more significant application in the future than virtual reality and that once perfected that it’s going to be something everybody is going to want.

By now many of you have tried virtual reality. You don a helmet of some kind and are then transported into some imaginary world. The images are in surround-3D and the phenomenon is amazing. And this is largely a gaming application and a solitary one at that.

But augmented reality brings virtual images out into the real world. Movie directors have grasped the idea and one can hardly watch a futuristic show or movie without seeing a board room full of virtual people who are attending a meeting from other locations.

And that is the big promise of virtual reality. It will allow telepresence – the ability for people to sit in their home or office and meet and talk with others as if they are in the same room. This application is of great interest to me because I often travel to hold a few hour meetings and the idea of doing that from my house would add huge efficiency to my business life. Augmented reality could spell the end of the harried business traveler.

But the technology has far more promise than that. With augmented reality people can share any other images. You can share a sales presentation or share videos from your latest vacation with grandma. This ability to share images between people could drastically change education, and some predict that over a few decades that augmented reality would begin to obsolete the need for classrooms full of in-person students. This technology would fully enable telemedicine. Augmented reality will enhance aging in the home since shut-ins could still have a full social life.

And of course, the application that intrigues everybody is using augmented reality for entertainment. Taken to the extreme, augmented reality is the Star Trek holodeck. There are already first-generation units that can create a virtual landscape in your living room. It might take a while until the technology gets as crystal clear and convincing as the TV holodeck, but even having some percentage of that capability opens up huge possibilities for gaming and entertainment.

As the quality of augmented reality improves, the technology is going to require big bandwidth connections with a low latency. Rather than just transmitting a 2D video file, augmented reality will be transmitting 3D images in real time. Homes and offices that want to use the technology are going to want broadband connections far faster than the current 25/3 Mbps definition of broadband. Augmented reality might also be the first technology that really pushes the demand for faster upload speeds since they are as necessary as download speeds in enabling a 2-way augmented reality connection.

This is not a distant future technology and a number of companies are working on devices that will bring the first-generation of the technology into homes in the next few years. And if we’ve learned anything about technology, once a popular technology is shown to work, demand in the marketplace there will be numerous companies vying to improve the technology.

If augmented reality was here today the biggest hurdle to using it would be the broadband connections most of us have today. I am certainly luckier than people in rural areas and I have a 60/5 Mbps connection with a cable modem from Charter. But the connection has a lot of jitter and the latency swings wildly. My upload stream is not going to be fast enough to support 2-way augmented reality.

The economic benefits from augmented reality are gigantic. The ability for business people to easily meet virtually would add significant efficiency to the economy. The technology will spawn a huge demand for content. And the demand to use the technology might be the spur that will push ISPs to build faster networks.

Categories
Technology The Industry

The Return of Edge Computing

We just went through a decade where the majority of industry experts told us that most of our computing needs were going to move to the cloud. But it seems that that trend is starting to reverse somewhat and there are many applications where we are seeing the return of edge computing. This trend will have big implications for broadband networks.

Traditionally everything we did involved edge computing – or the use of local computers and servers. But a number of big companies like Amazon, Microsoft and IBM convinced corporate America that there were huge benefits of cloud computing. And cloud computing spread to small businesses and homes and almost every one of us works in the cloud to some extent. These benefits are real and include such things as:

  • Reduced labor costs from not having to maintain an in-house IT staff.
  • Disaster recovery of data due to storing data at multiple sites
  • Reduced capital expenditures on computer hardware and software
  • Increased collaboration due to having a widely dispersed employee base on the same platform
  • The ability to work from anywhere there is a broadband connection.

But we’ve also seen some downsides to cloud computing:

  • No computer system is immune from outages and an outage in a cloud network can take an entire company out of service, not just a local branch.
  • A security breach into a cloud network exposes the whole company’s data.
  • Cloud networks are subject to denial of service attacks
  • Loss of local control over software and systems – a conversion to cloud often means losing valuable legacy systems, and functionality from these systems is often lost.
  • Not always as cheap as hoped for.

The recent move away from cloud computing comes from computing applications that need huge amounts of computing power done in real time. The most obvious examples of this is the smart car. Some of the smart cars under development run as many as 20 servers onboard the car, making them a driving datacenter. There is no hope of ever moving the brains from smart cars or drones to the cloud due to the huge amounts of data that must be passed quickly between the car’s sensors and its computers. Any external connection is bound to have too much latency to make true real-time decisions.

But smart cars are not the only edge devices that don’t make sense on a cloud network. Some other such applications include:

  • Drones have the same concerns as cars. It’s hard to imagine a broadband network that can be designed to always stay in contact with a flying drone or even a sidewalk delivery drone.
  • Industrial robots. Many new industrial robots need to make decisions in real-time during the manufacturing process. Robots are no longer just being used to assemble things, but are also being used to handle complex tasks like synthesizing chemicals, which requires real-time feedback.
  • Virtual reality. Today’s virtual reality devices need extremely low latencies in order to deliver a coherent image and it’s expected that future generations of VR will use significantly more bandwidth and be even more reliant on real-time communications.
  • Medical devices like MRIs also require low latencies in order to pass huge data files rapidly. As we built artificial intelligence into hospital monitors the speed requirement for real-time decision making will become even more critical.
  • Electric grids. It turns out that it doesn’t take much of a delay to knock down an electric grid, and so local feedback is needed to make split-second decisions when problems pop up on grids.

We are all familiar with a good analogy of the impact of performing electronic tasks from a distance. Anybody my age remembers when you could pick up a telephone, have instant dialtone, and then also got a quick ring response from the phone at the other end. But as we’ve moved telephone switches farther from customers it’s no longer unusual to wait seconds to get a dialtone, and to wait even more agonizing seconds to hear the ringing starting at the other end. Such delays are annoying for a telephone call but deadly for many computing applications.

Finally, one of the drivers to move to more edge computing is the desire to cut down on the amount of bandwidth that must be transmitted. Consider a factory where thousands of devices are monitoring specific operations during the manufacturing process. The idea of sending this mountains of data to a distant location for processing seems almost absurd when local servers can handle the data at faster speeds with lower latency. But cloud computing is certainly not going to go away and is still the best network for many applications. In this factory example it would still make sense to send alarms and other non-standard data to some remote monitoring location even if the data needed to keep a machine running is done locally.

 

Categories
Current News Technology

Google Looking at Wireless Drops

In an interview with Re/code Craig Barrett, the CEO of Access for Alphabet said that Google is looking at wireless last mile technologies. Google is not the only one looking at this. The founder of Aereo has announced a new wireless initiative to launch this summer in Boston under the brand name of Starry. And Facebook says it is also investigating the technology.

The concept is not new. I remember visiting an engineer in Leesburg, Virginia back in the 90s who had developed a wireless local loop technology. He had working prototypes that could beam a big data pipe for the time (I’m fuzzily remembering a hundred Mbps back when DSL was still delivering 1 Mbps). His technology was premature in that there wasn’t any good technology at the time for bringing fast broadband to the curb.

As usual there will be those that jump all over this news and declare that we no longer need to build fiber. But even should one of these companies develop and perfect the best imaginable wireless technology there is still going to have to be a lot of fiber built. All of these new attempts to develop wireless last mile technologies share a few common traits that are dictated by the nature of wireless spectrum.

First, to get good the kind of big bandwidth that Google wants to deliver, the transmitter and the customer have to be fairly close together. Starry is talking about a quarter mile deliver distance. One characteristic of any wireless signal is that the signal weakens with distance. And the higher the frequency of the spectrum used, the faster the signal deteriorates.

Second, unless there is some amazing breakthrough, a given transmitter will have a fixed and limited number of possible paths that be established to customers. This characteristic makes it very difficult to connect to a lot of customers in a densely populated area and is one of the reasons that wireless today is more normally used for less densely populated places.

Third, the connection for this kind of point-to-multipoint network must be line of sight. In an urban environment every building creates a radio ‘shadow’ and block access to customers sitting behind that building. This can be overcome to a small degree with technologies that bounce the signal from one customer to another – but such retransmission of a signal cuts the both the strength of the signals and the associated bandwidth.

However, Google has already recognized that there are a lot of people unwilling or unable to buy a gigabit of bandwidth from them on fiber. In Atlanta the company is not just selling a gigabit connection and is hitting the street with a 100 Mbps connection for $50. A good wireless system that had access to the right kind of spectrum could satisfy that kind of bandwidth to a fairly reasonable number of customers around a given transmitter. But it would be technically challenging to try to do the same with gigabit bandwidth unless each transmitter served fewer customers (and had to be even closer to the customer). A gigabit wireless network would start looking a lot like the one I saw year ago in Virginia where there was a transmitter for just a few nearby customers – essentially fiber to the curb with gigabit wireless local loops.

But if Starry can do what they are shooting for – the delivery of a few hundred Mbps of bandwidth at an affordable price will be very welcome today and would provide real competition to the cable companies that have monopolies in most urban neighborhoods. But, and here is where many might disagree with me, the time is going to come in a decade or two where 200 Mbps of bandwidth is going to become just as obsolete as first generation DSL has become in the twenty years since it was developed.

Over the next twenty years we can expect the full development of virtual and augmented reality so that real telepresence is available – holographic images of people and places brought to the home. This kind of technology will require the kind of bandwidth that only fiber can deliver. I think we’ll start seeing this just a few years from now. I can already imagine a group of teenagers gathering at one home, each with their own headset to play virtual reality games with people somewhere else. That application will very easily require a gigabit pipe just a few years from now.

I welcome the idea of the wireless last mile if it serves to break the cable monopoly and bring some real price competition into broadband. It’s a lot less appealing if the wireless companies decide instead to charge the same high prices as the incumbents. It sounds like the connections that Starry is shooting for are going to fast by today’s standards, but I’m betting that within a few decades that the technology will fall to the wayside – like every technology that doesn’t bring a fast wire to the home.

Categories
Technology The Industry

Coming Technology Trends

I love to look into the future and think about where the various technology trends are taking us. Recently, Upfront Ventures held a conference for the top technology venture capital investors and before that conference they asked those investors what they foresaw as the biggest technology trends over the next five years. Five years is not the distant future and it’s interesting to see where the people that invest in new businesses see us heading in that short time. Here were the most common responses:

Talent Goes Global. Innovation and tech talent has most recently been centered in the US, Europe, Japan and China. But now there are tech start-ups everywhere and very talented young technologists to be found in all sorts of unlikely places.

For many years we have warned US kids that they are competing in a worldwide economy, and this is finally starting to come true. In a fully networked world it’s getting easier to collaborate with the brightest people from around the world, and that’s a much larger talent pool. The days of Silicon Valley being the only places developing the next big thing are probably behind us.

Sensors Everywhere. There will be a big increase in sensors that are going to supply us feedback about the world around us and will provide feedback in ways that were unimaginable. Assuming that we can find a way to tackle the huge influx of big data in real time we are going to have a whole new way to look at much of our world.

There will be sensors on farms, in factories, in public places and in our homes and businesses that will begin providing a wide array of feedback on the environment around us. There are currently hundreds of companies working on medical monitors that are going to be able to tell us a lot more about ourselves, which will allow us to track and treat diseases and allow older folks to stay in their homes longer.

The First Real Commercial A.I. It’s hard to go a week these days without hearing about an A.I platform that has solved the same kinds of issues we face every day. A.I. systems are now able to learn things from scratch, on their own, and self-monitor to improve their performance in specific applications.

This opens up the possibility of automating huge numbers of repetitive processes. I have a friend who is a CPA who has already automated the tax preparation process and he can go from bank accounts and create a set of books and tax returns in an hour or two – a process that used to take a week or longer. And soon it will be totally automated and not require much human assistance at all until the finished product is ready for review. People think that robots are going to take over physical tasks – and they will – but before then expect to see a huge wave of the automation of paperwork processes like accounting, insurance claim processing, mortgage and credit card approval and a long list of other clerical and white collar tasks.

Better Virtual Reality. The first generation of virtual reality is now hitting the market. But with five more years of development the technology will find its way into many facets of our lives. If you haven’t tried it yet, the first generation VR is pretty spectacular, but its potential is almost mind-blowing when plotting it out on a normal path of technical improvements and innovations.

New Ways to Communicate. The VR investors think that we are on the verge of finding new ways to communicate. Already today a lot of our forms of communication have moved to messaging platforms and away from phone calls and email. With the incorporation of A.I. the experts predict a fully integrated communications system that easily and automatically incorporates all kinds of communications mediums. And with the further introduction and use of bots companies will be able to automatically join in conversations without needing piles of people for much of it.

Categories
Technology

A Network Without Wires

There is an emerging trend in the industry to try to create home networks without wires. ISPs and cable companies are all putting a lot of faith into WiFi as an alternative for wires running to computers and settop boxes.

It’s an interesting trend but one that is not without peril. The problem is that WiFi, at least like the big ISPs deliver it, is not always the best solution. The big cable companies like Comcast tend to provide customers with a cable modem with a decent quality WiFi router built in. This router is placed wherever the cable enters the home, which might not be the ideal location.

A single strong WiFi router can be a great device in a home with a simple network and uncomplicated demands. A family with two TVs, one computer, and a few smartphones is probably going to do fine with a strong WiFi router as long as the house isn’t too large for the signal to get where it’s needed.

But we are quickly changing to a society where many homes have complex data needs scattered throughout the house. People are likely to be demanding video streams from all over the home, and often many at the same time. There are bound to be a few computers and it’s not unlikely that somebody in the house works at home at least part of the time. Demands for big bandwidth for things like gaming and the new virtual reality sets that are just now hitting the market are increasing. And we are on the verge of seeing 4K video streams at 15 Mbps. On top of all this will be a variety of smart IoT devices that are going to want occasional attention from the network.

When a home gets crowded with devices it’s very easy to overwhelm a WiFi router. The new routers are pretty adept at setting up multiple data paths. But with too many streams the router will lose efficiency as it constantly tries to monitor and change the bandwidth for each stream it is managing. When this happens a home network can bog down, dropping the efficiency of the router precipitously.

There are a few solutions to this problem. First, you can run wires directly to a few of the bigger data eaters in a house and remove them from the WiFi network. Just make sure in doing so that you also disable having them search for a WiFi signal. But people don’t really want more wires in their home, and ISPs definitely do not like this idea.

The other solution is to add additional WiFi hotspots in the home. The simplest example of this are WiFi repeaters that simply amplify the signal from the base WiFi hotspot. However, repeaters don’t improve the contention issue, they simply bring a stronger signal closer to some of the devices that need them.

A more complex solution is to set up a network of interconnected WiFi hotspots. This consists of separate WiFi routers that all feed through one base router, a configuration that is familiar to any network engineer but alien to most home owners. The main problem with this solution is obvious to anybody who has ever operated a network with multiple routers – getting them to work together efficiently. Setting up a multiple-router network can be challenging to those unfamiliar with networks. And if configured poorly this kind of network can operate worse than one big hotspot.

But these kinds of interconnected WiFi networks are the cutting edge of home networking. I was recently talking to an engineer from a mid-size cable company and he admitted that as many as 20% of their customers already need this kind of solution. It’s a bit ironic that the demand for WiFi is mushrooming so soon after the ISPs went to the one-router solution. The percentage of homes that need a better solution is growing rapidly as homes jam more devices onto WiFi.

So there is an opportunity here for any ISP. Customers need better networks in their homes and there is a revenue opportunity in helping them to set these up. The downside, at least for now, is that this is labor intensive and there may be a lot of maintenance to keep these networks running right. But there are a number of vendors looking into solutions and one would hope that home WiFi networks will soon become plug and play.

Categories
Technology

Augmented vs. Virtual Reality

We are about to see the introduction of the new generation of virtual reality machines on the market. Not far behind them will probably be a number of augmented reality devices. These devices are something that network operators should keep an eye on, because they are the next generation of devices that are going to be asking for significant bandwidth.

The term ‘augmented reality’ has been around since the early 1990s and is used to describe any technology that overlays a digital interface over the physical world. Until now, augmented reality has involved projecting opaque holograms to blend into what people see in the real world. Virtual reality takes a very different approach and immerses a person in a fully digital world by projecting stereoscopic 3D images onto a screen in front of your eyes.

A number of virtual reality headsets are going to hit the market late this year into next year:

  • HTC Vive is hoping to hit the market by Christmas of this year. This is being developed in conjunction with Valve. This device will be a VR headset that will incorporate some augmented reality, which will allow a user to move and interact with virtual objects.
  • Oculus Rift, owned by Facebook, is perhaps the most anticipated release and is expected to hit the market sometime in 2016.
  • Sony is planning on releasing Project Morpheus in 1Q 2016. This device will be the first VR device integrated into an existing game console.
  • Samsung will be releasing its Gear VR sometime in 2016. This device is unique in that it’s powered by the Samsung Galaxy smartphone.
  • Raser will be releasing a VR headset based upon open source software that they hope will allow for more content delivery. Dates for market delivery are still not known.

All of these first generation virtual reality devices are for gaming and, at least in the first few generations, that will be the primary use for these devices. Like with any new technology, price is going to be an issue for the first generation devices, but one has to imagine that within a few years these devices might be as common as, or even displace, traditional game consoles. The idea of being totally immersed in a game is going to be very attractive.

There are two big players in the augmented reality market—Microsoft’s HoloLens and the Google-backed Magic Leap. These devices don’t have a defined target release date yet. But the promise for augmented reality is huge. These devices are being touted as perhaps the successor to the smartphone and as such have a huge market potential. This list of potential applications for an augmented reality device is mind boggling large, which must be what attracted Google to buy into Magic Leap.

The MagicLeap works by beaming images directly into a user’s retinas and the strength and intensity of the beam can create the illusion of 3D. But as with Google Glass, a user is also going to be able to see the real world behind the image. This opens up a huge array of possibilities that range from gaming, where the device takes over a large share of the visual space, to the same sorts of communicative and informative functions done by Google Glass.

The big hurdles for augmented reality are how to power the device as well as overcoming the social stigma around wearing a computer in public—who can forget the social stigma that instantly accrued to glassholes, those who wore Google Glass into bars and other public places? As a device it must be small, low power, inconspicuous to use, and still deliver an amazing visual experience to users. It’s probably going to take a while to work out those issues.

The two kinds of devices will compete with each other to some extent on the fringes of the gaming community, and perhaps in areas like providing virtual tours of other places. But for the most part the functions they perform and the markets they chase will be very different.

Categories
Current News Technology

What Does a Gigabit Get Us?

Pew Research did a survey of 1,464 industry experts and asked then what killer apps we can expect if the US is able to significantly increase customer bandwidth between now and 2025. About 86% of the experts thought that bandwidth would improve enough by then to provide a platform for supporting widespread new applications.

The question does not suppose that everybody will have a gigabit of download speed, although by then there will many homes and businesses with that much speed available. But one can also suppose that by then that there will be many people with download speeds of hundreds of megabits. The cable companies are on a path with DOCSIS 3.1 to be able to increase speeds significantly on their networks if they so choose. So the biggest chance for fast speeds for the masses is not having fiber built everywhere by 2025, but rather of having the cable companies stepping up over the next decade. Most experts are thinking that they will to some extent (and I agree).

There were a few applications that a number of the experts agreed would become prevalent if download speeds increase:

Telepresence. There was a feeling that telepresence will have come a long way over the next decade. We already see the beginning of this today. For example, Julian Assange from WikiLeaks recently appeared at a summit in Nantucket via hologram. That is the precursor for having routine meetings with people by hologram. This would not just be speakers at conferences (but it would make it easier to get more impressive speakers when they don’t have to travel). But it means having salesmen make calls by telepresence. It means having staff meeting and other business meetings by telepresence. This is going to have a huge impact on business and could represent huge cost savings by reducing travel and the wasted costs and hours due to travel.

But there is also going to be a huge market for residential telepresence. One of the most popular features today of an iPhone is Facetime that lets people easily see each other while they talk. And Skyping has become wildly popular. One can imagine that people will grab onto telepresence as soon as the associated hardware is affordable, as a way to spend time with family and friends.

The experts also think that telepresence will have a big impact on medicine and education. Telemedicine will have come a long way when if a patient can spend time in the ‘presence’ of a doctor. Telepresence also will be huge for shopping since you will be able to get 3D demos of products online. In fact, this might become the first most prominent use of the technology.

Virtual Reality. Somewhat related to telepresence will be greatly improved virtual reality. We have the start of this today with Oculus Rift, but over a decade, with more bandwidth and faster processors we can have improved virtual reality experiences that can be used for gaming or for blending the fantasy world with the real one. There was also news last week that Microsoft demonstrated a 3D hologram gaming platform they are calling GameAlive that brings something akin to a holodeck experience into your living room. Over a decade virtual reality is likely to move beyond the need for a special helmet and will instead move into our homes and businesses.

Imagine being in a gym room and playing a game of tennis or some other sport with a friend who is elsewhere or against an imaginary opponent. Imagine taking virtual tours of foreign tourist destinations or even of visiting imaginary places like other planets or fantasy worlds. It is likely that gaming and virtual reality will become so good that they will become nearly irresistible. So I guess if computers take all of our jobs at least we’ll have something fun to do.

Internet of Things. Within a decade the IoT will become a major factor in our daily lives and the interaction between people and machines will become more routine. We are already starting to see the beginning of this in that we spend a lot of our time connected to the web. But as we become more entwined with technology it means a big change in our daily lives. For example, experts all expect personal assistants like Siri to improve to the point where they become a constant part of our lives.

Just last week we saw IBM roll out their Watson supercomputer platform for the use in daily apps. That processing speed along with better conversational skills is quickly going to move the web and computer apps deeper into our lives. Many of the experts refer to this as a future of being ‘always-on”, where computers become such a routine part of life that we always are connected. Certainly wearables and other devices will make it easier to always have the web and your personal assistant with you.

Aside from the many benefits of the IoT which I won’t discuss here, the fact that computers will become omnipresent is perhaps the most important prediction about our future.

Not everything predicted by the experts was positive and tomorrow I am going to look at a few of those issues.

Exit mobile version