The Fourth Industrial Revolution

There is a lot of talk around the world among academics and futurists that we have now entered into the beginnings of the fourth industrial revolution. The term industrial revolution is defined as a rapid change in the economy due to technology.

The first industrial revolution came from steam power that drove the creation of the first large factories to create textiles and other goods. The second industrial revolution is called the age of science and mass production and was powered by the simultaneous development of electricity and oil-powered combustion engines. The third industrial revolution was fairly recent and was the rise of digital technology and computers.

There are differing ideas of what the fourth industrial revolution means, but every prediction involves using big data and emerging technologies to transform manufacturing and the workplace. The fourth industrial revolution means mastering and integrating an array of new technologies including artificial intelligence, machine learning, robotics, IoT, nanotechnology, biotechnology, and quantum computing. Some technologists are already predicting that the shorthand description for this will be the age of robotics.

Each of these new technologies is in their infancy but all are progressing rapidly. Take the most esoteric technology on the list – quantum computing. As recently as three or four years ago this was mostly an academic concept and we now have first generation quantum computers. I can’t recall where I read it, but I remember a quote that said that if we think of the fourth industrial revolution in terms of a 1,000-day process that we are now only on day three.

The real power of the fourth industrial revolution will come from integrating the technologies. The technology that is the most advanced today is robotics, but robotics will change drastically when robots can process huge amounts of data quickly and can use AI and machine learning to learn and cope with the environment in real time. Robotics will be further enhanced in a factory or farm setting by integrating a wide array of sensors to provide feedback from the surrounding environment.

I’m writing about this because all of these technologies will require the real-time transfer of huge amounts of data. Futurists and academics who talk about the fourth industrial revolution seem to assume that the needed telecon technologies already exist – but they don’t exist today and need to be developed in conjunction with the other new technologies.

The first missing element to enable the other technologies are computer chips that can process huge amounts of data in real time. Current chip technology has a built-in choke point where data is queued and fed into and out of a chip for processing. Scientists are exploring a number of ways to move data faster. For example, light-based computing has the promise to move data at speeds up to 50 Gbps. But even that’s not fast enough and there is research being done using lasers to beam data directly into the chip processor – a process that might increase processing speeds 1,000 times over current chips.

The next missing communications element is a broadband technology that can move data fast enough to keep up with the faster chips. While fiber can be blazingly fast, a fiber is far too large to use at the chip level, and so data has to be converted at some point from fiber to some other transmission path.

The amount of data that will have to be passed in some future applications is immense. I’ve already seen academics bemoaning that millimeter wave radios are not fast enough, so 5G will not provide the solution. Earlier this year the first worldwide meeting was held to officially start collaborating on 6G technology using terabit wave spectrum. Transmissions at those super-high frequencies only stay coherent for a few feet, but these frequencies can carry huge amounts of data. It’s likely that 6G will play a big role in providing the bandwidth to the robots and other big data needs of the fourth industrial revolution. From the standpoint of the telecom industry, we’re no longer talking about last-mile and we are starting to address the last-foot!

New Technology – May 2015

TransistorThis blog will look at some of the coolest new technology that has come across my screen lately.

Ultrathin Transistor. Researchers at Cornell have developed a transistor that is only three atoms thick. The transistor is made from an experimental material called transition metal dichalcogenide (TMD).

The findings were published in Nature and noted as a potentially major breakthrough. We are reaching the limit on the smallness of circuits that can be made from silicon and this possibly portends a generation of ultrathin circuits and sensors. The Cornell team has made a circuit on a 4-inch wafer and they believe this can easily be made commercially viable. TMDs are being discussed along with graphene as the potential breakthroughs that will let us march past the Moore’s law limits on circuit sizes.

Acoustruments. Disney research labs have developed a technology they call acoustruments as a way to interface with physical devices using soundwaves. For example, this could let you set an alarm clock at a Disney resort from an app on any cellphone that has a speaker. As you tell the app what to do, it would emit sounds from your cellphone speaker that would then ‘push’ the appropriate buttons on the alarm clock to set the alarm. Disney sees applications allowing people from around the world to have easier interfaces with devices on the Disney property.

This has potential uses far outside this simple example because it could allow a no-power standard interface between people and electronics. This could become a handy way to interface with IoT devices, for example.

Better Electric Conductors. Scientists at Rice University along with the Tejiin Aramid, a firm from the Netherlands, have demonstrated the ability to use carbon nanotubes to carry up to four times as much electricity for the same mass of wires. The team has found techniques that allow them to spin strong durable wire from carbon nanotubes that can perform as well as copper.

This can lead to specialized wiring for those applications where weight is an issue. For example, this could be used to produce higher efficiency long-haul wires from rural solar power stations. Or it could be used in applications like spacecraft, airplanes, and cars where weight is always an issue.

Wireless Energy Transmission. The Japanese Aerospace Exploration Agency (JAXA) has been able to transmit 1.8 kilowatts of power accurately through the air to a receiver 170 feet away. While this is not very far, nor a lot of power, it is the first practical demonstration of the ability to transmit power in much the same way that we transmit wireless data streams.

Japan’s goal with this project is to eventually be able to beam electricity back to earth from space. They envision large solar plants in space that are more efficient and not dependent upon weather. They envision solar farms set up at 22,300 miles from earth where they would be exposed to the sun continuously.

Breakthroughs in Quantum Computing. Researchers at IBM have made a few breakthroughs that could help to make quantum computers commercially viable. For the first time they have been able to measure the two types of quantum errors (bit-flip and phase-flip) simultaneously, allowing them to now work on an error correction algorithm for quantum computers. Until now, they could only measure one of the two variables at a time. The scientists have also developed a square quantum bit circuit that might make it feasible to mass product quantum chips.

These breakthroughs are important because quantum computing is one of the possible paths that could help us smash past the Moore’s Law limits on current technology. A quantum computer with only 50 quantum bits (qubits) can theoretically outperform a slew of our best supercomputers acting together. Such computers would also allow us to solve problems that are unsolvable today.

Better Atomic Clock. Scientists at the National Institute of Standards and Technology (NIST) and the University of Boulder in Colorado have developed an atomic clock that is accurate to within one second in 15 billion years. This is a vast improvement over the current atomic clock technology that uses a vibrating crystal of Cesium 133 and which is accurate to within a second over 100 million years.

The new clock would be sensitive enough to be able to measure the time differences at different altitudes on earth, a phenomenon predicted by Einstein but which has never been demonstrated.

What is Quantum Computing?

cats-animals_00419529A week ago one of my blogs mentioned a new way to transmit the results of quantum computing. I’ve been following quantum computing for a few years and that announcement led me to take a fresh look at the latest in the quantum computing field.

We all have a basic understanding of how our regular computers work. From the smallest chip in a fitness wearable up to the fastest supercomputer, our computers are Turing machines that convert data into bits represented by either a 1 or a 0 and then process data linearly through algorithms. An algorithm can be something simple like adding a column of numbers on a spreadsheet or something complex like building a model to predict tomorrow’s weather.

Quantum computing takes advantage of a property found in subatomic particles. Physicists have found that some particles have a property called superposition, meaning that they operate simultaneously in more than one state, such as an electron that is at two different levels. Quantum computing mimics this subatomic world by creating what are called qubits which can exist as both a 1 and a 0 at the same time. This is significant because a single qubit can perform two calculations at once. More importantly, though, qubits working together act exponentially. Two qubits can perform four calculations at once, three can perform eight calculations and a thousand qubits can perform . . . a lot of calculations at the same time.

This is intriguing to computer scientists because there are a number of challenges that need more computing power than can be supplied by even the fastest Turing computers. This would include such things as building a model that will more accurately predict the weather and long-term climate change. Or it might involve building a model that accurately mimics the actions of the human brain in real time.

Quantum computers should also be useful when looking at natural processes that have some quantum mechanical characteristics. This would involve trying to predict complex chemical reactions when designing and testing new drugs, or designing nanoparticle processes that operate at an atomic level.

Quantum computers also should be good at processes that require trying huge numbers of guesses to find a solution when each guess has an equal chance of being correct. An example is cracking a password. A quantum computer can try all of the possible combinations quickly while a normal computer would toil away for hours plugging in one choice after another in a linear fashion.

Quantum computing is in its infancy with major breakthroughs coming only a few years ago. Scientists at Yale created the first qubit-based quantum computing processor in 2009. Since then there have been a few very basic quantum computers built that demonstrate the potential of the technology. For instance, in 2013 Google launched the Quantum Artificial Intelligence Lab, hosted by NASA’s Ames Research Center, using a 512 qubit computer built by D-Wave.

For the most part, the field is still exploring the basic building blocks needed to build larger quantum computers. There is a lot of research looking at the best materials to use to produce reliable quantum chips and the best techniques for both programming and deciphering the results of quantum computing.

There are numerous universities and companies around the world engaged in this basic research. Recently, Google hired John Martinis and his team from the University of California at Santa Barbara. He is considered one of the foremost experts in the field of quantum computing. Martinis is still associated with the UCSB but decided that joining Google gave him the best resources for his research.

The NSA is also working on quantum computers that will be able to crack any codes or encryption. Edward Snowden released documents that show that the agency has two different initiatives going to produce the ultimate code-breaking machine.

And there are others in the field. IBM, Microsoft, and Facebook all are doing computer research that includes quantum computing techniques. It’s possible that quantum computing is a dead end that won’t produce results that can’t be obtained by very fast Turing computers. But the theory and early prototypes show that there is a huge amount of potential for the new technology.

Quantum computers are unlikely to ever make it into common use and will probably be limited to industry, universities or the government. A quantum computer must be isolated from external influences and will have to operate in a shielded environment. This is due to what is called quantum decoherence, which means that that just ‘looking’ at a quantum component by some external influence can change its state, in the same manner of opening the box determines the state of Schrodinger’s cat. Quantum computing brings quantum physics into the macro world, which is both mystifying and wonderful.