Future Technology – May 2018

I’ve seen a lot of articles recently that promise big improvements in computer speeds, power consumption, data storage, etc.

Smaller Transistors. There has been an assumption that we are at the end of Moore’s Law due to reaching the limit on the smallness of transistors. The smallest commercially available transistors today are 10 nanometers in diameter. The smallest theoretical size for silicon transistors is around 7 nm since below that size the transistor can’t contain the electron flow due to a phenomenon called quantum tunneling.

However, scientists at the Department of Energy’s Lawrence Berkeley Laboratory have developed a 1 nanometer transistor gate, which is several magnitudes smaller than silicon transistors. The scientists used molybdenum disulfide, a lubricant commonly used in auto shops. Combining this material with carbon nanotubes allows electrons to be controlled at the 1 nm distance. Much work is still needed to go from lab to production, but this is the biggest breakthrough in transistor size in many years and if it works will provide a few more turns of Moore’s Law.

Better Data Storage. A team of scientists at the National University of Singapore have developed a technology that could be a leap forward in data storage technology. The breakthrough uses skyrmions which were identified in 2009. The scientists have combined cobalt and palladium into a film that is capable of housing the otherwise unstable skyrmions at room temperatures.

Once stabilized the skyrmions, at only a few manometers in size, can be used to store data. If these films can be stacked they would provide data storage with 100 times the density of current storage media. We need better storage since the amount of data we want to store is immense and expected to increase 10-fold over the next decade.

Energy Efficient Computers.  Ralph Merkle, Robert Freitas and others have created a theoretical design for a molecular computer than would be 100 billion times more energy efficient than today’s most energy efficient computers. This is done by creating a mechanical computer that creates small physical gates at the molecular level that mechanically open and close to create circuits. This structure would allow the creation of the basic components for computing such as AND, NAND, NOR, NOT, OR, XNOR and XOR gates without electronic components.

Today’s computers create heat due to the electrical resistance in components like transistors, and it’s this resistance that requires huge electricity bills to operate and then cool big data centers. Mechanical computer create less heat from the mechanical process of opening and closing logic gates, and this friction can be nearly eliminated by creating tiny gates at the molecular level.

More Powerful Supercomputers. Scientists at Rice University and the University of Illinois at Urbana-Champaign have developed a process that significantly lowers the power requirements while making supercomputers more efficient. The process uses a mathematical technique developed in the 1600s by Isaac Newton and Joseph Raphson that cut down on the number of calculations done by a computer. Computers normally calculate every mathematical formula to the seventh or eight decimal point, but using the Newton-Raphson tool can reduce the calculations to only the third or fourth decimal place while also increasing the accuracy of the calculations by three orders of magnitude (1000 times).

This method drastically reduces the amount of time needed process data, which makes the supercomputer faster while drastically reducing the amount of energy needed to perform a given calculation. This has huge implications when running complex simulations such as weather forecasting programs that require the crunching of huge amounts of data. Such programs can be run much more quickly while producing significantly more accurate results.