I remember many articles in 2016 that lamented that Moore’s Law was dead, which spelled the end of the era for U.S. technology constantly improving due to ever-faster generations of computers. Moore’s Law is named after Gordon Moore, an engineer who later became one of the founders of Intel. In 1965, Moore observed that the number of transistors that could be squeezed into a given area of a circuit board was doubling every two years. He predicted this trend would last for perhaps another decade, but the microchip industry fulfilled his prediction for over 50 years.
In 1965 a single transistor cost about $9 in today’s dollars and we can now put billions of transistors onto a chip, at a tiny fraction of a cent each. The belief that chips could always be improved helped to launch Silicon Valley and enabled the huge array of technological changes that have been brought about by cheap computer chips. The companies that make chips thrived by creating a new generation of chips every few years, which represented a significant leap forward in computing power.
It’s now clear that Moore’s law is dead, or nearly so, and computing is not going to improve much from denser transistors. But that doesn’t mean that computing can’t get faster, and there are new strategies for developing better computers and chips.
One path is to use new materials and devices that can improve the computing process.
- Researchers have been exploring new materials for chips such as graphene and carbon nanotubes.
- Scientists are pursuing optical chips that use light for processing inside the chip instead of electricity. This speeds up the bottleneck of getting data into and out of a chip.
- Spintronics research is looking at using the characteristics of the spin direction of electrons as a way to create much higher density data storage.
- Tunnel field-effect transistors switch between 1s and 0s using quantum tunneling instead of today’s modulation with thermionic emissions. This has the potential to create chips that use far less power.
Another path for future improvements is to develop new models of computing.
- Quantum computing is exploring the ability to process multiple calculations simultaneously.
- Neuromorphic computing models the computing process after systems in the human brain and nervous system.
- Adiabatic computing uses reversible circuits that have as many outputs as inputs. Since each input can be reconstructed from an output, no bits are lost, and reversible circuits give off no heat.
Another approach is to develop a more efficient architecture and packaging of chips.
- There have been some significant improvements in building three-dimensional chips that consist of stacked layers of chips.
- Reconfigurable computing is an architecture that can speed up complex processing by using components that can change function or spatial configuration during the computing process.
- Dark silicon computing powers down any portion of a chip that is not being used to conserve power.
- Superconducting computers operate at cold temperatures that dramatically cut power usage and speed up calculating.
- Near-threshold voltage computing saves power by using chips that operate only at the peak energy-efficient level.
The biggest bottleneck today for creating the next generations of better computing is the typical 10-year window required to go from an idea created in a lab to producing chips. If we want to continue on the path of predictably better computers, we’ll have to find a way to speed up this process.



