More New Technologies

Math_equation_dice_d6I periodically report on new technologies that I find interesting. This past week I ran across several new technologies that seem pretty revolutionary and which could all result in significant improvements on our lives.


First is a new green technology. Scientists at the University of Toronto have developed something they are calling colloidal quantum dots. These new materials have the potential to revolutionize solar cell technology. Today’s solar cells all work by juxtaposing two types of materials, an n-type material that is rich in electrons and a p-type material that is low in electrons. Solar energy then creates a current by taking the energy from sunlight and moving electrons from the high electron to low electron materials. However, until now, all n-type materials have lost potency when exposed to air and thus have had to be sealed into the solar cells that we are familiar with. But with colloidal quantum dots we might be able to have cheap solar cells everywhere. Picture having them embedded into outdoor paint so that every roof, home, bridge or cell tower could be generating electricity.


Next is Spansion which has developed and is manufacturing energy harvesting chips that can generate enough electricity to power themselves. They generate small amounts of electricity through different techniques such as taking advantage of vibrations, sunlight or differences in heat. This is one of the breakthroughs that have been needed to unleash the Internet of Things. Without this breakthrough every IoT sensor would need its own battery, and replacing those batteries was a cost barrier to realistic deployment of sensor networks. But self-powering chips make it possible to deploy sensor networks that can monitor crops, herds, pollution or just about anything else.


Another big breakthrough comes from HP who is calling it ‘The Machine”. This brings together a number of different technologies that is going to revolutionize the computers that we use to process large amounts of data. HP has developed a new computer from scratch. It uses specialized core processors rather than a series of generic processors. It will use photonics rather than electronics and will eliminate copper wiring. It will use memristors for a unified memory that is as fast as RAM but that can also store data like a flash drive. And it has a 3D architecture that packs components closer than can be done using traditional flat chipsets.


All of these changes will result in servers that are about 6 times faster than today’s best server and that uses only one eightieth of the power and requiring significantly less space. Probably the most significant aspect of this is the reduced need for power. Today’s network of data centers have often been built where power is the cheapest, but by cutting power consumption by a factor of 80 then any closet can probably become a small data center. It’s been reported that both Google and Amazon are working on their own versions of new servers and they may very well be doing something similar. But HP is the first to announce the specifics. HP hopes to be able to ship these by 2018.


Finally, math gets a headline because a company called Code On Technologies promises to use math to speed up existing data transmissions. Most people probably don’t realize how much time and energy is spent during a data transmission today to reassemble a data stream at the receiving end. Bits essentially get numbered and the receiving end of each transmission looks until it finds all of the needed bits before passing on information. The process is quite inefficient due to searches for missing packets and this reassembly is done over and over in the Internet network as a piece of data goes from device to device.


Code On has developed a technique that instead will code data into a mathematical equation. Rather than ‘number’ the bits it will assign identity to packets in terms of the solution to an equation. What this means on the receiving end is that there no longer will have to be a constant search for missing packets since the receiver can assume what was in the missing packets by reconstructing the answer to the equation. This sounds esoteric, but it could improve the transmission speeds on current networks by as much as twenty times by vastly improving the process or reconstructing the data at the receiving end of each transaction. This could make for much faster satellite or WiFi networks without having to change those networks. This makes a math nerd smile!

Leave a Reply