Moore’s law: the famous rule of computing has reached the end of the road, so what comes next? |
For half a century, computing advanced in a reassuring, predictable way. Transistors – devices used to switch electrical signals on a computer chip – became smaller. Consequently, computer chips became faster, and society quietly assimilated the gains almost without noticing.
These faster chips enable greater computing power by allowing devices to perform tasks more efficiently. As a result, we saw scientific simulations improving, weather forecasts becoming more accurate, graphics more realistic, and later, machine learning systems being developed and flourishing. It looked as if computing power itself obeyed a natural law.
This phenomenon became known as Moore’s Law, after the businessman and scientist Gordon Moore. Moore’s Law summarised the empirical observation that the number of transistors on a chip approximately doubled every couple of years. This also allows the size of devices to shrink, so it drives miniaturisation.
That sense of certainty and predictability has now gone, and not because innovation has stopped, but because the physical assumptions that once underpinned it no longer hold.
So what replaces the old model of automatic speed increases? The answer is not a single breakthrough, but several overlapping strategies.
One involves new materials........