Moore’s law describes a phenomenon that has arguably influenced our modern world more than any other. The exponential growth of computing performance over the past 40 years has reshaped our lives in myriad ways, empowering all manner of wonders from the world wide web to smartphones to the Internet of Things.
But Moore’s law has been fading fast, if it has not already expired. The big, as yet unanswered, question is to what extent this matters.
In an article in 1965, Gordon Moore, the future founder of Intel, explained how the number of components that could be crammed on to an integrated circuit doubled every year. In 1975, he revised this to two years and his forecast, subsequently adopted as a “law”, has pretty much held true ever since.