The most important concept to understand AGI trends, is to realize that the doubling rate of AI compute is shortening.
Moore’s law is chugging along at a sedate exponential, doubling every couple of years or so. A self fulfilling prophecy we’ve become accustomed to for over 50 years, bringing us ever more powerful PCs and smartphones.
Competing teams worldwide bet on different approaches aiming to progress in the same mission. To outsiders it’s very hard at the start to realize that the compounding effects draw a nonlinear result, given how noisy the world is.
Similarly, in its 2019 AI report, Stanford jumped to a naive conclusion talking about the “two eras of AI compute” based on a coarse data set. They looked at it as if it were a static picture with a single shift. Instead what we are seeing is a dynamic phenomenon, that is ongoing.
I call these the Paradigm of Jolting Technologies.
Today, the increase of AI Compute is close to 1 OOM/y (order of magnitude per year). In the two years of a Moore’s doubling it achieves a hundredfold increase. And tomorrow it’ll be faster still!
Going back to the first charts from Jensen Huang’s keynote presentations: if NVIDIA were on a merely exponential trend, in 10 years they would have improved their performance less than a thousandfold. Instead they improved ten millionfold.