Between 2012 and 2018 the power of the infrastructure available for applications in artificial intelligence increased over three hundred thousand times. If it followed Moore’s law, it would have been much less, around 7 times or so.
Stanford University in its 2019 report on AI, and in a blog post OpenAI analyzed and illustrated this recently.
We can apply a simple linear interpolation and view two segments, one where computing power doubles every two years, and one where it doubles every three-four months.
But it is much more useful to think in terms of increasing acceleration of computing power.
This better prepares us to face the future changes, additional increases in acceleration, when quantum computers are going to be used by AI systems to design more powerful versions of themselves.
Will the doubling rate become a few weeks, a few hours?