future timeline technology singularity humanity
 
Blog»
 

 

28th February 2023

Nvidia CEO: "We're going to accelerate AI by another million times"

In a recent earnings call, the boss of Nvidia Corporation, Jensen Huang, outlined his company's achievements over the last 10 years and predicted what might be possible in the next decade.

 

AI predictions 2030
Credit: Sergey Nivens

 

For many years, people knew Nvidia for just its gaming and video industry hardware. In 1999, the company released its GeForce line-up, marketed as the world's first graphics processing units (GPUs). These allowed gamers and film editors to experience major improvements in visual quality and performance by adding a chip specifically dedicated to graphics processing, unlike a central processing unit (CPU). Over the years, GPUs followed an exponential improvement trend, eventually handling many billions of calculations per second.

Subsequently, researchers began to realise the potential for GPUs in the field of artificial intelligence. Although slower and weaker than their CPU cousins for general-purpose applications, their architecture enabled far stronger capabilities in certain tasks, such as the parallel processing of many independent computations simultaneously. So, as well as graphical details, this made them better suited to machine learning algorithms – which require vast throughput of smaller inputs together – unlike a CPU that is forced to crunch through each individual one sequentially.

The early 2010s marked the beginning of the deep learning era – a rapid acceleration of AI that continues to this day. Nvidia played a central role in what some called the "big bang" of deep learning, as researchers combined GPUs with neural networks. In 2016, the company released its DGX-1 server, described as a "supercomputer in a box," designed specifically for machine learning work. These developments (and more) paved the way for advancements in AI that included the likes of GPT-3, DALL·E, and more recently ChatGPT.

Fast forward to today, and CEO Jensen Huang is optimistic that the recent momentum in AI can be sustained into at least the next decade. During the company's latest earnings call, he explained that Nvidia's GPUs had boosted AI processing by a factor of one million in the last 10 years.

"Moore's Law, in its best days, would have delivered 100x in a decade. By coming up with new processors, new systems, new interconnects, new frameworks and algorithms and working with data scientists, AI researchers on new models – across that entire span – we've made large language model processing a million times faster," Huang said.

"ChatGPT is a wonderful piece of work, and the team did a great job, OpenAI did a great job with it," he added. "People were surprised by [...] the capability of a single AI model that can perform tasks and skills that it was never trained to do. And for this language model to not just speak human language, but output Python, output Cobalt, a language that very few people even remember, output Python for Blender, a 3D program. So it's a program that writes a program for another program.

 

chatgpt javascript

 

"The world now realises that maybe human language is a perfectly good computer programming language, and that we've democratised computer programming for everyone, almost anyone who could explain in human language a particular task to be performed. This new computing platform [can] take whatever your prompt is – whatever your human-explained request is – and translate it to a sequence of instructions, or it waits for you to decide whether you want to process it or not.

"And so this type of computer is utterly revolutionary in its application, because it's democratised programming to so many people, and really has excited enterprises all over the world. The activity around the AI infrastructure that we build [...] to inference large language models, has just gone through the roof in the last 60 days. And so there's no question that whatever our views are of this year as we enter the year has been fairly, dramatically changed as a result of the last 60, 90 days."

Looking to the future, he speculated: "Over the course of the next 10 years, I hope through new chips, new interconnects, new systems, new operating systems, new distributed computing algorithms and new AI algorithms and working with developers coming up with new models, I believe we're going to accelerate AI by another million times."

"I believe the number of AI infrastructures are going to grow all over the world," he continued. "And the reason for that is AI; the production of intelligence is going to be manufacturing. There was a time when people manufactured just physical goods. In the future, almost every company will manufacture soft goods in the form of intelligence."

"I expect to see AI factories all over the world. There will be some that are large, some that are mega-large, and some that are smaller. My expectation is that you're going to see really gigantic breakthroughs in AI models in the next company, the AI platforms in the coming decade."

 

Comments »

 


 

If you enjoyed this article, please consider sharing it:

 

 

 

 
 

 

Comments

 

 

 

 

⇡  Back to top  ⇡

Next »