#205 On Leveraging AI To Grapple with Humanity’s Biggest Questions with NVIDIA’S VP of Solutions Architecture and Engineering, Marc Hamilton

NVIDIA is best known for its production of GPUs and APIs that enable high-performance computing, supercomputing, and power some of the most intense applications across the world. Unsurprisingly, the company is deeply involved in AI, and on this week’s podcast, the company’s VP of Solutions Architecture and Engineering, Marc Hamilton, joins us to share the company’s unique insights into the field.

Hamilton explains how NVIDIA’s innovative AI Factory concept allows it to introduce efficiencies into the data gathering process. He uses the example of self-driving cars to just how effective the NVIDA approach is. To manually collect all the data on all the roads in the world, the researchers would need to travel 11 billion miles. However, NVIDIA can leverage simulations of roads to “teach” the AI powering these cars synthetically.

Hamilton then describes the fascinating advancements of digital twins – a technology idea that has been around for decades but only just now supported by powerful enough technology to handle the AI and other processing requirements for it. This, Hamilton says, can be used for everything from workplace layout simulations that “test” an environment to make sure it’s safe to work in before real humans do so, through to creating a digital “twin” of the earth as a way of testing the impact of climate change “700 or 7,000” days down the track.

“It's going to be many years before we're done, but we're already making some interesting project progress and seeing some interesting early signs of future success,” he said.

With AI certain to be critical to how humanity grapples with the increasingly complex challenges facing it into the future, it is companies like NVIDIA that will at the forefront of our response. Tune in to learn more about the very cutting edge of AI.

Enjoy the show!

About NVIDIA

Since its founding in 1993, NVIDIA (NASDAQ: NVDA) has been a pioneer in accelerated computing. The company’s invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined computer graphics and ignited the era of modern AI. NVIDIA is now a full-stack computing company with data-center-scale offerings that are reshaping industry. More information at https://nvidianews.nvidia.com/

 

About GTC (GPU Technology Conference)

The Technology Conference for the Era of AI and Metaverse

Explore the latest technologies and business breakthroughs.

Learn from experts how AI and the evolution of the 3D Internet are profoundly impacting industries—and society as a whole.

Don’t miss the GTC 2022 keynote.

Jensen Huang | Founder and CEO | NVIDIA

Take a closer look at the game-changing technologies that are helping us take on the world’s greatest challenges.

Free to Register and the event you do not want to miss!

Join us in Melbourne for Scaling AI with MLOPS:  https://www.datafuturology.com/mlops

Thank you to our sponsor, Talent Insights Group!

Join our Slack Community: https://join.slack.com/t/datafuturologycircle/shared_invite/zt-z19cq4eq-ET6O49o2uySgvQWjM6a5ng

NVIDIA didn’t invent AI, modern day AI, the basic algorithms go back 30 or 40 years, you can find research papers on deep neural networks going back 30 years. And what happened about 10 years ago, is the same time several universities and several students sort of rediscovered those papers in those algorithms. And at the time, were aware of CUDA and GPUs and in the computing power of those GPUs, and they sort of put two and two together.
— Marc Hamilton, NVIDIA’S VP of Solutions Architecture and Engineering

WHAT WE DISCUSSED

0:00 Introduction
05:51 I was wondering if we could dive into the concept of the AI factory?
10:49 What would you say are some of the gotchas that people should be aware of in that journey of developing an AI factory?
15:33 Could you tell us a little bit about what Nvidia has been doing in that area?
22:01 When they have a 3D representation of a plant or factory and then they overlay the information coming in from sensors, what type of doors does that open? What can they do from that point on?
27:46 I know that you guys have been thinking about the sustainability side of all this. Could you tell us a little bit about that side?
36:42 How can people get involved in getting some help to look at areas like the ones we've just spoken about today?

EPISODE HIGHLIGHTS

  • 2012 is when AI started to take off. And so but the one thing hasn't changed, right is our GPU platform, his has been consistent ever since. And in fact, if you're writing CUDA programs 10 years ago, and a 10 year old GPU, it'll still run today. And I wouldn't be surprised some of the audience out there may even have one or two of those old GPUs still running.

  • It was that availability of large amounts of data, which of course, by 10 years ago, we have thanks to the internet, the compute power, the GPU in this 30 year old algorithm sort of reinvented in rebuilt to run on top of the GPU, and then we just saw AI explode.

  • An AI factory is just a modern way of developing AI code, the inputs, the raw materials or the data, you see still applied domain expertise, the AI doesn't write itself, you have to sort of, you have to generate what's called an AI model. And the AI model is what tells the deep neural network or informs the deep neural network, how to actually take that input data and transform it into a program.

  • At NVIDIA, in our data factory, we not only processed the data collected by our self-driving test cars, but we're actually generating data synthetically. And if you think about that, it sounds really complex.

  • The concept of digital twin isn't new has been around a long time. But really, it was impossible in the past to really put together a true digital twin at large scale.

  • So if we think of the meta world, maybe in social networking it just doesn't involve moving atoms, right? We're just sort of sitting on your couch, on your laptop or your phone. But in the enterprise world, right? The Metaverse is going to be all about moving atoms, and then overlaying the bits, the AI the software, the simulation on top of that, and having the bits match the atoms.

  • We can't solve it today. Right in we know that we need to be able to compute about a million times faster. So, building a global climate model in inside Omniverse. Building a digital twin of the earth in effect is a project that we kicked off about a year ago. It's going to be many years before we're done, but we're already making some interesting project progress and see some interesting early signs of future success with Omniverse and we call that Project Earth 2.


At Data Futurology, we are always working to bring you use cases, new approaches and everything related to the most relevant topics in data science to help you get the most value out of these technologies! Check out our upcoming events for more amazing content.