Season 2 Ep 22 Geoff Hinton on revolutionizing artificial intelligence... again
The Robot Brains Podcast The Robot Brains Podcast
13.6K subscribers
181,220 views
0

 Published On Jun 1, 2022

Over the past ten years, AI has experienced breakthrough after breakthrough in everything from computer vision to speech recognition, protein folding prediction, and so much more.

Many of these advancements hinge on the deep learning work conducted by our guest, Geoff Hinton, who has fundamentally changed the focus and direction of the field. A recipient of the Turing Award, the equivalent of the Nobel prize for computer science, he has over half a million citations of his work.

Hinton has spent about half a century on deep learning, most of the time researching in relative obscurity. But that all changed in 2012 when Hinton and his students showed deep learning is better at image recognition than any other approaches to computer vision, and by a very large margin. That result, that moment, known as the ImageNet moment, changed the whole AI field. Pretty much everyone dropped what they had been doing and switched to deep learning.

Geoff joins Pieter in our two-part season finale for a wide-ranging discussion inspired by insights gleaned from Hinton’s journey from academia to Google Brain. The episode covers how existing neural networks and backpropagation models operate differently than how the brain actually works; the purpose of sleep; and why it’s better to grow our computers than manufacture them.

What's in this episode:

00:00:00 - Introduction
00:02:48 - Understanding how the brain works
00:06:59 - Why we need unsupervised local objective functions
00:09:39 - Masked auto-encoders
00:10:55 - Current methods in end to end learning
00:18:36 - Spiking neural networks
00:23:00 - Leveraging spike times
00:29:55 - The story behind AlexNet
00:36:15 - Transition from pure academia to Google
00:40:23 - The secret auction of Hinton’s company at NuerIPS
00:44:18 - Hinton’s start in psychology and carpentry
00:54:34 - Why computers should be grown rather than manufactured
01:06:57 - The function of sleep and Boltzmann Machines
01:11:49 - Need for negative data
01:19:35 - Visualizing data using t-SNE

Links:
Geoff's Bio: https://en.wikipedia.org/wiki/Geoffre...
Geoff's Twitter: https://twitter.com/geoffreyhinton?la...
Research and Publications: https://bit.ly/3z3M54e
Google Scholar Citations: https://bit.ly/3N892HJ
Story Behind the 2012 NIPS Auction: https://bit.ly/3t9xsIN
GLOM: https://bit.ly/3lYgWr6
Vector Institute: https://vectorinstitute.ai/

SUBSCRIBE TODAY:

Apple: https://apple.co/3NLtQED
Spotify: https://spoti.fi/3GBDpDM
Amazon: https://amzn.to/3NHlQoa
Google: https://bit.ly/3aD7ZkN
Acast: https://bit.ly/3x6ZYfw

Host: Pieter Abbeel
Executive Producers: Alice Patel & Henry Tobias Jones
Production: Fresh Air Production

show more

Share/Embed