Pages

21 Oct 2022

Terabrain systems are coming - and could mean that many paid jobs will disappear.

I was mentioned in a message on the Forbes news site yesterday entitled  "Superintelligence May Be Closer Than Most People Think, Says Neuroscientist". It was posted by Calum Chace as a way to draw attention to some ideas I have been proposing recently.

Following on from my recent Youtube video on "AI, Technological Unemployment and Universal Basic Income", I was invited by Calum Chace and David Wood to do a podcast in their series called "London Futurists".

The podcast can be found on Apple PodcastsSpotify. and on the London Futurist's site.

In it, I talk about the question of why human brains consume much less power than artificial neural networks.  And I explain my view that the key to artificial general intelligence is a "terabrain" that copies from human brains the sparse-firing networks with spiking neurons.

 The timeline of the podcast, which lasts 30 minutes, gives an overview of some of the points I make

  • 00.11 Recapping "the AI paradox"
  • 00.28 The nervousness of CTOs regarding AI
  • 00.43 Introducing Simon
  • 01.43 45 years since Oxford, working out how the brain does amazing things
  • 02.45 Brain visual perception as feed-forward vs. feedback
  • 03.40 The ideas behind the system that performed so well in the 2012 ImageNet challenge
  • 04.20 The role of prompts to alter perception
  • 05.30 Drawbacks of human perceptual expectations
  • 06.05 The video of a gorilla on the basketball court
  • 06.50 Conjuring tricks and distractions
  • 07.10 Energy consumption: human neurons vs. artificial neurons
  • 07.26 The standard model would need 500 petaflops
  • 08.40 Exaflop computing has just arrived
  • 08.50 30 MW vs. 20 W (less than a lightbulb)
  • 09.34 Companies working on low-power computing systems
  • 09.48 Power requirements for edge computing
  • 10.10 The need for 86,000 neuromorphic chips?
  • 10.25 Dense activation of neurons vs. sparse activation
  • 10.58 Real brains are event driven
  • 11.16 Real neurons send spikes not floating point numbers
  • 11.55 SpikeNET by Arnaud Delorme
  • 12.50 Why are sparse networks studied so little?
  • 14.40 A recent debate with Yann LeCun of Facebook and Bill Dally of Nvidia
  • 15.40 One spike can contain many bits of information
  • 16.24 Revisiting an experiment with eels from 1927 (Lord Edgar Adrian)
  • 17.06 Biology just needs one spike
  • 17.50 Chips moved from floating point to fixed point
  • 19.25 Other mentions of sparse systems - MoE (Mixture of Experts)
  • 19.50 Sparse systems are easier to interpret
  • 20.30 Advocacy for "grandmother cells"
  • 21.23 Chicks that imprinted on yellow boots
  • 22.35 A semantic web in the 1960s
  • 22.50 The Mozart cell
  • 23.02 An expert system implemented in a neural network with spiking neurons
  • 23.14 Power consumption reduced by a factor of one million
  • 23.40 Experimental progress
  • 23.53 Dedicated silicon: Spikenet Technology, acquired by BrainChip
  • 24.18 The Terabrain Project, using standard off-the-shelf hardware
  • 24.40 Impressive recent simulations on GPUs and on a MacBook Pro
  • 26.26 A homegrown learning rule
  • 26.44 Experiments with "frozen noise"
  • 27.28 Anticipating emulating an entire human brain on a Mac Studio M1 Ultra
  • 28.25 The likely impact of these ideas
  • 29.00 This software will be given away
  • 29.17 Anticipating "local learning" without the results being sent to Big Tech
  • 30.40 GPT-3 could run on your phone next year
  • 31.12 Our interview next year might be, not with Simon, but with his Terabrain
  • 31.22 Our phones know us better than our spouses do

The main point overlaps with the first part of my YouTube presentation, in that I argue that although the current generation of deep learning based AI systems are too power hungry to compete directly with a human brain, which uses a mere 20 W, there is a strong chance that within a few years, it will be possible to simulate neural networks with more neurons than the human neocortex using off-the-shelf hardware. The trick is to networks of spiking neurons with extremely sparse firing patterns.

Although I don't talk about it in the Podcast, I think that such systems could mean that many paid jobs will disappear. Fortunately, there's a solution. It's called Universal Basic Income!



No comments:

Post a Comment