In this meetup, we discuss alternatives to backpropagation in neural networks.
From the neuroscience side, Prof. Rafal Bogacz, from University of Oxford, discusses the viability of backpropagation in the brain, and the relationship of predictive coding networks and backpropagation. Prof. Rafal has published extensively in the field and co-authored a comprehensive review paper on Theories of Error Backpropagation in the Brain.
Sindy Löwe from University of Amsterdam then discusses her latest research on self-supervised representation learning. She is the first author of the paper Putting An End to End-to-End: Gradient-Isolated Learning of Representations, presented at last year’s Neurips, that shows networks can learn by optimizing the mutual information between representations at each layer of a model in isolation.
Finally, Jack Kendall, co-founder of RAIN Neuromorphics, shows how equilibrium propagation can be used to train end-to-end analog networks, which can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
Meetup link: https://www.meetup.com/Brains-Bay/events/262647238/
Brains@Bay Meetups focus on how neuroscience can inspire us to create improved artificial intelligence and machine learning algorithms. Join the discussion here.
Video