Six years ago, we wrote a blog about Classic AI, Simple Neural Networks, and Biological Neural Networks. Fast forward to today and it’s no surprise that the terms have continued to evolve. In this blog post, we’ll revisit these approaches, look at how they hold up today, and compare them to each other. We’ll also explore how each approach might address the same real-world problem.
How can we take a step towards the brain’s efficiency without sacrificing accuracy? One strategy is to invoke sparsity. Today I’m excited to share a step in that direction – a 10x parameter reduction in BERT with no loss of accuracy on the GLUE benchmark.
In our new pre-print titled “Going Beyond the Point Neuron: Active Dendrites and Sparse Representations for Continual Learning”, we investigated how to augment neural networks with properties of real neurons, specifically active dendrites and sparse representations.
Our research meetings are the cornerstone of everything we do. It’s where we share hypotheses, review papers, and often invite other researchers to share their work. Here are our most popular research meetings from the previous 12 months – just in case you missed them!
Are you a machine learning researcher looking for better learning algorithms? Interested in how neuroscience research can help inform the development of artificial intelligence systems? Brains@Bay may be the Meetup group for you! Brains@Bay is a meetup hosted by Numenta with the goal of bringing together experts and practitioners at the intersection of neuroscience and AI.