Designed to promote collaboration, our Visiting Scholar Program lets researchers and professors spend time at our offices and learn about Numenta’s work in depth while continuing their normal research. As one of Numenta’s first “virtual” interns, I asked Niels Leadholm to share his work and experience interning at Numenta.
In a recent technology demonstration, we showed that brain-derived sparse network algorithms were 50 times faster than dense networks and used significantly less power. In this blog post, we walk through some visualizations of our results that ultimately validate how sparsity can enable massive energy savings and lower costs.
Numenta Research Staff Member Lucas Souza continues his series on sparse neural networks. He provides a review of current techniques to train networks from scratch and updates on dynamic sparsity, or sparse networks that learn their structure dynamically through training. Finally, he walks through the implications for hardware.
Numenta Research Staff Members Lucas Souza and Michaelangelo Carporale attended ICLR 2020 and share their takeaways of the best sessions in neuroscience, deep learning theory, and pruning and sparsity.
Led by Jeff Hawkins, the Numenta Research team has been livestreaming their research meetings for a year. This post highlights how opening research meetings to the public has led to important collaborations with other researchers.