Our Director of ML Architecture Lawrence Spracklen is speaking at the Stanford SystemX seminar on October 11th at Stanford University. He will be talking about how Numenta’s recent innovations in AI research enable order-of-magnitude performance speedups and energy savings.
The Stanford SystemX Alliance is a collaboration between Stanford University and member industrial firms to produce world-class research and Ph.D. graduates with a view to enabling truly ubiquitous sensing, computing and communication with embedded intelligence. Learn more here.
Abstract:
Although today’s deep learning networks achieve state of the art accuracy in numerous tasks, they incur exponentially increasing computational complexity and cost. We present novel techniques that achieve over 100X improvements in deep learning performance, while preserving the accuracy of the resulting model. We discuss our recent innovations in AI research, inspired by neocortical structure and function, that enables deep learning networks to outperform standard networks by two-orders of magnitude on existing CPUs, GPUs and FPGAs. We discuss how these techniques can improve throughput, latency, and energy requirements of convolutional and transformer networks. Our techniques can be synergistically combined with other standard optimization techniques to achieve further multiplicative improvements in performance.