New Numenta Paper Compares HTM to Machine Learning Techniques

Media Alert

REDWOOD CITY, CA — November 14, 2016 — Numerous proposals have been offered for how intelligent machines might learn sequences of patterns, which is believed to be an essential component of any intelligent system. Researchers at Numenta Inc. have published a new study, “Continuous Online Sequence Learning with an Unsupervised Neural Network Model,” which compares their biologically-derived HTM sequence memory to traditional machine learning algorithms.

The paper has been published in MIT Press Journal’s Neural Computation 28, 2474–2504 (2016). You can read and download the paper here.

Authored by Numenta researchers Yuwei Cui, Subutai Ahmad, and Jeff Hawkins, the new paper serves as a companion piece to Numenta’s breakthrough research offered in “Why Neurons Have Thousands of Synapses, A Theory of Sequence Memory in Neocortex,” which appeared in Frontiers in Neural Circuits, in March 2016.

The earlier paper described a biological theory of how networks of neurons in the neocortex learn sequences. In this paper, the authors demonstrate how this theory, HTM sequence memory, can be applied to sequence learning and prediction of streaming data.

“Our primary goal at Numenta is to understand, in detail, how the neocortex works. We believe the principles we learn from the brain will be essential for creating intelligent machines, so a second part of our mission is to bridge the two worlds of neuroscience and AI. This new work demonstrates progress towards that goal,” Hawkins commented.

In the new paper, HTM sequence memory is compared with four popular statistical and machine learning techniques: ARIMA, a statistical method for time-series forecasting (Durbin & Koopman 2012); extreme learning machine (ELM), a feedforward network with sequential online learning (Huang, Zhu, & Siew, 2006); and two recurrent networks, long-short term memory (LSTM) (Hochreiter and Schmidhuber 1997) and echo state networks (ESN) (Jaeger and Hass 2004).

The results in this paper show that HTM sequence memory achieves comparable prediction accuracy to these other techniques. However, the HTM model also exhibits several properties that are critical for streaming data applications including:

  • Continuous online learning
  • Ability to make multiple simultaneous predictions
  • Robustness to sensor noise and fault tolerance
  • Good performance without task-specific tuning

“Many existing machine learning techniques demonstrate some of these properties,” Cui noted, “but a truly powerful system for streaming analytics should have all of them.”

The HTM sequence memory algorithm is something that machine learning experts can test and incorporate into a broad range of applications. In keeping with Numenta’s open research philosophy, the source code for replicating the graphs in the paper can be found here. Numenta also welcome questions and discussion about the paper on the HTM Forum or by contacting the authors directly.


*Yuwei Cui, Subutai Ahmad, Jeff Hawkins (2016). Continuous Online Sequence Learning with an Unsupervised Neural Network Model. Neural Computation 28(11), 2474–2504. doi:10.1162/NECO_a_00893

*Hawkins, J., and Ahmad, S. (2016). Why Neurons Have Thousands of Synapses, A Theory of Sequence Memory in Neocortex. Front. Neural Circuits 10. doi:10.3389/fncir.2016.00023

About Neural Computation

Neural Computation disseminates important, multidisciplinary research results in a field that attracts psychologists, physicists, computer scientists, neuroscientists, and artificial intelligence investigators, among others. For researchers looking at the scientific and engineering challenges of understanding the brain and building computers, Neural Computation highlights common problems and techniques in modeling the brain, and in the design and construction of neurally-inspired information processing systems.

About Numenta

Founded in 2005, Numenta develops theory, software technology, and applications all based on reverse engineering the neocortex. Laying the groundwork for the new era of machine intelligence, this technology is ideal for analysis of continuously streaming data and excels at modeling and predicting patterns in data. Numenta has also developed a suite of products and demonstration applications that utilize its flexible and generalizable Hierarchical Temporal Memory (HTM) learning algorithms to provide solutions that encompass the fields of machine generated data, human behavioral modeling, geo-location processing, semantic understanding and sensory-motor control. In addition, Numenta has created
NuPIC (Numenta Platform for Intelligent Computing) as an open source project. Numenta is based in Redwood City, California.

Connect with Numenta:
Twitter,
Facebook,
YouTube and
LinkedIn

Numenta Media Contact:
Krause Taylor Associates,
Betty Taylor:
bettyt@krause-taylor.com
408-981-7551

Authors

Press Release

Share