CNS 2017: A Neural Mechanism for Sequence Learning – HTM Sequence Memory

Click on image to enlarge

It is a mystery how pyramidal neurons integrate the input from thousands of synapses, what role the different dendrites play in this integration, and what kind of network behavior this enables in cortical tissue. It has been previously proposed that non-linear properties of dendrites enable cortical neurons to recognize multiple independent patterns.

In this poster, we extend this idea and propose a neural mechanism for sequence learning– HTM Sequence Memory, where:

  1. Neurons learn to recognize hundreds of patterns using active dendrites.
  2. Recognition of patterns act as predictions by depolarizing the cell without generating an immediate action potential.
  3. A network of neurons with active dendrites forms a powerful sequence memory.
  4. Sparse representations lead to highly robust recognition.

Through simulation, we show that the network scales well and operates robustly over a wide range of parameters as long as the network uses a sparse distributed code of cellular activations. Given the prevalence and similarity of excitatory neurons throughout the neocortex and the importance of sequence memory in inference and behavior, we propose that this form of sequence memory may be a universal property of neocortical tissue.