This Master’s thesis presents a scalable flash-based storage processor unit called Flash-HTM (FHTM), which utilizes Hierarchical Temporal Memory learning algorithms. FHTM is designed to scale with increasing model complexity. As validation, the author evaluates a mathematical model of the hardware against the MNIST dataset, yielding a 91.98% classification accuracy.
This paper explores a hardware implementation for the HTM sequence memory (in the paper they use an older term, the “Cortical Learning Algorithm” to refer to the sequence memory). The paper hypothesizes a structure that is not application dependent and performs fully unsupervised continuous learning.