This week Jeff Hawkins will present two talks in the prestigious Berkeley Hitchcock Lectures series. We’re proud of the recognition that Jeff’s work has received in the academic community.
In a recent Forbes post, Ed Dumbill describes how computers have traditionally been used to digitize real-world business operations, for use in siloed computer applications. He refers to this as a “digital exoskeleton” that served as a support system for processes like payroll or inventory management.
Jeff explained how he came to understand and address this problem in a recent keynote address at the International Symposium on Computer Architecture. We posted a video of the keynote on YouTube.
A question we get all the time from machine learning fans is: “How does Numenta’s Hierarchical Temporal Memory (the HTM) compare to traditional machine learning algorithms?” There are many ways to answer this question. In this blog entry, I will focus on one specific difference, perhaps the most fundamental one.
Today we hear so much about “big data” and the database tools you can use to sort through large amounts of data. However at Numenta we see a different future.