Sunday, December 5, 2010

Another new Hawkins talk

On December 2, Jeff Hawkins gave a talk at Berkeley. It is similar to the talk from three weeks ago, but with some added tidbits sprinkled throughout. One nugget was Hawkins' statement that there is no existing machine learning model that comes even close to HTM for the depth to which it maps to the real cortical anatomy. This is exactly the point I made in my debate with Michael Anissimov on his blog. Here's the link:


  1. Jeff seems to be doing well physically, considering he has been working more than full time for many years, and he is 53 years old :-)

    I'm wondering how Jeff is financing Numenta. Is it from his own pocket ? How long can he keep it running ? That's probably too personal, but he seems very passionate about the project, so much that maybe he would spend all his money if he had to.

    It's admirable to have a grand goal in life and to stick to it. You have something to wake up to every morning.

    disclaimer: English is not my first language :-)

  2. Yes, its very admirable what Hawkins is doing.
    But I think I would do the same :)

    Maybe he, and some others (ex. who reading this blog) belives inside, that HTM is something different that the other
    When Im thinking about HTM I feel this is a new era in the 'intelligence research'...

    So its not surprise heis working full time nowdays

  3. More good evidence for the memory-prediction model:

  4. Thanks, Dave. You beat me to the punch. This article definitely supports some of the key principles of HTM. The current version of HTM does bottom-up rather than top-down prediction, so it will be interesting to see how future versions can make these two types of prediction work together in the hierarchy.

  5. You probably also get the Kurzweil newsletter, but if you missed it, here is an article about Numenta in Technology Review:

  6. Martin, thanks for the link. I get kurzweilai on the google feed, but not their actual newsletter, so I hadn't yet seen it. I think it is worth a post of its own.