Thursday, August 26, 2010

Numenta's new website

Numenta redesigned its website. Here are a few nuggets from the new site:

1. Some new videos were added, including Hawkins' 2008 keynote from the HTM workshop and a speech by Subutai Ahmad from the 2009 workshop. Ahmad's talk was particularly interesting because he discussed a number of corporate partnerships and some early results from them. For instance, Numenta is/was working with a major automaker on the creation of a pedestrian detection system, where the car looks for pedestrians in front of the vehicle. The early testing resulted in 96-97% accuracy, or closer to 99% accuracy if one counts a false positive as a good result (situations where the system detects a pedestrian where there wasn't one). The talk also mentioned some interesting work that Numenta did with Tyzx, which provides computer vision systems. They used an HTM network to look for objects/persons in security camera footage. Subutai specifically mentioned robotics as a potential application. Interestingly, only three days ago Tyzx announced a deal with Irobot to provide vision systems for its military robots, including person detection capabilities. The press release did not mention whether HTM's are a part of that technology. It would be interesting to see what companies Numenta has been working within in the 14 months since Subutai's talk.

2. The website also contains a basic description of its new learning algorithms. It is difficult not to notice how huge of a leap forward that Numenta views these algorithms as. In one place, Numenta states that the new algorithms are a "radical" improvement. In another place, it states that the new learning algorithms are "far superior" to the old ones. One thing that I wish the website contained was some experimental results showing these huge improvements. One thing that I found confusing was its description of prediction in the new algorithms. It described prediction as something flowing up the hierarchy. That seems different from prediction as described in the original HTM theory, which envisioned incoming data flowing up the hierarchy and predictions flowing down the hierarchy. In any event, it was an interesting read.

Thursday, August 19, 2010

Tidbits on Numenta

Wow has it been a slow summer for HTM news. I have never seen a period of time where Numenta's employees have made so few public appearances. Since Hawkins' talk in March, I haven't seen any mention of any Numenta speeches, interviews, or papers of any kind. A few things of note:

-In its June newsletter, Numenta mentioned that it decided not to attempt an HTM workshop this year, meaning that the next generation algorithms will not be out this year. In their words, they decided not to push for an "interim" release this year, but to delay the workshop and release to 2011. "Interim" was an interesting choice of words, suggesting that a more fully featured product will be the result when NUPIC 2.0 does come out.

-Dileep George has a new blog on his website. It is called Mind Matter, and is located at the following link.

-I saw an interesting blog post regarding a robot called Nao that can apparently show and understand emotion. The author claims that the creators of the new robot software were using HTM software in the robot. I have not been able to verify that claim. The only mention of HTM in the context of a Nao robot that I found was an article by Ben Goertzel in which he describes using an HTM for low level perception in a Nao robot. That article specifically states that he hasn't implemented the idea yet, however. Find the link here.

-Finally, Tomaso Poggio, a Professor in the Department of Brain and Cognitive Sciences at MIT, and one of the creators of a biology-based hierarchical learning model known at HMAX, has created a software model that uses GPUs to greatly accelerate software that is designed to emulate the cortex, such as HTM or HMAX. Poggio claims that the software accelerates these biology-based models by an amazing 80-100 times. Poggio is listed as a technical advisor on Numenta's website, so hopefully they are aware of this, especially given the increased computational demands of NUPIC 2.0.