Computing power requirement differences between HTM and neural nets

I’m wondering about the differences in computing power (and monetary cost) required for HTM as opposed to the various neural networks. At this stage this question is about the power needed to do emulation, but I’m also wondering about guesses on requirements foreventual hardware HTM implementation vs [X]NN. I’m looking for this in relation to specific jobs if possible, like the navigation problem discussed here: AI program gets really good at navigation by developing a brain-like GPS system

AlphaGo Zero employed around 15 people and millions in computing resources.

1 Like

The only level field to compare HTM and other ANNs are with temporal streaming data, which we have set up as a benchmark in a project called the Numenta Anonaly Benchmark. It does not take compute resource requirements or speed into account for the benchmark. I can’t really answer your question.

In the “comparison” in computing power between methods does DNN have to include the CPU-hours spent in training to find a workable solution?

Do the multiple failed attempts to find any workable net also count against the successful DNN models?

How do you weight the CPU-hours spent in attempts that never find a usable solution?

1 Like

I think for Deep Networks, the fact that we have to spend so much time exploring, testing, iterating, is just assumed par for the course, as long as the end results work out. I tend to assume this is okay in the field, because the overall efficiency gains that are paid back (when a working solution is found…) tend to make up for initial time investment.

I’d argue this is parallel to any industrial development, where initial outlay is required before the benefits can be reaped. With Deep Learning, that outlay is generally electricity and development time (far cheaper than giant mechanical machines of yesteryear).

I have yet to apply HTM concepts to any practical problems yet, so I cannot comment on what its iteration and exploration costs are. Maybe someone else here could?