Releasing BrainBlocks 0.6: Building ML Applications with HTM-Like Algorithms

I know of an alternative method which is far easier to use than Numenta’s method. For reference this is Numenta’s boosting function:
boost_factor = e ^ (boost_strength * (target_sparsity - activation_frequency))

This is a better function:

boost_factor = log(activation_frequency) / log(target_sparsity)

This function has a zero-crossing at cell activation frequency of 100% and an asymptote to infinity at activation frequency of 0%. These properties give it stronger theoretical guarantees than the exponential boosting function. It also has fewer parameters, which makes it easier to use. The only parameter is the period of the exponential moving average which tracks the activation_frequency.

5 Likes