To get the highest magnitude output from a dot product the angle between the input vector and the weight vector should be as small as possible. In the Numenta net top-k selection must be highly correlated with smallest angle selection. At each layer the neurons with the smallest angle between the weights and input are being selected. That is clear (perhaps even error correcting) pattern detection. Repeated layer after layer.
In some previous Numenta paper I remember they discussed dot products between sparse input vectors and (more) dense weight vectors and the combinatorial math involved.
I would say the Numenta net is very good at classification. However top-k is an information bottleneck. I doubt such a net can produce high dimensional image responses. It might be very good for text given the precision of pattern detection.
The dot product of several (connected) dot product can be simplified back to single dot product. When all the top-k switch states in the Numenta net become known it collapses to a simple matrix (bunch of dot products with the input vector.) Very likely the angles with the input vector are small, by inheritance during feedforward. That should mean the noise sensitivity is low. You could work that out using the variance equation for linear combinations of random variables applied to dot products.
You could try k-least angle selection or even some more exact least noise selection.
If you want the value 1 out of a dot product that has equal noise on all inputs you can make one input 1 and one weight 1. That cuts out most of the noise. Or you can make all the inputs 1 and all the weights 1/d (d=dimensions). That will average the noise. Averaging the noise is better.
In both cases the angle between the input vector and the weight vector is zero. As you increase the angle toward 90 degrees the length of the weight vector must increase to keep getting 1 out. And the output will as a result get very noisy.
Thus there are 2 factors that control noise response of a dot product, cut or average or anything between, and angle.
I actually learned quite a number of things thinking about that net. Thanks.