I disagree. SP is a potential bottleneck of information. If information doesn’t survive the transformation into SP then it is certainly not available to form sequences or whatever else.
Wow, that paper (Sanger 1989) actually describes the same method as another paper I’ve been interested in recently, Spratling & Johnson 2002.
No it is not lateral inhibition. Rather, the method “uses up” each input bit’s strength progressively as target neurons are activated, meaning those same inputs can’t contribute to the activation of other target units. Privately I’ve been calling it suck decorrelation, because it sucks out the energy of input bits.
On the other hand Földiák 1990 does decorrelation (sparse coding) using learned lateral inhibition. That whole paper is blowing my mind. I wish I’d read it 2 years ago when I first got into HTM.