How do new patterns affect column representation of old patterns?

Suppose the spatial pooler has learned many patterns, and now you present another that is maybe half-way between two existing patterns. Could the representation of existing patterns change in any way?
Likewise, suppose two cortical columns disappear for some reason. I would the algorithm would then recruit two new columns to represent the damaged patterns. Would this affect other existing SDRs?

No, because the learning a new pattern doesn’t flush out old patterns. Depending on how the network is configured, connections between cells will degrade over time as old patterns are less seen. But just because a similar new patterns is learned doesn’t mean existing pattern learning will be affected.

(I think you mean “mini-columns”, cortical columns are a larger structure consisting of multiple layers.)

There are probably other columns that represent similar data already, and if not (because of them losing column competitions) there will be soon as they start winning the competitions the removed columns used to win.

I’m not sure what this question means.

1 Like

@gidmeister, I believe this question was basically the same as your first question, right? Basically I think you are asking: in the case where two columns are lost, and two new columns now start winning the competitions, if those two new columns had already learned other patterns, would those other patterns be affected? In that case, @rhyolight’s answer to your first question applies – the old patterns do not get flushed, so the two new columns would still win out for those earlier patterns (taking into account the degrading process over time).

1 Like

Thanks for the answer. The degradation over time is interesting too. I once looked at the work of Walter Freeman, who found out that the olfactory system operates with chaotic attractors, and he said that when you learned a new pattern, the representation of old patterns changed. But of course your model is based on a different part of the brain.