When an input doesn’t activate some predictive cells do those cells have their permanences decreased?
Yes. From BAMI:
1. for column in columns 2. if column in activeColumns(t) then 3. if count(segmentsForColumn(column, activeSegments(t-1)))> 0 then 4. activatePredictedColumn(column) 5. else 6. burstColumn(column) 7. else 8. if count(segmentsForColumn(column, matchingSegments(t-1)))> 0 then 9. punishPredictedColumn(column)
This basically loops through all minicolumns, and for any minicolumn that isn’t activated by the current input, the cells in it which were predicted have their active dendrites punished.
Typically you will set this decrement pretty low, so that when there is a lot of bursting as the layer is learning something new, it won’t rapidly forget everything it learned before. An alternative implementation would be to not do this punishment step after any time step where most/all of the minicolumns were bursting.