This isn’t a bug in your implementation. The BAMI pseudocode has the same behavior. In step 5, it won’t grow new connections to the previous winner cells, because it already has SYNAPSE_SAMPLE_SIZE (a.k.a. “maxNewSynapseCount”) active synapses.
If this logic used “number of synapses to previous winner cells” rather than “number of synapses to previous active cells”, then it would have the alternate behavior that you’re expecting. But that would have other bad effects: if the TM learns sequences “ABCD” and “XBCY”, it would assign the same SDR for both occurrences of C, and then it would always predict a union of D and Y afterward, regardless of whether it had seen “ABC” or “XBC”.