Graded input

I’ve noticed in HTM that the predicted cells in an input column appear to be all or none i.e. if multiple patterns are predicted from an input they are all equally predicted. In the Why neurons have thousands of synapses papers, 2 sequences are given ABCD and XBCY such that if given the input BC both D and Y are equally predicted.

Is there any mechanism to create graded predictions? Like suppose input ABCD is given 80% of the time and XBCY only 20%, then presumably the learning algorithm should make a similar sort of confidence prediction in the next letter.

One thought that I’ve had on this is in regards to spike activity. Essentially, could rate coding contain the relative probability distribution while spatial coding holds the actual predictions? If so, then it seems likely neurons need a mechanism to quickly transform graded analog input into binary decision points. A paper (referenced below) provides a powerful potential mechanism for achieving such a transformation. It also (see figure 7) makes a strong case that converting an analog to binary actually enhances noise discrimination which is a useful parallel to HTM ideas.

Paper reference: A Neural Basis for Categorizing Sensory Stimuli to Enhance Decision Accuracy
cell 2020 authors:Yujia Hu, Congchao Wang, Limin Yang,
Geng Pan, Hao Liu, Guoqiang Yu,
Bing Ye