In a system where size in bytes of an SDR is of concern (such as synchronizing between multiple threads, or transmitting over a network), the thought occurred to me that indices to the 1 bits could be relative to each other. If the space is too large to fit into the data type used, noise could be added.
For example an SDR encoded as indices to the one bits, such as:
[2,16,254,267,509,513,752,763,1020,1023 …]
Could be encoded as relative indices:
[2,14,238,13,242,4,239,11,257,3 …]
In this case, lets say we want to encode the indices as uint8, which has a rang of 0 - 255. One of the values (257) is too large. However, if we insert a single bit of noise, then we could use uint8 (thus halving the footprint compared to the next size up uint16):
[2,14,238,13,242,4,239,11,255,2,3 …]
My understanding of the principles tells me that this should be fine (HTM being inherently noise tolerant). Wondering if anyone has explored this before, and has any thoughts.
What occurs to me is that in a stream that has large blocks of zero bits (more likely with topology), there would be something of a pattern in the noise (every 250’th bit on, in the case of uint8). Has anyone explored the affects of noise involving certain bits that are always on (versus randomized noise that changes every time step), or even patterns of bits that are on in a system that relies on topology?