# SDR Turing machine idea

I’m not sure how to followup on this, so I’ll just expose it here.
The main idea is to combine both Turing machine and a SDR sequence learner concepts into a hybrid between the two which could lead to interesting possibilities.

• Universal Turing machine: starting with a limited set of symbols and instructions it can be programmed incrementally to address arbitrarily complex problems.
• SDR predictor: is able to learn and remember arbitrary sequences of SDR “symbols”

What a “hybrid” between these two would be able to do?

First it should be able to define arbitrary operations from a list of examples.

E.G to learn two digit addition it is trained to remember continuation on sequences like:

``````add_two_digit first 9 second 7 >  r0 1 r1 6
``````

Note there is no intrinsic notion of numbers, the things above are all SDRs or symbolic embeddings if you like. Numbers eventually emerge as a set of SDRs as symbols and a set of SDRs functions which when combined produce new symbols

Once a certain function is learned, freeze the corresponding synapses/cells/parameters and teach a new algorithm for new problems that are somehow able to reuse previously learned operations.

In the previous example “add_two_digit” can be seen as the function name, “first” and “second” input registers and “r0” “r1” output registers.

Hopefully the machine itself would be able to learn new functions only by examples of correct (and incorrect?) sequences and (eventually) hints about what known functions are likely to be needed in order to solve the new problem.

like from many examples like:

``````add2digit  7 9 > 1 6
``````

It would be able to figure out that applying the known sequence above:

``````add_two_digit first 9 second 7 >  r0 1 r1 6
``````

Would match the new example.
Yes, I know add2digit isn’t a new function but a shorthand of a previous one but… it illustrates the idea.

I know there are many ancient attempts at symbolic AI in this vein, with limited results. The difference would be this uses SDRs (large-ish embeddings) not sure yet what/if this would be an advantage. Associative lookup for “resemblance” with known examples could be one that avoids exhaustive searches.

I don’t know if the above is any worth here you have it.

3 Likes

For number (and other) symbols, my thought would be to treat the digits separately and have the process recursive so that you don’t end up with a temporal limit to the complexity.

Treat all symbols in the same sense as they are split (time gapped/delayed) when they are spoken, numbers, words, etc. This can create some interesting symbols and allows for otherwise singular symbols to be broken down to closer to a simplistic base if you want to avoid creating a new symbolic input language.

Would the approach then need to follow a recursive pattern to deal with branching or certain complexity. The recursive pass would then have a secondary (or more) parallel temporal stream (carry equivalent), which would represent short term memory and allow that short term memory to interact with the attention stream.

The output is then two or more streams, resolved and unresolved. Continue until the unresolved stream is null.

This could allow the addition of very large numbers with some very simple rules, which should be capable of being self taught.

Three hundred and seventy five thousand four hundred and sixty five plus one. (in this instance “and” can be thought of as a temporal shift, aligning three hundred in parallel with seventy five). Maybe this creates a separate stream (splitter neurons paper) and the process is then inherently more parallel - and allows for indifference for some sequence orders. “seventy five and four hundred”. The shifting is also learnt…

How this fits with a pure turing, your “functions” should all just be symbols as far as SDR processing goes.

Just some thoughts and ideas…

2 Likes

Sir,
I am doing my project in this so i need to implement this but unfortunately python 2.7 has stopped and couldn’t be implemented so could u please help me to sort out the problem.
Thank u,
Honeymol

1 Like

@honey_karthik I think you missed the topic, there is no code of what I’m talking about here.

@BrainVx thanks these are interesting points.

1 Like