The ANN equivalent of HTM

RNN are ANN with a feed back loop. LSTM are RNN with flip flops to hold bits high.
Noting that it has toggled to a new state.

The connectionist working with ANNs try to model ANN more as transmission lines
They will use hard memory device as a last resort to make their model work.

Machine Leaning do not stack two or more deep ANN together than have a feed back
to make supper stacked RNN. I think the brain does.

There are a couple of ways making a memory device with ANN. One way is to daisy
chain of neurons in a loop. Also known as a delay line, that feed back in. Like a
RNN. But it is a pure transmission line and data is not changed.

The second way is to store thing up in “Weight Space”.
A ANN converts input data to output data. It dose this by adjusting the weight within
the ANN.
So a input value that look like a address value could be inputted in do a ANN
and be trained to give a output data value. This is a ANN simulating RAM. Training
a value into ANN take longer to store it.

For me, One of my AGI models i use a very long “Straight Line” of many spiked
neurons that drive video.

When a pule singal jump to then next neuron it generate then next frame of video.

One pulse goes to the next neuron and another goes to a ANN that generates
the video image. Like a addressing through video memory in sequence.

When a bot wake up and look around. it detection features of the world that
activate SDR bits. These bit routed to the “straight Line” of spiked neurons. That
drives the video.
Like hitting a spider web line. Many features detection will cause a spike to develop
at given location. I also call this Straight line of neurons the consciousness track.

The center of the universe for a connectionist is in between a encoder network
and generator network.

Encoder Decoder Network - Computerphile:

Some of the very fist life form on earth, that used neurons, had a detector or encoder
network that detected food and then a decoder or generating network to catch food.

Latter on decoder were used as a memory device.

2 Likes