Intelligence is embarrassingly simple

No robots - cannot afford it. Text/images are nice proving grounds. Here are working examples:
https://youtu.be/_NjjKeGltBw (Sep 2020, breakthrough with finding a distance measure working in feature spaces ~10^6 dimensionality)
IMDB Reviews Dataset - YouTube (same thing - IMDB)
https://youtu.be/CJY0zgMBwb0 (Amazon sentiment)

Fresh repo (used it as backup, kind of messy, but working: classification, generation):

Last repo (backup again, the simplest implementation: local continual Hebbian learning, no BP, single [dah epoch, structurally and synaptically plastic, emulating spiking messaging). Does online clustering of tokens streams (characters, words, tokens, integers). Want to play - start the jar, tune config file: number of streams +++.

Here is video of how it works : Sign Up | LinkedIn

Here is video of how structural (LED) matrix is built:

Here are examples of generations (nano generator ~200 lines of Java, on top of trained “language model” - stochastic parrot:

“Never again she be able to get a better look at the man who had been in the same way as the malfunctioning one.”(c)

“I suspect real gangbangers do not wear T-shirts outside on new years eve in northern Europe.”(c)

Some musing on how lateral messaging between dedicated neurons could work:

(side propagation :slight_smile: )

And if you not tired yet, here is an attempt (#17) to describe what’s going on and how it works:

Enjoy.

Images are not that obvious - have no publishable posts about. But - have no reason to mislead - it works (trains, classifies, generalizes) on the same platform.

I’m leaving tomorrow morning for a week - so, do not miss me much, be back - will respond :slight_smile:

2 Likes