h&sELM (Hash & Select Extreme Learning Machines)

This is where I am at with Extreme Learning Machines:

https://sciencelimelight.blogspot.com/2025/08/hash-select-extreme-learning-machine.html

It seems to be the best overall mix of things to do at the moment.

I kinda want limit time and effort wise the pointless (for my life) ML code I do.

I think limit down to h&sELM and free-up time for other interesting (non-ML) projects.

1 Like

Here is an old “blast from the past” weight block selection ELM I showed on this forum before.

https://editor.p5js.org/seanhaddps/full/AVWDmn5z8t

You select 32 samples from the image (of LĂĽbeck - very nice place). Press 1 to train, press 1 again to stop training. Then in a broad sense the generalization is quite good, though it has more of an interpolation feel to it, not so much extrapolation.

You could image an animal being able to reorientation itself from such a memory.

That example does not include more modern refinements I have.

1 Like

I did the h&sELM in JavaScript:

https://editor.p5js.org/seanhaddps/full/ypBBRgt4t

I have to admit I kinda don’t like it. 32 training examples is too few to test a large associative memory. Still I think the crude first code did far better with regard to generalization. I’ll modernize the first version at some stage and see how that goes.

1 Like

After chatGPT5 debugged my code a little I ended up with this nice hash&select associative memory (ELM more or less). It works well with a lower number of training examples compared to the select blocks of 8 weights from a pool of 256 blocks ELM.

Example:

https://editor.p5js.org/seanhaddps/full/ustNtD-Ek

I included modern refinements such as a project-then-sum ensemble method to combine the output layers.

Code:

https://editor.p5js.org/seanhaddps/sketches/ustNtD-Ek

1 Like