Neural networks are moving out to the the edge!

Perhaps the advantages of HTM have a place in this market.

Machine learning is a rapidly growing field, especially as deep learning moves to the edge and more and more engineers build applications that include some form of vision- or voice-based machine learning technology.
http://app.contact.nxp.com/e/es.aspx?s=1764&e=837986&elqTrackId=da411e3d6d3b438ca942f6ba8ff6e877&elq=1d4f960fd4544e3c8e8bb294720df4c3&elqaid=8139&elqat=1

Something to think about.

5 Likes

An example of Neuro at the edge:

Robotic arm being made with neuromorphic engineering

3 Likes

Their “neuromorphic computing” term must mean something similar to HTMs. That’s done with Spiking neural nets, yeah? AFAIK they’re more ‘biologically accurate’ in simulating membrane / mini-brains, but use specialized intense hardware to do so?
Reminds me of that … 2014, I think, video. Some woman from Numenta presented her HTM which categorized human gesturing to drone controls, left right or neutral classifier, I believe.

I feel like encoding gestures as input for augmented body structures/tools is starting down the wrong path, though. Surely there’s more potential in BCI input, although I know that’s a much tougher task to develop.

I found this company who makes ‘dry electrode’ wearable EEG headgear, and they have some software that predicts one of four mental states (agitated, thinking hard, relaxed etc?) based on active EEG input. Probably a (convolutional?) neural net. EEG input is less ‘sensitive’ than, for example, Musk’s Neuralink, but I don’t think we have a functioning example of that yet.
I don’t foresee any trouble encoding EEGs, really. The tough part would be ‘training’ a human to ‘feel’ a certain mind-state that triggers the machinery’s commands and ‘associate’ it mentally with the machine’s movement.

1 Like