Their “neuromorphic computing” term must mean something similar to HTMs. That’s done with Spiking neural nets, yeah? AFAIK they’re more ‘biologically accurate’ in simulating membrane / mini-brains, but use specialized intense hardware to do so?
Reminds me of that … 2014, I think, video. Some woman from Numenta presented her HTM which categorized human gesturing to drone controls, left right or neutral classifier, I believe.
I feel like encoding gestures as input for augmented body structures/tools is starting down the wrong path, though. Surely there’s more potential in BCI input, although I know that’s a much tougher task to develop.
I found this company who makes ‘dry electrode’ wearable EEG headgear, and they have some software that predicts one of four mental states (agitated, thinking hard, relaxed etc?) based on active EEG input. Probably a (convolutional?) neural net. EEG input is less ‘sensitive’ than, for example, Musk’s Neuralink, but I don’t think we have a functioning example of that yet.
I don’t foresee any trouble encoding EEGs, really. The tough part would be ‘training’ a human to ‘feel’ a certain mind-state that triggers the machinery’s commands and ‘associate’ it mentally with the machine’s movement.