Could ANNs do meta-approximations?

I was watching a video from Hinton and he says that ANNs can basically do anything. I believe what he means by that is they can approximate any function. A deep feed forward net maps inputs to outputs. Recurrent nets essentially do the same thing but over time. I wonder, could some architecture be used to approximate gradient decent (or some other optimization)? This would mean optimizing an optimizer.

Sorry Matt, I couldn’t resist when thinking about meta
image

2 Likes

Thanks for picking a decent picture, I know what you had to choose from.

1 Like

Whatever happened to your man bun? You were rockin that style!

Neural evolution can be used instead of back propagation. Gradient of decent and
reward function can be at the same time. And it works with unsupervised learning.

Neuroevolution: A different kind of deep learning:

Modeling Evolution with Tensorflow.js:

Code vs Data (Metaprogramming) - Computerphile: