Manifold Neural nets

The idea I had is that it should be possible to evolve deep neural nets even though restricting the weight parameters to a low dimensional manifold.
Obviously that would make the evolution process far faster. It is not obvious though that the system would be rich enough to produce effective results.
My personal feeling is that with deep enough nets good results might be obtained. However I only just finished the code a few minutes ago, hence I can’t positively assert that it is so!

You can say all this is very speculative, but actually I think there has been some work on weight sharing in neural networks, which is another such restriction to a lower dimensional manifold in the state space of the neural network weights.
Anyway who knows? I’ll run the code for a while and see.