We propose a novel Graph Continuous Thought Machine (Graph CTM) architecture that integrates a simulated prefrontal cortex to enable adaptive problem-solving and decision-making. The Graph CTM leverages graph neural net-works to process complex data streams, while the simulated prefrontal cortex modulates node activity to selectively focus on relevant information. Through reinforcement learning, the model navigates graph space to converge on optimal solutions, guided by the information contained in learnt node property vectors. The simulated prefrontal cortex regulates the flow of information by adjusting the disposition of nodes to lead to the next instantiation of the graph network. The Graph CTM incorporates an attention mechanism that integrates the internal state of the graph as input, which is modulated by outputs from the model’s neural synchronization matrix. This modulation enables the algorithm to selectively focus on specific subgraphs or node subsets , correlating them with the input, effectively emulating short-term and long-term memory mechanisms when attending to both the input and internal representation. By dynamically weighting the importance of different graph components, the model can adaptively process and retain relevant information, facilitating more accurate and context-dependent decision-making.
Very nice approach! Have you published the code?
Thank you! I’m working on it..
Text Conditioned Self Architecture Search for Building Brain Like Connectivity by Describing It
Author Tofara moyo
Abstract
Text Conditioned Self Architecture Search for Building Brain Like Connectivity by Describing It
Author Tofara moyo
Abstract
We propose a method by which a neural graph continuous thought machines dispositional nodes connections may be designed faithful to a human brain. A graph continuous thought machine replaces the synapse and neuron level models with a graph cnn .In some sense, the nodes of the graph at any one time represent the instantiation of the nodes of the dispositional neural model it is part of. Instantiating only those nodes that are currently firing. The GCNN then outputs the next graph as the system searches graph space for solutions as guided by learnt property vectors.The outputs from its neural synchronization matrix then modulate the attention given to inputs as well as to the nodes of the dispositional network. This way it designs The dispositional neural models connections (disposition for particular graphs to be next after others). We then employ neural training modules which are spiking neural networks which have their nodes mapped with keys from a musical keyboard. In particular when exposed to the state of teacher systems the nodes are trained to musically harmonize, while when exposed to the state of the untrained agent they are dissonant. The agent then tries to maximise consonance in the spiking network by using it as a reward signal. By this method the agent is trained to perform like the teacher system. We introduce text conditioned neural training modules, that condition the input on text. We show a method to modulate not just the behavior of the system , but the connectivity of the dispositional network of a GCTM.