ThinkBe is developing a new gated auto generated multi layer perceptron, which uses unique new processes
called Logic, Creative & Subconscious
disk layers: Gated Subconscious Generating multilayer perceptron.
The old & New
MLP's are a very old concept, but they are still the most popular and effective way to train a basic neural network.
Gated MLP's (gMLP) are a new concept designed by Google, to compete against transformers for large language models with less resources.
We are combinding gMLP with a entirely new way to understand, build and train neural networks aimed at higher level feature building with self correction.