ThinkBe is developing a new gated auto generated multi layer perceptron, which uses a unique new processes
called Logic, Creative & Subconscious
disk layers: Subconscious Generating Neural Network - SCGNN
The old & New
MLP's are a very old concept, but they are still the most popular and effective way to train a basic neural network.
Gated MLP's (gMLP) are a new concept designed by Google, to compete against transformers with less resources.
We are combinding gMLP with a entirely new way to understand, build and train neural networks aimed at higher level feature building with self correction.