Logic, Creative & Subconscious disk layers: Gated Subconscious Generating multilayer perceptron.
ThinkBe is developing a new gated auto generated multi layer perceptron, which uses unique new processes called
Google, to compete against transformers for large language models with less resources.
We are combinding gMLP with a entirely new way to understand, build and train neural networks aimed at higher level feature building with self correction.