Synthetic Intelligence Researcher
CEO @ Iseer & Co.
C++ · RUST · ASSEMBLY
PYTORCH · CUDA · JAX
We introduce a unified framework combining classical cybernetic feedback with neural learning for adaptive control. The architecture integrates real-time feedback processing with parameterized neural control laws, enabling online refinement of performance in nonstationary environments.
We introduce Iseer, a language model architecture that integrates selective state space models with sparse mixture-of-experts layers, achieving sub-quadratic complexity without sacrificing representational capacity. Unlike attention-based transformers, Iseer processes sequences in O(n) time through learned, input-dependent gating.
We train Iseer at scales from 20M to 300M parameters on balanced English-Bengali corpora. Empirical evaluation demonstrates that Iseer achieves perplexity competitive with dense transformer baselines while requiring approximately 40% fewer FLOPs during training.
The future belongs to systems that can reason, adapt, and evolve. We move beyond static algorithms to create cybernetic organisms in code— bridging the gap between biological cognition and machine precision.