Zyra Releases Zia 1 8B: First Frontier-Tier Model Trained Entirely on AMD
Zyra has released Zia 1 8B under the Apache 2.0 license — the first frontier-capable model demonstrating that training entirely on AMD Instinct GPUs (rather than NVIDIA) produces competitive results at the highest performance tier. The model uses a Mixture of Experts architecture with "Markovian RSA" multi-attempt reasoning (sampling and propagating the best reasoning fragments across rounds), compressed convolutional attention, and learned residual scaling. At ~17.7GB total weight, it fits on consumer hardware and benchmarks claim it competes with Qwen 3 Thinking (235B), DeepSeek 3.2, and GPT-5 on math, code, and reasoning tasks.
Why It Matters
AMD-trained frontier performance at the 8B scale proves that NVIDIA is not a prerequisite for frontier AI training — a supply chain diversification story with significant geopolitical and commercial implications for AI infrastructure independence.