Ai2 Releases BAR: Modular MoE Post-Training for LLM Domain Updates
The Allen Institute for AI (Ai2) has released BAR (Branch-Adapt-Route) — a modular post-training recipe that trains domain experts separately and merges them into a unified MoE via a learned router. Targeting one domain does not disturb others. Applied to OLMo 2 7B, BAR-5x7B achieves +16.5 coding points and +13 math points with only linear per-domain update cost versus quadratic for monolithic post-training. Full checkpoint suite released under Apache 2.0, including base model, all domain experts, and the final 5-expert MoE with learned router.
Why It Matters
Linear-cost domain updates make targeted LLM specialization economically viable for teams that need to update one capability (e.g., code, legal) without retraining the full model — a direct path to continuously adapted production models.