09:13 CETWednesday · May 13, 2026

shipfeed

K SEARCHJK NAVO OPEN
on the wire
home/cluster
ad slot opena single understated line lives here — sponsor wordmark + a short line.advertise on shipfeed →
§ feed · cluster

Zyphra Releases ZAYA1-8B: A Reasoning MoE Trained on AMD Hardware That Punches Far Above Its Weight Class

May 7 · · primary fetch1 sourcecluster 5039b3ddupdated May 7 ·

Zyphra released ZAYA1-8B, a small MoE model with 760M active parameters trained on AMD hardware, outperforming larger models on math and coding benchmarks, available under Apache 2.0.

read full article on marktechpost.com
§ sources1 publication · timeline below
  1. marktechpost.comZyphra Releases ZAYA1-8B: A Reasoning MoE Trained on AMD Hardware That Punches Far Above Its Weight Classprimary
Zyphra Releases ZAYA1-8B: A Reasoning MoE Trained on AMD Hardware That Punches Far Above Its Weight Class · shipfeed