DeepSeek Releases MoE EP Communication Library DeepEP – Truly Open!
DeepSeek has released DeepEP, an EP communication library for MoE model training and inference. This open-source project improves performance with NVLink, RDMA, and FP8 support.
"AI Disruption" publication New Year 30% discount link.
Last Friday, DeepSeek tweeted that this week would be OpenSourceWeek and that they would release five software libraries in succession.
Yesterday, they released the first codebase – FlashMLA. This is an efficient MLA decoding core for Hopper GPUs, which garnered nearly 8k stars in just 24 hours.
Today, DeepSeek continues to innovate with the underlying architecture. Today's open-source project is the first EP communication library for MoE model training and inference, DeepEP.