AI Disruption

AI Disruption

DeepSeek Releases MoE EP Communication Library DeepEP – Truly Open!

DeepSeek has released DeepEP, an EP communication library for MoE model training and inference. This open-source project improves performance with NVLink, RDMA, and FP8 support.

Meng Li's avatar
Meng Li
Feb 25, 2025
∙ Paid

"AI Disruption" publication New Year 30% discount link.


Last Friday, DeepSeek tweeted that this week would be OpenSourceWeek and that they would release five software libraries in succession.

Yesterday, they released the first codebase – FlashMLA. This is an efficient MLA decoding core for Hopper GPUs, which garnered nearly 8k stars in just 24 hours.

DeepSeek Releases FlashMLA, Boosting H800 GPU Performance

DeepSeek Releases FlashMLA, Boosting H800 GPU Performance

Meng Li
·
February 24, 2025
Read full story

Today, DeepSeek continues to innovate with the underlying architecture. Today's open-source project is the first EP communication library for MoE model training and inference, DeepEP.

User's avatar

Continue reading this post for free, courtesy of Meng Li.

Or purchase a paid subscription.
© 2026 Meng Li · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture