AI Disruption

AI Disruption

Share this post

AI Disruption
AI Disruption
DeepSeek Releases MoE EP Communication Library DeepEP – Truly Open!

DeepSeek Releases MoE EP Communication Library DeepEP – Truly Open!

DeepSeek has released DeepEP, an EP communication library for MoE model training and inference. This open-source project improves performance with NVLink, RDMA, and FP8 support.

Meng Li's avatar
Meng Li
Feb 25, 2025
∙ Paid

Share this post

AI Disruption
AI Disruption
DeepSeek Releases MoE EP Communication Library DeepEP – Truly Open!
1
Share

"AI Disruption" publication New Year 30% discount link.


Last Friday, DeepSeek tweeted that this week would be OpenSourceWeek and that they would release five software libraries in succession.

Yesterday, they released the first codebase – FlashMLA. This is an efficient MLA decoding core for Hopper GPUs, which garnered nearly 8k stars in just 24 hours.

DeepSeek Releases FlashMLA, Boosting H800 GPU Performance

DeepSeek Releases FlashMLA, Boosting H800 GPU Performance

Meng Li
·
Feb 24
Read full story

Today, DeepSeek continues to innovate with the underlying architecture. Today's open-source project is the first EP communication library for MoE model training and inference, DeepEP.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Meng Li
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share