AI Disruption

AI Disruption

Share this post

AI Disruption
AI Disruption
Densing Law Discovers LLM's Maximum Capability Density Doubles Every 100 Days!
Copy link
Facebook
Email
Notes
More

Densing Law Discovers LLM's Maximum Capability Density Doubles Every 100 Days!

Discover how the Densing Law redefines AI development with exponential growth in capability density, cost reductions, and the rise of edge intelligence.

Meng Li's avatar
Meng Li
Dec 09, 2024
∙ Paid
1

Share this post

AI Disruption
AI Disruption
Densing Law Discovers LLM's Maximum Capability Density Doubles Every 100 Days!
Copy link
Facebook
Email
Notes
More
1
Share
AI, HPC, and quantum computing power modern enterprise innovation - Edge  Middle East

Is the Scaling Law That Has Driven the Rise of Large Models Coming to an End?

Recently, the AI community has been divided on whether the Scaling Law has reached its limit. On one side, some argue that the Scaling Law has hit a "wall." On the other, proponents like OpenAI CEO Sam Altman maintain that the potential of the Scaling Law is far from exhausted.

OpenAI Shifts Next-Gen Model Strategy: Has the Scaling Law Hit a Wall?

OpenAI Shifts Next-Gen Model Strategy: Has the Scaling Law Hit a Wall?

Meng Li
·
November 11, 2024
Read full story

At the heart of the debate lies a key question: can the performance of large models continue to rely on merely scaling up data and parameter sizes to achieve breakthroughs?

However, Scaling Law is not the only perspective. Recently, a research team led by Professor Liu Zhiyuan from Tsinghua University proposed the Densing Law—a new principle that reveals the exponential growth of model capability density over time. Their findings indicate that since 2023, the capability density of models has been doubling approximately every 3.3 months (or 100 days). This suggests that every 100 days, models with half the parameters can achieve performance equivalent to the state-of-the-art models.

Key Insights from the Densing Law

Based on the Densing Law, the research team derived several important conclusions, showing that AI's three core drivers in the modern era—electricity, computing, and intelligence—also follow similar trends of rapid density growth:

  • Conclusion 1: Model inference costs are decreasing exponentially over time.

  • Conclusion 2: The capability density of large models is accelerating.

  • Conclusion 3: Model miniaturization unveils enormous potential for edge intelligence.

  • Conclusion 4: Model compression algorithms alone cannot significantly enhance capability density.

  • Conclusion 5: The cost-effectiveness lifespan of high-performing models is rapidly shortening.

The law also underscores the significant potential for edge intelligence and calls for continued exploration of scientific methodologies for building large models. Improving model design processes will be key to achieving high-quality and sustainable development of AI models.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Meng Li
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More