# Dense LLM, MoE, CoE
[Chain-of-experts (CoE): A lower-cost LLM framework that increases efficiency and accuracy | VentureBeat](https://venturebeat.com/ai/chain-of-experts-coe-a-lower-cost-llm-framework-that-increases-efficiency-and-accuracy/)
- Dense LM
- classic LLM
- MoE: Mixture-of-Experts
- DeepSeek-V3, (assumedly) GPT-4o
- splitting the model into a set of experts
- CoE: Chain-of-Experts
- activating experts sequentially instead of in parallel
[Chain-of-Experts: Unlocking the Communication Power of MoEs](https://sandy-server-87f.notion.site/Chain-of-Experts-Unlocking-the-Communication-Power-of-MoEs-1ab9bb750b7980048d43e6aab3537cea)
