Baidu logo

Baidu

ERNIE 4.5 300B A47B

ERNIE-4.5-300B-A47B is a 300B parameter Mixture-of-Experts (MoE) language model developed by Baidu as part of the ERNIE 4.5 series. It activates 47B parameters per token and supports text generation in both English and Chinese. Optimized for high-throughput inference and efficient scaling, it uses a heterogeneous MoE structure with advanced routing and quantization strategies, including FP8 and 2-bit formats. This version is fine-tuned for language-only tasks and supports reasoning, tool parameters, and extended context lengths up to 131k tokens. Suitable for general-purpose LLM applications with high reasoning and throughput demands.

Input / 1M tokens
$0.280
Output / 1M tokens
$1.10
Context window
123K tokens
Provider
Baidu
Knowledge cutoff
2025-03-31

Performance

Median streaming throughput and first-token latency measured by Artificial Analysis.

Output tokens / sec
29 t/s
Time to first token
1.75s

Benchmarks

Intelligence, coding, and math indexes plus the underlying evaluation scores.

Intelligence Index
15
Coding Index
15
Math Index
41
MMLU-Pro
77.6%
GPQA
81.1%
HLE
3.5%
LiveCodeBench
46.7%
SciCode
31.5%
MATH-500
93.1%
AIME
49.3%

Benchmarks via Artificial Analysis