Loading...
Loading...
ERNIE-4.5-300B-A47B is a 300B parameter Mixture-of-Experts (MoE) language model developed by Baidu as part of the ERNIE 4.5 series. It activates 47B parameters per token and supports text generation in both English and Chinese. Optimized for high-throughput inference and efficient scaling, it uses a heterogeneous MoE structure with advanced routing and quantization strategies, including FP8 and 2-bit formats. This version is fine-tuned for language-only tasks and supports reasoning, tool parameters, and extended context lengths up to 131k tokens. Suitable for general-purpose LLM applications with high reasoning and throughput demands.
Quality Index
15.0
264th of 442
Top 60%
Coding Index
14.5
196th of 352
Top 57%
Math Index
41.3
151st of 268
Top 58%
Price/1M
$0.48
385th cheapest
56% above median
Top 57%
Speed
36 tok/s
Top 56%
TTFT
1.89s
Context Window
123K
265th largest
Top 76%
Input
$0.28
per 1M tokens
Output
$1.10
per 1M tokens
Blended
$0.48
per 1M tokens
Cheaper than 43% of models. Median price is $0.31/1M tokens.
Daily
$0.48
Monthly
$14.55
36
tokens/sec
Faster than 44% of models
1.89
seconds
Faster than 14% of models
1.89
seconds
Faster than 29% of models
Market Median
46 tok/s
22% slower
Median TTFT
0.42s
352% slower
Throughput/Dollar
73
tok/s per $/1M
Speed Comparison
Context Window
123K
tokens
Larger than 24% of models
Max Output
12K
tokens
10% of context