Loading...
2.0M
323
16-24 GB
RTX 4090 / M2 Max
Qwen3.5 397B A17B (Reasoning)
Alibaba
Qwen3.5 27B (Reasoning)
Qwen3.5 122B A10B (Reasoning)
Qwen3.5 397B A17B (Non-reasoning)
Qwen3 Max Thinking
Qwen3.5 27B (Non-reasoning)