Tngtech
DeepSeek R1T2 Chimera
DeepSeek-TNG-R1T2-Chimera is the second-generation Chimera model from TNG Tech. It is a 671 B-parameter mixture-of-experts text-generation model assembled from DeepSeek-AI’s R1-0528, R1, and V3-0324 checkpoints with an Assembly-of-Experts merge. The tri-parent design yields strong reasoning performance while running roughly 20 % faster than the original R1 and more than 2× faster than R1-0528 under vLLM, giving a favorable cost-to-intelligence trade-off. The checkpoint supports contexts up to 60 k tokens in standard use (tested to ~130 k) and maintains consistent <think> token behaviour, making it suitable for long-context analysis, dialogue and other open-ended generation tasks.
- Input / 1M tokens
- $0.300
- Output / 1M tokens
- $1.10
- Context window
- 164K tokens
- Provider
- Tngtech
- Cached input / 1M
- $0.150
- Knowledge cutoff
- 2024-07-31
Performance
Median streaming throughput and first-token latency measured by Artificial Analysis.
- Output tokens / sec
- —
- Time to first token
- —