In January 2025, the company DeepSeek released model R1, which stimulated a race among AI model providers. ByteDance announced the Seed-Thinking-v1.5 model aimed at enhancing STEM and general-purpose reasoning efficiency. The model uses an Mixture-of-Experts (MoE) architecture with 20 billion parameters out of 200 billion. Seed-Thinking-v1.5 surpasses DeepSeek R1 in many metrics, approaching Google Gemini 2.5 Pro and o3-mini-high reasoner from OpenAI. On the ARC-AGI benchmark, the model achieves high scores, evidenced by evaluations on AIME 2024 (86.7%), Codeforces (55.0% pass@8), and GPQA (77.3%). Seed-Thinking-v1.5 showed an 8% higher win rate compared to DeepSeek R1 in human preference assessments. ByteDance introduced BeyondAIME — a new mathematical benchmark for better discrimination of model performance and resistance against memorization of standard benchmarks.