Section 01
OpenSeeker-v2: Introduction to the Cutting-Edge Search Agent Trained Only with SFT
OpenSeeker-v2 achieves state-of-the-art (SOTA) performance on multiple search benchmarks using only 10.6k samples and supervised fine-tuning (SFT) training through a high-quality data synthesis strategy, challenging the complex CPT+SFT+RL training paradigm commonly used in the industry. This article will analyze it from aspects such as background, methods, and experimental results.