ExplorerComputational LinguisticsNLP
Research PaperResearchia:202602.02066

RE-TRAC: REcursive TRAjectory Compression for Deep Search Agents

Jialiang Zhu

Abstract

LLM-based deep research agents are largely built on the ReAct framework. This linear design makes it difficult to revisit earlier states, branch into alternative search directions, or maintain global awareness under long contexts, often leading to local optima, redundant exploration, and inefficient search. We propose Re-TRAC, an agentic framework that performs cross-trajectory exploration by generating a structured state representation after each trajectory to summarize evidence, uncertainties,...

Submitted: February 2, 2026Subjects: NLP; Computational Linguistics

Description / Details

LLM-based deep research agents are largely built on the ReAct framework. This linear design makes it difficult to revisit earlier states, branch into alternative search directions, or maintain global awareness under long contexts, often leading to local optima, redundant exploration, and inefficient search. We propose Re-TRAC, an agentic framework that performs cross-trajectory exploration by generating a structured state representation after each trajectory to summarize evidence, uncertainties, failures, and future plans, and conditioning subsequent trajectories on this state representation. This enables iterative reflection and globally informed planning, reframing research as a progressive process. Empirical results show that Re-TRAC consistently outperforms ReAct by 15-20% on BrowseComp with frontier LLMs. For smaller models, we introduce Re-TRAC-aware supervised fine-tuning, achieving state-of-the-art performance at comparable scales. Notably, Re-TRAC shows a monotonic reduction in tool calls and token usage across rounds, indicating progressively targeted exploration driven by cross-trajectory reflection rather than redundant search.


Source: arXiv:2602.02486v1 - http://arxiv.org/abs/2602.02486v1 PDF: https://arxiv.org/pdf/2602.02486v1 Original Article: View on arXiv

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
Feb 2, 2026
Topic:
Computational Linguistics
Area:
NLP
Comments:
0
Bookmark
RE-TRAC: REcursive TRAjectory Compression for Deep Search Agents | Researchia