NeoChainDaily
NeoChainDaily
Uplink
Initialising Data Stream...
29.12.2025 • 15:29 Research & Innovation

Curriculum-Guided Adaptive Recursion Improves Training Efficiency of Recursive Reasoning Models

Global: Curriculum-Guided Adaptive Recursion Improves Training Efficiency of Recursive Reasoning Models

Researchers led by Kaleemullah Qasim released a new approach called Curriculum‑Guided Adaptive Recursion (CGAR) in a paper posted to arXiv in November 2025, aiming to reduce the computational cost of training recursive reasoning models while preserving accuracy.

Training Efficiency Challenges

Recursive reasoning models have demonstrated strong problem‑solving abilities by iteratively refining their predictions, yet their training often demands extensive GPU resources; for example, solving the Sudoku‑Extreme benchmark can require up to 36 GPU‑hours.

Curriculum‑Guided Adaptive Recursion

CGAR combines two techniques: Progressive Depth Curriculum (PDC), which dynamically adjusts the recursion depth during training, and Hierarchical Supervision Weighting (HSW), which applies an exponentially decaying weight to supervision signals across recursion steps.

Performance Gains

According to the authors, the integrated CGAR framework achieved a 1.71‑fold reduction in training time—from 10.93 hours to 6.38 hours—while the model’s accuracy fell by only 0.63 percentage points (from 86.65 % to 86.02 %).

When evaluated in isolation, PDC alone delivered a 2.26‑fold speedup with an accuracy of 85.47 %, representing a Pareto improvement in efficiency and quality. HSW contributed an additional 1.61‑fold acceleration.

Inference Improvements

Models trained with CGAR also exhibited superior inference efficiency, achieving 100 % halting accuracy and requiring 11 % fewer reasoning steps on average.

Broader Impact and Availability

The authors suggest that treating recursion depth as a scheduled parameter can make neurosymbolic AI and program‑synthesis applications more practical on modest hardware, potentially lowering barriers for research and deployment.

Code for CGAR and pretrained models are publicly available on GitHub and the Hugging Face model hub, enabling independent verification and further development.

This report is based on information from arXiv, licensed under Academic Preprint / Open Access. Based on the abstract of the research paper. Full text available via ArXiv.

Ende der Übertragung

Originalquelle

Privacy Protocol

Wir verwenden CleanNet Technology für maximale Datensouveränität. Alle Ressourcen werden lokal von unseren gesicherten deutschen Servern geladen. Ihre IP-Adresse verlässt niemals unsere Infrastruktur. Wir verwenden ausschließlich technisch notwendige Cookies.

Core SystemsTechnisch notwendig
External Media (3.Cookies)Maps, Video Streams
Analytics (Lokal mit Matomo)Anonyme Metriken
Datenschutz lesen