NeoChainDaily
NeoChainDaily
Uplink
Initialising Data Stream...
02.02.2026 • 05:06 Research & Innovation

Classification Framework Extends Tabular Foundation Models to Survival Analysis

Global: Classification Framework Extends Tabular Foundation Models to Survival Analysis

A new study released on January 29, 2026, demonstrates that a classification‑based framework can adapt tabular foundation models for survival analysis, a type of time‑to‑event modeling. The research was authored by Da In Kim, Wei Siang Lai, and Kelly W. Zhang and posted to the arXiv preprint server. By reformulating survival tasks as a series of binary classification problems, the approach addresses right‑censoring without requiring dedicated model training.

Methodology Overview

The authors discretize event times into intervals and treat each interval as a separate binary classification task. Censored observations are represented as instances with missing labels for future intervals, allowing existing tabular foundation models to be applied through in‑context learning. This design leverages the models’ strengths in classification while sidestepping the need for specialized survival‑specific architectures.

Theoretical Guarantees

Under standard censoring assumptions, the paper proves that minimizing the proposed binary classification loss converges to the true survival probabilities as the training set size grows. The proof relies on consistency arguments common in statistical learning theory, ensuring that the method is theoretically sound.

Empirical Evaluation

Performance was assessed on 53 real‑world datasets spanning various domains. Across multiple survival metrics, the off‑the‑shelf tabular foundation models equipped with the classification formulation outperformed both classical statistical methods and contemporary deep‑learning baselines on average.

Comparison with Existing Approaches

Traditional survival analysis techniques, such as Cox proportional hazards models, require explicit handling of censoring and often assume linear relationships. Deep learning alternatives typically involve custom architectures and extensive training. In contrast, the presented framework offers a plug‑and‑play solution that can be deployed with minimal adaptation.

Implications and Future Work

The findings suggest that large‑scale tabular models can be repurposed for a broader range of predictive tasks, including those with complex temporal dynamics. Future research may explore extending the discretization strategy, integrating uncertainty quantification, and applying the method to high‑dimensional biomedical datasets.

This report is based on information from arXiv, licensed under Academic Preprint / Open Access. Based on the abstract of the research paper. Full text available via ArXiv.

Ende der Übertragung

Originalquelle

Privacy Protocol

Wir verwenden CleanNet Technology für maximale Datensouveränität. Alle Ressourcen werden lokal von unseren gesicherten deutschen Servern geladen. Ihre IP-Adresse verlässt niemals unsere Infrastruktur. Wir verwenden ausschließlich technisch notwendige Cookies.

Core SystemsTechnisch notwendig
External Media (3.Cookies)Maps, Video Streams
Analytics (Lokal mit Matomo)Anonyme Metriken
Datenschutz lesen