NeoChainDaily
NeoChainDaily
Uplink
Initialising Data Stream...
14.01.2026 • 05:25 Research & Innovation

Multiplicative Orthogonal Sequential Editing Improves LLM Knowledge Updates While Preserving Stability

Global: Multiplicative Orthogonal Sequential Editing Improves LLM Knowledge Updates While Preserving Stability

Researchers introduced a new method called Multiplicative Orthogonal Sequential Editing (MOSE) to modify the internal knowledge of large language models without compromising numerical stability, according to a preprint posted on arXiv.

Background

Existing knowledge‑editing techniques typically append an update matrix to a model’s original parameters. Prior analyses have shown that this additive approach can increase condition numbers and matrix norms, leading to degraded editing performance and reduced general capabilities, especially when edits are applied sequentially.

Method Overview

The authors propose a multiplicative paradigm in which the new information is encoded in an orthogonal matrix that multiplies the original parameter matrix. Mathematical analysis demonstrates that orthogonal multiplication leaves key stability metrics unchanged, addressing the core limitation of additive methods.

Experimental Evaluation

MOSE was benchmarked against several contemporary editing methods across three distinct LLM architectures. The evaluation measured both the success of targeted edits and the preservation of performance on unrelated downstream tasks.

Key Findings

Results indicate that MOSE limits deviations in the edited parameter matrix and maintains numerical stability. Compared with current approaches, MOSE delivers a 12.08% improvement in sequential editing performance while retaining 95.73% of general abilities on downstream benchmarks.

Implications and Future Work

By preserving stability, the multiplicative approach may enable more reliable and scalable knowledge updates in large models, potentially reducing the need for extensive retraining after each edit.

Availability

The implementation code is publicly available on GitHub at https://github.com/famoustourist/MOSE.

This report is based on information from arXiv, licensed under Academic Preprint / Open Access. Based on the abstract of the research paper. Full text available via ArXiv.

Ende der Übertragung

Originalquelle

Privacy Protocol

Wir verwenden CleanNet Technology für maximale Datensouveränität. Alle Ressourcen werden lokal von unseren gesicherten deutschen Servern geladen. Ihre IP-Adresse verlässt niemals unsere Infrastruktur. Wir verwenden ausschließlich technisch notwendige Cookies.

Core SystemsTechnisch notwendig
External Media (3.Cookies)Maps, Video Streams
Analytics (Lokal mit Matomo)Anonyme Metriken
Datenschutz lesen