NPMCL: A Theoretical Framework for Non-Parametric Continual Learning through Meta-Ability Cultivation
DOI:
https://doi.org/10.31224/6634Keywords:
Non-Parametric Meta Continual Learning (NPMCL), Knowledge Compression-Decompression, Prior SuppressionAbstract
Parametric update methods for Large Language Models (LLMs) in continual learning often face challenges such as catastrophic forgetting and the stability-plasticity dilemma. In this work, we characterize Non-Parametric Meta Continual Learning (NPMCL) as a structured approach that enables knowledge updates without additional training. This framework models adaptation as a Knowledge Compression-Decompression process, formalized through four core meta-abilities: (1) Query Generation for identifying information gaps; (2) Structural Matching for precise referential and temporal alignment; (3) Distillative Compression for extracting logical invariants from high-entropy data; and (4) Constrained Inference for memory-guided reasoning and prior suppression.
We propose that these meta-abilities constitute a domain-agnostic cognitive pipeline, potentially allowing LLMs to adapt to counterfactual environments by leveraging dynamic external memory. This work aims to formalize the theoretical underpinnings of such meta-cognitive protocols. The proposed framework is informed by preliminary empirical observations from logic-aligned memory architectures (e.g., CoG-MeM). In this paper, we systematize the NPMCL paradigm and discuss its implications for the future development of training-free, autonomous cognitive agents.
Downloads
Downloads
Posted
Versions
- 2026-04-27 (3)
- 2026-04-17 (2)
- 2026-03-13 (1)
License
Copyright (c) 2026 Zhiqiang Gan

This work is licensed under a Creative Commons Attribution 4.0 International License.