Abstract
Large Language Models (LLMs) have demonstrated remarkable capabilities across numerous tasks, yet principled explanations for their underlying mechanisms and several phenomena, such as scaling laws, hallucinations, and related behaviors, remain elusive. In this work, we revisit the classical relationship between compression and prediction, grounded in Kolmogorov complexity and Shannon information theory, to provide deeper insights into LLM behaviors. By leveraging the Kolmogorov Structure Function and interpreting LLM compression as a two-part coding process, we offer a detailed view of how LLMs acquire and store information across increasing model and data scales—from pervasive syntactic patterns to progressively rarer knowledge elements. Motivated by this theoretical perspective and natural assumptions inspired by Heap’s and Zipf’s laws, we introduce a simplified yet representative hierarchical data-generation framework called the Syntax-Knowledge model. Under the Bayesian setting, we show that prediction and compression within this model naturally lead to diverse learning and scaling behaviors of LLMs. In particular, our theoretical analysis offers intuitive and principled explanations for both data and model scaling laws, the dynamics of knowledge acquisition during training and fine-tuning, factual knowledge hallucinations in LLMs. The experimental results validate our theoretical predictions.
About the speaker
Jian Li is a professor at the Institute for Interdisciplinary Information Sciences, Tsinghua University. His research focuses on theoretical computer science, artificial intelligence, FinTech and databases. He has published over 100 papers in major international conferences and journals. His work has received the Best Paper Award at the VLDB conference and the European Symposium on Algorithms (ESA), as well as the Best Newcomer Award at the International Conference on Database Theory (ICDT). Multiple papers of his have been selected for oral presentations or highlighted as spotlight papers. He has led several research projects, including those funded by NSFC and industry projects with companies such as Baidu, Ant Group, ByteDance, E-Fund Management, Huatai Securities etc.
