Global Lyapunov functions: a long-standing open problem in mathematics, with symbolic transformers

Part of Advances in Neural Information Processing Systems 37 (NeurIPS 2024) Main Conference Track

Bibtex Paper

Authors

Alberto Alfarano, Francois Charton, Amaury Hayat

Abstract

Despite their spectacular progress, language models still struggle on complex reasoning tasks, such as advanced mathematics.We consider a long-standing open problem in mathematics: discovering a Lyapunov function that ensures the global stability of a dynamical system. This problem has no known general solution, and algorithmic solvers only exist for some small polynomial systems.We propose a new method for generating synthetic training samples from random solutions, and show that sequence-to-sequence transformers trained on such datasets perform better than algorithmic solvers and humans on polynomial systems, and can discover new Lyapunov functions for non-polynomial systems.