@adulny

Neural Symbolic Regression that scales

, , , , and . Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, page 936--945. PMLR, (18--24 Jul 2021)

Abstract

Symbolic equations are at the core of scientific discovery. The task of discovering the underlying equation from a set of input-output pairs is called symbolic regression. Traditionally, symbolic regression methods use hand-designed strategies that do not improve with experience. In this paper, we introduce the first symbolic regression method that leverages large scale pre-training. We procedurally generate an unbounded set of equations, and simultaneously pre-train a Transformer to predict the symbolic equation from a corresponding set of input-output-pairs. At test time, we query the model on a new set of points and use its output to guide the search for the equation. We show empirically that this approach can re-discover a set of well-known physical equations, and that it improves over time with more data and compute.

Description

Neural Symbolic Regression that scales

Links and resources

Tags

community