Machine learning of the well-known things
V. V. Dolotina, b, c, A. Yu. Morozova, b, c, and A. V. Popolitova, b, c, *
aMoscow Institute of Physics and Technology (National Research University), Dolgoprudny, Moscow Region, Russia
bAlikhanov Institute for Theoretical and Experimental Physics, National Research Center “Kurchatov Institute”, Moscow, Russia
cKharkevich Institute for Information Transmission Problems, Russian Academy of Sciences, Moscow, Russia
email: *popolit@itep.ru
Received 2 December, 2022
Abstract—
Machine learning (ML) in its current form implies that the answer to any problem can be well approximated by a function of a very peculiar form: a specially adjusted iteration of Heaviside theta-functions. It is natural to ask whether the answers to questions that we already know can be naturally represented in this form. We provide elementary and yet nonevident examples showing that this is indeed possible, and suggest to look for a systematic reformulation of existing knowledge in an ML-consistent way. The success or failure of these attempts can shed light on a variety of problems, both scientific and epistemological.
Keywords:
exact approaches to QFT,
nonlinear algebra,
machine learning,
steepest descent method
DOI: 10.1134/S0040577923030091