These examples focus on using prior knowledge of the underlying physics to guide the model, often as numerical constraints, or by using the underlying physics to numerically solve equations with variables predicted by the ML algorithms. Recurrent architectures have also shown promise in predicting the time evolution of systems 28, 29. Neural networks have been used to both parametrize and solve differential equations such as Navier Stokes 14, 15 and Hamilton’s equations of motion 27. The power of physics-based ML is well documented and remains an active area of research. Stochastic optimization has been previously used in conjunction with backpropagation to improve robustness or minimize overfitting in models 20, 21, 22, 23, 24, 25, 26, this work extends these ideas to finding parsimonious models from data to learn physics. ![]() In the case of particle dynamics, the learned models satisfy non-trivial underlying symmetries embedded in the data which increases the applicability the parsimonious neural networks (PNNs) over generic NN models. ![]() We find that the resulting descriptions are indeed interpretable and provide insight into the system of interest. Our hypothesis is that the requirement of parsimony will result in the discovery of the physical laws underlying the problem and result in interpretability and improved generalizability. In this letter, we combine neural networks (NNs) with stochastic optimization to find models that balance accuracy and parsimony and apply them to learn, solely from observational data, the dynamics of a particle under a highly non-linear potential, and expressions to predict the melting temperature of materials in terms of fundamental properties. extracting physical laws from observational data, see Refs. Less explored is the use of ML for scientific discovery, i.e. To tackle this challenge, progress has been made towards using knowledge (even partial) of underlying physics to improve the accuracy of models and/or reduce the amount of data required during training 14, 15, 16. In many fields, these limitations are compensated by copious amounts of data, but this is often not possible in areas such as materials science where acquiring data is expensive and time consuming. That is, ML approaches generally neither learn physics nor can they explain their predictions. In addition, most ML models lack interpretability. One of the major drawbacks of the use of ML in the physical sciences is that models often do not learn the underlying physics of the system at hand, such as constraints or symmetries, limiting their ability to generalize. For example, ML-based constitutive models are being used in electronic structure calculations 10 and molecular dynamics (MD) simulations 11, 12, 13. These approaches are also playing an increasing role in the physical sciences where data is generally limited but underlying laws (sometimes approximate) exist 4, 5, 6, 7, 8, 9. Machine learning (ML) can provide predictive models in applications where data is plentiful and the underlying governing laws are unknown 1, 2, 3. In the second case, the PNNs not only find the celebrated Lindemann melting law, but also new relationships that outperform it in the pareto sense of parsimony vs. In the first example, the resulting PNNs are easily interpretable as Newton’s second law, expressed as a non-trivial time integrator that exhibits time-reversibility and conserves energy, where the parsimony is critical to extract underlying symmetries from the data. ![]() The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties. We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony. Less explored is its use to discover interpretable physical laws from data. Machine learning is playing an increasing role in the physical sciences and significant progress has been made towards embedding domain knowledge into models.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |