You are currently on a failover version of the Materials Cloud Archive hosted at CINECA, Italy.
Click here to access the main Materials Cloud Archive.
Note: If the link above redirects you to this page, it means that the Archive is currently offline due to maintenance. We will be back online as soon as possible.
This version is read-only: you can view published records and download files, but you cannot create new records or make changes to existing ones.

×

Recommended by

Indexed by

Prediction rigidities for data-driven chemistry

Sanggyu Chong1*, Filippo Bigi1*, Federico Grasselli1*, Philip Loche1*, Matthias Kellner1*, Michele Ceriotti1*

1 Laboratory of Computational Science and Modeling (COSMO), IMX, École Polytechnique Fédérale de Lausanne (EPFL), 1015 Lausanne, Switzerland

* Corresponding authors emails: sanggyu.chong@epfl.ch, filippo.bigi@epfl.ch, federico.grasselli@epfl.ch, philip.loche@epfl.ch, matthias.kellner@epfl.ch, michele.ceriotti@epfl.ch
DOI10.24435/materialscloud:6x-gs [version v1]

Publication date: Aug 28, 2024

How to cite this record

Sanggyu Chong, Filippo Bigi, Federico Grasselli, Philip Loche, Matthias Kellner, Michele Ceriotti, Prediction rigidities for data-driven chemistry, Materials Cloud Archive 2024.130 (2024), https://doi.org/10.24435/materialscloud:6x-gs

Description

The widespread application of machine learning (ML) to the chemical sciences is making it very important to understand how the ML models learn to correlate chemical structures with their properties, and what can be done to improve the training efficiency whilst guaranteeing interpretability and transferability. In this work, we demonstrate the wide utility of prediction rigidities, a family of metrics derived from the loss function, in understanding the robustness of ML model predictions. We show that the prediction rigidities allow the assessment of the model not only at the global level, but also on the local or the component-wise level at which the intermediate (e.g. atomic, body-ordered, or range-separated) predictions are made. We leverage these metrics to understand the learning behavior of different ML models, and to guide efficient dataset construction for model training. We finally implement the formalism for a ML model targeting a coarse-grained system to demonstrate the applicability of the prediction rigidities to an even broader class of atomistic modeling problems. This record contains all the data used for the analyses conducted in the associated work published in Faraday Discussions.

Materials Cloud sections using this data

No Explore or Discover sections associated with this archive record.

Files

File name Size Description
PR_FD_materials_cloud.tar.gz
MD5md5:4e5ab9528477908af3c68637ef0d1154
4.5 GiB compressed file containing the reference structures and their associated data, as well as trained NN models
README.txt
MD5md5:5952c12112a7efa01dc6b6bf2bde36c6
2.0 KiB README file with details about the data, how it has been organized, and where additional information about how to access the trained models can be found

License

Files and data are licensed under the terms of the following license: Creative Commons Attribution 4.0 International.
Metadata, except for email addresses, are licensed under the Creative Commons Attribution Share-Alike 4.0 International license.

External references

Journal reference (Paper in which the data and models were used for analysis)
S. Chong, F. Bigi, F. Grasselli, P. Loche, M. Kellner, M. Ceriotti, Faraday Discussions (2024) doi:10.1039/D4FD00101J

Keywords

machine learning prediction rigidity uncertainty quantification ML model robustness

Version history:

2024.130 (version v1) [This version] Aug 28, 2024 DOI10.24435/materialscloud:6x-gs