Skip to content

A Python library for developing machine learning interatomic potentials, based on Google JAX.

License

Notifications You must be signed in to change notification settings

hghcomphys/pantea

Repository files navigation

Pantea

Documentation Status

Description

Pantea is an optimized Python library based on Google JAX that enables development of machine learning interatomic potentials for use in computational physics. These potentials are particularly necessary for conducting large-scale molecular dynamics simulations of complex materials with ab initio accuracy.

See documentation for more information.

Main Features

  • The design of Pantea is simple and flexible, which makes it easy to incorporate atomic descriptors and potentials.
  • It uses automatic differentiation to make defining new descriptors straightforward.
  • Pantea is written purely in Python and optimized with just-in-time (JIT) compilation.
  • It also supports GPU computing, which can significantly speed up preprocessing and model training.

Warning

This package is under development and the current focus is on the implementation of high-dimensional neural network potential (HDNNP) proposed by Behler et al. (2007).

Installation

To install Pantea, run this command in your terminal:

$ pip install pantea

For machines with an NVIDIA GPU please follow the installation instruction on the documentation.

Examples

I. Defining an ACSF descriptor

Atom-centered Symmetry Function (ACSF) descriptor captures information about the distribution of neighboring atoms around a central atom by considering both radial (two-body) and angular (three-body) symmetry functions. The values obtained from these calculations represent a fingerprint of the local atomic environment and can be used in various machine learning potentials.

This script below demonstrates the process of evaluating an array of atomic-centered symmetry functions (ACSF) for a specific element, which can be utilized to evaluate the descriptor values for any structure. The resulting values can then be used to construct a machine learning potential.

from pantea.datasets import Dataset
from pantea.descriptors import ACSF
from pantea.descriptors.acsf import CutoffFunction, G2, G3

# Read atomic structure dataset (e.g. water molecules)
structures = Dataset.from_runner('input.data')
structure = structures[0]
print(structure)
# >> Structure(natoms=12, elements=('H', 'O'), dtype=float64)

# Define an ACSF descriptor for hydrogen
# It includes two radial (G2) and angular (G3) symmetry functions
descriptor = ACSF('H')
cfn = CutoffFunction.from_cutoff_type(r_cutoff=12.0, cutoff_type='tanh')
descriptor.add(G2(cfn, eta=0.5, r_shift=0.0), 'H')
descriptor.add(G3(cfn, eta=0.001, zeta=2.0, lambda0=1.0, r_shift=12.0), 'H', 'O')
print(descriptor)
# >> ACSF(central_element='H', symmetry_functions=2)

values = descriptor(structure)
print("Descriptor values:\n", values)
# >> Descriptor values:
# [[0.01952943 1.13103234]
#  [0.01952756 1.04312263]
# ...
#  [0.00228752 0.41445455]]

gradient = descriptor.grad(structure, atom_index=0)
print("Descriptor gradient:\n", gradient)
# >> Descriptor gradient:
# [[ 0.04645236 -0.05037861 -0.06146214]
# [-0.10481855 -0.01841708  0.04760214]]

II. Training an NNP potential

This example illustrates how to quickly create a high-dimensional neural network potential (HDNNP) instance from an in input setting files and train it on input structures. The trained potential can then be used to evaluate the energy and force components for new structures.

from pantea.datasets import Dataset
from pantea.potentials import NeuralNetworkPotential

structures = Dataset.from_runner("input.data")
structure = structures[0]

nnp = NeuralNetworkPotential.from_file("input.nn")

nnp.fit_scaler(structures)
nnp.fit_model(structures)

total_energy = nnp(structure)
print(total_energy)

forces = nnp.compute_forces(structure)
print(forces)

Example input files: input.data and input.nn

License

This project is licensed under the GNU General Public License (GPL) version 3 - see the LICENSE file for details.

About

A Python library for developing machine learning interatomic potentials, based on Google JAX.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published