Skip to content

Commit

Permalink
initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
bowen-bd committed Feb 24, 2023
0 parents commit fb28984
Show file tree
Hide file tree
Showing 26 changed files with 4,938 additions and 0 deletions.
61 changes: 61 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]

# C extensions
*.so

# Distribution / packaging
bin/
build/
develop-eggs/
dist/
eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
.tox/
.coverage
.cache
nosetests.xml
coverage.xml

# Translations
*.mo

# Mr Developer
.mr.developer.cfg
.project
.pydevproject

# Rope
.ropeproject

# Django stuff:
*.log
*.pot

# Sphinx documentation
docs/_build/

# Extras
*.xml
*.csv
*.tar
*.pt
.DS_Store
.idea
.ipynb_checkpoints
test/
examples/*
20 changes: 20 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
exclude: '.git|.tox'
default_stages: [commit]
fail_fast: true

repos:
- repo: https://github.com/psf/black
rev: 22.3.0
hooks:
- id: black

- repo: https://github.com/PyCQA/flake8
rev: 4.0.1
hooks:
- id: flake8

- repo: https://github.com/asottile/pyupgrade
rev: v2.31.0
hooks:
- id: pyupgrade
args: [--py38-plus]
44 changes: 44 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
Crystal Hamiltonian Graph neural Network (CHGNet) Copyright (c) 2023, The Regents
of the University of California, through Lawrence Berkeley National
Laboratory (subject to receipt of any required approvals from the U.S.
Dept. of Energy) and the University of California, Berkeley. All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

(1) Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.

(2) Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.

(3) Neither the name of the University of California, Lawrence Berkeley
National Laboratory, U.S. Dept. of Energy, University of California,
Berkeley nor the names of its contributors may be used to endorse or
promote products derived from this software without specific prior written
permission.


THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.

You are under no obligation whatsoever to provide any bug fixes, patches,
or upgrades to the features, functionality or performance of the source
code ("Enhancements") to anyone; however, if you choose to make your
Enhancements available either publicly, or directly to Lawrence Berkeley
National Laboratory, without imposing a separate written license agreement
for such Enhancements, then you hereby grant the following license: a
non-exclusive, royalty-free perpetual license to install, use, modify,
prepare derivative works, incorporate into other computer software,
distribute, and sublicense such enhancements or derivative works thereof,
in binary and source code form.
3 changes: 3 additions & 0 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
include *.py requirements*.txt
recursive-include chgnet *.py *.json
recursive-include chgnet/pretrained *
103 changes: 103 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
# CHGNet
Crystal Hamiltonian Graph neural-Network: A pretrained universal neural network potential for
charge-informed molecular dynamics and beyond
![chgnet](./chgnet.png)
# Installation:
CHGNet requires
- numpy~=1.21.6
- torch~=1.11.0
- pymatgen~=2022.4.19
- ase==3.22.0

To install:
```bash
pip install .
```

# Usage:
## Direct Inference (Static Calculation):
```python
from chgnet.model.model import CHGNet
from pymatgen.core import Structure

chgnet = CHGNet.load()
structure = Structure.from_file('examples/o-LiMnO2_unit.cif')
prediction = chgnet.predict_structure(structure)
```

## Molecular Dynamics:
Charge-informed molecular dynamics can be simulated with pretrained `CHGNet` through ASE environment
```python
from gcn.chgnet.model.model import CHGNet
from gcn.chgnet.model.dynamics import MolecularDynamics
from pymatgen.core import Structure

structure = Structure.from_file('examples/o-LiMnO2_unit.cif')
chgnet = CHGNet.load()

md = MolecularDynamics(
atoms=structure,
model=chgnet,
ensemble='nvt',
compressibility_au= 1.6,
temperature=1000, # in k
timestep=2, # in fs
trajectory=f'md_out.traj',
logfile=f'md_out.log',
loginterval = 100,
use_device = 'cuda'
)
md.run(500000) # run a 1 ns MD simulation
```
Visualize the magnetic moments after the MD run
```python
from ase.io.trajectory import Trajectory
traj = Trajectory("md_out.traj")
mag = traj[100].get_magnetic_moments()

# convert to pymatgen structure
from pymatgen.io.ase import AseAtomsAdaptor
structure = AseAtomsAdaptor.get_structure(traj[100])
```
## Structure Optimization
```python
from gcn.chgnet.model import StructOptimizer
relaxer = StructOptimizer()
result = relaxer.relax(structure, steps=1000)
```



## Model Training / Fine-tune:
Fine-tuning will help achieve better accuracy if high-precision study
is desired. To train/tune a `CHGNet`, you need to define your data in a
pytorch `Dataset` object. The example datasets are provided in `data/dataset.py`
```python
from chgnet.data.dataset import StructureData, get_train_val_test_loader
from chgnet.trainer import Trainer

dataset = StructureData(
structures=list_of_structures,
energies=list_of_energies,
forces=list_of_forces,
stresses=list_of_stresses,
magmoms=list_of_magmoms
)
train_loader, val_loader, test_loader = get_train_val_test_loader(
dataset,
batch_size=32,
train_ratio=0.9,
val_ratio=0.05
)
trainer = Trainer(
model=chgnet,
targets='efsm',
optimizer='Adam',
criterion='MSE',
learning_rate=1e-2,
epochs=50,
use_device='cuda'
)

trainer.train(train_loader, val_loader, test_loader)
```
Empty file added __init__.py
Empty file.
Binary file added chgnet.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 4 additions & 0 deletions chgnet/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
"""
The pytorch implementation for CHGNet neural network potential
"""
__version__ = "0.0.1"
Empty file added chgnet/data/__init__.py
Empty file.
Loading

0 comments on commit fb28984

Please sign in to comment.