Skip to content

vegaandagev/tensorgrad

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

Differentiable Programming Tensor Networks

Requirements

  • PyTorch 1.0+
  • A good GPU card if you are inpatient or ambitious

Higher order gradient of free energy

Run this to compute the energy and specific heat of the 2D classical Ising model using Automatic Differentiation through the Tensor Renormalization Group contraction.

$ cd 1_ising_TRG
$ python ising.py 

You can supply the command line argument -use_checkpoint to reduce the memory usage.

trg

Variational optimization of iPEPS

Run this to optimize an iPEPS wavefuntion for 2D quantum Heisenberg model. Here, we use Corner Transfer Matrix Renormalization Group for contraction, and L-BFGS for optimization.

$ cd 2_variational_iPEPS
$ python variational.py -D 3 -chi 30 

In case of a question, you can type python variational.py -h. To make use of the GPU, you can add -cuda <GPUID>. You will reach the state-of-the-art variational energy and staggered magnetization using this code. You can also supply your own Hamiltonian of interest.

heisenberg

What is under the hood ?

Reverse mode AD computes gradient accurately and efficiently for you! Check the codes in adlib for backward functions which propagate gradients through tensor network contractions.

To Cite

@article{Liao2019,
    title={Differentiable Programming Tensor Networks},
    author={Liao, Hai-Jun and Liu, Jin-Guo and Wang, Lei and Xiang, Tao},
    eprint={arXiv:1903.09650},
    url={https://arxiv.org/abs/1903.09650}
}

Explore more

https://github.com/under-Peter/TensorNetworkAD.jl

About

Differentiable Programming Tensor Networks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%