Skip to content

Commit

Permalink
Update in Jun 2024
Browse files Browse the repository at this point in the history
  • Loading branch information
ChangwenXu98 committed Jun 23, 2024
1 parent bb185a5 commit 1eb2f9c
Show file tree
Hide file tree
Showing 5 changed files with 33 additions and 4 deletions.
2 changes: 1 addition & 1 deletion _pages/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ redirect_from:

About Me
------
I am a Ph.D. student in Mechanical Engineering at University of Michigan, working on Molecular Machine Learning in [EEG](https://www.cmu.edu/me/venkatgroup/) advised by [Prof. Viswanathan](https://aero.engin.umich.edu/people/viswanathan-venkat/). Earlier, I received M.S. in Computational Materials Science and Engineering at Carnegie Mellon University and did research in [Mechanical and AI Lab](https://sites.google.com/view/barati) advised by [Prof. Barati Farimani](https://www.meche.engineering.cmu.edu/directory/bios/barati-farimani-amir.html). Besides, I received B.Eng in Materials Science and Engineering at South China University of Technology.
I am a Ph.D. student in Mechanical Engineering at University of Michigan, working on Molecular Machine Learning in [EEG](https://eeg.engin.umich.edu/) advised by [Prof. Viswanathan](https://aero.engin.umich.edu/people/viswanathan-venkat/). Earlier, I received M.S. in Computational Materials Science and Engineering at Carnegie Mellon University and did research in [Mechanical and AI Lab](https://sites.google.com/view/barati) advised by [Prof. Barati Farimani](https://www.meche.engineering.cmu.edu/directory/bios/barati-farimani-amir.html). Besides, I received B.Eng in Materials Science and Engineering at South China University of Technology.

My research interest lies in combining Artificial Intelligence with interdisciplinary science and engineering problems. My current research focuses on implementing and improving foundation models for material discovery and leveraging scientific machine learning for electrolyte optimization. I believe that deep learning models are able to learn representations from data so that we can understand scientific problems from a data science view, which is a significant transformation from traditional research strategies.

Expand Down
6 changes: 3 additions & 3 deletions _pages/cv.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,15 @@ redirect_from:
- /resume
---

You can download my [CV](http://ChangwenXu98.github.io/files/CV.pdf) (last updated September 2023)
You can download my [CV](http://ChangwenXu98.github.io/files/CV.pdf) (last updated June 2024)

{% include base_path %}

Education
======
* Ph.D. in Mechanical Engineering, University of Michigan, 2027 (expected)
* B.E. in Materials Science and Engineering, South China University of Technology, 2021
* M.S. in Computational Materials Science and Engineering, Carnegie Mellon University, 2022
* B.E. in Materials Science and Engineering, South China University of Technology, 2021

Work experience
======
Expand Down Expand Up @@ -53,7 +53,7 @@ Blog Posts

Professional Services
======
* Reviewer: NeurIPS'23, ICLR ML4Materials Workshop'23, ICML SPIGM Workshop'23
* Reviewer: NeurIPS'23-24, ICLR'24, ICML'24, ICLR ML4Materials Workshop'23, ICML SPIGM Workshop'23-24, ICML AI4Science Workshop'24

Skills
======
Expand Down
14 changes: 14 additions & 0 deletions _talks/2024-04-03-Cloud.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
title: "CLOUD: A Scientific Foundation Model for Crystal Property Prediction"
collection: talks
type: "Poster"
permalink: /talks/2024-04-03-Cloud
venue: "MICDE Scientific Foundation Model Conference"
date: 2024-04-03
location: "Ann Arbor, MI"
---


**Abstract**

Property prediction of crystals is crucial for material design. However, developing machine learning models for these tasks is hampered by the need for labeled data from costly experiments or Density Functional Theory (DFT), resulting in limited data size and poor generalization to new crystals. Foundation models (FMs) present a potential solution with their self-supervised pretraining on unlabeled datasets for better representation learning and transferability. Yet, applying FMs to crystals is challenging due to the sparse number of valid structures for pretraining and the inadequacy of existing representations to capture critical structural information like symmetry. Herein, We propose the CrystaL fOUnDation model (CLOUD), a Transformer-based foundation model for crystal property prediction. CLOUD utilizes a novel symmetry-aware string representation that efficiently encodes symmetry, equivalent sites, and constituting atoms, eliminating the need for coordinate information or equivariant models. Pretrained on million-scale crystal data from various databases via Masked Language Modeling (MLM), CLOUD is then fine-tuned and assessed on eight MatBench datasets. The model not only significantly outperforms structure-agnostic models and achieves near state-of-the-art results on two datasets, but also demonstrates robust scaling with data and model size. This suggests CLOUD's potential as a scalable solution for crystal foundation models, capable of learning from billions of unlabeled crystal data.
15 changes: 15 additions & 0 deletions _talks/2024-06-19-Cloud.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "CLOUD: A Scientific Foundation Model for Crystal Property Prediction"
collection: talks
type: "Poster"
permalink: /talks/2024-06-19-Cloud
venue: "Molecular Machine Learning Conference"
date: 2024-06-19
location: "Montreal, Quebec"
---

Spotlight Paper at MoML 2024

**Abstract**

Property prediction of crystals is crucial for material design. However, developing machine learning models for these tasks is hampered by the need for labeled data from costly experiments or Density Functional Theory (DFT), resulting in limited data size and poor generalization to new crystals. Foundation models (FMs) present a potential solution with their self-supervised pretraining on unlabeled datasets for better representation learning and transferability. Yet, applying FMs to crystals is challenging due to the sparse number of valid structures for pretraining and the inadequacy of existing representations to capture critical structural information like symmetry. Herein, We propose the CrystaL fOUnDation model (CLOUD), a Transformer-based foundation model for crystal property prediction. CLOUD utilizes a novel symmetry-aware string representation that efficiently encodes symmetry, equivalent sites, and constituting atoms, eliminating the need for coordinate information or equivariant models. Pretrained on million-scale crystal data from various databases via Masked Language Modeling (MLM), CLOUD is then fine-tuned and assessed on eight MatBench datasets. The model not only significantly outperforms structure-agnostic models and achieves near state-of-the-art results on two datasets, but also demonstrates robust scaling with data and model size. This suggests CLOUD's potential as a scalable solution for crystal foundation models, capable of learning from billions of unlabeled crystal data.
Binary file modified files/CV.pdf
Binary file not shown.

0 comments on commit 1eb2f9c

Please sign in to comment.