Skip to content

Commit 2f0f602

Browse files
committed
Create Publications “2025-05-31-protein-language-model-zero-shot-fitness-predictions-are-improved-by-inference-only-dropout”
1 parent 6622438 commit 2f0f602

File tree

1 file changed

+23
-0
lines changed

1 file changed

+23
-0
lines changed
Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
---
2+
layout: publication-single
3+
title: Protein Language Model Zero-Shot Fitness Predictions are Improved by
4+
Inference-only Dropout
5+
abstract: Protein Language Models (PLMs) such as ESM2 (Lin et al., 2023) have
6+
been shown to be capable of zero-shot prediction of critical scalar properties
7+
of proteins (“fitness”, Meier et al. (2021)). In this work, we show that
8+
injecting a dropout layer at inference time between a PLM’s
9+
featurizer/embedding layer and its transformer, and averaging its output akin
10+
to Monte-Carlo dropout (Gal & Ghahramani, 2016) increases zero-shot
11+
performance on a subset of the ProteinGym dataset (Notin et al., 2023). This
12+
is the case even when the model was not trained with dropouts to begin with,
13+
and does not require retraining or f inetuning of the PLM. A dropout of 0.1
14+
seems performant across all models.
15+
published: 2025-05-31
16+
authors:
17+
internal_authors:
18+
- Aditya Ravuri
19+
- Neil D. Lawrence
20+
details:
21+
pdf: https://arxiv.org/pdf/2506.14793
22+
container-title: "MLCB (workshop track) 2025. "
23+
---

0 commit comments

Comments
 (0)