From 852f21795ad35f33233c4d8f6a0a03d251412c66 Mon Sep 17 00:00:00 2001 From: Guspan Tanadi <36249910+guspan-tanadi@users.noreply.github.com> Date: Mon, 23 Dec 2024 04:39:47 +0700 Subject: [PATCH] docs(2024/09/05/perf-update): links Installation page --- 2024/09/05/perf-update.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/2024/09/05/perf-update.html b/2024/09/05/perf-update.html index 9b74a58..bce15ba 100644 --- a/2024/09/05/perf-update.html +++ b/2024/09/05/perf-update.html @@ -210,7 +210,7 @@
If you haven’t, we highly recommend you to update the vLLM version (see instructions here) and try it out for yourself! We always love to learn more about your use cases and how we can make vLLM better for you. The vLLM team can be reached out via vllm-questions@lists.berkeley.edu. vLLM is also a community project, if you are interested in participating and contributing, we welcome you to check out our roadmap and see good first issues to tackle. Stay tuned for more updates by following us on X.
+If you haven’t, we highly recommend you to update the vLLM version (see instructions here) and try it out for yourself! We always love to learn more about your use cases and how we can make vLLM better for you. The vLLM team can be reached out via vllm-questions@lists.berkeley.edu. vLLM is also a community project, if you are interested in participating and contributing, we welcome you to check out our roadmap and see good first issues to tackle. Stay tuned for more updates by following us on X.
If you are in the Bay Area, you can meet the vLLM team at the following events: vLLM’s sixth meetup with NVIDIA(09/09), PyTorch Conference (09/19), CUDA MODE IRL meetup (09/21), and the first ever vLLM track at Ray Summit (10/01-02).