And what is the max value of `max_model_len` for DeepSeek-V2-Chat? ``` from transformers import AutoTokenizer from vllm import LLM, SamplingParams max_model_len, tp_size = 8192, 8 ```