Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No chat template for processor #845

Open
dtdo90 opened this issue Jan 15, 2025 · 2 comments
Open

No chat template for processor #845

dtdo90 opened this issue Jan 15, 2025 · 2 comments
Assignees

Comments

@dtdo90
Copy link

dtdo90 commented Jan 15, 2025

Dear author,

Thank you for the amazing work!!!

I am studying finetuning the Llama 3.2 11B Vision model, but I encounter the following error

ValueError: No chat template is set for this processor. Please either set the chat_template attribute, or provide a chat template as an argument. See https://huggingface.co/docs/transformers/main/en/chat_templating for more information.

Based on ocrvqa_dataset.py, apply_chat_template is a step in processing the data (as in the code snippet below). Very much look forward to your reply!

def tokenize_dialogs(dialogs, images, processor):
    text_prompt = processor.apply_chat_template(dialogs)
    text_prompt = [prompt.replace('<|begin_of_text|>','') for prompt in text_prompt]

@wukaixingxp
Copy link
Contributor

@dtdo90 Can you show me your code that can reproduce this error? Thanks!

@wukaixingxp wukaixingxp self-assigned this Jan 17, 2025
@dtdo90
Copy link
Author

dtdo90 commented Jan 19, 2025

Thank you for responding!

I found a workaround with the formatted prompt in the form

formatted_prompt=("<|begin_of_text|>"
            f"<|start_header_id|>system<|end_header_id|>{system_message}<|eot_id|>"
            f"<|start_header_id|>user<|end_header_id|>{formatted_prompt}<|image|><|eot_id|>"
            f"<|start_header_id|>assistant<|end_header_id|>{description}<|eot_id|><|end_of_text|>"
)

I have checked again today and found .apply_chat_template work now. I'm not sure what caused the error the other day.

Regarding the formatted prompt by method .apply_chat_template, is there a documentation that I can look up to understand the types of role, or message, or image the processor of Llama supports?

Thank you again for providing us the amazing Llama model and making it open source!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants