-
Notifications
You must be signed in to change notification settings - Fork 25.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add_generation_prompt=False in Tokenizer.apply_chat_template has no effect #30893
Comments
Hi @AndreiMuresanu, I just tried to reproduce this and |
I have also encountered this issue. model : |
Can you try installing |
same problem |
Solved. Because I used the initial chat template, which was later updated. See Fix chat template to add generation prompt only if the option is selected (#9) |
@iseesaw thanks for pointing that one out! It seems like this was an issue with the model's chat template, which has since been resolved, and the issue will be fixed if people redownload the tokenizer. Going to close this issue now, since it's not a bug in |
System Info
Hi, I am trying to apply a chat template to an input without the generation prompt. However, setting add_generation_prompt=False appears to have no effect.
Who can help?
@ArthurZucker @Rocketknight1
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Minimal reproducible example:
transformers==4.41.0
Expected behavior
Observed Output:
Expected Output:
The text was updated successfully, but these errors were encountered: