π Chat Api With Tokenizer Chat TemplateΒΆ
formats.chat_api_with_tokenizer_chat_template
HFSystemFormat
(
model_name="ibm-granite/granite-3.1-2b-instruct",
)
[source]Explanation about HFSystemFormatΒΆ
Formats the complete input for the model using the HuggingFace chat template of a given model.
HFSystemFormat formats instance fields into a single string to be inputted to the model. This string overwrites field βsourceβ of the instance.
- Example:
HFSystemFormat(model_name="HuggingFaceH4/zephyr-7b-beta")
Uses the template defined the in tokenizer_config.json of the model:
"chat_template": "{% for message in messages %}\n{% if message['role'] == 'user' %}\n{{ '<|user|>\n' + message['content'] + eos_token }}\n{% elif message['role'] == 'system' %}\n{{ '<|system|>\n' + message['content'] + eos_token }}\n{% elif message['role'] == 'assistant' %}\n{{ '<|assistant|>\n' + message['content'] + eos_token }}\n{% endif %}\n{% if loop.last and add_generation_prompt %}\n{{ '<|assistant|>' }}\n{% endif %}\n{% endfor %}"
See more details in https://huggingface.co/docs/transformers/main/en/chat_templating
Read more about catalog usage here.