diff options
author | Xuan Son Nguyen <thichthat@gmail.com> | 2024-02-22 00:31:00 +0100 |
---|---|---|
committer | GitHub <noreply@github.com> | 2024-02-22 00:31:00 +0100 |
commit | 7c8bcc11dc61cf5930b70cd0168b84afcebe12a9 (patch) | |
tree | f5b04881466f01302d9626433f650763785f8818 /llama.h | |
parent | 7fe4678b0244ba7b03eae66ebeaa947e2770bb1a (diff) |
Add docs for llama_chat_apply_template (#5645)
* add docs for llama_chat_apply_template
* fix typo
Diffstat (limited to 'llama.h')
-rw-r--r-- | llama.h | 2 |
1 files changed, 1 insertions, 1 deletions
@@ -708,7 +708,7 @@ extern "C" { /// Apply chat template. Inspired by hf apply_chat_template() on python. /// Both "model" and "custom_template" are optional, but at least one is required. "custom_template" has higher precedence than "model" - /// NOTE: This function only support some known jinja templates. It is not a jinja parser. + /// NOTE: This function does not use a jinja parser. It only support a pre-defined list of template. See more: https://github.com/ggerganov/llama.cpp/wiki/Templates-supported-by-llama_chat_apply_template /// @param tmpl A Jinja template to use for this chat. If this is nullptr, the model’s default chat template will be used instead. /// @param chat Pointer to a list of multiple llama_chat_message /// @param n_msg Number of llama_chat_message in this chat |