index
:
ik_llama.cpp.git
main
Unnamed repository; edit this file 'description' to name the repository.
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
prompts
Age
Commit message (
Expand
)
Author
2023-12-01
llama : add Qwen support (#4281)
Shijie
2023-10-18
speculative : add tree-based sampling example (#3624)
Georgi Gerganov
2023-10-12
prompts : add mnemonics.txt
Georgi Gerganov
2023-10-06
prompts : fix editorconfig checks after #3416
Georgi Gerganov
2023-10-06
parallel : add option to load external prompt file (#3416)
pudepiedj
2023-09-14
feature : support Baichuan serial models (#3009)
jameswu2014
2023-05-11
prompts : model agnostic DAN (#1304)
CRD716
2023-05-03
examples : read chat prompts from a template file (#1196)
khimaros
2023-05-03
examples : various prompt and example fixes (#1298)
CRD716
2023-04-14
Revert "main : alternative instruct mode (Vicuna support, etc.) (#863)" (#982)
Pavol Rusnak
2023-04-14
main : alternative instruct mode (Vicuna support, etc.) (#863)
Tomáš Pazdiora
2023-04-13
do not force the prompt file to end with a new line (#908)
Pavol Rusnak
2023-03-29
add example of re-act pattern (#583)
Tobias Lütke
2023-03-25
Add longer DAN prompt for testing big batch numbers
Georgi Gerganov
2023-03-19
Add "--instruct" argument for usage with Alpaca (#240)
Georgi Gerganov