summaryrefslogtreecommitdiff
path: root/examples
diff options
context:
space:
mode:
authorkhimaros <me@khimaros.com>2023-05-03 10:58:11 -0700
committerGitHub <noreply@github.com>2023-05-03 20:58:11 +0300
commit6daa09d87926fe654385c2887e39ec3eeaa58120 (patch)
treea97352647984747f2fb380f272c790f5fc3357c3 /examples
parentbca9ad938a2a43621cf406d993b755cc91728dd5 (diff)
examples : read chat prompts from a template file (#1196)
Diffstat (limited to 'examples')
-rwxr-xr-xexamples/chat-13B.sh48
1 files changed, 18 insertions, 30 deletions
diff --git a/examples/chat-13B.sh b/examples/chat-13B.sh
index d7148d18..35c089d5 100755
--- a/examples/chat-13B.sh
+++ b/examples/chat-13B.sh
@@ -1,9 +1,12 @@
#!/bin/bash
+set -e
+
cd "$(dirname "$0")/.." || exit
MODEL="${MODEL:-./models/13B/ggml-model-q4_0.bin}"
-USER_NAME="${USER_NAME:-User}"
+PROMPT_TEMPLATE=${PROMPT_TEMPLATE:-./prompts/chat.txt}
+USER_NAME="${USER_NAME:-USER}"
AI_NAME="${AI_NAME:-ChatLLaMa}"
# Adjust to the number of CPU cores you want to use.
@@ -15,39 +18,24 @@ N_PREDICTS="${N_PREDICTS:-2048}"
# For example, override the context size by doing: ./chatLLaMa --ctx_size 1024
GEN_OPTIONS="${GEN_OPTIONS:---ctx_size 2048 --temp 0.7 --top_k 40 --top_p 0.5 --repeat_last_n 256 --batch_size 1024 --repeat_penalty 1.17647}"
+DATE_TIME=$(date +%H:%M)
+DATE_YEAR=$(date +%Y)
+
+PROMPT_FILE=$(mktemp -t llamacpp_prompt.XXXXXXX.txt)
+
+sed -e "s/\[\[USER_NAME\]\]/$USER_NAME/g" \
+ -e "s/\[\[AI_NAME\]\]/$AI_NAME/g" \
+ -e "s/\[\[DATE_TIME\]\]/$DATE_TIME/g" \
+ -e "s/\[\[DATE_YEAR\]\]/$DATE_YEAR/g" \
+ $PROMPT_TEMPLATE > $PROMPT_FILE
+
# shellcheck disable=SC2086 # Intended splitting of GEN_OPTIONS
./main $GEN_OPTIONS \
--model "$MODEL" \
--threads "$N_THREAD" \
--n_predict "$N_PREDICTS" \
--color --interactive \
+ --file ${PROMPT_FILE} \
--reverse-prompt "${USER_NAME}:" \
- --prompt "
-Text transcript of a never ending dialog, where ${USER_NAME} interacts with an AI assistant named ${AI_NAME}.
-${AI_NAME} is helpful, kind, honest, friendly, good at writing and never fails to answer ${USER_NAME}'s requests immediately and with details and precision.
-There are no annotations like (30 seconds passed...) or (to himself), just what ${USER_NAME} and ${AI_NAME} say aloud to each other.
-The dialog lasts for years, the entirety of it is shared below. It's 10000 pages long.
-The transcript only includes text, it does not include markup like HTML and Markdown.
-
-$USER_NAME: Hello, $AI_NAME!
-$AI_NAME: Hello $USER_NAME! How may I help you today?
-$USER_NAME: What year is it?
-$AI_NAME: We are in $(date +%Y).
-$USER_NAME: Please tell me the largest city in Europe.
-$AI_NAME: The largest city in Europe is Moscow, the capital of Russia.
-$USER_NAME: What can you tell me about Moscow?
-$AI_NAME: Moscow, on the Moskva River in western Russia, is the nation's cosmopolitan capital. In its historic core is the Kremlin, a complex that's home to the president and tsarist treasures in the Armoury. Outside its walls is Red Square, Russia’s symbolic center.
-$USER_NAME: What is a cat?
-$AI_NAME: A cat is a domestic species of small carnivorous mammal. It is the only domesticated species in the family Felidae.
-$USER_NAME: How do I pass command line arguments to a Node.js program?
-$AI_NAME: The arguments are stored in process.argv.
-
- argv[0] is the path to the Node. js executable.
- argv[1] is the path to the script file.
- argv[2] is the first argument passed to the script.
- argv[3] is the second argument passed to the script and so on.
-$USER_NAME: Name a color.
-$AI_NAME: Blue.
-$USER_NAME: What time is it?
-$AI_NAME: It is $(date +%H:%M).
-$USER_NAME:" "$@"
+ --in-prefix ' ' \
+ "$@"