From fc56a02efb3aed9be53e08f8246d575db3ad33a6 Mon Sep 17 00:00:00 2001 From: guinmoon Date: Mon, 27 May 2024 18:19:10 +0300 Subject: [PATCH] Prompt format info --- docs/prompt_format.md | 22 ++++++++++++++++++++++ llmfarm_core.swift | 2 +- 2 files changed, 23 insertions(+), 1 deletion(-) create mode 100644 docs/prompt_format.md diff --git a/docs/prompt_format.md b/docs/prompt_format.md new file mode 100644 index 0000000..dd44209 --- /dev/null +++ b/docs/prompt_format.md @@ -0,0 +1,22 @@ +# Prompt format +You can use `{prompt}` or `{{prompt}}` to indicate a prompt. +The system prompt can be specified at the beginning of the template in the format `[system]()`. + +## BOS option: +Adds the Begin Of Session token to the beginning of the prompt. + +## EOS option: +Adds the End Of Session token to the end of the prompt. + +## Option Special: +If enabled, the tokenizer will accept special tokens in the template, such as `<|user|>`. + +## Option reverse prompts: +Allows you to specify sequences at the occurrence of which the prediction will be stopped. The sequences are specified using commas. +Example: `<|end|>,user:`. + +## Option skip tokens: +Allows you to specify tokens (in string format) that will not be displayed in the prediction results. Tokens are specified using commas. +Example: `<|end|>,<|assistant|>`. + +Translated with DeepL.com (free version) \ No newline at end of file diff --git a/llmfarm_core.swift b/llmfarm_core.swift index 07d53c7..9c7aac5 160000 --- a/llmfarm_core.swift +++ b/llmfarm_core.swift @@ -1 +1 @@ -Subproject commit 07d53c734605cc112dd528993fa349bd96364ee6 +Subproject commit 9c7aac54aa4a300667c481b7cd3b95c011165cba