summaryrefslogtreecommitdiff
path: root/common/common.cpp
diff options
context:
space:
mode:
authorgoerch <jhr.walter@t-online.de>2023-10-03 09:16:26 +0200
committerGitHub <noreply@github.com>2023-10-03 09:16:26 +0200
commitff5a3f0c09dfa0a8e0bf76d1748df5c6dee0e8ff (patch)
tree356ce471234d1f82db452e6274a951ac0b72cb9f /common/common.cpp
parent1c84003c08027f5d3a4cb876f51d6b6224a34d0e (diff)
Work on the BPE tokenizer (#3252)
* Work on the BPE tokenizer Tokenizer tests work for Falcon-7B * Try to fix build problem * Fix debug assertion failure * Fix MSVC Unicode BOM problem * Cleanup and an improvement * Fix compiler warning * Cleanup * Test doesn't work over the full range of Unicodes * Update .gitignore and Makefile * Another Makefile rule * Testing Aquila * Moving byte decoding back to `token_to_piece` ... ... because everyone is using it. * Guarding some unusable code pathes * Streamlining code and adding some more assertions Important change: I'm classifying added tokens as control tokens now for BPE. * Adding a comment * Adding another assertion * Fixed vocabulary guarding assertions * Fix PR for recent change * Fix PR for recent change * Fix for compiler warning * Fix PR for recent change * Fix PR for recent change * Fix PR for recent change * Fix for compiler warning * Fixes for more compiler warnings * Remove unused code * Fix initialization of static maps * Add scores and token types back, adapt gptneox * Update llama.cpp Co-authored-by: Georgi Gerganov <ggerganov@gmail.com> * Update unicode.h Co-authored-by: Georgi Gerganov <ggerganov@gmail.com> * Update unicode.h Co-authored-by: Georgi Gerganov <ggerganov@gmail.com> * Ported Starcoder and added some assertions * Fix coding style * Apply @jploski 's fix for missing tokens --------- Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
Diffstat (limited to 'common/common.cpp')
-rw-r--r--common/common.cpp1
1 files changed, 1 insertions, 0 deletions
diff --git a/common/common.cpp b/common/common.cpp
index 4b233786..7370017f 100644
--- a/common/common.cpp
+++ b/common/common.cpp
@@ -923,6 +923,7 @@ std::string llama_detokenize_bpe(llama_context * ctx, const std::vector<llama_to
result += piece;
}
+ // NOTE: the original tokenizer decodes bytes after collecting the pieces.
return result;
}