summaryrefslogtreecommitdiff
path: root/examples/perplexity
diff options
context:
space:
mode:
authorDavid Sommers <12738+databyte@users.noreply.github.com>2024-01-18 12:20:59 -0500
committerGitHub <noreply@github.com>2024-01-18 19:20:59 +0200
commitb46757735d30f5c6ed4f20ebeccc684e02d4f3bf (patch)
tree7327dd01fecf2f45007ae84e9d09345f317083bb /examples/perplexity
parent3e945cc1e9c06d2001031360e4e303e9548fb02c (diff)
convert.py : fix llama/llama2 conversion due to vocab_size=-1 (#5019)
PR #4818 (merged last week) reintroduced a config check for vocab_size that was addressed in PR #4258 (merged 2023-11-30). Without the fix, llama2 models can't be converted. The error is: `ValueError: The model's vocab size is set to -1 in params.json. Please update it manually. Maybe 32000?`
Diffstat (limited to 'examples/perplexity')
0 files changed, 0 insertions, 0 deletions