index
:
ik_llama.cpp.git
main
Unnamed repository; edit this file 'description' to name the repository.
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
examples
/
server
Age
Commit message (
Expand
)
Author
2024-08-12
Merge mainline - Aug 12 2024 (#17)
Kawrakow
2024-07-27
Merge mainline llama.cpp (#3)
Kawrakow
2024-06-20
server : fix smart slot selection (#8020)
sasha0552
2024-06-18
Only use FIM middle token if it exists (#7648)
Sigbjørn Skjæret
2024-06-13
`build`: rename main → llama-cli, server → llama-server, llava-cli → ll...
Olivier Chafik
2024-06-12
server : restore numeric prompts (#7883)
Georgi Gerganov
2024-06-11
json: refine constraint for whitespace to avoid runaways yet allow pretty pri...
Olivier Chafik
2024-06-11
`json`: document schema conversion in GBNF readme, align manual grammar examp...
Olivier Chafik
2024-06-10
server : improve "prompt" handling (#7847)
Georgi Gerganov
2024-06-09
server: do not remove whitespace at the start of a completion chunk (#7830)
mgroeber9110
2024-06-08
server : smart slot selection using Longest Common Prefix (#7728)
sasha0552
2024-06-07
server: update cache_prompt documentation [no ci] (#7745)
Johannes Gäßler
2024-06-07
server : do not get prompt in infill mode (#7286)
woodx
2024-06-06
imatrix : migrate to gpt_params (#7771)
Georgi Gerganov
2024-06-06
grammars: x{min,max} repetition operator (#6640)
Olivier Chafik
2024-06-04
common : refactor cli arg parsing (#7675)
Georgi Gerganov
2024-06-01
server : new UI (#7633)
Yazan Agha-Schrader
2024-06-02
SimpleChat: Simple histogram/repeatMatching driven garbageTrimming, Settings ...
HanishKVC
2024-05-31
server : update js (#7670)
Georgi Gerganov
2024-05-28
server: do not remove whitespace at the start of a completion chunk (#7524)
mgroeber9110
2024-05-28
Markdownish code block fix (#7571)
Nathan Epstein
2024-05-26
SimpleChat Completion Mode flexibility and cleanup, Settings gMe, Optional sl...
HanishKVC
2024-05-23
SimpleChat: a simple and dumb web front end for testing /chat/completions and...
HanishKVC
2024-05-22
common : normalize naming style (#7462)
Georgi Gerganov
2024-05-21
Tokenizer SPM fixes for phi-3 and llama-spm (bugfix) (#7425)
jaime-m-p
2024-05-20
Tokenizer SPM fixes for phi-3 and llama-spm (#7375)
jaime-m-p
2024-05-20
server : fix temperature + disable some tests (#7409)
Georgi Gerganov
2024-05-20
server : tuning tests (#7388)
Georgi Gerganov
2024-05-20
server : return error on too large embedding input (#7389)
Georgi Gerganov
2024-05-19
server: add test for token probs (#7347)
Johannes Gäßler
2024-05-19
server: fix seed being reported back (#7382)
Johannes Gäßler
2024-05-18
server: correct --threads documentation [no ci] (#7362)
Johannes Gäßler
2024-05-17
server : add support for the RPC backend (#7305)
Radoslav Gerganov
2024-05-17
[Server] Added --verbose option to README [no ci] (#7335)
Leon Knauer
2024-05-16
Revert "server bench: fix bench not waiting for model load (#7284)" (#7334)
Pierrick Hymbert
2024-05-15
server bench: fix bench not waiting for model load (#7284)
Johannes Gäßler
2024-05-14
server: free sampling contexts on exit (#7264)
Steve Grubb
2024-05-14
docs: Fix typo and update description for --embeddings flag (#7026)
Ryuei
2024-05-13
change default temperature of OAI compat API from 0 to 1 (#7226)
Benjamin Findley
2024-05-11
fix system prompt handling (#7153)
Xuan Son Nguyen
2024-05-11
server : free llama_batch on exit (#7212)
Steve Grubb
2024-05-11
server: fix reported top tokens for temperature 0 (#7203)
Johannes Gäßler
2024-05-08
convert-hf : save memory with lazy evaluation (#7075)
compilade
2024-05-08
JSON: [key] -> .at(key), assert() -> GGML_ASSERT (#7143)
Johannes Gäßler
2024-05-08
server : add themes + favicon (#6848)
JohnnyB
2024-05-08
server : add_special option for tokenize endpoint (#7059)
Johan
2024-05-08
clean up json_value & server_log (#7142)
Xuan Son Nguyen
2024-05-07
server: fix incorrectly reported token probabilities (#7125)
Johannes Gäßler
2024-05-07
server : update readme with undocumented options (#7013)
Kyle Mistele
2024-05-04
If first token generated from the server is the stop word the server will cra...
maor-ps
[next]