index
:
ik_llama.cpp.git
main
Unnamed repository; edit this file 'description' to name the repository.
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
examples
/
llava
Age
Commit message (
Expand
)
Author
2024-08-12
Merge mainline - Aug 12 2024 (#17)
Kawrakow
2024-07-27
Merge mainline llama.cpp (#3)
Kawrakow
2024-06-13
`build`: rename main → llama-cli, server → llama-server, llava-cli → ll...
Olivier Chafik
2024-06-04
common : refactor cli arg parsing (#7675)
Georgi Gerganov
2024-05-30
Move convert.py to examples/convert-legacy-llama.py (#7430)
Galunid
2024-05-28
llava : update clip.h (#7580)
Ikko Eltociear Ashimine
2024-05-22
common : normalize naming style (#7462)
Georgi Gerganov
2024-05-15
ggml : tag ggml_tensor::backend as deprecated (#7290)
slaren
2024-05-14
llava-cli: fix base64 prompt (#7248)
k.h.lai
2024-05-10
Fix memory bug in grammar parser (#7194)
Justine Tunney
2024-05-10
llava : fix moondream support (#7163)
Andrei
2024-05-08
Revert "llava : add support for moondream vision language model (#6899)"
Georgi Gerganov
2024-05-07
docs: fix typos (#7124)
omahs
2024-04-29
llava-cli : multiple images (#6969)
cpumaxx
2024-04-25
llava : add support for moondream vision language model (#6899)
vik
2024-04-25
clip : rename lerp function to avoid conflict (#6894)
Daniel Bevenius
2024-04-21
llava : use logger in llava-cli (#6797)
Justine Tunney
2024-04-21
llama : support Llama 3 HF conversion (#6745)
Pedro Cuenca
2024-04-12
chore: Fix markdown warnings (#6625)
Rene Leonhardt
2024-04-09
BERT tokenizer fixes (#6498)
Jared Van Bortel
2024-03-28
llava : fix MobileVLM (#6364)
Ziang Wu
2024-03-28
doc: fix typo in MobileVLM-README.md (#6181)
Ziang Wu
2024-03-26
cuda : rename build flag to LLAMA_CUDA (#6299)
slaren
2024-03-20
llava : update MobileVLM-README.md (#6180)
Ziang Wu
2024-03-20
llava : add MobileVLM_V2 backup (#6175)
Ziang Wu
2024-03-20
Revert "llava : add a MobileVLM_V2-1.7B backup (#6152)"
Georgi Gerganov
2024-03-20
llava : add a MobileVLM_V2-1.7B backup (#6152)
Ziang Wu
2024-03-18
clip : fix memory leak (#6138)
Felix
2024-03-15
llava : change API to pure C style for Rust FFI bindgen (#6079)
Ting Lou
2024-03-14
gguf : fix resource leaks (#6061)
Steve Grubb
2024-03-14
readme : improve readme for Llava-1.6 example (#6044)
Jian Liao
2024-03-09
ggml : remove old quantization functions (#5942)
Georgi Gerganov
2024-02-25
code : normalize enum names (#5697)
Georgi Gerganov
2024-02-21
llava : add --skip-unknown to 1.6 convert.py (#5632)
Daniel Bevenius
2024-02-20
server : support llava 1.6 (#5553)
CJ Pais
2024-02-20
llava : add explicit instructions for llava-1.6 (#5611)
Daniel Bevenius
2024-02-19
llava : remove extra cont (#5587)
Georgi Gerganov
2024-02-19
llava : replace ggml_cpy with ggml_cont
slaren
2024-02-19
llava : avoid changing the original BakLLaVA model (#5577)
Daniel Bevenius
2024-02-18
llava : update surgery script to not remove tensors (#5536)
Daniel Bevenius
2024-02-16
llava : removed excess free(NULL) operation (#5531)
Herman Semenov
2024-02-16
ggml : add numa options (#5377)
bmwl
2024-02-16
llava : fix clip-model-is-vision flag in README.md (#5509)
Daniel Bevenius
2024-02-15
clip : fix wrong loop condition
Georgi Gerganov
2024-02-15
llava : fix memory management bug (#5491)
Elbios
2024-02-15
llaba : hotfix for llava-1.6 image number (#5495)
John
2024-02-14
llava : update README.md (#5489)
John
2024-02-14
llava : support v1.6 (#5267)
John
2024-02-12
llava : remove prog parameter from ArgumentParser (#5457)
Daniel Bevenius
2024-02-12
sync : ggml (#5452)
Georgi Gerganov
[next]