index
:
ik_llama.cpp.git
main
Unnamed repository; edit this file 'description' to name the repository.
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
common
/
common.cpp
Age
Commit message (
Expand
)
Author
2023-11-05
ggml-cuda : fix f16 mul mat (#3961)
slaren
2023-11-05
Allow common process_escapes to handle \x sequences (#3928)
Kerfuffle
2023-11-03
speculative : change default p_accept to 0.5 + CLI args (#3919)
Georgi Gerganov
2023-11-02
build : link against build info instead of compiling against it (#3879)
cebtenzzre
2023-11-01
llama : implement YaRN RoPE scaling (#2268)
cebtenzzre
2023-11-01
common : minor (#3715)
Georgi Gerganov
2023-11-01
common : allow caller to handle help/argument exceptions (#3715)
bandoti
2023-10-31
samplers : Min-P sampler implementation [alternative to Top P/Top K] (#3841)
kalomaze
2023-10-29
Extend llama_kv_cache_seq_rm to allow matching any sequence (#3843)
Kerfuffle
2023-10-28
llama : add option for greedy sampling with probs (#3813)
Georgi Gerganov
2023-10-28
common : print that one line of the syntax help *also* to standard output (#3...
Henk Poley
2023-10-23
llama : remove token functions with `context` args in favor of `model` (#3720)
Marcus Dunn
2023-10-22
main : escape prompt for cfg_negative_prompt and consecutive inputs in main w...
vvhg1
2023-10-20
sampling : refactor init to use llama_sampling_params (#3696)
Georgi Gerganov
2023-10-18
speculative : add tree-based sampling example (#3624)
Georgi Gerganov
2023-10-17
tokenizer : special token handling (#3538)
staviq
2023-10-12
examples: support LLaVA v1.5 (multimodal model) (#3436)
M. Yusuf Sarıgöz
2023-10-11
common : fix mirostat state when using multiple sequences (#3543)
Kerfuffle
2023-10-07
Fix trying to strip newline from empty prompt and cfg prompt file content (#3...
Kerfuffle
2023-10-06
parallel : add option to load external prompt file (#3416)
pudepiedj
2023-10-06
server : reuse llama_sample_token common util (#3494)
Jhen-Jie Hong
2023-10-05
build : use std::make_tuple() for compatibility with older GCC versions (#3488)
Kenvix ⭐
2023-10-05
common : process escape sequences in reverse prompts (#3461)
staviq
2023-10-03
Work on the BPE tokenizer (#3252)
goerch
2023-10-02
infill : add new example + extend server API (#3296)
vvhg1
2023-09-28
build : enable more non-default compiler warnings (#3200)
Cebtenzzre
2023-09-28
llama.cpp : split llama_context_params into model and context params (#3301)
slaren
2023-09-28
train : finetune LORA (#2632)
xaedes
2023-09-28
llama : custom attention mask + parallel decoding + no context swaps (#3228)
Georgi Gerganov
2023-09-20
llama : allow gguf RoPE keys to be overridden with defaults (#3240)
Cebtenzzre
2023-09-16
Fixing the last deviations from sentencepiece indicated by test-tokenizer-1 (...
goerch
2023-09-15
check C++ code with -Wmissing-declarations (#3184)
Cebtenzzre
2023-09-15
llama : remove mtest (#3177)
Roland
2023-09-13
speculative: add --n-gpu-layers-draft option (#3063)
FK
2023-09-07
fix some warnings from gcc and clang-tidy (#3038)
Cebtenzzre
2023-09-07
metal : fix kernel_norm (fixes Falcon on Metal) (#3057)
Georgi Gerganov
2023-09-05
examples : replace fprintf to stdout with printf (#3017)
Cebtenzzre
2023-09-05
speculative : add grammar support (#2991)
Georgi Gerganov
2023-09-04
build : on Mac OS enable Metal by default (#2901)
Georgi Gerganov
2023-09-03
speculative : PoC for speeding-up inference via speculative sampling (#2926)
Georgi Gerganov
2023-09-03
perplexity : fix ETA by warming up the model with an empty run
Georgi Gerganov
2023-09-01
build : fix most gcc and clang warnings (#2861)
Cebtenzzre
2023-08-30
main : log file (#2748)
staviq
2023-08-28
train : mem usage and other improvements (#2439)
xaedes
2023-08-28
YAML result logging + preset script (#2657)
Johannes Gäßler
2023-08-27
llama : more tokenizer fixes (#2810)
Georgi Gerganov
2023-08-25
ROCm Port (#1087)
Henri Vasserman
2023-08-23
llm : add Falcon support (#2717)
Georgi Gerganov
2023-08-23
Strided perplexity (#2714)
Kawrakow
2023-08-22
CUDA: use mul_mat_q kernels by default (#2683)
Johannes Gäßler
[next]