Skip to content

Commit

Permalink
context : add cache-less llama_context
Browse files Browse the repository at this point in the history
ggml-ci
  • Loading branch information
ggerganov committed Feb 20, 2025
1 parent 072280e commit b1554be
Show file tree
Hide file tree
Showing 8 changed files with 1,122 additions and 404 deletions.
2 changes: 1 addition & 1 deletion common/common.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -952,7 +952,7 @@ struct common_init_result common_init_from_params(common_params & params) {
}

if (params.ctx_shift && !llama_kv_self_can_shift(lctx)) {
LOG_WRN("%s: KV cache shifting is not supported for this model, disabling KV cache shifting\n", __func__);
LOG_WRN("%s: KV cache shifting is not supported for this context, disabling KV cache shifting\n", __func__);
params.ctx_shift = false;
}

Expand Down
Loading

0 comments on commit b1554be

Please sign in to comment.