feat: add ollama as supported provider (#1543)

* feat: add ollama as supported provider

*This implementation is only working with `stream = true`*
- Uses the actual ollama api and allows for passing additional options
- Properly passes the system prompt to api

Use ollama as provider in opts like this:
opts = {
        debug = true,
        provider = "ollama",
        ollama = {
                api_key_name = "",
                endpoint = "http://127.0.0.1:11434",
                model = "qwen2.5-coder:latest",
                options = {
                        num_ctx = 32768,
                        temperature = 0,
                },
                stream = true,
        },

* fix: ollama types

---------

Co-authored-by: jtabke <25010496+jtabke@users.noreply.github.com>
This commit is contained in:
yetone
2025-03-10 02:23:56 +08:00
committed by GitHub
parent 4976807a33
commit 750ee80971
5 changed files with 100 additions and 5 deletions

View File

@@ -683,6 +683,17 @@ return {
See [highlights.lua](./lua/avante/highlights.lua) for more information
## Ollama
ollama is a first-class provider for avante.nvim. You can use it by setting `provider = "ollama"` in the configuration, and set the `model` field in `ollama` to the model you want to use. For example:
```lua
provider = "ollama",
ollama = {
model = "qwq:32b",
}
```
## Custom providers
Avante provides a set of default providers, but users can also create their own providers.