Commit Graph

10 Commits

Author SHA1 Message Date
Karl Bowden
ce9f6a8ec1 feat: fetch ollama models to display in the model selector (#2287) 2025-06-22 16:36:28 +08:00
Avinash Thakur
8396cc77e4 feat: allow overriding provider headers (#2161) 2025-06-08 02:04:00 +08:00
yetone
bc403ddcbf feat: ReAct tool calling (#2104) 2025-05-31 08:53:34 +08:00
e8c5f4f13e feat(provider/ollama): allow optional API key without blocking the request (#1898) 2025-04-30 22:17:31 +08:00
yetone
f10b8383e3 refactor: history messages (#1934) 2025-04-30 03:07:18 +08:00
Ricardo Maraschini
cff8cbf9c5 feat: add config for ollama keep alive (#1858) 2025-04-17 10:47:51 +08:00
kyrisu
20fea1e717 refactor: rename is_o_series_model to is_reasoning_model (#1632) 2025-03-19 01:45:28 +08:00
yetone
f9f92dc9d4 Revert "fix: Always handle the extra response outside of stream, such as the exceptions from bedrock (#1526)" (#1569)
This reverts commit f9ab6934d2.
2025-03-12 19:10:05 +08:00
brook hong
f9ab6934d2 fix: Always handle the extra response outside of stream, such as the exceptions from bedrock (#1526) 2025-03-12 16:29:32 +08:00
yetone
750ee80971 feat: add ollama as supported provider (#1543)
* feat: add ollama as supported provider

*This implementation is only working with `stream = true`*
- Uses the actual ollama api and allows for passing additional options
- Properly passes the system prompt to api

Use ollama as provider in opts like this:
opts = {
        debug = true,
        provider = "ollama",
        ollama = {
                api_key_name = "",
                endpoint = "http://127.0.0.1:11434",
                model = "qwen2.5-coder:latest",
                options = {
                        num_ctx = 32768,
                        temperature = 0,
                },
                stream = true,
        },

* fix: ollama types

---------

Co-authored-by: jtabke <25010496+jtabke@users.noreply.github.com>
2025-03-10 02:23:56 +08:00