Commit Graph

232 Commits

Author SHA1 Message Date
msvechla
2f806ca342 Resolve AWS credentials using default credentials provider chain for Bedrock (#1752) 2025-05-16 11:13:40 +08:00
doodleEsc
0b78b58760 feat(file_selector): add intergration with nvim-tree.lua (#1987)
Co-authored-by: doodleEsc <cokie@foxmail.com>
Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com>
2025-05-06 00:06:41 +08:00
Eric Chen
b028f32fbf feat(docs): Warp Support 2025-05-03 03:15:21 +08:00
yetone
f10b8383e3 refactor: history messages (#1934) 2025-04-30 03:07:18 +08:00
yetone
756d1f1e24 feat: universal selector (#1877) 2025-04-15 16:40:47 +08:00
William Kang
0d26590389 docs: Avante Chat History Selector Documentation and Commands (#1862) 2025-04-13 21:11:58 +08:00
guanghechen
c74ef1b9bc improve(prompt_input): support to customize the border and body highlight (#1846) 2025-04-11 10:20:40 +08:00
yetone
04336913b3 Revert "fix max_tokens for reasoning models (#1819)" (#1839)
* Revert "fix max_tokens for reasoning models (#1819)"

This reverts commit 1e2e233ff5.

* Revert "fix: revert max_completion_tokens to max_tokens (#1741)"

This reverts commit cd13eeb7d9.

* fix: nvim_version
2025-04-09 16:58:54 +08:00
Jorge Valdez
1fc57ab1ae feat: add support for searxng (#1814)
* Add support for searxng

* body.results not body.web

* type annotation

* update docs
2025-04-09 14:35:42 +08:00
doodleEsc
1c36cfc812 fix: enhance web search functionality with proxy support (#1823)
* fix: enhance web search functionality with proxy support

- Remove unnecessary blank line in README.md
- Add missing closing details tag in both README.md and README_zh.md
- Add proxy support to the web search engine configuration in config.lua
- Ensure the web search function uses the proxy setting when available in init.lua
- Fix a potential nil access in the response body format check in config.lua

Signed-off-by: 范立洲 <fanlizhou@yunqilaohe.com>

* [pre-commit.ci lite] apply automatic fixes

---------

Signed-off-by: 范立洲 <fanlizhou@yunqilaohe.com>
Co-authored-by: 范立洲 <fanlizhou@yunqilaohe.com>
Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com>
2025-04-07 15:05:22 +08:00
yetone
f83378a67e feat: add pre-commit ci (#1824) 2025-04-07 14:55:21 +08:00
yetone
7dc5560909 chores: update readme (#1786) 2025-03-31 23:28:29 +08:00
yetone
46073c0efd feat: aihubmix (#1780)
* feat: aihubmix

* feat: add chinese readme
2025-03-31 18:58:41 +08:00
yetone
c272bcd2ae chores: update business sponsors links (#1770) 2025-03-30 17:52:20 +08:00
yetone
45b7c5ddc6 chores: updaste business sponsors links (#1763) 2025-03-29 13:36:40 +08:00
yetone
2cd6d93640 chores(docs): update sponsor logo (#1756) 2025-03-28 12:44:13 +08:00
yetone
1dca83e578 feat(docs): add sponsors (#1755) 2025-03-28 12:33:02 +08:00
yetone
cd13eeb7d9 fix: revert max_completion_tokens to max_tokens (#1741) 2025-03-27 16:53:55 +08:00
guanghechen
0b4a493d60 improve: support to customize the keymaps for cancelling the editing (#1730)
* improve: support to customize the keymaps for cancelling the editing

* docs: update README
2025-03-26 21:58:03 +08:00
yetone
0e69891bdc chores: change business sponsor logo (#1727) 2025-03-26 17:10:10 +08:00
yetone
fc3e90ce3b feat: business sponsors (#1724) 2025-03-26 15:03:04 +08:00
Bogdan Manole
19cc52ee6b doc: adding config line for home-manager (#1680)
- related to #1679
2025-03-24 15:44:53 +08:00
Omar Crespo
bae5275705 feat: add stop sequence (#1652) 2025-03-21 19:34:33 +08:00
yetone
191d7b8783 feat: claude text editor tool (#1631) 2025-03-19 00:09:49 +08:00
kernitus
10ce065d9e feat: update openai/azure params (#1604)
* feat(openai): use max_completion_tokens & reasoning_effort params

* feat(openai): use developer prompt for reasoning models

* docs: update openai config in readme

* refactor: follow lua style quotes

* fix(azure): rename max_tokens to max_completion_tokens

* refactor(azure): remove duplicate field

* refactor: update types

* refactor(azure): update type
2025-03-18 19:40:20 +08:00
yetone
62a8c07e91 fix(docs): lazy version (#1628) 2025-03-18 19:37:17 +08:00
yetone
cd50140cb4 docs: add better codebase indexing to todo list (#1585) 2025-03-14 16:05:14 +08:00
yetone
931a364dad feat: add AvanteModels command (#1575) 2025-03-13 01:45:20 +08:00
yetone
d422cfabcd fix(docs): details format (#1573) 2025-03-13 00:04:56 +08:00
yetone
f3583a04ca docs: add mcp doc (#1572) 2025-03-13 00:01:19 +08:00
yetone
409ee9cfda docs: update Discord invite link (#1568) 2025-03-12 18:09:01 +08:00
RiN
de6e3657f1 docs: add example avante with nixvim options (#1561) 2025-03-12 02:57:35 +08:00
Jae-Won Chung
9fa2d9e51d fix: remove sidebar.close_from_input default keybinding (#1536)
* Remove <c-d> from default insert mode keybinding

* Check `close_from_input ~= nil` before keymap

* Default `close_from_input` to `nil`

* Update README.md keybinding defaults

* Try to pass type check
2025-03-10 15:31:13 +08:00
yetone
cdbfe79097 docs: for ollama (#1545) 2025-03-10 02:43:12 +08:00
yetone
750ee80971 feat: add ollama as supported provider (#1543)
* feat: add ollama as supported provider

*This implementation is only working with `stream = true`*
- Uses the actual ollama api and allows for passing additional options
- Properly passes the system prompt to api

Use ollama as provider in opts like this:
opts = {
        debug = true,
        provider = "ollama",
        ollama = {
                api_key_name = "",
                endpoint = "http://127.0.0.1:11434",
                model = "qwen2.5-coder:latest",
                options = {
                        num_ctx = 32768,
                        temperature = 0,
                },
                stream = true,
        },

* fix: ollama types

---------

Co-authored-by: jtabke <25010496+jtabke@users.noreply.github.com>
2025-03-10 02:23:56 +08:00
moecasts
32665974ee feat: add neotree shortcut tip (#1537) 2025-03-09 15:43:39 +08:00
Lukas Nakamura
510bf2ff35 docs: update README with the AvanteClear command (#1528) 2025-03-08 21:16:30 +08:00
teleivo
dec794ac85 fix: contradictory lazy.nvim installation spec (#1506)
setting `lazy=false` means eagerly loading the plugin while `event =
"VeryLazy"` suggests the plugin should be lazy loaded

https://lazy.folke.io/spec/lazy_loading

> Plugins will be lazy-loaded when one of the following is true:
>
> * The plugin only exists as a dependency in your spec
> * It has an event, cmd, ft or keys key
> * config.defaults.lazy == true

see example

6c3bda4aca/lua/lazy/example.lua (L58-L60)
2025-03-06 18:33:52 +08:00
Vinicius Zenha
cad42ac00f Update README.md (#1497) 2025-03-05 22:25:35 +08:00
nzlov
7919fe010b docs: add custom tool (#1487)
* docs: add custom tool

* fix: add func params
2025-03-05 19:24:57 +08:00
yetone
ce3c47b6ec fix: allow reset rag service chromadb (#1491) 2025-03-05 17:01:00 +08:00
Francesco Tassi
232c9a635c feat: improve avante-rag-service container execution (#1448)
* Refactor Docker mount to mount only user home

Mounting the whole filesystem expose the user to security risks,
considering the container is running are root.

This mounts only the user home directory in the container, to mitigate
the security risks. The user home directory is mounted in read only mode
to even reduce the risks of accidental or malicious modifications.

Mounting the whole should allow the user to have multiple neovim instances runinng at
the same time and sharing the same rag_service.

Also the container is started with the --rm flag to remove it after it stops.

* RAG mount point is not configurable

* Remove useless filter.lua file

* Use Path to join paths

This should be more safe than just concatenating strings.
2025-03-05 16:18:52 +08:00
Limbo Peng
ab63b52ffb feat: add Brave Search as web search engine provider (#1481)
* feat: add Brave Search as web search engine provider

* docs: update README
2025-03-04 23:47:04 +08:00
nzlov
de7cccd089 feat: add support for ollama RAG providers (#1427)
* fix: openai env

* feat: add support for multiple RAG providers

- Added provider, model and endpoint configuration options for RAG service

- Updated RAG service to support both OpenAI and Ollama providers

- Added Ollama embedding support and dependencies

- Improved environment variable handling for RAG service configuration

Signed-off-by: wfhtqp@gmail.com <wfhtqp@gmail.com>

* fix: update docker env

* feat: rag server add ollama llm

* fix: pre-commit

* feat: check embed model and clean

* docs: add rag server config docs

* fix: pyright ignore

---------

Signed-off-by: wfhtqp@gmail.com <wfhtqp@gmail.com>
2025-03-04 11:07:40 +08:00
Omar Crespo
6bbf9b3c42 doc: document disabled_tools option (#1471) 2025-03-03 15:01:58 +08:00
Omar Crespo
7d28e9b233 docs: add select model binding to readme (#1443) 2025-03-01 17:44:43 +08:00
yetone
7c9ee0760a docs: add mcp in todos (#1444) 2025-03-01 17:43:07 +08:00
Ben Burgess
86feaf3e38 docs: add sidebar toggle keymap (#1439) 2025-03-01 12:48:06 +08:00
yetone
2b3a41e811 feat: implement a more flexible custom prompts solution (#1390) 2025-02-25 16:08:16 +08:00
yetone
c2188e1afd chores: add python ci badge (#1367) 2025-02-23 23:17:47 +08:00