Carlos Gutierrez 8a3ee81c3f feat: add event-driven architecture with scope resolution
- Add event queue system (queue.lua) with priority-based processing
- Add patch system (patch.lua) with staleness detection via changedtick
- Add confidence scoring (confidence.lua) with 5 weighted heuristics
- Add async worker wrapper (worker.lua) with timeout handling
- Add scheduler (scheduler.lua) with completion-aware injection
- Add Tree-sitter scope resolution (scope.lua) for functions/methods/classes
- Add intent detection (intent.lua) for complete/refactor/fix/add/etc
- Add tag precedence rules (first tag in scope wins)
- Update autocmds to emit events instead of direct processing
- Add scheduler config options (ollama_scout, escalation_threshold)
- Update prompts with scope-aware context
- Update README with emojis and new features
- Update documentation (llms.txt, CHANGELOG.md, doc/codetyper.txt)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-13 21:55:44 -05:00
2026-01-11 17:38:25 -05:00
2026-01-11 17:38:25 -05:00

🚀 Codetyper.nvim

AI-powered coding partner for Neovim - Write code faster with LLM assistance while staying in control of your logic.

License: MIT Neovim

Features

  • 📐 Split View: Work with your code and prompts side by side
  • 💬 Ask Panel: Chat interface for questions and explanations
  • 🤖 Agent Mode: Autonomous coding agent with tool use (read, edit, write, bash)
  • 🏷️ Tag-based Prompts: Use /@ and @/ tags to write natural language prompts
  • Transform Commands: Transform prompts inline without leaving your file
  • 🔌 Multiple LLM Providers: Claude, OpenAI, Gemini, Copilot, and Ollama (local)
  • 📋 Event-Driven Scheduler: Queue-based processing with optimistic execution
  • 🎯 Tree-sitter Scope Resolution: Smart context extraction for functions/methods
  • 🧠 Intent Detection: Understands complete, refactor, fix, add, document intents
  • 📊 Confidence Scoring: Automatic escalation from local to remote LLMs
  • 🛡️ Completion-Aware: Safe injection that doesn't fight with autocomplete
  • 📁 Auto-Index: Automatically create coder companion files on file open
  • 📜 Logs Panel: Real-time visibility into LLM requests and token usage
  • 🔒 Git Integration: Automatically adds .coder.* files to .gitignore
  • 🌳 Project Tree Logging: Maintains a tree.log tracking your project structure

📚 Table of Contents


📋 Requirements

  • Neovim >= 0.8.0
  • curl (for API calls)
  • One of: Claude API key, OpenAI API key, Gemini API key, GitHub Copilot, or Ollama running locally

📦 Installation

Using lazy.nvim

{
  "cargdev/codetyper.nvim",
  cmd = { "Coder", "CoderOpen", "CoderToggle", "CoderAgent" },
  keys = {
    { "<leader>co", "<cmd>Coder open<cr>", desc = "Coder: Open" },
    { "<leader>ct", "<cmd>Coder toggle<cr>", desc = "Coder: Toggle" },
    { "<leader>ca", "<cmd>CoderAgentToggle<cr>", desc = "Coder: Agent" },
  },
  config = function()
    require("codetyper").setup({
      llm = {
        provider = "claude", -- or "openai", "gemini", "copilot", "ollama"
      },
    })
  end,
}

Using packer.nvim

use {
  "cargdev/codetyper.nvim",
  config = function()
    require("codetyper").setup()
  end,
}

🚀 Quick Start

1. Open a file and start Coder:

:e src/utils.ts
:Coder open

2. Write a prompt in the coder file (left panel):

/@ Create a function to validate email addresses
using regex, return boolean @/

3. The LLM generates code and injects it into utils.ts (right panel)


⚙️ Configuration

require("codetyper").setup({
  -- LLM Provider Configuration
  llm = {
    provider = "claude", -- "claude", "openai", "gemini", "copilot", or "ollama"

    -- Claude (Anthropic) settings
    claude = {
      api_key = nil, -- Uses ANTHROPIC_API_KEY env var if nil
      model = "claude-sonnet-4-20250514",
    },

    -- OpenAI settings
    openai = {
      api_key = nil, -- Uses OPENAI_API_KEY env var if nil
      model = "gpt-4o",
      endpoint = nil, -- Custom endpoint (Azure, OpenRouter, etc.)
    },

    -- Google Gemini settings
    gemini = {
      api_key = nil, -- Uses GEMINI_API_KEY env var if nil
      model = "gemini-2.0-flash",
    },

    -- GitHub Copilot settings (uses copilot.lua/copilot.vim auth)
    copilot = {
      model = "gpt-4o",
    },

    -- Ollama (local) settings
    ollama = {
      host = "http://localhost:11434",
      model = "deepseek-coder:6.7b",
    },
  },

  -- Window Configuration
  window = {
    width = 25, -- Percentage of screen width (25 = 25%)
    position = "left",
    border = "rounded",
  },

  -- Prompt Tag Patterns
  patterns = {
    open_tag = "/@",
    close_tag = "@/",
    file_pattern = "*.coder.*",
  },

  -- Auto Features
  auto_gitignore = true, -- Automatically add coder files to .gitignore
  auto_open_ask = true, -- Auto-open Ask panel on startup
  auto_index = false, -- Auto-create coder companion files on file open

  -- Event-Driven Scheduler
  scheduler = {
    enabled = true, -- Enable event-driven prompt processing
    ollama_scout = true, -- Use Ollama for first attempt (fast local)
    escalation_threshold = 0.7, -- Below this confidence, escalate to remote
    max_concurrent = 2, -- Max parallel workers
    completion_delay_ms = 100, -- Delay injection after completion popup
  },
})

Environment Variables

Variable Description
ANTHROPIC_API_KEY Claude API key
OPENAI_API_KEY OpenAI API key
GEMINI_API_KEY Google Gemini API key

🔌 LLM Providers

Claude (Anthropic)

Best for complex reasoning and code generation.

llm = {
  provider = "claude",
  claude = { model = "claude-sonnet-4-20250514" },
}

OpenAI

Supports custom endpoints for Azure, OpenRouter, etc.

llm = {
  provider = "openai",
  openai = {
    model = "gpt-4o",
    endpoint = "https://api.openai.com/v1/chat/completions", -- optional
  },
}

Google Gemini

Fast and capable.

llm = {
  provider = "gemini",
  gemini = { model = "gemini-2.0-flash" },
}

GitHub Copilot

Uses your existing Copilot subscription (requires copilot.lua or copilot.vim).

llm = {
  provider = "copilot",
  copilot = { model = "gpt-4o" },
}

Ollama (Local)

Run models locally with no API costs.

llm = {
  provider = "ollama",
  ollama = {
    host = "http://localhost:11434",
    model = "deepseek-coder:6.7b",
  },
}

📝 Commands Reference

Main Commands

Command Description
:Coder {subcommand} Main command with subcommands
:CoderOpen Open the coder split view
:CoderClose Close the coder split view
:CoderToggle Toggle the coder split view
:CoderProcess Process the last prompt

Ask Panel

Command Description
:CoderAsk Open the Ask panel
:CoderAskToggle Toggle the Ask panel
:CoderAskClear Clear chat history

Agent Mode

Command Description
:CoderAgent Open the Agent panel
:CoderAgentToggle Toggle the Agent panel
:CoderAgentStop Stop the running agent

Transform Commands

Command Description
:CoderTransform Transform all /@ @/ tags in file
:CoderTransformCursor Transform tag at cursor position
:CoderTransformVisual Transform selected tags (visual mode)

Utility Commands

Command Description
:CoderIndex Open coder companion for current file
:CoderLogs Toggle logs panel
:CoderType Switch between Ask/Agent modes
:CoderTree Refresh tree.log
:CoderTreeView View tree.log

📖 Usage Guide

Tag-Based Prompts

Write prompts in your coder file using /@ and @/ tags:

/@ Create a Button component with the following props:
- variant: 'primary' | 'secondary' | 'danger'
- size: 'sm' | 'md' | 'lg'
- disabled: boolean
Use Tailwind CSS for styling @/

When you close the tag with @/, the prompt is automatically processed.

Transform Commands

Transform prompts inline without the split view:

// In your source file:
/@ Add input validation for email and password @/

// Run :CoderTransformCursor to transform the prompt at cursor

Prompt Types

The plugin auto-detects prompt type:

Keywords Type Behavior
refactor, rewrite Refactor Replaces code
add, create, implement Add Inserts new code
document, comment Document Adds documentation
explain, what, how Explain Shows explanation only

🤖 Agent Mode

The Agent mode provides an autonomous coding assistant with tool access:

Available Tools

  • read_file: Read file contents
  • edit_file: Edit files with find/replace
  • write_file: Create or overwrite files
  • bash: Execute shell commands

Using Agent Mode

  1. Open the agent panel: :CoderAgent or <leader>ca
  2. Describe what you want to accomplish
  3. The agent will use tools to complete the task
  4. Review changes before they're applied

Agent Keymaps

Key Description
<CR> Submit message
Ctrl+c Stop agent execution
q Close agent panel

⌨️ Keymaps

Default Keymaps (auto-configured)

Key Mode Description
<leader>ctt Normal Transform tag at cursor
<leader>ctt Visual Transform selected tags
<leader>ctT Normal Transform all tags in file
<leader>ca Normal Toggle Agent panel
<leader>ci Normal Open coder companion (index)

Ask Panel Keymaps

Key Description
@ Attach/reference a file
Ctrl+Enter Submit question
Ctrl+n Start new chat
Ctrl+f Add current file as context
q Close panel
Y Copy last response

Suggested Additional Keymaps

local map = vim.keymap.set

map("n", "<leader>co", "<cmd>Coder open<cr>", { desc = "Coder: Open" })
map("n", "<leader>cc", "<cmd>Coder close<cr>", { desc = "Coder: Close" })
map("n", "<leader>ct", "<cmd>Coder toggle<cr>", { desc = "Coder: Toggle" })
map("n", "<leader>cp", "<cmd>Coder process<cr>", { desc = "Coder: Process" })
map("n", "<leader>cs", "<cmd>Coder status<cr>", { desc = "Coder: Status" })

🏥 Health Check

Verify your setup:

:checkhealth codetyper

This checks:

  • Neovim version
  • curl availability
  • LLM configuration
  • API key status
  • Telescope availability (optional)

📁 File Structure

your-project/
├── .coder/                    # Auto-created, gitignored
│   └── tree.log              # Project structure log
├── src/
│   ├── index.ts              # Your source file
│   ├── index.coder.ts        # Coder file (gitignored)
└── .gitignore                # Auto-updated with coder patterns

🤝 Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.


📄 License

MIT License - see LICENSE for details.


👨‍💻 Author

cargdev


Made with ❤️ for the Neovim community

Description
No description provided
Readme MIT 828 KiB
Languages
Lua 100%