async vim with lua

26 Oct 2024 . vim . Comments #ai

Andrew Hunt and Dave Thomas gave a modest challenge to learn a new language every year in The Pragmatic Programmer. I learned enough lua to get by this year. I was not doing a TOOD check list dogmatic ritual, but I had a need and it was the tool.

Ever since ditching the IDE and using vi for everything, there was an obvious delay in the ui. A lot of plugins to get ruby and typescript support had made for a bloated terminal, which was what drove me away from VSCode, JetBrains and Visual Studio. I want to balance a powerful editor with responsiveness.

The motivation for learning lua was to understand asynchronous configuration of neovim and to control it through callbacks. Lua commands are executed using keymaps and can gather the visual content or the buffer. Neovim has some docs about integrating the language into the editor.

The code companion plugin setup using Lazy is where learning lua became important since the setup leaves out defining a config.

{
  "olimorris/codecompanion.nvim",
  dependencies = {
    "nvim-lua/plenary.nvim",
    "nvim-treesitter/nvim-treesitter",
    "hrsh7th/nvim-cmp", -- Optional: For using slash commands and variables in the chat buffer
    "nvim-telescope/telescope.nvim", -- Optional: For using slash commands
    { "stevearc/dressing.nvim", opts = {} }, -- Optional: Improves `vim.ui.select`
  },
  config = true
}

In lua the config line becomes a function which oddly ends with end, because the anonymous function is called to define config, which is just another key in the lua struct which is returned with lazy, moving to a plugin as a separate file.

Note: config = true will default to the config.lua files found in the repository config.lua

plugins/codecompanion.lua

return {
  "olimorris/codecompanion.nvim",
  dependencies = {
    "nvim-lua/plenary.nvim",
    "nvim-treesitter/nvim-treesitter",
    "hrsh7th/nvim-cmp", -- Optional: For using slash commands and variables in the chat buffer
    "nvim-telescope/telescope.nvim", -- Optional: For using slash commands
    { "stevearc/dressing.nvim", opts = {} }, -- Optional: Improves `vim.ui.select`
  },
  config = function()
      prompt_library = {
          ["Write Spec"] = {
             strategy = "inline",
             description = "Generate a spec from the buffer",
             opts = {
                 -- allows for keymapping for automation
                 is_slash_cmd = true,
                 short_name = "rspec",
             },
             prompts = {
                 {
                     role = constants.SYSTEM_ROLE,
                     content = "Your role is writing an rspec unit test for the ruby on rails application that does X.",
                 },
                 {
                     role = constants.USER_ROLE,
                     content = "I want a unit test written in rspec for the code.",
                 },
             },
          },
      },
  end,
}

The RECIPES.md goes into more detail about injecting the visual buffer into prompt. It is easy to configure the LLM to always point locally so no public APIs are used.

adapters = {
    ollama = function()
      return require("codecompanion.adapters").extend("openai_compatible", {
        env = {
          url = "http[s]://open_compatible_ai_url", -- optional: default value is ollama url http://127.0.0.1:11434
          api_key = "OpenAI_API_KEY", -- optional: if your endpoint is authenticated
          chat_url = "/v1/chat/completions", -- optional: default value, override if different
        },
      })
    end,
},

I am most impressed about compatibility with OpenAI. It makes other tools plug into the ecosystem. I can only imagine how integrated a setup is possible if we let the AI build the prompts.

As for my primary concern about bloat, I have found a lightweight asynchronous configured editor that will engage features on demand, not eagerly. It was really easy to enable extensible AI that doesn’t have API dependencies, but only through learning lua.