PromptPipe.vim is your Vim-powered AI assistant. Automatically combine prompt templates with your code and documentation using the familiar copy-paste workflow developers rely on daily. Compatible with all AI platforms, no API required—keeping your workflow simple with copy-paste and your costs at zero.
PromptPipe works by following these steps:
- Scans ~/.promptpipe/prompts and ./.promptpipe/prompts directories for prompt templates (with tab completion for quick selection)
- Loads system-level context from ~/.promptpipe/PROMPTPIPE.md
- Merges with project-specific context from .promptpipe/PROMPTPIPE.md if available
- Gathers text from all your open Vim buffers
- Combines the context, template, and buffer content into an LLM prompt
- Copies the prompt to your clipboard
- (optional) Runs the prompt using, e.g., OpenAI's CLI tool and shows the output in a Vim buffer
- Clone the repository:
git clone https://gitlab.com/promptpipe/promptpipe.vim.git
cd promptpipe.vim
- Run the installation:
make install
- Create the plugin directory if it doesn't exist:
mkdir -p ~/.vim/plugin
- Copy the plugin file:
cp promptpipe.vim ~/.vim/plugin/
- Create the templates directory:
mkdir -p ~/.promptpipe/prompts
The installation includes these sample prompts:
-
Chat (
~/.promptpipe/prompts/chat.yml)description: "Chat" template: | Chat with the user to help them solve their problem. Context:
-
Implement Feature (
~/.promptpipe/prompts/developer/implement-feature.yaml)description: "Implement feature" version: "2.0" persona: "software-engineering/software-engineer" variables: requirements: description: "Requirements" type: string required: true example: "Implement authentication using OAuth2" template: | Implement the feature described in the requirements. Ask the user to specify the value for variables that are in the format {{variable name}}. Instructions: 1. Review the requirements 2. If requirements are not 100% clear, ask for clarification 3. Review the code that is affected by the requirements 4. Think through the solution step-by-step 5. Code the solution 6. Verify the code matches the requirements Requirements: {{requirements}}
Add the following to your .vimrc to customize:
" PromptPipe Configuration
" Change the command name (default is 'Prompt')
let g:promptpipe_cmd = 'Prompt'
" Direct execution with OpenAI LLM (optional)
let g:promptpipe_pipecmd = 'llm'
" Direct execution with local LLM (optional)
let g:promptpipe_pipecmd = 'llm'
" Example mappings
" Use leader+p to run the Prompt command (configured automatically by the plugin)
" nnoremap <leader>p :<C-u>Prompt<space>
" Or define custom mappings for specific prompts:
" nnoremap <leader>pc :<C-u>Prompt chat<CR>
" nnoremap <leader>pd :<C-u>Prompt developer/implement-feature<CR>PromptPipe supports direct execution with local LLMs or any command-line tool that accepts input from standard input:
-
Setup: Add the following to your
.vimrc:" Set the LLM command to use, e.g., https://llm.datasette.io/ let g:promptpipe_pipecmd = 'llm'
-
How it works:
- When you run a prompt and
promptpipe_pipecmdis set, the plugin will:- Save the prompt to
.promptpipe/logs/timestamp.prompt - Execute the command with the prompt file as input
- Open the result in a new buffer automatically
- Save the prompt to
- When you run a prompt and
-
Examples:
" Using llama.cpp let g:promptpipe_pipecmd = 'llama -m ~/models/llama-7b.gguf -p' " Using Ollama let g:promptpipe_pipecmd = 'ollama run mistral' " Using OpenAI command-line tool let g:promptpipe_pipecmd = 'openai api chat.completions.create --model -m gpt-3.5-turbo'
-
View output: The model's response will automatically open in a new buffer with a unique name including the timestamp.
:Prompt [template_name]
Examples:
:Prompt chat:Prompt developer/implement-feature
Type /Prompt template_name and leave insert mode to execute.
Press <leader>p to start the prompt command.
PromptPipe scans for YAML prompt templates in:
~/.promptpipe/prompts/.promptpipe/prompts/(local project directory)
When executing a prompt:
- The plugin searches for the specified template in the above directories
- It extracts the template content using
yq - It combines the template with the content of all open buffers
- The result is copied to your clipboard, ready to be pasted into an AI assistant
- If
promptpipe_pipecmdis set, it also pipes the prompt to the specified command and displays the output
- Vim 8.0+
- (optional)
yqcommand-line tool for template support (https://github.com/mikefarah/yq) - (optional) Command-line LLM tools for direct execution
We welcome contributions to PromptPipe.vim! Here's how you can help:
-
Fork the repository:
git clone https://gitlab.com/promptpipe/promptpipe.vim.git cd promptpipe.vim -
Create a new branch:
git checkout -b my-feature-branch -
Make your changes and test them thoroughly
-
Commit your changes:
git commit -am "Add new feature: description" -
Push to your fork:
git push origin my-feature-branch -
Create a merge request from your fork to the main repository
- Follow the existing code style
- Add tests for new features, if possible
- Update documentation as needed
- Keep changes focused on a single issue or feature
Please use the GitLab issue tracker to submit bug reports or feature requests. When reporting bugs, please include:
- Description of the issue
- Steps to reproduce
- Expected behavior
- Vim version and operating system
For professional support, custom development, or services related to PromptPipe, contact Aktagon Ltd..
This project is licensed under the MIT License - see the LICENSE file for details.
Active development - accepting contributions and feature requests.