Skip to content

aktagon/promptpipe.vim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

promptpipe.vim

PromptPipe.vim is your Vim-powered AI assistant. Automatically combine prompt templates with your code and documentation using the familiar copy-paste workflow developers rely on daily. Compatible with all AI platforms, no API required—keeping your workflow simple with copy-paste and your costs at zero.

Description

PromptPipe works by following these steps:

  1. Scans ~/.promptpipe/prompts and ./.promptpipe/prompts directories for prompt templates (with tab completion for quick selection)
  2. Loads system-level context from ~/.promptpipe/PROMPTPIPE.md
  3. Merges with project-specific context from .promptpipe/PROMPTPIPE.md if available
  4. Gathers text from all your open Vim buffers
  5. Combines the context, template, and buffer content into an LLM prompt
  6. Copies the prompt to your clipboard
  7. (optional) Runs the prompt using, e.g., OpenAI's CLI tool and shows the output in a Vim buffer

Installation

Using the Makefile

  1. Clone the repository:
git clone https://gitlab.com/promptpipe/promptpipe.vim.git
cd promptpipe.vim
  1. Run the installation:
make install

Manual Installation

  1. Create the plugin directory if it doesn't exist:
mkdir -p ~/.vim/plugin
  1. Copy the plugin file:
cp promptpipe.vim ~/.vim/plugin/
  1. Create the templates directory:
mkdir -p ~/.promptpipe/prompts

Sample Prompts

The installation includes these sample prompts:

  1. Chat (~/.promptpipe/prompts/chat.yml)

    description: "Chat"
    template: |
      Chat with the user to help them solve their problem.
    
      Context:
  2. Implement Feature (~/.promptpipe/prompts/developer/implement-feature.yaml)

    description: "Implement feature"
    version: "2.0"
    persona: "software-engineering/software-engineer"
    variables:
      requirements:
        description: "Requirements"
        type: string
        required: true
        example: "Implement authentication using OAuth2"
    template: |
      Implement the feature described in the requirements. Ask the user to specify
      the value for variables that are in the format {{variable name}}.
    
      Instructions:
      1. Review the requirements
      2. If requirements are not 100% clear, ask for clarification
      3. Review the code that is affected by the requirements
      4. Think through the solution step-by-step
      5. Code the solution
      6. Verify the code matches the requirements
      Requirements:
      {{requirements}}

Configuration

Add the following to your .vimrc to customize:

" PromptPipe Configuration
" Change the command name (default is 'Prompt')
let g:promptpipe_cmd = 'Prompt'

" Direct execution with OpenAI LLM (optional)
let g:promptpipe_pipecmd = 'llm'

" Direct execution with local LLM (optional)
let g:promptpipe_pipecmd = 'llm'

" Example mappings
" Use leader+p to run the Prompt command (configured automatically by the plugin)
" nnoremap <leader>p :<C-u>Prompt<space>
" Or define custom mappings for specific prompts:
" nnoremap <leader>pc :<C-u>Prompt chat<CR>
" nnoremap <leader>pd :<C-u>Prompt developer/implement-feature<CR>

Executing Prompts with Local Models

PromptPipe supports direct execution with local LLMs or any command-line tool that accepts input from standard input:

  1. Setup: Add the following to your .vimrc:

    " Set the LLM command to use, e.g., https://llm.datasette.io/
    let g:promptpipe_pipecmd = 'llm'
  2. How it works:

    • When you run a prompt and promptpipe_pipecmd is set, the plugin will:
      • Save the prompt to .promptpipe/logs/timestamp.prompt
      • Execute the command with the prompt file as input
      • Open the result in a new buffer automatically
  3. Examples:

    " Using llama.cpp
    let g:promptpipe_pipecmd = 'llama -m ~/models/llama-7b.gguf -p'
    
    " Using Ollama
    let g:promptpipe_pipecmd = 'ollama run mistral'
    
    " Using OpenAI command-line tool
    let g:promptpipe_pipecmd = 'openai api chat.completions.create --model -m gpt-3.5-turbo'
  4. View output: The model's response will automatically open in a new buffer with a unique name including the timestamp.

Usage

Command Mode

:Prompt [template_name]

Examples:

  • :Prompt chat
  • :Prompt developer/implement-feature

Insert Mode

Type /Prompt template_name and leave insert mode to execute.

Keyboard Shortcut

Press <leader>p to start the prompt command.

How It Works

PromptPipe scans for YAML prompt templates in:

  • ~/.promptpipe/prompts/
  • .promptpipe/prompts/ (local project directory)

When executing a prompt:

  1. The plugin searches for the specified template in the above directories
  2. It extracts the template content using yq
  3. It combines the template with the content of all open buffers
  4. The result is copied to your clipboard, ready to be pasted into an AI assistant
  5. If promptpipe_pipecmd is set, it also pipes the prompt to the specified command and displays the output

Requirements

Contributing

We welcome contributions to PromptPipe.vim! Here's how you can help:

  1. Fork the repository:

    git clone https://gitlab.com/promptpipe/promptpipe.vim.git
    cd promptpipe.vim
    
  2. Create a new branch:

    git checkout -b my-feature-branch
    
  3. Make your changes and test them thoroughly

  4. Commit your changes:

    git commit -am "Add new feature: description"
    
  5. Push to your fork:

    git push origin my-feature-branch
    
  6. Create a merge request from your fork to the main repository

Guidelines

  • Follow the existing code style
  • Add tests for new features, if possible
  • Update documentation as needed
  • Keep changes focused on a single issue or feature

Bug Reports and Feature Requests

Please use the GitLab issue tracker to submit bug reports or feature requests. When reporting bugs, please include:

  • Description of the issue
  • Steps to reproduce
  • Expected behavior
  • Vim version and operating system

Commercial Support

For professional support, custom development, or services related to PromptPipe, contact Aktagon Ltd..

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project Status

Active development - accepting contributions and feature requests.

About

A plugin for Vim that makes LLM prompting easier and cheaper

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors