Skip to content

Latest commit

 

History

History
170 lines (134 loc) · 8.34 KB

File metadata and controls

170 lines (134 loc) · 8.34 KB

Development guide

This document should help you make sense of the codebase and provide guidance on working with it and testing it locally.

Note that since Bors is a GitHub app, it has a relatively non-trivial first-time setup that is required to test it on live repositories. However, you don't need to do that, as we also have a comprehensive integration test suite, which should be good enough for testing most changes.

Directory structure:

  • migrations
    • sqlx migrations that are the source of truth for the database schema.
  • src/bors
    • Bors commands and their handlers.
  • src/database
    • Database access layer built on top of sqlx.
  • src/github
    • Communication with the GitHub API and definitions of GitHub webhook messages.
  • src/server
    • Axum server that hosts the queue page and a few other endpoints.

Architecture diagram

The following diagram shows a simplified view on the important state entities of Bors. bors_process handles events generated by webhooks. It uses a shared global state through BorsContext, which holds a shared connection to the database and a command parser. It also has access to a map of repository states. Each repository state contains an API client for that repository, its loaded config, and permissions loaded from the Team API.

---
title: Important entities
---
flowchart
    BorsContext
    repo_client_1["GithubRepositoryClient"]
    repo_client_2["GithubRepositoryClient"]

    repo_state_1["RepositoryState"]
    repo_state_2["RepositoryState"]

    BorsContext --> repo_state_1 --> repo_client_1 --> repo1
    BorsContext --> repo_state_2 --> repo_client_2 --> repo2

    repo_state_1 --> cr1["Config (repo 1)"]
    repo_state_1 --> pr1["Permissions (repo 1)"]

    repo_state_2 --> cr2["Config (repo 2)"]
    repo_state_2 --> pr2["Permissions (repo 2)"]

    BorsContext --> db[(Database)]

    bors_process --> BorsContext
Loading

Database

Bors requires an actual running Postgres database for running its test suite, and even for compiling, because it uses sqlx and compile-time checked queries.

Compiling without a database

If you want to build bors without access to a running Postgres DB, add the SQLX_OFFLINE=1 environment variable to an .env file in the root of the project.

Setting up the database

The database can be set up with the docker-compose file in the root of the repository:

$ docker-compose up -d

Then, set the DATABASE_URL environment variable to the connection string of the database. The connection string for the database started by the docker compose file in the repository can be found in the .env.example file. If an .env file is present, environment variables listed in it will be picked up automatically by sqlx.

$ export DATABASE_URL=postgres://bors:bors@localhost:5432/bors

Applying migrations

You must have sqlx-cli installed for the following commands to work.

$ cargo install sqlx-cli@0.7.4 --no-default-features --features native-tls,postgres

To apply migrations to your Postgres DB, you can execute the cargo sqlx migrate run command. To delete the whole database and recreate it from scratch (including applying) migrations, run cargo sqlx database reset.

Running tests

To run tests, simply run cargo test while the Postgres database is running, and the DATABASE_URL environment variable being set correctly.

By default, logs are disabled in tests. To enable them, add the #[traced_test] attribute on top of the test function.

Updating the DB schema

Caution

When adding a new NOT NULL column, always specify the DEFAULT value that will be backfilled during the migration! Otherwise, the migration might break the deployed bors service.

  1. Generate a new migration
    $ cargo sqlx migrate add <new-migration>
  2. Change the migration manually in migrations/<timestamp>-<new-migration>.sql.
  3. Apply migrations to the Postgre DB.
    $ cargo sqlx migrate run
  4. Add a test data file to tests/data/migrations/<timestamp>-<new-migration>.sql.
  • The file should contain SQL that inserts some reasonable data into a test database after the migration is applied. The goal is to check that we have a test database with production-like data, so that we can test that applying migrations will not produce errors on a non-empty database.
  • If it doesn't make sense to add any data to the migration (e.g. if the migration only adds an index), put -- Empty to satisfy migration tests into the file.

Regenerate .sqlx directory

Before you make a commit that changes SQL queries in the bors codebase, you should regenerate the stored sqlx metadata files in the .sqlx directory:

rm -rf .sqlx
cargo sqlx prepare -- --all-targets
git add .sqlx

Make sure to remove the .sqlx directory before running the prepare command, to ensure that leftover queries do not remain committed in the repository.

How to test bors on live repositories

Bors has a cargo test suite that you can run locally, but sometimes nothing beats an actual test on live, GitHub repositories. The bot has a staging deployment at the https://github.com/rust-lang/bors-kindergarten repository, where you can try it however you want.

Nevertheless, sometimes it might be easier to test it on your own repository. The process is a bit involved, but it can still be done if needed.

One-time setup

  • Create your own GitHub app.
    • Configure its webhook secret and private key and write them down.
    • Give it permissions for Actions (r/w), Checks (r/w), Contents (r/w), Issues (r/w) and Pull requests (r/w).
    • Subscribe it to webhook events Issue comment, Push, Pull request, Pull request review, Pull request review comment and Workflow run.
  • Install your GitHub app on some test repository where you want to test bors.
  • Add rust-bors.toml in the root of the repository, and also add some example CI workflows.
  • If you want to use custom permissions for PR approvals, create team data files for GitHub users in data/team. You can find examples in that directory, which you should copy and remove the .example suffix.
    • Get your GitHub user ID https://api.github.com/users/<your_github_user_name>
    • Edit both bors.review.json and bors.try.json files to include your GitHub ID: { "github_ids": [123] }

Everytime you run bors

  1. Start the Postgres database
  2. Run bors locally, and configure environment variables and/or command-line parameters for it:
    • Set APP_ID to the ID of the created GitHub app.
    • Set WEBHOOK_SECRET to the webhook secret of the app.
    • Set PRIVATE_KEY to the private key of the app.
    • (optional) Set WEB_URL to the public URL of the website of the app.
    • (optional) Set CMD_PREFIX to the command prefix used to control the bot (e.g. @bors).
    • (optional) Set PERMISSIONS "data/permissions" directory path to list users with permissions to perform try/review.
  3. Redirect webhooks from your test repository to bors. You can gh webhook for that.
    • If you want to do it manually, you need to configure a globally reachable URL/IP address for your computer e.g. using ngrok, and then configure the webhook URL of your GitHub app to point to <your-pc-address>/github.
  4. Try @bors ping on some PR on your test repository :)

Seeding test repositories

For testing the merge queue, there's a scripts/seed.py script that can automatically create multiple PRs on a given repository and approve them with @bors r+ command.

Prerequisites

$ pip install PyGithub

Usage

The script requires a GitHub personal access token with the following permissions:

  • Contents (read/write)
  • Pull requests (read/write)
  • Issues (write)
$ python scripts/seed.py --repo owner/repo-name --token $GITHUB_TOKEN --count 5

What it does

  1. Creates multiple branches with simple changes (new markdown files)
  2. Opens pull requests for each branch
  3. Posts @bors r+ comments to approve each PR

Updating commands

When modifying commands, make sure to update the @bors help command in src/bors/handlers/help.rs.