This document should help you make sense of the codebase and provide guidance on working with it and testing it locally.
Note that since Bors is a GitHub app, it has a relatively non-trivial first-time setup that is required to test it on live repositories. However, you don't need to do that, as we also have a comprehensive integration test suite, which should be good enough for testing most changes.
Directory structure:
migrationssqlxmigrations that are the source of truth for the database schema.
src/bors- Bors commands and their handlers.
src/database- Database access layer built on top of
sqlx.
- Database access layer built on top of
src/github- Communication with the GitHub API and definitions of GitHub webhook messages.
src/server- Axum server that hosts the queue page and a few other endpoints.
The following diagram shows a simplified view on the important state entities of Bors. bors_process handles events generated by webhooks. It uses a shared global state through BorsContext, which holds a shared connection to the database and a command parser. It also has access to a map of repository states. Each repository state contains an API client for that repository, its loaded config, and permissions loaded from the Team API.
---
title: Important entities
---
flowchart
BorsContext
repo_client_1["GithubRepositoryClient"]
repo_client_2["GithubRepositoryClient"]
repo_state_1["RepositoryState"]
repo_state_2["RepositoryState"]
BorsContext --> repo_state_1 --> repo_client_1 --> repo1
BorsContext --> repo_state_2 --> repo_client_2 --> repo2
repo_state_1 --> cr1["Config (repo 1)"]
repo_state_1 --> pr1["Permissions (repo 1)"]
repo_state_2 --> cr2["Config (repo 2)"]
repo_state_2 --> pr2["Permissions (repo 2)"]
BorsContext --> db[(Database)]
bors_process --> BorsContext
Bors requires an actual running Postgres database for running its test suite, and even for compiling, because it uses sqlx and compile-time checked queries.
If you want to build bors without access to a running Postgres DB, add the SQLX_OFFLINE=1 environment variable to an .env file in the root of the project.
The database can be set up with the docker-compose file in the root of the repository:
$ docker-compose up -dThen, set the DATABASE_URL environment variable to the connection string of the database.
The connection string for the database started by the docker compose file in the repository can be found
in the .env.example file.
If an .env file is present, environment variables listed in it will be picked up automatically by sqlx.
$ export DATABASE_URL=postgres://bors:bors@localhost:5432/borsYou must have sqlx-cli installed for the following commands to work.
$ cargo install sqlx-cli@0.7.4 --no-default-features --features native-tls,postgresTo apply migrations to your Postgres DB, you can execute the cargo sqlx migrate run command. To delete the whole database and recreate it from scratch (including applying) migrations, run cargo sqlx database reset.
To run tests, simply run cargo test while the Postgres database is running, and the DATABASE_URL environment variable being set correctly.
By default, logs are disabled in tests. To enable them, add the #[traced_test] attribute on top of the test function.
Caution
When adding a new NOT NULL column, always specify the DEFAULT value that will be backfilled
during the migration! Otherwise, the migration might break the deployed bors service.
- Generate a new migration
$ cargo sqlx migrate add <new-migration> - Change the migration manually in
migrations/<timestamp>-<new-migration>.sql. - Apply migrations to the Postgre DB.
$ cargo sqlx migrate run - Add a test data file to
tests/data/migrations/<timestamp>-<new-migration>.sql.
- The file should contain SQL that inserts some reasonable data into a test database after the migration is applied. The goal is to check that we have a test database with production-like data, so that we can test that applying migrations will not produce errors on a non-empty database.
- If it doesn't make sense to add any data to the migration (e.g. if the migration only adds an index), put
-- Empty to satisfy migration testsinto the file.
Before you make a commit that changes SQL queries in the bors codebase, you should regenerate the stored sqlx metadata files in the .sqlx directory:
rm -rf .sqlx
cargo sqlx prepare -- --all-targets
git add .sqlxMake sure to remove the
.sqlxdirectory before running thepreparecommand, to ensure that leftover queries do not remain committed in the repository.
Bors has a cargo test suite that you can run locally, but sometimes nothing beats an actual test on live, GitHub repositories. The bot has a staging deployment at the https://github.com/rust-lang/bors-kindergarten repository, where you can try it however you want.
Nevertheless, sometimes it might be easier to test it on your own repository. The process is a bit involved, but it can still be done if needed.
- Create your own GitHub app.
- Configure its webhook secret and private key and write them down.
- Give it permissions for
Actions(r/w),Checks(r/w),Contents(r/w),Issues(r/w) andPull requests(r/w). - Subscribe it to webhook events
Issue comment,Push,Pull request,Pull request review,Pull request review commentandWorkflow run.
- Install your GitHub app on some test repository where you want to test bors.
- Add
rust-bors.tomlin the root of the repository, and also add some example CI workflows. - If you want to use custom permissions for PR approvals, create team data files for GitHub users in
data/team. You can find examples in that directory, which you should copy and remove the.examplesuffix.- Get your GitHub user
IDhttps://api.github.com/users/<your_github_user_name> - Edit both
bors.review.jsonandbors.try.jsonfiles to include your GitHubID:{ "github_ids": [123] }
- Get your GitHub user
- Start the Postgres database
- Run bors locally, and configure environment variables and/or command-line parameters for it:
- Set
APP_IDto the ID of the created GitHub app. - Set
WEBHOOK_SECRETto the webhook secret of the app. - Set
PRIVATE_KEYto the private key of the app. - (optional) Set
WEB_URLto the public URL of the website of the app. - (optional) Set
CMD_PREFIXto the command prefix used to control the bot (e.g.@bors). - (optional) Set
PERMISSIONS"data/permissions"directory path to list users with permissions to perform try/review.
- Set
- Redirect webhooks from your test repository to bors. You can gh webhook for that.
- If you want to do it manually, you need to configure a globally reachable URL/IP address for your computer e.g. using ngrok, and then configure the webhook URL of your GitHub app to point to
<your-pc-address>/github.
- If you want to do it manually, you need to configure a globally reachable URL/IP address for your computer e.g. using ngrok, and then configure the webhook URL of your GitHub app to point to
- Try
@bors pingon some PR on your test repository :)
For testing the merge queue, there's a scripts/seed.py script that can automatically create multiple PRs on a given repository and approve them with @bors r+ command.
$ pip install PyGithubThe script requires a GitHub personal access token with the following permissions:
- Contents (read/write)
- Pull requests (read/write)
- Issues (write)
$ python scripts/seed.py --repo owner/repo-name --token $GITHUB_TOKEN --count 5- Creates multiple branches with simple changes (new markdown files)
- Opens pull requests for each branch
- Posts
@bors r+comments to approve each PR
When modifying commands, make sure to update the @bors help command in src/bors/handlers/help.rs.