Machine supercharges your GitHub Workflows with seamless GPU acceleration. Say goodbye to the tedious overhead of managing GPU runners and hello to streamlined efficiency. With Machine, developers and organizations can effortlessly scale their AI and machine learning projects, shifting focus from infrastructure headaches to innovation and speed.
This repository provides a complete, automated workflow for GPU-accelerated hyperparameter tuning of a ResNet model using GitHub Actions powered by Machine.dev. It systematically explores combinations of key training parameters—such as learning rates and batch sizes—to identify the optimal configuration for training a ResNet model on the CIFAR-10 dataset.
- ⚡ GPU Acceleration: Efficiently train and evaluate models using GPUs via Machine
- 🧪 Systematic Tuning: Automate the exploration of multiple hyperparameter combinations
- 📊 Performance Comparison: Automatically aggregate and compare model metrics
- 📤 Artifact Storage: Conveniently store and retrieve metrics and comparison results using GitHub Actions artifacts
- 📈 Easy Interpretation: Identify the best-performing configuration quickly with structured CSV outputs
├── .github/workflows/
│ └── resnet-hyperparameter-tuning.yaml # Workflow for hyperparameter tuning
├── train.py # Script to train and evaluate ResNet
├── compare_metrics.py # Script to aggregate and compare metrics
└── requirements.txt # Python dependencies
Click the Use this template button at the top of this page to create your own copy.
Ensure your repository uses Machine GPU-powered runners. No additional configuration is required if you're already using Machine.dev.
-
Trigger the workflow manually in GitHub Actions (
workflow_dispatch) -
The workflow automatically runs training jobs for each combination of hyperparameters defined in the matrix:
matrix: learning_rate: [0.001, 0.0005] batch_size: [32, 64]
- Training progress and metrics are logged during each workflow execution
- Performance metrics for each hyperparameter set are saved as artifacts (
metrics-{learning_rate}-{batch_size}) - A final comparison of all training results is saved as
model_comparison.csvin the artifactcomparison-results
- GitHub account
- Access to Machine GPU-powered runners
No local installation necessary—all processes run directly within GitHub Actions.
This repository is available under the MIT License.
-
This hyperparameter tuning template specifically targets ResNet models trained on the CIFAR-10 dataset but can easily be adapted for other models, datasets, and tasks with minimal modifications.
-
This repository is currently open for use as a template. While public forks are encouraged, we are not accepting Pull Requests at this time.
For questions or concerns, please open an issue.
