-
Notifications
You must be signed in to change notification settings - Fork 74
Description
The current tutorial_image_segmentation.ipynb has a high barrier to entry and fails silently on specific local environments.
1. Massive Data Overhead: The requirement to download 10GB of data just to verify the pipeline works is a major blockage. It often triggers Google Drive download limits and discourages quick testing.
2. Silent Inference Failure: On certain Intel-based hardware (MacBook Pro 2016), the model runs without errors but produces an all-zero prediction mask. I checked whether the weights are loaded normally (they seem fine) and the range of input values (0-255).
Prediction is 0s
Weights are OK
Input values are OK
3. Environment Incompatibility: The notebook fails in Google Colab due to a version mismatch with numpy, Colab comes with numpy > 2.2.x while perceptionmetrics requires 1.26.2.
SOLUTIONS:
I propose updating the tutorial with the following improvements:
1. "Quick Start" Mode (Sample Dataset)
Introduce a v_small dataset option. I would introduce a curated set of 5–10 images instead of the full 10GB. Users can verify their environment and the inference logic in under 60 seconds.
3. Robust Environment Support (Colab & NumPy)
Here I am still working on finding the right combination. Colab's environment update to NumPy 2.x breaks backward compatibility with the current requirement of 1.26.2.
Plus, here I would add instructions on how to create a mount such that users don't need to download the data every time.