Skip to content

[Enhancement/Bug] Streamline Tutorial Onboarding and Resolve Silent Inference Failures #383

@iocalangiu

Description

@iocalangiu

The current tutorial_image_segmentation.ipynb has a high barrier to entry and fails silently on specific local environments.

1. Massive Data Overhead: The requirement to download 10GB of data just to verify the pipeline works is a major blockage. It often triggers Google Drive download limits and discourages quick testing.

2. Silent Inference Failure: On certain Intel-based hardware (MacBook Pro 2016), the model runs without errors but produces an all-zero prediction mask. I checked whether the weights are loaded normally (they seem fine) and the range of input values (0-255).

Prediction  is 0s 
Image
     Weights are OK	
Image
   Input values are OK
Image

3. Environment Incompatibility: The notebook fails in Google Colab due to a version mismatch with numpy, Colab comes with numpy > 2.2.x while perceptionmetrics requires 1.26.2.

SOLUTIONS:
I propose updating the tutorial with the following improvements:

1. "Quick Start" Mode (Sample Dataset)
Introduce a v_small dataset option. I would introduce a curated set of 5–10 images instead of the full 10GB. Users can verify their environment and the inference logic in under 60 seconds.
3. Robust Environment Support (Colab & NumPy)
Here I am still working on finding the right combination. Colab's environment update to NumPy 2.x breaks backward compatibility with the current requirement of 1.26.2.
Plus, here I would add instructions on how to create a mount such that users don't need to download the data every time.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions