Skip to content

fix unet bottleneck dim off by 1 error#29

Open
PatrickRMiles wants to merge 2 commits intoLBANN:mainfrom
PatrickRMiles:miles30/bottleneck_dim_offby1
Open

fix unet bottleneck dim off by 1 error#29
PatrickRMiles wants to merge 2 commits intoLBANN:mainfrom
PatrickRMiles:miles30/bottleneck_dim_offby1

Conversation

@PatrickRMiles
Copy link
Collaborator

@PatrickRMiles PatrickRMiles commented Mar 17, 2026

This PR adjusts how we calculate the numbers of "layers" in the unet from problem_scale and unet_bottleneck_dim. With this change, the bottleneck spatial dimensions are now pow(2, unet_bottleneck_dim) as intended.

Context: We noticed recently that the unet_bottleneck_dim did not map to the spatial dimensions of the bottleneck layers as we expected: for a dim of 3, we expected bottleneck spatial dimensions of size 8, we observed 4. This has to do with how we build the unet (or equivalently how we define "layers"). We pass a layers arg into the Unet construction. I think we have been interpreting this to mean the number of "levels" in the unet, but with how we construct the unet, it's actually "levels" - 1 (or equivalently the number of 2x2 downsampling steps we apply).
We construct the unet as follows:

  1. Start with one DoubleConv (input level)
  2. Add a MaxPool3d+DoubleConv block layers - 1  times
  3. Add a final MaxPool3d+DoubleConv (bottleneck level)

In ScaFFold, we calculate layers as layers = problem_scale - bottleneck_dim + 1 . At scale 6 and bottleneck dim 3, we have 6 - 3 + 1 == 4 . So in the unet, we have 1 + (layers - 1) + 1 == 1 + (4-1) + 1 == 5 "levels", four of which include 2x2 downsampling from the MaxPool3d. This means we take input of size 64, and apply 2x2 downsampling four times. 64 -> 32 -> 16 -> 8 -> 4. That's why we see bottleneck spatial dims of size 4, instead of 8. This holds regardless of problem_scale: at scale 7, we have input size 128 but five downsampling steps, so we still arrive at bottleneck spatial dim size 4.

Copy link
Collaborator

@michaelmckinsey1 michaelmckinsey1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants