Feature Maps ft. Patch Size #475
-
Alright, let's get this discussion started! I have a few concepts that I've kind of gone along without fully understanding the reasoning behind and seeking to get them clarified. Hopefully, this can help other people in the same boat as me and we can help each other. Fig. 1 from An attempt at beating the 3D U-Net: (my notes in green and red) My question which I've answered with what I've discovered is below. (Seeking for any corrections to it if I'm mistaken.) Why are there 30 feature maps at the beginning of the architecture?
Is my understanding correct? Fabian: Let me know if this is not the right place for this kind of posts, you're the boss at the end of the day. Thanks for everything again. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Hi, |
Beta Was this translation helpful? Give feedback.
-
This is where is comes from: self.base_num_features is what you are looking for. This number is saved in the plans file, see here: It was set to 30 in the initial nnU-Net (which did not support mixed precision yet) and was later changed to 32 when mixed precision support was added. 30 and 32 are very similar, there was no segmentation performance difference between them. 30/32 was chosen as trade-off between performance and memory consumption, with smaller numbers resulting in faster training but potentially (not necessarily!) reduced segmentation accuracy. Best, |
Beta Was this translation helpful? Give feedback.
This is where is comes from:
nnUNet/nnunet/training/network_training/nnUNetTrainerV2.py
Line 154 in 251ab3d
self.base_num_features is what you are looking for. This number is saved in the plans file, see here:
nnUNet/nnunet/training/network_training/nnUNetTrainer.py
Line 363 in 251ab3d
It was set to 30 in the initial nnU-Net (which did not support mixed precision yet) and was later changed to 32 when mixed precision support was added. 30 and 32 are very similar, there was no segmentation performance differenc…