Avoiding overfitting using unetLayers

7 次查看(过去 30 天)
Hello!
I'm trying to get a unetLayers-network to work on a binary segmentation problem. I'm training the network on patches of CT-images where the goal is to segment the bone pixels from background pixels. Since bones in CT-images have high numerical values compared to background, this shouldn't be a difficult problem. However, my network keeps overfitting after approximately 1-2 epochs each time I train.
I have approximately 23000 training patches. There is a slight class imbalance in the extracted patches. The pixel count is:
{'background'} : 1.6879e+08
{'bone' } : 2.2309e+07
What I've tried so far to avoid overfitting:
  • I tried adding augmentation according to: aug = imageDataAugmenter('RandRotation', [-20, 20], 'RandScale', [0.7 1.5]); as well as other types of augmentation.
  • I tried adding different amounts of 'L2Regularization'.
  • I tried making the 'EnoderDepth' smaller in order to reduce the complexity of the model.
What more can I try to remove the overfitting when I'm training my network on this data using the unetLayers-architecture?

采纳的回答

Srivardhan Gadila
Srivardhan Gadila 2020-3-14
The following are some suggestions:
  1. Since there is a slight class imbalance, try the following: Balance Classes Using Class Weighting
  2. Include dropoutLayer and batchNormalizationLayer in your architecture
  3. Refer to Deep Learning Tips and Tricks

更多回答(0 个)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by