A systematic evaluation of pre-trained encoder architectures for multimodal brain tumor segmentation using U-Net-based architectures
Abstract
Accurate brain tumor segmentation from medical imaging is critical for early diagnosis and effective treatment planning. Deep learning methods, particularly U-Net-based architectures, have demonstrated strong performance in this domain. However, prior studies have primarily focused on limited encoder backbones, overlooking the potential advantages of alternative pretrained models. This study presents a systematic evaluation of twelve pretrained convolutional neural networks—ResNet34, ResNet50, ResNet101, VGG16, VGG19, DenseNet121, InceptionResNetV2, InceptionV3, MobileNetV2, EfficientNetB1, SE-ResNet34, and SE-ResNet18—used as encoder backbones in the U-Net framework for identification and extraction of tumor-affected brain areas using the BraTS 2019 multimodal MRI dataset. Model performance was assessed through cross-validation, incorporating fault detection to enhance reliability. The MobileNetV2-based U-Net configuration outperformed all other architectures, achieving 99% cross-validation accuracy and 99.3% test accuracy. Additionally, it achieved a Jaccard coefficient of 83.45%, and Dice coefficients of 90.3% (Whole Tumor), 86.07% (Tumor Core), and 81.93% (Enhancing Tumor), with a low-test loss of 0.0282. These results demonstrate that MobileNetV2 is a highly effective encoder backbone for U-Net in extracting tasks for tumor-affected brain regions using multimodal medical imaging data.
Keywords
Brain tumor cancer; Encoders; Medical imaging; Pre-trained; U-Net segmentation
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v40.i2.pp850-859
Refbacks
- There are currently no refbacks.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES).