RAC: a reusable adaptive convolution for CNN layer

Nguyen Viet Hung, Phi Dinh Huynh, Pham Hong Thinh, Phuc Hau Nguyen, Trong-Minh Hoang

Abstract


This paper proposes reusable adaptive convolution (RAC), an efficient alternative to standard 3×3 convolutions for convolutional neural networks (CNNs). The main advantage of RAC lies in its simplicity and parameter efficiency, achieved by sharing horizontal and vertical 1×k/k×1 filter banks across blocks within a stage and recombining them through a lightweight 1×1 mixing layer. By operating at the operator design level, RAC avoids post-training compression steps and preserves the conventional Conv–BN–activation structure, enabling seamless integration into existing CNN backbones. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on CIFAR-10 using several architectures, including ResNet-18/50/101, DenseNet, WideResNet, and EfficientNet. Experimental results demonstrate that RAC significantly reduces parameters and memory usage while maintaining competitive accuracy. These results indicate that RAC offers a reasonable balance between accuracy and compression, and is suitable for deploying CNN networks on resource-constrained platforms.


Keywords


Convolutional neural networks; Filter sharing; Lightweight deployment; Memory efficiency; Model compression; Reusable adaptive convolution

Full Text:

PDF


DOI: http://doi.org/10.11591/ijeecs.v41.i2.pp753-763

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES).

shopify stats IJEECS visitor statistics