Optimization of learning algorithms in multilayer perceptron for sheet resistance of reduced graphene oxide thin-film
Abstract
Multilayer perceptron (MLP) optimization is carried out to investigate the classifier's performance in discriminating the uniformity of reduced Graphene Oxide(rGO) thin-film sheet resistance. This study used three learning algorithms: resilient back propagation (RP), scaled conjugate gradient (SCG) and levenberg-marquardt (LM). The dataset used in this study is the sheet resistance of rGO thin films obtained from MIMOS Bhd. This work involved samples selection from a uniform and non-uniform rGO thin-film sheet resistance. The input and output data were under going data pre-processing: data normalization, data randomization and data splitting. The data were dividedin to three groups; training, validation and testing with a ratio of 70%: 15%: 15%, respectively. A varying number of hidden neurons optimized the learning algorithms in MLP from 1 to 10. Their behavior helped establish the best learning algorithms in discriminating MLP for rGO sheet resistance uniformity. The performances measured were the accuracy of training, validation and testing dataset, mean squared errors (MSE) andepochs. All the analytical work in this study was achieved automatically via MATLAB software version R2018a. It was found that the LM is dominant inthe optimization of a learning algorithm in MLP forrGO sheet resistance.The MSE for LM is the most reduced amid SCG and RP.
Keywords
Image classification; LM; MLP; Reduced graphene oxide; RP; SCG; Sheet resistance
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v23.i2.pp686-693
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).