Hyperspectral image construction in different spectral bands of tea leafs for identifying the tea type using O-ConvNet-RF model
Abstract
Tea, a commonly consumed beverage, is susceptible to being sold in adulterated or expired forms by third-party vendors. Hyperspectral imaging across different wavelength bands has proven to precisely assess the diverse types of tea and their corresponding financial gains. This study aims to employ a deep learning methodology in conjunction with hyperspectral imaging for efficiently classifying tea leaves. A novel approach is proposed, wherein a waveband convolutional neural network is utilized to generate hyper spectral images of tea leaf samples with enhanced resolution. The model known as optimized-convolutional neural network-random forest O- (ConvNet-RF) demonstrated exceptional performance, achieving high accuracy, impressive recall, F1 score, and notable sensitivity rate, outperforming existing alternative methods. The tea leaf types, namely green, yellow, and black, were accurately identified using a combination of the random forest (RF) model and the O-ConvNet-RF model. The tree-based classification method for the identification of tea leaves demonstrated superior performance as compared to alternative machine learning models. In general, this study presents a successful methodology for the classification of tea leaves, with potential implications for consumer processing and distributor profit analysis.
Keywords
Convolutional neural network; Feature extraction; Hyperspectral image; Image spectral bands; Optimal waveband; Random forest; Tea class identification
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v35.i1.pp301-309
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).