A joint learning classification for intent detection and slot filling with domain-adapted embeddings
Abstract
For dialogue systems to function effectively, accurate natural language understanding is vital, relying on precise intent recognition and slot filling to ensure smooth and meaningful interactions. Previous studies have primarily focused on addressing each subtask individually. However, it has been discovered that these subtasks are interconnected and achieving better results requires solving them together. One drawback of the joint learning model is its inability to apply learned patterns to unseen data, which stems from a lack of large, annotated data. Recent approaches have shown that using pretrained embeddings for effective text representation can help address the issue of generalization. However, pretrained embeddings are merely trained on corpus that typically consist of commonly discussed matters, which might not necessarily contain domain specific vocabularies for the task at hand. To address this issue, the paper presents a joint model for intent detection and slot filling, harnessing pretrained embeddings and domain specific embeddings using canonical correlation analysis to enhance the model performance. The proposed model consists of convolutional neural network along with bidirectional long short-term memory (BiLSTM) for efficient joint learning classification. The results of the experiment show that the proposed model performs better than the baseline models.
Keywords
Dialogue system; Intent detection; Joint learning; Natural language understanding; Slot filling
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v37.i2.pp1306-1316
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).