A platform for research: civil engineering, architecture and urbanism
Pre-trained BERT Architecture Analysis for Indonesian Question Answer Model
Developing a question-and-answer system in Natural Language Processing (NLP) has become a major concern in the Indonesian language context. One of the main challenges in developing a question-and-answer system is the limited dataset, which can cause instability in system performance. The limitations of the dataset make it difficult for the question-and-answer model to understand and answer questions well. The proposed solution uses Transfer Learning with pre-trained models such as BERT. This research aims to analyze the performance of the BERT model, which has been adapted for question-and-answer tasks in Indonesian. The BERT model uses an Indonesian language dataset adapted specifically for question-and-answer tasks. A customization approach tunes BERT parameters according to the given training data. The results obtained; the model is improved by minimizing the loss function. Evaluation of the trained model shows that the best validation loss is 0.00057 after 150 epochs. In addition, through in-depth evaluation of the similarity of question texts, the BERT model can answer questions measurably, according to existing knowledge in the dataset.
Pre-trained BERT Architecture Analysis for Indonesian Question Answer Model
Developing a question-and-answer system in Natural Language Processing (NLP) has become a major concern in the Indonesian language context. One of the main challenges in developing a question-and-answer system is the limited dataset, which can cause instability in system performance. The limitations of the dataset make it difficult for the question-and-answer model to understand and answer questions well. The proposed solution uses Transfer Learning with pre-trained models such as BERT. This research aims to analyze the performance of the BERT model, which has been adapted for question-and-answer tasks in Indonesian. The BERT model uses an Indonesian language dataset adapted specifically for question-and-answer tasks. A customization approach tunes BERT parameters according to the given training data. The results obtained; the model is improved by minimizing the loss function. Evaluation of the trained model shows that the best validation loss is 0.00057 after 150 epochs. In addition, through in-depth evaluation of the similarity of question texts, the BERT model can answer questions measurably, according to existing knowledge in the dataset.
Pre-trained BERT Architecture Analysis for Indonesian Question Answer Model
Sudianto Sudianto (author)
2024
Article (Journal)
Electronic Resource
Unknown
Metadata by DOAJ is licensed under CC BY-SA 1.0
TIBKAT | 2005
|