BERT BiLSTM-Attention Similarity Model
Funding Sponsor
American University in Cairo
Second Author's Department
Computer Science & Engineering Department
Find in your Library
https://doi.org/10.1109/ICAICA52286.2021.9498209
Document Type
Research Article
Publication Title
2021 IEEE International Conference on Artificial Intelligence and Computer Applications, ICAICA 2021
Publication Date
6-28-2021
doi
10.1109/ICAICA52286.2021.9498209
Abstract
Semantic similarity models are a core part of many of the applications of natural language processing (NLP) that we may be encountering daily, which makes them an important research topic. In particular, Question Answering Systems are one of the important applications that utilize semantic similarity models. This paper aims to propose a new architecture that improves the accuracy of calculating the similarity between questions. We are proposing the BERT BiLSTM-Attention Similarity Model. The model uses BERT as an embedding layer to convert the questions to their respective embeddings, and uses BiLSTM-Attention for feature extraction, giving more weight to important parts in the embeddings. The function of one over the exponential function of the Manhattan distance is used to calculate the semantic similarity score. The model achieves an accuracy of 84.45% in determining whether two questions from the Quora duplicate dataset are similar or not.
First Page
366
Last Page
371
Recommended Citation
APA Citation
Aboutaleb, A.
Fayed, A.
Ismail, D.
Gaballah, N.
...
(2021). BERT BiLSTM-Attention Similarity Model. 2021 IEEE International Conference on Artificial Intelligence and Computer Applications, ICAICA 2021, 366–371.
10.1109/ICAICA52286.2021.9498209
https://fount.aucegypt.edu/faculty_journal_articles/2397
MLA Citation
Aboutaleb, Ahmed, et al.
"BERT BiLSTM-Attention Similarity Model." 2021 IEEE International Conference on Artificial Intelligence and Computer Applications, ICAICA 2021, 2021, pp. 366–371.
https://fount.aucegypt.edu/faculty_journal_articles/2397