Utilizing the Transformer Architecture for Question Answering
dc.contributor.advisor | Hoque Prince, Enamul | |
dc.contributor.advisor | Huang, Xiangji "Jimmy" | |
dc.contributor.author | Laskar, Md Tahmid Rahman | |
dc.date.accessioned | 2021-03-08T17:24:08Z | |
dc.date.available | 2021-03-08T17:24:08Z | |
dc.date.copyright | 2020-11 | |
dc.date.issued | 2021-03-08 | |
dc.date.updated | 2021-03-08T17:24:08Z | |
dc.degree.discipline | Computer Science | |
dc.degree.level | Master's | |
dc.degree.name | MSc - Master of Science | |
dc.description.abstract | The Question Answering (QA) task aims at building systems that can automatically answer a question or query about the given document(s). In this thesis, we utilize the transformer, a state-of-the-art neural architecture to study two QA problems: the answer sentence selection and the answer summary generation. For answer sentence selection, we present two new approaches that rank a list of candidate answers for a given question by utilizing different contextualized embeddings with the encoder of transformer. For answer summary generation, we study the query focused abstractive text summarization task to generate a summary in natural language from the source document(s) for a given query. For this task, we utilize transformer to address the lack of large training datasets issue in single-document scenarios and no labeled training datasets issue in multi-document scenarios. Based on extensive experiments, we observe that our proposed approaches obtain impressive results across several benchmark QA datasets. | |
dc.identifier.uri | http://hdl.handle.net/10315/38195 | |
dc.language | en | |
dc.rights | Author owns copyright, except where explicitly noted. Please contact the author directly with licensing requests. | |
dc.subject | Information technology | |
dc.subject.keywords | Question Answering | |
dc.subject.keywords | Text Summarization | |
dc.subject.keywords | Query Focused Abstractive Summarization | |
dc.subject.keywords | Deep Learning | |
dc.subject.keywords | Machine Learning | |
dc.subject.keywords | Transformer | |
dc.subject.keywords | BERT | |
dc.title | Utilizing the Transformer Architecture for Question Answering | |
dc.type | Electronic Thesis or Dissertation |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Laskar_MdTahmidRahman_2020_Masters.pdf
- Size:
- 1.73 MB
- Format:
- Adobe Portable Document Format
- Description: