Comparative Analysis of Transformer-Based Language Models for Text Analysis in the Domain of Sustainable Development
dc.contributor.advisor | Marina G Erechtchoukova | |
dc.contributor.author | Nabil Safwat | |
dc.date.accessioned | 2023-08-04T15:16:34Z | |
dc.date.available | 2023-08-04T15:16:34Z | |
dc.date.issued | 2023-08-04 | |
dc.date.updated | 2023-08-04T15:16:33Z | |
dc.degree.discipline | Information Systems and Technology | |
dc.degree.level | Master's | |
dc.degree.name | MA - Master of Arts | |
dc.description.abstract | With advancements of Artificial Intelligence, Natural Language Processing (NLP) has gained a lot of attention because of its potential to facilitate complex human-machine interactions, enhance language-based applications, and automate processing of unstructured texts. The study investigates the transfer learning approach on Transformer-based Language models, abstractive text summarization approach, and their application to the domain of Sustainable Development with the goal to determine SDGs representation in scientific publications using the text summarization technique. To achieve this, the traditional transfer learning framework was expanded so that: (1) the relevance of textual documents to specified text can be evaluated, (2) neural language models, namely BART and T5, were selected, and (3) 8 text similarity measures were investigated to identify the most informative ones. Both the BART and T5 models were fine-tuned on an acquired domain-specific corpus of scientific publications extracted from Scopus Elsevier database. The relevance of recently published works to an SDG was determined by calculating semantic similarity scores between each model generated summary to the SDG’s description. The proposed framework made it possible to identify goals that dominated the developed corpus and those that require further attention of the research community. | |
dc.identifier.uri | https://hdl.handle.net/10315/41364 | |
dc.language | en | |
dc.rights | Author owns copyright, except where explicitly noted. Please contact the author directly with licensing requests. | |
dc.subject | Information science | |
dc.subject | Artificial intelligence | |
dc.subject | Environmental studies | |
dc.subject.keywords | transformer-based language models | |
dc.subject.keywords | transfer learning | |
dc.subject.keywords | semantic similarity | |
dc.subject.keywords | abstractive text summarization | |
dc.subject.keywords | sustainable development | |
dc.subject.keywords | document relevance | |
dc.title | Comparative Analysis of Transformer-Based Language Models for Text Analysis in the Domain of Sustainable Development | |
dc.type | Electronic Thesis or Dissertation |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Safwat_Nabil_2023_Masters.pdf
- Size:
- 4.83 MB
- Format:
- Adobe Portable Document Format