ETD SWORD Deposit
Permanent URI for this collection
ETDs in this collection are being checked for completeness and are in the process of being transferred to their respective collections under the FGS ETD collection umbrella.
Browse
Browsing ETD SWORD Deposit by Subject "abstractive text summarization"
Now showing 1 - 1 of 1
Results Per Page
Sort Options
Item Open Access Comparative Analysis of Transformer-Based Language Models for Text Analysis in the Domain of Sustainable Development(2023-08-04) Nabil Safwat; Marina G ErechtchoukovaWith advancements of Artificial Intelligence, Natural Language Processing (NLP) has gained a lot of attention because of its potential to facilitate complex human-machine interactions, enhance language-based applications, and automate processing of unstructured texts. The study investigates the transfer learning approach on Transformer-based Language models, abstractive text summarization approach, and their application to the domain of Sustainable Development with the goal to determine SDGs representation in scientific publications using the text summarization technique. To achieve this, the traditional transfer learning framework was expanded so that: (1) the relevance of textual documents to specified text can be evaluated, (2) neural language models, namely BART and T5, were selected, and (3) 8 text similarity measures were investigated to identify the most informative ones. Both the BART and T5 models were fine-tuned on an acquired domain-specific corpus of scientific publications extracted from Scopus Elsevier database. The relevance of recently published works to an SDG was determined by calculating semantic similarity scores between each model generated summary to the SDG’s description. The proposed framework made it possible to identify goals that dominated the developed corpus and those that require further attention of the research community.