Query Focused Abstractive Summarization Using BERTSUM Model
نام عام مواد
[Thesis]
نام نخستين پديدآور
Abdullah, Deen Mohammad
نام ساير پديدآوران
Chali, Yllias
وضعیت نشر و پخش و غیره
نام ناشر، پخش کننده و غيره
University of Lethbridge (Canada)
تاریخ نشرو بخش و غیره
2020
يادداشت کلی
متن يادداشت
78 p.
یادداشتهای مربوط به پایان نامه ها
جزئيات پايان نامه و نوع درجه آن
M.Sc.
کسي که مدرک را اعطا کرده
University of Lethbridge (Canada)
امتياز متن
2020
یادداشتهای مربوط به خلاصه یا چکیده
متن يادداشت
In Natural Language Processing, researchers find many challenges on Query Focused Abstractive Summarization (QFAS), where Bidirectional Encoder Representations from Transformers for Summarization (BERTSUM) can be used for both extractive and abstractive summarization. As there is few available datasets for QFAS, we have generated queries for two publicly available datasets, CNN/Daily Mail and Newsroom, according to the context of the documents and summaries. To generate abstractive summaries, we have applied two different approaches, which are Query focused Abstractive and Query focused Extractive then Abstractive summarizations. In the first approach, we have sorted the sentences of the documents from the most query-related sentences to the less query-related sentences, and in the second approach, we have extracted only the query related sentences to fine-tune the BERTSUM model. Our experimental results show that both of our approaches show good results on ROUGE metric for CNN/Daily Mail and Newsroom datasets.
اصطلاحهای موضوعی کنترل نشده
اصطلاح موضوعی
Artificial intelligence
اصطلاح موضوعی
Computer science
اصطلاح موضوعی
Engineering
نام شخص به منزله سر شناسه - (مسئولیت معنوی درجه اول )