Enhancing the explainability of transformer–based abstractive summarization models

Loading...
Thumbnail Image

Date

2025

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Abstractive Summarization (AS) is a Natural Language Processing (NLP) task that generates a concise and coherent summary of a given document by rephrasing or paraphrasing the content. It captures the essential information rather than directly extracting sentences or phrases from the source text, as opposed to Extractive Summarization (ES). AS is used in multiple mission-critical domains such as healthcare, law, and finance. Nevertheless, the existing state-of-the-art AS models are based on black-box deep learning models such as Transformers, and they cannot explain why specific facts were included in the summary while some facts were omitted. This research proposes a novel framework to explain which facts have been excluded from the summary by a given AS model and the rationale behind the selections. The new framework, Fact Omission Explanation (FOE) , utilizes a feature attribution method to analyze the factselection process of a given AS model and generate a linguistic explanation of which facts have been excluded and the respective reasons. The proposed framework was assessed using the PubMed dataset and Arxiv dataset, which consists of long documents in medical and scientific domains, and PEGASUS and T5 transformer models, which are state-of-the-art transformer-based AS models. A user study was conducted with the participation of medical professionals to assess the value addition of the framework in practice. The results demonstrate that the generated explanations help ensure the trustworthiness of AS models in mission-critical domains such as healthcare.

Description

Citation

Panawenna, P.H. (2025). Enhancing the explainability of transformer–based abstractive summarization models [Master’s theses, University of Moratuwa]. Institutional Repository University of Moratuwa. https://dl.lib.uom.lk/handle/123/25098

DOI

Endorsement

Review

Supplemented By

Referenced By