Adapter-based fine-tuning for PRIMERA

dc.contributor.authorHewapathirana, K
dc.contributor.authorDe Silva, N
dc.contributor.authorAthuraliya, CD
dc.contributor.editorGunawardena, S
dc.date.accessioned2025-11-19T04:30:53Z
dc.date.issued2025
dc.description.abstractMulti-document summarisation (MDS) involves generating concise summaries from clusters of related documents. PRIMERA (Pyramid-based Masked Sentence Pre-training for Multi-document Summarisation) is a pre-trained model specifically designed for MDS, utilizing the LED architecture to handle long sequences effectively [1–4]. Despite its capabilities, fine-tuning PRIMERA for specific tasks remains resourceintensive. To mitigate this, we explore the integration of adapter modules—small, trainable components inserted within transformer layers—that allow models to adapt to new tasks by updating only a fraction of the parameters, thereby reducing computational requirements [5–8].
dc.identifier.conferenceApplied Data Science & Artificial Intelligence (ADScAI) Symposium 2025
dc.identifier.departmentDepartment of Computer Science & Engineering
dc.identifier.doihttps://doi.org/10.31705/ADScAI.2025.57
dc.identifier.emailkushan.22@cse.mrt.ac.lk
dc.identifier.emailnisansa@cse.mrt.ac.lk
dc.identifier.emailcd@conscient.ai
dc.identifier.facultyEngineering
dc.identifier.placeMoratuwa, Sri Lanka
dc.identifier.proceedingProceedings of Applied Data Science & Artificial Intelligence Symposium 2025
dc.identifier.urihttps://dl.lib.uom.lk/handle/123/24392
dc.language.isoen
dc.publisherDepartment of Computer Science and Engineering
dc.subjectMulti-document Summarisation
dc.subjectNatural Language Processing
dc.subjectPre-trained Models
dc.subjectAdapters
dc.titleAdapter-based fine-tuning for PRIMERA
dc.typeConference-Extended-Abstract

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Paper 57 - ADScAI 2025.pdf
Size:
161.96 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections