Exploiting adapters for question generation from Tamil text in A zero - resource setting

Loading...
Thumbnail Image

Date

2022

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Automatic Question Generation focuses on generating questions from a span of text is a significant problem in Natural Language Processing (NLP). Question generation in lowresource languages is under-explored compared to high-resource languages. In the earlier work, all the parameters of a pre-trained multilingual language model were fine-tuned to perform a zero-shot question generation and other sequence-to-sequence (S2S) generation tasks. However, such full model fine-tuning is not computationally efficient. Recent research introduced a neural module called adapter to each Transformer layer of a pretrained language model and fine-tuned only the adapter parameters to mitigate this issue. In this study, we explored single task adapter and adapter fusion on the pre-trained multilingual model mBART to generate questions from Tamil text. Our best model produced a Rough-1 (F1) score of 16.9. Furthermore, we obtained a similar result with two variants of adapters called Houlsby adapter [1] and Pfeifer adapter [1], which resemble the result of adapters for other S2S tasks[2].

Description

Citation

Purusanth, S. (2022). Exploiting adapters for question generation from Tamil text in A zero - resource setting [Master's theses, University of Moratuwa]. Institutional Repository University of Moratuwa. http://dl.lib.uom.lk/handle/123/21590

DOI

Endorsement

Review

Supplemented By

Referenced By