Institutional-Repository, University of Moratuwa.  

Assessment and error identification of answers to mathematical word problems

Show simple item record

dc.contributor.advisor Ranathunga, S
dc.contributor.advisor Dias, G
dc.contributor.author Kadupitiya, JCS
dc.date.accessioned 2017-12-19T00:22:48Z
dc.date.available 2017-12-19T00:22:48Z
dc.identifier.uri http://dl.lib.mrt.ac.lk/handle/123/12946
dc.description.abstract In Mathematics, the term “word problem” is often used to refer to any mathematical exercise where significant background information on the problem is presented as text rather than in mathematical notation. This research focuses on word problems that have simple numerical and/or algebraic answers. These types of word problems can be further categorized according to the domain, such as interest calculation questions, percentages, shares and mensuration. These word problems can be found in many international examinations. Existing research has produced solutions that focus on questions only for some of the aforementioned categories. Moreover, they have not focused on assessment based on a marking rubric. This thesis presents a system that is capable of assessing answers to both numerical and algebraic type word problems using a (teacher-provided) marking rubric. We automatically identify the exact errors (if any) made by students by using the marking rubric. This system is modularized and can be extended to support different types of word problems. If the answer contains a short sentence phrase along with the numerical or algebraic expression, it is also evaluated in order to check whether the student has actually understood the question. Our main focus is the questions from the GCE Ordinary Level (O/L) Mathematics syllabus in Sri Lanka. Many students take this examination in Sinhala (an official language in Sri Lanka). Therefore short sentence evaluation had to be done for Sinhala. This requirement led us to conduct the first research on short sentence similarity measurement for Sinhala. The unsupervised similarity measurement technique we used showed comparable results to that of English. The system was thoroughly evaluated with student answers to questions from GCE O/L examination. It was further tested for answers to some word problems from the Cambridge Ordinary Level and the Australian year 10 international examinations, which demonstrated that the system is able to deal with variations in questions in different examinations. en_US
dc.language.iso en en_US
dc.subject COMPUTER SCIENCE AND ENGINEERING
dc.title Assessment and error identification of answers to mathematical word problems en_US
dc.type Thesis-Full-text en_US
dc.identifier.faculty Engineering en_US
dc.identifier.degree Master of Science in Computer Science en_US
dc.identifier.department Department of Computer science and Engineering en_US
dc.date.accept 2017-04
dc.identifier.accno TH3329 en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record