**Introduction**

This article aims to analyse how students respond to conceptual word problems in mathematics. The questions were solved by children on Mindspark, an online learning product created by Educational Initiatives Pvt. Ltd. Data recorded includes student responses (by grade), time taken to answer the question and percentage of students answering the questions correctly. Correlations between time and accuracy have been discussed with the aim of giving insights into how students might be thinking about and approaching problems.

**Research Methodology**

This research was conducted in 2017. At the time, Mindspark had over 75000 subscribed students from grades 1 to 10. The process of creating a question/item takes place in the following stages.

First, a question maker creates an item in the system by adding the question stem, images (if any), the correct answer and an answer explanation. Second, he/she shares the item with a peer/manager who critically reviews it and gives suggestions for changes and improvements. Once the suggestions are discussed and incorporated, the item is approved in the system. Third, the item is activated for specific grades and for a specified time period which makes it go live in the product for children to attempt.

As each student has a unique user identification number, it ensures a single attempt per question per student. Students are under no time pressure as the software is used as a learning tool – not an assessment tool. Once the question is deactivated, the student response data can be further analysed. In this article, one blank type question and one multiple choice question are studied.

**Question 1**

This question tests reasoning and logical skills – cultivating both is vital to learning and engaging with mathematics. *Table 1* shows the variation of accuracy with grade.

The correct answer is 2 hours as all the towels are drying at the same time and are of similar quality. Based on the preliminary data, this question is tricky for students of all grades with a gradual increase in accuracy with grade – expected as reasoning and logical thinking skills develop with age. Let’s go a bit deeper into the student response data.

From *Table 2*, we see that there is a __decrease in accuracy with an increase in time taken__ to solve the question. This indicates that those students who quickly grasp the essence of the problem have a higher chance of getting the problem right than those children who get led by the numbers that they see in the problem and try to connect them with a mathematical operation. It also appears that those students who spend more than 10 seconds on this problem are reducing it to a “unitary method problem” and multiplying 6 towels by 2 hours to get 12 hours as the answer. This explains the __increase of the percentage of students answering 12 with more time taken__ before stabilising. The common wrong answer is 12 (particularly when more time is spent on the question) indicating that students are likely to be reducing it to a unitary method problem and solving it without examining the context and situation outlined. Such students seem to be comparing this question to ones like “1 banana costs Rs. 4. How much will 6 such bananas cost?” to obtain their answer.

A key learning for educators and teachers is that children should be encouraged to read and understand problems from a practical viewpoint and not adopt mechanical approaches that aim to connect the problem with keywords.

**Question 2**

This question tests conceptual understanding of fractions with options designed to identify misconceptions and errors in analysing the question. It is an application-oriented question on a daily life scenario. Data indicating the different responses of students of grades 6 to 9 is shown in *Table 3* (the correct answer, option B, is highlighted in green).

There is a marginal increase in accuracy with increase in grade; however, the data reveals that students are weak at applying knowledge of fractions into a real-life problem regardless of their grade. Even at the grade 9 level, only about 35% of the students have gotten this question right. The common wrong answer is option A (2/9) – here, students seem to be __counting the number of divisions instead of the number of parts__ of the fuel gauge to get the denominator as 9. Then, these students seem to be __counting the division that the pointer is pointing to as the second division from the ‘E’ mark__ and obtaining the fraction (2/9). Around 25% of the students are opting for option C – here, the students seem to be __correctly seeing that the gauge is divided into 8 equal parts__ but then __counting the division that the pointer is pointing to as the third division including the ‘E’ mark__ and obtaining the fraction as (3/8).

With regards to time taken to solve the question, there are some insightful trends observed which are discussed below. It is estimated that about 20 seconds is sufficient time for an average child to read and answer as the question stem is short, the testing idea is conceptual and there are no calculations/operations involved.

There was a __drop in accuracy of ~10% when kids spent more time on solving this question__. This points to the highly conceptual nature of this question – if students were clear on how the whole needs to be divided into equal parts to determine fractions and were able to translate this concept to the context, then they were able to get it right without spending much time on it. On the other hand, the students who were not conceptually strong in fractions probably spent more time on the question and still ended up getting it wrong because the basic understanding of the concept was lacking.

**Conclusions**

One significant learning from this article is that students should be encouraged to __read and understand math problems; not simply solve them__. In the education system that these children study, math learning tends to be focused on repetitive skill-based problems. At times, students end up possessing superficial knowledge that is insufficient to answer application-based problems. For instance, in question 1, students appear to ‘solve’ the problem from a purely numerical perspective without trying to ‘read and understand’ what is actually happening.

In question 2, the data on the common wrong answer suggests that students are likely to be counting the divisions instead of the number of parts that the fuel gauge has been divided into. Teachers’ instruction should aim to guide students to read and understand the question carefully before deciding on how their conceptual knowledge can be applied.

The author hopes that articles such as these will guide teachers to emphasise the process and approach to solving problems instead of focusing on whether the final answer is ‘right’ or ‘wrong’. Otherwise, we fail as teachers and educators if students think of math only in terms of numbers, operations and exam scores.

**Acknowledgements**

The idea for question 1 was contributed by Ramyaa N of class 6G of Kamala Niketan Montessori School, Tamil Nadu, India. It was converted into a question in the system using the process outlined in the research methodology section. Question 2 was created by the author of this article. Analysis of data was done by running queries on HeidiSQL which is an open-source tool available online for MySQL.

### Shreyas Vatsa

#### Latest posts by Shreyas Vatsa (see all)

- Using Data to Gauge Student Thought - December 10, 2020