Institute for Teaching and Learning homepage

University of Sydney homepage
Issue 24, November 2006  

Submit an article

articles regulars other issues




Using student-generated assessment items to enhance teamwork, feedback and the learning process
Neal Arthur
Faculty of Economics and Business
  

This paper describes an innovative method used to enhance student learning and collegiality through the use of student-generated multiple choice questions (MCQ). The unit of study in which these developments have been applied is Advanced Financial Reporting (ACCT 6010), which is a Faculty of Economics and Business postgraduate award unit of study. This development follows research by Brink et al. (2004) who document a positive link between quality of student-developed model examinations and final examination scores. Traditionally, we observe the use of teacher-written MCQ as an assessment device to provide feedback on students’ performance. We also have used student-written MCQ to achieve four additional benefits. First, in designing the scope and content of the set of MCQ for a topic, each team of students is directed to focus on the teacher specified learning objectives and are able to better identify the links between the topic learning objectives and material in the text and course pack. Second, as a key part of the learning process, the writing task leads students to ask “what are the important concepts and/or methods related to this topic?” Third, the task of writing MCQ may improve students’ test taking strategies. While the authors of questions focus on the guidelines we provide to writing MCQ questions, these same guidelines to writing MCQ questions can in many cases be ‘mapped’ to guidelines in answering MCQ.

Finally, as the unit of study has several streams (classes), we have been able to assemble a student developed practice test bank, comprising of a selection of questions on each topic. This article documents the motivation for this strategy and explains how it can be adopted by other teachers.

Motivation
Given the pace of change of accounting regulation, the ability of teachers to use the test banks that frequently accompany the overseas and more popular local accounting texts also is often limited. Questions quickly become out of date with references to obsolete regulations, specifying accounting techniques that are no longer applied, providing data in a format that is obsolete. In addition, over time answers change and answer keys can become incorrect. In contrast, student-written multiple choice questions will by nature be up to date provided of course that the teacher and the unit of study materials provided to students are up-to-date with relevant current regulations and developments in theory and research.

Whilst multiple-choice questions are widely used in examinations, they have been subject to a variety of criticisms. One significant criticism of their use in business units of study is that many business problems rely on managers identifying what the feasible alternatives actually are as well as choosing the most appropriate alternatives from amongst those identified. In contrast to the nature of decision making, multiple-choice type examinations require students to make a selection from a set of alternatives provided to them – usually developed by the teacher. One approach designed to address this deficiency is to allocate the question-writing task to students so as to engage them more fully in the process of problem solving. This assists them to develop an understanding of not only a preferred solution to a problem but importantly, requires them to identify a series of plausible alternatives. As part of the process of constructing the question, students develop explanations as to why each of the plausible alternatives (distracters) is either incorrect or at least inferior to the preferred alternative. This should assist in developing skills in problem solving and choosing from amongst a number of alternatives.

Through the process of developing alternate plausible solutions, the groups of students may broaden their understanding of a concept beyond the simple “right” answers to consider the variations in the meaning of a concept, and the interrelationships between them. Further, the process of developing multiple choice questions encourages students to distinguish between views/methods that represent good and poor understanding of a concept or its application.

A further practical advantage of student-written questions is that they are replaced by new sets every semester, unlike the test banks provided by publishers. At no great cost to the teacher, this allows students to keep the test paper and to reflect on their responses to each question. This leads to more useful feedback, particularly (as in accounting units of study) where an incorrect answer to a numerical question could be caused by a large number of different reasons.

A further pedagogical advantage is that it ensures a better matching between the teacher-developed learning objectives for each topic and the multiple choice test items than will the use of standardised test banks. This will provide more relevant feedback to both the students and the teacher in relation to the extent to which the learning goals are being met.

How it works
The unit of study is divided into streams comprised of approximately 50 students. The different streams have a common unit of study outline and identical assessment tasks. Different topics are taught by different teachers, based partly on the research interests and experience of the teachers on the course. Thirteen groups of three or four students were formed and students self-selected into groups for the task. We allowed students to form their own groups in order to minimise the problems arising from difference in timetables, language, and cultural factors. In other units of study, groups are formed by the teacher so that students develop a capacity to work with and learn from those from diverse backgrounds.

On the day prior to each class, an email message is sent to a selected group advising them that they will be asked to write questions on the topic to be covered in class the following day. This notice is given in advance of the class so that students can consider questions that might be based on the class discussion and activities as well as the material from the text and readings pack. To reduce the administrative burden, emails to student groups are sent in batches every three or four weeks and the messages are held by the email system (Outlook) until the designated day and time for dispatch. Students have seven days from the date of the email (six days from the date of the class where the topic is first addressed) to write questions on the allotted topic and submit the assignment electronically using the Blackboard site. The assignment is submitted one day prior to class in which the questions will be answered by the other students in the group.

Each group is required to prepare between six and eight to allow for variation in the time required to answer individual questions. Students are advised to prepare a quiz that could be completed in ten minutes. If the majority of students within the stream cannot complete the quiz within this time, the teacher allows extra time so that enough answers were obtained to each question to draw conclusions.

Students are told that the teachers will not edit the questions to correct perceived problems in the questions or the responses (alternatives). This includes possible cases of ambiguities in the question, more than one correct answer, no correct answer or errors in the answer key. The only editing done by the teachers is to insert or remove page breaks prior to printing where necessary. This approach avoids disputes that might arise from teacher changes – such as providing different groups with different levels or type of assistance.

The student-written questions are answered by other students in the stream following the session in which the related material was initially addressed. This approach allows students the time necessary to complete the required reading and personal study questions assigned for the related topic. The weekly review questions also serve to further encourage students to keep up-to-date with reading and study activities. The multiple-choice questions are attempted by other students in the section and are allowed a fixed amount of time determined by the teacher. Students record their answers on a standard answer sheet and also on the question paper. At the end of the quiz, students hand in their answer sheets and retain a copy of the test paper.

Immediately following each test, students are provided with the answer key as advised by the authors of the questions. This provides students with immediate feedback. We also encourage students to discuss other answers – which provides lecturers with feedback about the areas where students experience most difficulty. Informal feedback is also provided to the authors of the questions. This is always a good opportunity to discuss the material with students who often do not interact with faculty staff in small groups.

Where we considered it necessary, comment was made on the answers provided by students. Our experience is that there are rare cases where one or more of the answer keys were incorrect. One of the reasons for this is that assignments are prepared on a group basis and discussed within the group, or “trial sat” by other students in the group. Questions tend to be unambiguous and evidence in relation to this is sourced from data on the percent of correct answers. This is discussed further below in the context of assessment.

Assessing the MCQs
Completion of the question writing task resulted in the award of up to five marks towards each student’s unit of study total. There are a number of ways in which questions could be ranked: individual questions themselves; originality; degree of difficulty; and the extent to which the questions related to the learning objectives to the topic. In the case of the questions as a set, consideration could be given to the breadth of coverage, depth of coverage (e.g., using Bloom’s taxonomy or the ‘revised taxonomy’ (Anderson et al., 2001)), and time required to complete the questions.

The student questions are graded based on two criteria: link to topic and the percentage of correct peer responses. The first criterion was included to discourage questions based on prior topics (unless linked to a later topic) or material that might be covered in the related textbook chapter but explicitly excluded from coverage in the unit of study. The second criterion serves a number of functions. A low percentage correct (normally) penalises for an incorrect answer key, more than one correct answer and questions which are too difficult. A very high percentage of correct responses was also penalised. It might be argued that the questions are trivial in nature or the distracters provided were not designed well enough to allow for common alternative approaches or minor variations in a concept or method.

In terms of grading questions in relation to the percentage of correct responses, we set wide boundaries and considered acceptable cases where the percentage of correct responses to a question falls in the range 30%-80% inclusive. Table 1 below provides a summary of the scores for this task.

Student-written MCQs can also be utilised to generate practice question sets for use by students as part of their preparation for examinations. Prior to the date of the mid-semester and final examinations, the unit of study teachers assemble test banks from the sets of student-designed questions. Questions were selected to provide a breadth of coverage of the topic and to provide a similar degree of difficulty to the set of questions written by teachers for the examination. Completion of these revision tests enabled students to identify topics or methods that require further attention prior to the examinations. Students were able to access these practice sets via the unit of study web page. This becomes viable where a unit of study is taught in streams and students agree to share questions across streams. It further enhances teamwork and provides a sense of unit of study coherence which can feel divided by the streaming process. This also allows students to review topics in which they had not performed well, based on earlier feedback including the in-class multiple choice questions.

Conclusions
The innovation used in this unit of study, to use student-written multiple choice questions, was well received by students as a means of providing a more active approach to learning as well as improving their ability to analyse and respond to multiple choice questions used in examinations. The approach encourages students to focus on the learning objectives of the individual topics covered in the unit of study and the links between these objectives and the material covered in the text. The development of questions also led students to consider a variety of possible solutions to accounting problems and possible subtle variations of meanings of concepts and their application.

The advantages of using student written rather than teacher written questions needs to be balanced with the extra time required to administer the processes involved in communication with students and assessment. If the quizzes are to be used in class, another approach might be to use current infra-red devices which will eliminate paperwork from the administration of the test. However, to a large extent, many of the processes that have been employed could be undertaken using online tests that are completed outside of class time. This could be enhanced by the use of online discussion boards and dedicated forums which could enable students to discuss other possible answers.


References
Anderson, L.W. & Krathwol, D. (2001) (Eds). A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York: Longman.

Brink, J., Capps, E. & Sutko, A. (2004). Student Exam Creation as a Learning Tool. College Student Journal, 38 (2), 262-272.

Ellsworth, R. A., Dunnell, P. & Duell, O. K. (1990). Multiple-Choice Test Items: What are Textbook Authors Telling Teachers? Journal of Educational Research, 83(5), 289-293.


Neal Arthur is a Senior Lecturer in the Faculty of Economics and Business, and a recipient of a School of Business Award for Excellence in Teaching in 2005. n.arthur@econ.usyd.edu.au




^ top