Setting Quizzes/Exams

I. In the course design phase

 
1. Designing your exam

Exams are often regarded as official tests for students to demonstrate their knowledge and ability of material. Before setting the exam, identify the programme requirements that your exam needs to satisfy. Please note that at SMU, formal end-of-term exams are administered by the Office of the Registrar, either in a 2-hour or 3-hour format. Final examination is compulsory for all undergraduate core courses, including discipline cores (i.e. programme and major core). Permission has to be sought from the Dean’s Office for exemption. Your exam can also contribute to:

  1. Ensuring standards are maintained. Imagine your course being audited or reviewed. Are there any external standards that you need to ensure are upheld in exams regarding the content covered, skills acquired, knowledge created, and/or attitudes acquired? More importantly, students who were certified to have passed the exam this year need to meet a similar standard as those who were certified the year before and so forth.
     
  2. Fulfilling course prerequisites. If your course is a prerequisite, you will be coordinating with instructors teaching other courses to find out about their requirements. This leads to identification of important topics that are tested in your final exam. If your course is not a prerequisite for other courses, consider checking with your course coordinator and/or programme director to ensure alignment of your examinable topics to the overall programme.
     
  3. Ensuring achievement of learning objectives. Exams provide you (the instructor) with feedback on how well you have taught. Students’ performance on the exams can pinpoint areas where you should spend more time or to change your current instructional strategies. 
     
  4. Creating verifiable individual assessment components. This ensures that the work being assessed was done by the same student, and not by others. Students have the same tasks to do in the same way and within the same timeframe. 
     
  5. Contributing towards assurance of learning. Exams are part of the assessment plan where data about students’ achievement can be collected in a systematic manner. The data can be used to review and improve the course and the programme.
     
  6. Motivating students to learn. Exams can be used to motivate students, as they tend to review the materials more often when an evaluation is coming up.  An exam is also a learning moment in itself. Good exam questions create opportunities for students to apply what they have learnt and will lead to better retention. 

There are a number of ways to review and prioritise the concepts and skills taught in a course for an exam. You could:

  • Refer to your course outline for the list of topics you plan to cover
  • Review your lecture notes to find key concepts 
  • Compare textbooks to identify key topics
  • Review chapter headings and subheadings in the assigned readings
  • Talk to other instructors teaching the same course as you
  • Determine the usefulness and relevance to future courses, and/or careers

It is important to ensure that what you decide to assess in an exam should be related to the learning outcomes of your course. The course learning outcomes prescribe what students are expected to demonstrate they have learnt, while exams will show how they will demonstrate their learning. To ensure that the questions that you have created met the intended purposes of your exam, consider enlisting the help of other instructors teaching your course to review the questions and provide feedback.

2. Communicating to students about exams

As you design your exam, you can also decide how to communicate the purpose and parameters of your exam to your students,. They will be able to prepare better if they know not only the structure of the exam, but also what is expected of them for each part. Although your students will not know what the exam questions are in advance, you can help them by being prepared to answer questions such as:

  • What is the format of the exam?
  • Which are the topics that will be covered?
  • Is it an open book/closed book exam?
  • Will I be allowed to bring in any material into the exam room?
  • Will I be given the option to choose the topics on which I do questions?
  • What happens if I am unable to sit for the exam, for good reason? Do I get to re-take?
  • How do I go about doing a grade appeal? 
  • Will I be able to see my examination answer script? 
  • Will allowances be made if English is not my first language?
  • When will I know my grade?

Click here to find out more about the SMU policy on examinations (Sign-in required).

3. Deciding on exam formats

At SMU, end-of-term exams are typically categorised into close-book or open-book exams. Consider the following table to choose the format you will you use for your exam(s).

Exam Format

Suitable learning objectives to assess

Description

Close-book

The emphasis is on recall of knowledge. Hence, most close-book exams ask students to remember, understand or apply knowledge.

Written (paper-and-pen):
Students are expected to complete the exams on paper without any sources of reference materials and within a stipulated time. They do not know what the questions will be until they are sitting for the exams.

Online (via exam browser in eLearn): Students are expected to complete the exams online without any sources of reference materials and within a stipulated time. They do not know what the questions beforehand. Students attempt the exam via a secure exam browser in eLearn using their laptops. The secure exam browser prevents students from accessing the Internet during the exams. To find out more about administering online exams, please visit the eLearn resource page here.

Open-book

Because open-book exams don't have the same emphasis on memorisation, questions can move up Bloom's Revised Taxonomy and ask students to analyse or evaluate knowledge, rather than just remember it.

Written (paper-and-pen):
Similar to close-book exams where students do not know the questions beforehand and are expected to complete the exams within a given time. They are allowed to take in with them sources of reference materials. If you are concerned about students accessing the Internet (to get additional reference materials or for communication purposes), the Office of the Registrar strongly advises you not to allow the use of students’ laptops during the exams.

Table 1. Linking exam formats and types of learning objectives to assess

4. Writing your exam

A. Deciding on an appropriate mix of question types

To set you thinking about exam questions, you can consider the following differences between question types (Race, Brown & Smith, 2004).

  • Is a question assessing students on the product and/or process? You can test students’ reasoning skills and evaluate the process of how students achieve the solution. Alternatively, you can evaluate the end product. For example, a multiple-choice question is considered to be evaluating the outcome of students’ reasoning skills as they are graded on their choices (rather than the steps needed to arrive at the answers). Conversely, an essay question requiring students to analyse a scenario and put forth their arguments would be considered to be evaluating the process of their reasoning skills.
     
  • Is it specific subject knowledge that the question is testing, or is it how well students can apply such knowledge? You can consider designing a variety of exam questions that test students’ ability to recall information, apply and/or evaluate course materials. 

B. Writing exam questions

The following guidelines are useful when setting exam questions for your exam:

  • Choose appropriate question types for your learning objectives. There is no single best type of exam question, but it is important that your question types reflect your learning objectives. For instance, if you want your students to articulate or justify an argument, short-answer or essay type of questions would be more appropriate as compared to multiple-choice questions. For an overview of the different question types, check out the CTE resource guide on “Checking Students’ Understanding Through Quizzes/Exams”.
  • Use multiple question types. An effective exam gives all students an equal opportunity to fully demonstrate their learning. Exams should include multiple question types so as to cater to the different abilities of students.

Check for content validity, in other words check that the questions test your learning objectives, or help you to meet another requirement you identified earlier (e.g. prerequisite).  To ensure that your exam has content validity, create questions based on the materials covered in class. Expert judgment (not statistics) is usually the primary method to determine whether an exam has content validity. Enlist the help of your colleagues to review the questions so that they meet your intended purposes. 

When writing exam questions, word the questions clearly to eliminate ambiguity. Avoid convoluted and complex sentence constructions that may confuse students. Read your own questions critically to check if they could be misunderstood. Also, craft questions that are free of bias. When designing an exam, keep in mind your student differences such that they do not create obstacles for some students. For instance, avoid setting questions that are easily understood by local students but are inaccessible to international students.

C. Other considerations to check before finalising your exam

  • Be realistic about students can do in the permitted time. Students should be able to respond to all the questions in the given time. Try to take the exam yourself; revise the exam if you cannot complete it in about one third of the time permitted.
     
  • Proof-read your your exam carefully. What seems perfectly clear to you may be confusing for others. Enlist the help of your colleagues to read through your exam to make sure that the questions are clear and unambiguous.
     
  • Accessibility. Exams evaluate learning objectives and not the speed, manual dexterity, vision, hearing, or physical endurance of the learners. For students with disabilities, they can request for support that allows for different modes of exam through the Diversity, Inclusion and Integration (DII) unit. At DII, the examination accommodations provided include:
    • Separate venue
    • Extra time
    • Break time during exam
    • Allow for student to type (instead of write)
    • Assistive device : Text-to-speech software
    • Scribe (transcribe answers for student who are unable to write)
    • Special seating arrangement
    • Adapting printed materials : change in font size

DII accommodations are personalised according to the student’s needs. To request for exam accommodations for students with special needs, contact DII at dii [at] smu.edu.sg

D. Preparing a grading guideline rubric or marking scheme

It is recommended to do this when you set the exam as you will often identify ways to clarify the exam questions as you create the grading guideline. Having created a grading guideline, you will also be able to communicate more clearly to your students about how to prepare for the exam. The purposes of the rubric or marking scheme are to set out:

  1. The categories or criteria against which the students work will be judged; and
  2. The explicit standards of performance for each category.

When designing your rubric or marking scheme: 

  • Look at what others have done in the past. Refer to marking schemes developed by other instructors as a reference on how marking schemes could be developed.
  • Create a marking scheme/rubric usable by non subject matter experts. Writing a model answer for each question is useful in allowing other assessors and students to easily understand your marking scheme/rubric. Informing the students of the criteria that you will score their responses by ahead of the exams can also enhance the validity of your exam, as they are like to focus their efforts on the topics you intended to assess them on.

Clear specifications of scoring criteria in advance of administering exams can contribute towards improving the reliability of the grading as it will help ensure that the grading is consistent across students. If there are multiple instructors for the same course and a standard exam is used, then it can be useful to ensure high inter-rater reliability by having the instructors discuss how they marked to ensure consistency in their interpretations of your rubric or marking scheme. In cases where exemplars are not highlighted in the marking schemes/rubrics, instructors should agree on acceptable responses and the marks allocated. Another option is to divide the marking load such that one instructor marks the same question across the cohort, based on a rubric/marking scheme that is agreed upon by all instructors teaching the course. 

Please refer to CTE’s website here for samples of grading rubrics contributed by other SMU instructors for grading students’ work.

E. Planning for a review session

Consider planning for an exam review session to guide students in their preparation, e.g.  using past exam questions as class exercises and sharing sample answers with your students. Getting your students to exchange their answers and guide them to marking the responses using a marking scheme/rubric helps them to better understand what you look out for during the marking process.

II. Communicating with the Office of Registrar

 

Be aware of the exam requirements stipulated by the Office of Registrar. At the beginning of each academic term, the Office of the Registrar will email all instructors of their exams and grading schedules. The email also contains important information such as:

  • University Exam Policy and Procedures (Documents can be retrieved from iNet > Teaching > Policies and Guidelines)
  • Submission of final exam requirements
  • Requirements for printing of exam question papers (written and online)

In addition, the Office of Registrar also stipulates a set of administrative guidelines for the administration and invigilation of exams. Click here to find out more (Documents can be retrieved from iNet > Teaching > Policies and Guidelines). 

For any queries related to undergraduate exams, instructors can contact the Office of the Registrar at exam [at] smu.edu.sg

III. After your Exam is administered

 
Guidelines on marking exams
  • Review the marking scheme after the exam. After the exam, consider doing an initial review of some responses to a single question. You may sometimes find that students interpreted your question in a way that is different from how you intended which may require you to adjust your grading scheme This initial review can also allow for the identification of a few exemplars or “anchor” responses that most clearly correspond to your marking scheme/rubric. The comparability and fairness of the scores assigned to student responses can be enhanced by comparing each response to the selected anchor responses (Miller, Linn & Gronlund, 2013).
     
  • Make notes on student responses during marking. These notes can remind you of why you gave the marks which is useful should your students want to appeal their grades. 
     
  • Consider marking all student responses to one question before moving on to marking the next one. In the context of marking exams, a halo effect can occur when an assessor’s perceptions of an individual student will bias the marking of various questions. It is therefore recommended to anonymise the scripts and to mark each question for all students before moving on to marking the next question, therefore avoiding this bias (e.g., McDonald, 1999, p. 24; Kahneman, 2011).
     
  • Review the quality of the exam for yourself and the module. Make notes on the questions which seemed to be well-understood and those that were frequently misunderstood. If you administer online exams, you will be able to access statistical reports on student performance on each question. A typical measure is to look at the percentage of students who are able to answer the questions well to gauge the quality of the exam question.

    Another thing to look at is the correlation of a student scores on one particular question with their scores on the exam as a whole (i.e. item-total correlation). Such correlation ranges from a low of -1.0 to a high of +1.0. The closer this correlation is to +1.0 the more reliable the question is considered because it discriminates well among students who mastered the test material and those who did not. As a general rule, questions with correlations below 0.2 are considered poor; 0.20 - 0.29 are considered as fair; 0.30 - 0.39 are considered as good while 0.40 - 0.70 are considered as very good (McGahee & Ball, 2009). 

    Such feedback and reflection is useful when designing questions the next time round, and when making changes for part of the course or module content, and/or the instructional strategies.

IV. Releasing results

 

After you are done with the marking of your final exams, please check with and adhere to your school guidelines on grade release procedures. They will typically review the grade distribution for overall final grades and marks before you can submit the information to the Office of the Registrar. The Office of the Registrar will officially release overall final grades to students, where they can view their grades via OASIS.

After the official release of the exam results, your students may submit an official Appeal for Review of Grade to the Office of Registrar if they believe that there is a mis-grading. Students have up to 3 working days from the official release of exam results (exclusive of release date) to do the appeal. 

The Office of the Registrar and the respective schools will take the necessary action to follow-up on the appeal.  When students approach you about their performance in your course during this period, you can use this as a further opportunity for feedback to the student but it is important to communicate effectively since poor communication can impact the grade appeal process negatively. In addition, students are not permitted to view their exam answer scripts during the grade review. 

More information regarding grade appeal can be found in OASIS.

Bibliography

 
  1. Kahneman, D. (2011). Thinking, Fast and Slow , New York : Farrar, Straus and Giroux.
  2. McDonald, R. P. (1999). Test theory: A unified treatment. Mahwah, NJ: Lawrence Earlbaum.
  3. McGahee, T. W., & Ball, J. (2009). How to read and really use an item analysis. Nurse educator, 34(4), 166-171.
  4. Miller, M., Linn, R., & Gronlund, N. (2013). Measurement and Assessment in teaching, Pearson, London, 12.
  5. Race, P., Brown, S., & Smith, B. (2004). 500 tips on assessment. Routledge.
  6. Rowntree, D. (1987). Assessing Students : How Shall We Know Them? , London : Kogan Page