Students and instructor sitting in a room with a laptop, light bulb depicted in front of a brain indicating an ideation session.

Lesson 4: Design decisions for exams

Exam formats

Exams are a mainstay of the student assessment experience. Because exams are so common, it is easy for instructors to use exams without explicitly reflecting on the format and whether it meets the assessment outcomes. When determining the format of exams, instructors need to consider how their choices can impact students’ perception of and preparation for the assessment.

Types of exam questions

When selecting the format of the exam, instructors need to consider how the format of the exam influences and informs the approach students take to study.

Structured response questions refer to questions that involve a series of responses that students can select from to answer the question. These types of questions include multiple-choice, true/false, matching, and ranking style questions.

  1. Benefits

    Students have higher success expectations for structured response exam format. Students perceive these assessments as easier and associate them with assessments of declarative knowledge. As such, many students perceive multiple-choice tests as less threatening because there is greater predictability about the testing situation. Structured response questions are often able to be computer-graded, allowing for faster feedback in large classes.

  2. Drawbacks

    Because students associate structured response tests with memorization, they are less likely to engage in the deep learning approach. If the multiple-choice tests required high-order thinking (e.g., application or evaluation), students may not appropriately prepare as they are less familiar with these question types (Tempelaar, 2020).   

    Multiple-choice exams can be particularly challenging for students with lower self-efficacy or who are more likely to face stereotype threats (van de Watering et al., 2008). Multiple-choice tests may also measure a student’s risk tolerance and confidence in decision-making, rather than knowledge of the topic. Risk-averse students are likely to score lower on structured response questions than if tested on the same information in a constructed response question.  

Constructed response questions refer to questions in which the students are provided with a prompt or problem that they need to respond to instead of solve.

  1. Benefits

    Constructed responses tend to encourage deep learning from students. Students perceive this form of testing as more reflective of their overall learning in an area (Birenbaum & Feldman, 1998).

  2. Drawbacks

    Despite being perceived as a tool that better reflects students’ learning, students also describe constructed response questions as more challenging, which could increase threat response in some students. In large courses, constructed responses can be challenging to grade, both in terms of the volume of the material to grade, as well as ensuring the grading isn’t impacted by the verbosity (number of words) and fluency (breadth of vocabulary) in the responses.

Collaborative testing involves a timed assessment during which small groups of students can discuss questions and can either independently or collectively indicate their answer to the question. Typically, collaborative testing uses a form of fixed response questions.

  1. Benefits

    Using collaborative or team testing as part of a test can reduce the test as threat perception (Pandey & Kapitanoff, 2011). In particular, team testing reduced anxiety when offered as a supplement to tests taken individually. In general, students perform better on collaborative testing than on individual tests, particularly on questions that required higher-order thinking.

  2. Drawbacks

    Team testing creates a noisier testing environment which can impact student concentration for some. This exam format can also present the same concerns as other student group work contexts, such as social loafing and interpersonal conflict.

  1. Benefits

    Oral exams have not been common at the undergraduate level of Canadian post-secondary education; however, with concerns regarding academic integrity, this form of testing is becoming more common (Akimov & Malin, 2020). Oral exams provided students with more immediate feedback on their performance as examiners may ask follow-up questions. Similarly, students can ask clarifying questions of the examiner. This increased feedback and dialogue could reduce anxiety in some students. 

  2. Drawbacks

    Because of their lack of familiarity with oral exams, students may feel increased anxiety and uncertainty about how to prepare for them. Consider providing examples of preparation strategies, successful oral exam performance, as well as clear rubrics for evaluation.

This format involves providing students with a series of questions before the exam and then selecting a subsection for examination.  

  1. Benefits

    Seen question exams can decrease anxiety by reducing the uncertainty of the exams, and is most effective when the questions require deep learning and synthesis (Turner & Briggs, 2018).

  2. Drawbacks

    If seen questions are too broad to cover too large a scope of course materials, students’ motivation may decrease if the workload is too high (Reimann, 2011). Seen question exams that focus on declarative knowledge can be less effective, as students may collaborate and share resources.

Take-home exams involve the completion of an assessment in which the time limit is provided in days, rather than hours. Take-home exams are distinct from other assignments by having a more compressed time frame and higher stakes than other assignments (Bengtsson, 2019).

  1. Benefits

    Take-home exams reduce anxiety by removing time constraints on the students. The flexibility of take-home exams allows students to select the time and place for completion. They also allow for the use of more open-ended or higher-order questions.

  2. Drawbacks

    Without a structured time limit, students can struggle to determine what quality is expected. Instructors are advised to specify what resources (including classmates) can be accessed during the take-home exams. To avoid academic integrity concerns due to take-home exams being un-proctored, exams should be specific and contextualized to the course and often test declarative knowledge. 

Time provided

Time pressure during exams negatively impacts the measurement quality of the exam and increases students’ perceptions of the exam’s difficulty. Time pressure during exams can impact students’ use of effective exam-taking strategies (Stadler et al., 2021). For example, students may skip reviewing the whole test or determining the appropriate time per question under time pressure.  

Differentiate between how long the test is expected to take and the additional time that is provided to support students. For example, a test of 60 multiple choice questions could be described as typically taking students an hour to complete; however, in this class, you have been provided within 90 minutes to complete the test.

Rather than directly increase test length, you can provide lead time before the start of the exam. Lead time (approx. 5 minutes) allows students to look at the exam, plan for completing the exam, and make notes or concept maps. This time aims to have a settling effect, allowing students to engage in study strategies and focus on the exam (Whalley, 2021).

Number of attempts

Students who are provided more attempts to retake or re-sit an exam report less anxiety before the exam; however, students who are allowed to retake an exam tend to do worse on the first sitting on the exam. Students do improve with their subsequent attempts. This suggests that students may adjust their study approach to use the feedback from the first attempt. The first attempt may be used to judge the exam’s difficulty and areas covered by the exam. 

Exam resources

Student gives a description of their experience with test anxiety and using exam resources. Voice actor: Nolan Peters

Desk with student sitting with a laptop, books, pens, calculator, and ruler.

Suggested resources

Exams are often conducted in a restrictive environment where students are not allowed to access the resources they used to study throughout the course. Consider if the following resources could be allowed for your test.   

Students report that open book tests appear less threatening because they have additional resources to deal with the test unless they perceive the test was constructed to be harder (Block, 2012). Students may not adequately be prepared for these exams under the assumption that they will access or can look up material. This means students may spend a disproportional amount of the exam time looking for the information. Research suggests that students do not necessarily perform better on open versus closed book exams. Open book exams can prompt deep learning, particularly when students are required to use multiple sources to prepare for the exam (Johanns et al., 2017). Open book exams appear to better reflect the “real world”, which can increase students’ perception of the validity of the assessment.

  1. Provide study strategies

    Students may be unfamiliar with open book exams and struggle to identify appropriate study approaches. Consider identifying core knowledge (student knows without reference) and backup knowledge (student understands and used a reference). Encourage students to create a schema, which is a way of organizing various ideas to create meaning based on existing associations. Prompt students to connect ideas across the course (horizontal organization) as well as focus on skills or topics (vertical organization). 

  2. Specific resources

    Open book exams should have clear instructions about the type and format of the resources that are allowed for the exam. Determine whether resources must be printed, bound, and with\without annotations. Consider including this information in multiple places for students to reference, including the course outline and course D2L site.

Student-prepared testing aids include any resources that a student has created that are allowed in the exams. These may be referred to as note sheets, formulas, or cheat sheets. These aids reduce student anxiety by relieving concerns about the memorization of the material. In comparison to open-book exams, students with note cards tend to spend less time seeking information on the cards and therefore score higher on the test (Block, 2012).

  1. Structuring aids

    The more preparation required to create the testing aids, the greater the student’s performance on the test. Providing students with guidelines for the aid requires students to be engaged with the course material, rather than copying material directly on the aid (Nsor-Ambala, 2020). These structures help encourage students to engage in deep learning and support their metacognition of the course material. Possible structures to use include argumentation maps and concept maps. 

Standard formula sheets also reduce concerns regarding memorization. Formula sheets help to reduce anxiety when students have access and can use them while they prepare for the exam. If the formula sheet has errors, students’ experience of anxiety may increase as they doubt their knowledge and the accuracy of their responses.

Consider providing students with scrap paper, which can be detached from the test questions during the exams. Scrap paper supports a variety of test-taking strategies, such as noting key concepts and strategies to use during the test, covering up parts of the exam, and helping determine answers to multiple-choice questions. 

Even though students can complete basic arithmetic, these calculations can increase the load on students’ working memory. By allowing students access to calculators, students report lower anxiety, reduced concerns over arithmetic errors, and less concentration on the time left in the exam. For complex multistep problems, students may prefer calculators with larger screens (associated with programmable calculators), which allows them to review the input for typographic/input errors.

  1. Define the calculator

    Reduce confusion about the calculator permitted by defining it in terms of the functions that are allowed. Consider indicating if the calculator is limited to arithmetic functions (less than 25 keys), includes scientific functions, and/or allows for programming or graphing.

Having access to rulers can help students reduce the misreading of questions. Students can do this by covering up parts of the exam, which guides their eye movement to ensure questions are read correctly. Rulers can also help students complete scantron/optical score sheets by reducing the likelihood of missed questions or incorrect entering of responses.


Reflection questions:

  • Considering the past tests you designed, what resources did you allow for or provide for the test? If yes, did these resources impact how students responded to the test? Did students appear more or less relaxed that in previous tests? 
  • If you haven’t allowed or provided any resources in your test, what resources might you consider allowing? How might these resources support students focussing on higher-order learning over memorization? 

Lesson checklist

After this lesson, you will be able to: 

  • Describe various skills that are assessed through testing structures and confirm they match learning outcomes 

  • Articulate the benefits and challenges of particular test structures on test anxiety and study behaviour 

  • Discuss various exam resources that can reduce common student concerns 

References

​Akimov, A., & Malin, M. (2020). When old becomes new: A case study of oral examination as an online assessment tool. Assessment & Evaluation in Higher Education, 45(8), 1205–1221. https://doi.org/10.1080/02602938.2020.1730301 

Bengtsson, L. (2019). Take-home exams in higher education: A systematic review. Education Sciences, 9(4), 267. https://doi.org/10.3390/educsci9040267 

Birenbaum, M., & Feldman, R. A. (1998). Relationships between learning patterns and attitudes towards two assessment formats. Educational Research, 40(1), 90–98. https://doi.org/10.1080/0013188980400109 

Block, R. M. (2012). A discussion of the effect of open-book and closed-book exams on student achievement in an introductory statistics course. PRIMUS, 22(3), 228–238. https://doi.org/10.1080/10511970.2011.565402 

Johanns, B., Dinkens, A., & Moore, J. (2017). A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse Education in Practice, 27, 89–94. https://doi.org/10.1016/j.nepr.2017.08.018 

Nsor-Ambala, R. (2020). Impact of exam type on exam scores, anxiety, and knowledge retention in a cost and management accounting course. Accounting Education, 29(1), 32–56. https://doi.org/10.1080/09639284.2019.1683871 

Pandey, C., & Kapitanoff, S. (2011). The influence of anxiety and quality of interaction on collaborative test performance. Active Learning in Higher Education, 12(3), 163–174. https://doi.org/10.1177/1469787411415077 

Reimann, N. (2011). To risk or not to risk it: Student (non‐)engagement with seen examination questions. Assessment & Evaluation in Higher Education, 36(3), 263–279. https://doi.org/10.1080/02602930903311716 

Stadler, M., Kolb, N., & Sailer, M. (2021). The right amount of pressure: Implementing time pressure in online exams. Distance Education, 42(2), 219–230. https://doi.org/10.1080/01587919.2021.1911629 

Tempelaar, D. (2020). Supporting the less-adaptive student: The role of learning analytics, formative assessment and blended learning. Assessment & Evaluation in Higher Education, 45(4), 579–593. https://doi.org/10.1080/02602938.2019.1677855 

Turner, J., & Briggs, G. (2018). To see or not to see? Comparing the effectiveness of examinations and end of module assessments in online distance learning. Assessment & Evaluation in Higher Education, 43(7), 1048–1060. https://doi.org/10.1080/02602938.2018.1428730 

van de Watering, G., Gijbels, D., Dochy, F., & van der Rijt, J. (2008). Students’ assessment preferences, perceptions of assessment and their relationships to study results. Higher Education, 56(6), 645–658. https://doi.org/10.1007/s10734-008-9116-6 

Whalley, W. B. (2021). Using attainment curves and lead-times to help improve student examination performance. Journal of Further and Higher Education, 45(1), 1–15. https://doi.org/10.1080/0309877X.2019.1702152 

More lessons

Woman interacting with a mobile device depicting a lit up light bulb, a rocket and gears can be seen in the background.

Lesson 5: Growth mindset

Instructor and students reviewing course material.

Lesson 6: Working with feedback

Students sitting in a classroom, visibly frustrated

Lesson 1: Understanding test anxiety