The permitted use of exam aids has the capacity to decrease examination stress and positively impact student learning and academic performance. The purpose of this study was to develop an exam aid assessment rubric to evaluate the inclusion of critical organization and course content integrative elements included on single page student-generated exam aids. The exam aids were used as both a learning and examination preparation tool that were also used as a supportive aid during the final exam of a fourth-year nutrition and disease pathophysiology course. Student-generated exam aids (n=167) were assessed according to two rubrics for exam aid assessment scores, namely the exam aid general score and exam aid content integration score developed by the research team. Subsequently, the relationship between students’ exam aid assessment scores and academic performance (e.g., final exam grade, final grade in the course), learning approach (namely deep and surface approaches, motives, and strategies) and perceived stress scale (PSS) scores were determined using online survey data collected at the end of the academic semester prior to the final exam. Higher exam aid assessment scores were positively correlated with higher academic performance outcomes, including both final exam grade and overall final grade in the course (P<0.05). Students’ surface learning approach scores and surface learning motive scores were both inversely correlated with their exam aid content integration score (P<0.05), whereas PSS scores were negatively correlated with final grade, exam aid general score, and exam aid content integration score (P<0.05). Conversely, PSS scores were positively correlated with surface learning approach and surface learning motive scores (P<0.05). This demonstrates the negative impact of stress on academic performance and engagement in examination preparation activities and learning approach. Collectively, this study demonstrates the positive impact of student-generated exam aids but highlights that their potential benefits to support student learning and academic performance is limited in students still experiencing higher stress levels and/or surface learning approaches.
Evaluation and assessment, namely through exam testing, is a common element in most undergraduate level courses, particularly in the basic sciences 1. Common sources of academic stress and/or examination-associated stress include grade competition, the perceived need for content memorization, feeling overwhelmed by the academic workload, and challenges in meeting deadlines 2, 3, 4. These findings could be attributable to student test aversions, fear of negative evaluation, or could be a consequence of ineffective study skills 5. Furthermore, there is pressure for students to achieve a high-grade point average (GPA), given that GPA is utilized in applicant screening and overall employment success in some disciplines 6. Research has shown that highly evaluative classrooms can decrease motivation in students and consequentially decrease their performance 7. Not unexpectedly, many studies have shown that undergraduate students experience a high degree of academic anxiety and/or exam-associated stress or anxiety 8, 9, 10, which can reduce student’s motivation to learn and their academic achievement 11, 12, 13, 14, 15, 16, 17, 18. Experiencing high exam stress can hinder students’ ability to recall prior knowledge and impair academic performance, whereas students with lower levels of exam stress have been shown to more readily retrieve information that is necessary for academic success 5. Thus, identifying strategies that help alleviate exam-associated stress or anxiety can help promote students’ academic performance. In this connection, permitting students to utilize an authorized exam aid may represent a useful strategy to alleviate academic stress or anxiety, particularly in high-stakes academic settings such as formal exams. The use of authorized exam aids has been shown previously to help alleviate the stress or anxiety students may experience while writing exams 19, 20, 21, 22, 23, 24. By permitting students to create exam aids for use during their exams, instructors are able to provide a beneficial study method for students with less effective study habits. Notably, exams permitting the use of student generated aids have been shown to be met with higher levels of student preparation 25. From an instructor perspective, another benefit of permitting students to use an exam aid may be to reduce incidents of cheating 26, 29. From the student perspective, being permitted to have an exam aid was perceived as being useful for their overall learning, retention of information and decreased reliance on memorization 7, 20, 26. Consequently, the authorization of an exam aid can improve test performance 20 and provide students with a feeling of security during the exam 21.
The impact of examination-associated stress or anxiety extends beyond student discomfort. Exam stress and/or anxiety can directly hinder student learning and academic performance, including impacts such as reduced concentration that result in deficits in working memory and cognition 24. In this connection, students experiencing high exam anxiety have been shown to exhibit lower academic performance on examinations 27 and motivation to learn 12, 13, 14, 17. Nsor-Ambala (2020) found that there was a difference in exam stress based on the type of exam. Specifically, the study indicated that using an exam aid decreased pre-exam anxiety 25. The process of creating an exam aid was shown to benefit students in their exam preparation by encouraging them to review and organize the course content in a logical manner 20, 24, 28. This approach has been shown to help students to focus their learning to prioritize course content and reduce reliance on rote memorization, while promoting problem-solving and critical thinking 21, 22, 29, 30. In contrast, open book exams can provide students with a sense of security, and previous research has shown that without adequate preparation, open book exams do not improve exam performance 31. Therefore, open book exams versus the use of authorized exam aids have different outcomes on student academic performance, which may reflect the benefit of the academic process used by students to create an exam aid contrasted with simply using the course notes provided by instructors during an open book exam. When allowed an exam aid with limited space, students are obligated to organize and prioritize information to benefit them when writing the exam. With this approach to exam aid preparation, students are given the flexibility to tailor the information included on the exam aid to address their individual needs 31. How useful an exam aid can be to a student can also depend on the types of questions on the examination. Specifically, the use of exam aids can help students with some types of comprehension questions on exams 32; however, they have been shown to be less effective in helping students with actual concept learning 20, 30, 33. Moreover, exam aids can be beneficial for students writing examinations that emphasize a deeper understanding of and engagement with the course content such as those utilizing application or analysis-based questions 7. In this connection, the process of creating an exam aid can be a beneficial study tool 7. Consequently, students’ test performance has been shown to improve with exam aids 20, 33, 34, 35, 36 with no negative impact on knowledge retention 37, although some studies have found conflicting findings or no effect 32, 38, 39, 40, 41. Therefore, there were two objectives in this study. The first objective was to develop an exam aid assessment rubric to evaluate students’ organizational approach toward creating an exam aid (one 8.5” x 11” piece of paper). The second objective was to determine the relationships between the exam aid assessment parameters and students’ academic performance, stress levels and learning approach (namely deep and surface level learning approaches).
Participants (n=167) were undergraduate students enrolled in a fourth-year nutrition and chronic disease pathophysiology course at a medium-sized research-intensive university. The final exam for the course was a two-hour examination comprised of 90 multiple choice questions worth 35% of students’ final grade. Students were informed for the first time at the start of week 11 of the 12-week semester that they would be permitted to use an exam aid on the final exam. Students were instructed to use the exam aid to support their understanding of the dietary fibre and microbiome content i.e., types of i) dietary fibres, ii) glycosidic bonds, iii) enzymes responsible for glycosidic bond hydrolysis, and iv) bacteria species responsible for fibre fermentation. Students were free to decide what additional course content to also include on the exam aid, which had to comply with following criteria: i) the exam aid was to be a maximum size of one 8.5” x 11 sheet of paper, ii) both sides of the paper could be used with no margin or layout restrictions, and iii) the content could be prepared on a computer and printed or could be handwritten by the students. Students were instructed that exam aids would be handed in with their final exam papers and during the examination all exam aids would be checked for compliance with the formatting instructions. Any exam aids that did not comply with the instructions would be confiscated during the exam, and there were no compliance issues.
The research team developed two exam aid assessment methods and accompanying rubrics. The exam aid general score was based on the visual organization of the content, inclusion of visual aids (e.g., diagrams, pathways, charts, summary tables, etc.) and the information density and/or space utilization on the exam aid (assessment rubric shown in Table 1). Each of these three criteria were assigned an average score out of 3 for a combined exam aid general score out of 9. The exam aid content integration score (Table 2) focused on the specific topics within the course content that students were instructed to generate their exam aid to include, namely the dietary fibre and microbiome content described above. Furthermore, students were informed during lecture that this information would be tested on the final exam in an integrative manner versus simple recall. This is relevant given that students are unlikely to engage in meta-cognitive processes unless explicitly asked to, and thus, would not form deeper cognitive connections unless explicitly instructed to do so 43. The rubric for the exam aid content integration score consisted of several criteria, namely inclusion of the assigned content, visual organization of the content, information accuracy and organization, integration of concepts, and evidence of concept comprehension. Each criterion was given a score on a 4-level scale, with 1 being the lowest and 4 the highest level. The maximum exam aid content integration score was 24, which was used to indicate a student’s ability to effectively convey their understanding of key concepts as well as their ability to integrate the concepts. Student-generated exam aids were independently scored by two members of the research team.
The aforementioned dietary fibre and microbiome course content was also used to generate the final exam content integration questions, which was a group of nine questions on the final exam that utilized the assigned content to be included on each student’s exam aid. These data are presented as the percentage of these questions that students answered correctly on the final exam. Additional academic performance indices included students’ grade on the final exam and final grade in the course, both of which are presented as percentages.
2.3. Online SurveyDuring week 10 of the semester, students were invited to participate in an optional online survey (Qualtrics Insight Platform, Provo, UT, USA) using a private link sent to their university email address. The online survey consisted of i) the Revised Two-Factor Study Process Questionnaire (R-SPQ-2F), a validated scale that measures learning approach (i.e., both deep and surface learning approaches) 44, ii) the Perceived Stress Scale (PSS), a validated scale that measures overall perceived stress from both academic and non-academic sources 45, iii) researcher generated questions pertaining to the frequency of experiencing stress and/or anxiety that have been used previously 47, 48. Participants gave their informed consent to participate in the study, which was approved by the institutional Research Ethics Board.
2.4. Statistical AnalysisStatistical analysis was conducted using GraphPad Prism (San Diego, CA, USA). For all data, the predefined upper limit of probability for statistical significance was P≤0.05. Pearson correlation analyses were conducted to determine the relationships between parameters, and corresponding correlation coefficients (r) and P-values are shown.
The distribution of students’ exam aid general score are shown in Figure 1A. The majority of students’ exam aid general scores were in the range of 5-8 out of 9, wherein 54.5% of students had a general exam aid score of 5 or 6 and 40.7% of students had an exam aid general score of 7 or 8. Additionally, the fibre and microbiome content of the course was utilized to generate an exam aid content integration score; specifically, students were instructed to include this course content on their exam aids and then add any additional content that they choose to include on the exam aid. The distribution of students’ exam aid content integration scores are shown in Figure 1B. A small proportion (7.8% of students) had a very low exam aid content integration score (≤4 out of 24), whereas 53.9% of students had exam aid content integration scores between 15-19 out of 24, and 25.7% of students had a score between 20-24 out of 24. Correlation analyses determined a positive relationship between students’ exam aid general score and content integration score (r=0.395, P<0.001).
The academic performance indices assessed in this study included the number of integrative concept questions students answered correctly on the final exam, final exam grade, and the overall grade in the course. The distribution of students’ grades in the course within each of the three academic performance indices are shown in Figure 2. The average overall final grade in the course was 80.1% (Figure 2A), whereas comparatively the average grade on the final exam was lower at 73.3% (Figure 2B). The integrative concept questions on the final exam (nine questions total) required students to utilize information from multiple concepts with the course to answer the question correctly. The distribution of students’ grades on the integrative content questions on the final exam is shown in Figure 2C, wherein the average grade on this aspect of the final exam was 76.7%.
The relationship between each of the exam aid assessment parameters, namely the exam aid general score or the exam aid content integration score and the academic performance indices are shown in Table 3. Independent positive correlative relationships were determined between both the exam aid general score and the exam aid content integration score with all three academic indices, which included students’ final exam integrative concept question grade, final exam grade, and overall final grade in the course (P<0.05). Collectively, these data indicated that students who utilized a more organized and integrative approach in constructing their exam aid exhibited higher academic performance in the course.
Students’ learning approach (deep and surface learning approach) scores were correlated with academic performance indices, as shown in Table 4. There were no relationships observed between any surface learning approach score (i.e., total score, surface motive or surface strategy score) and any academic performance outcome measurement assessed in the course (P>0.05). Conversely, deep learning approach score was positively correlated with students’ final exam grade (r=0.131, P=0.046). This indicated that students utilizing a deeper learning approach earned higher grades on the final exam. Analysis of the deep learning approach subscales, specifically deep motive score reflects students’ intrinsic motivation or curiosity that satisfies the students’ need for knowledge and understanding, was modestly positively correlated with final exam grade (r=0.124, P=0.055). Furthermore, deep strategy score was positively correlated with students’ final grade in the course (r=0.125, P=0.050), which reflects higher cognitive engagement with the course content in students employ this learning strategy. There were no other.
significant relationships between deep learning total score, deep strategy, or deep motivation sub-scale score and any other academic performance outcome assessed (P>0.05; Table 4).
Surface learning scores reflect learning approaches dominated by fear of failure, repetitive memorization techniques (i.e., rote learning) and reproducing knowledge with narrow targets 44, 49. Surface learning score exhibited no relationship with any academic performance indices; however, surface motive scores were negatively correlated with the number of integrative concept questions answered correctly on the final exam (r = -0.165, P=0.017). This reflects the nature of surface motive learning approaches aimed at content memorization and reinforced by punishment or reward sources of motivation versus the deeper understanding of concepts that is associated with knowledge integration 44, 49. There were no other significant relationships between surface learning total scores, surface strategy, or surface motivation sub-scale scores and any other academic performance indices (Table 4)
The relationship between learning approach and exam aid assessment parameters are also shown in Table 4. A significant negative relationship was observed between students’ surface learning approach score and the exam aid content integration score (r = -0.139, P=0.036), whereas the deep learning approach score exhibited a trend for a positive correlation with the exam aid content integration score (r = -0.121, P=0.059) but did not reach the cut off for determining a statistically significant relationship. This reflects how differing learning approaches adopted by students can impact their approach to organizing and integrating course concepts, which his required for a deep understanding of the concept 44, 49. With respect to the learning approach strategy and motivation subscales, the surface motive approach score was negatively correlated with the exam aid content integration score (r = -0.206, P=0.004), which reflects core surface learning strategies aimed at memorization and knowledge reproduction. There were no other significant relationships observed between any deep or surface learning approach scores (total score, motive score or strategy score) and any other exam aid assessment parameter (P>0.05).
3.4. Relationship Between Student Stress Parameters, Exam Aid Assessment Parameters, Academic Performance Indices and Learning ApproachStudent stress parameters assessed near the end of the semester prior to the start of the two-week final examination period are shown in Figure 3. Specifically, the distribution of students’ perceived stress levels from all sources (both academic and non-academic sources combined) showed that the majority of students (61.7%) reported PSS scores that are reflective of high stress (PSS score ≥27), whereas 37.7% of students reported PSS scores reflective of moderate stress levels (PSS score of 14-26) and 0.6% of students reported a low PSS score, reflective of low stress levels (PSS score of 0-13) (Figure 3A). Not unsurprisingly, academic sources of stress were identified as the major source of stress experienced by 67.7% of students (Figure 3B). 27.1% of students reported experiencing academic stress at a frequency of 2-3 times per week, 29.5% of students reported experiencing academic stress 4-6 times per week and 33.7% of students reported experiencing academic stress daily (7 times/week) (Figure 3C). Students also graded their daily academic stress intensity experience, wherein 34.4% reported a manageable level of stress, whereas 60.2% of students reported their daily academic stress levels to be quite or extremely stressful (Figure 3D).
The relationships between students’ stress parameters and academic performance are shown in Table 5. There was a significant negative correlation between students PSS score and final grade in the course, indicating that students experiencing higher overall stress levels exhibited lower overall final grades in the course. There was a similar relationship trend observed between students’ PSS scores and final exam grade, however, it did not reach statistical significance (r = -0.122, P=0.058). There were no significant relationships between students’ integrative concept question grade on the final exam and any stress parameters assessed (P>0.05). Furthermore, there were no significant relationships between students’ academic stress frequency or academic stress intensity rating and any academic performance indices (P>0.05).
The relationships between student stress parameters and exam aid assessment parameters are also shown in Table 5. There were moderate negative correlations between the PSS score and both the exam aid content integration score and the exam aid general score (P<0.05), indicating that students experiencing higher overall stress levels generated exam aids that exhibited lower levels of organization and integration of concepts that reflect the ease of utilizing the exam aid. There were no relationships between students’ academic stress frequency or academic stress intensity and any exam aid assessment parameter (P>0.05). Collectively, these findings likely reflect the challenges of cognitive processing experienced during periods of stress 5, 24, 27.
Surface learning approach score was modestly positively correlated with students’ PSS score (r=0.149, P=0.027; Table 5), indicating that students who preferred surface learning were experiencing higher stress levels from both academic and non-academic sources. In this connection, the surface motive score was also positively correlated with the PSS score (r=0.183, P=0.009), indicating that students favoring surface motives in their learning approach were experiencing higher stress levels. There were no significant relationships observed between surface learning approach score or the surface motive or surface strategy scores with any other stress parameter, namely PSS score or students’ self-reported academic stress frequency or academic stress intensity rating (P>0.05, Table 5). Similarly, there were no significant relationships observed between deep learning approach scores and any student stress parameter (P>0.05, Table 5).
Permitting students to use self-generated exam aids during an examination assessment may represent a tool to promote deeper learning of course content, facilitate exam preparation, and reduce exam-associated stress. However, students will utilize different approaches to generate their exam aid, which may influence the utility of the exam aid. Therefore, the current study sought to develop exam aid assessment rubrics that captured the overall design approach utilized by students in the exam aid general score (organization, inclusion of visual aids and utilization of the permitted exam aid space, Table 1) and in the exam aid content integration score focused specifically on the dietary fibre and microbiome content on the exam aid (inclusion, visual organization, information accuracy and organization, and evidence of integration of comprehension of concepts; Table 2). Subsequently, our secondary objective was to determine the relationships between the exam aid assessment parameters and students’ academic performance, learning approach, and stress levels. Our findings demonstrated that both the exam aid general scores and content integration scores were positively correlated with multiple academic parameters including students’ grade on specific integrative concept questions on the final exam, overall final exam grade, and final grade in the course (Table 3). Moreover, the exam aid content integration score was inversely correlated with surface learning approach scores (Table 4), indicating that students who developed exam aids that demonstrated limited evidence of integrating course concepts and comprehension were using both higher surface learning and surface motives that are reflective of limited concept comprehension, increased memorization and reduced engagement 44, 49. Furthermore, perceived stress levels were positively correlated with surface motive scores (Table 5), indicating that the higher level of stress a student experienced the more prevalent their surface motives associated with learning. This may be reflective of a coping mechanism associated with balancing academic requirements with daily stress. Although there was no relationship between deep learning approach scores and the exam aid assessment indices, both overall deep learning approach score and deep strategy scores were positively correlated with final exam grades and final grade in the course, respectively (Table 4). Finally, students’ overall perceived daily stress levels were negatively correlated with final grade in the course and both of the exam aid assessment scores (general score and content integration score), indicating a potential lower level of engagement with the exam aid development during periods of high stress (Table 5). Collectively, these data demonstrate the utility of exam aids as an instructional approach to promote exam preparation and limit academic stress to support student learning and academic success.
Previous research has demonstrated that students spend more time studying for exam aid assisted examinations compared to the preparation time for other exam formats without an exam aid (both closed-book and open-book assessment types) 25. Open-book exams differ from an exam aid-assisted exam format, as students must comply with the restrictions of the exam aid parameters and determine what information to include or exclude from the exam aid 5, 31. Furthermore, limited space encourages students to engage in an iterative organizational process in developing their exam aid in preparation for the assessment, which can encourage more engagement with the course material and increased preparation for the final examination 56, 57. Previous research has shown that the exam aid paper size impacts students’ perceptions of the necessity to study for the exam 25, as open-book exams permit multiple pages of resources whereas an exam aid has space limitations. In this connection, restricting the exam aid paper size has been shown to discourage direct copying of course notes and encourage students to make evaluative judgements about the course content, only including necessary information and using the available exam aid space effectively 31. Collectively, the process of generating an exam aid can help students prepare for the examination and can promote academic performance, as students permitted to use an exam aid have been shown to perform better compared to closed-book examinations 25. In this connection, in the current study academic performance, namely correctly answered content integration questions, final exam grade, and final grade in the course were all positively correlated to both the exam aid general score and the exam aid content integration scores (Table 3). Moreover, deep learning approach was also positively associated with higher final exam grades (Table 4), which reflects higher cognitive engagement and comprehension of concepts 44. Collectively, this highlights the value of utilizing an exam aid to foster deeper learning in addition to improving academic performance.
Multiple criteria were assessed in the evaluation of students’ exam aids, either related to the general design and organizational approach employed in the exam aid general score, or the specific approach students utilized to incorporate the specific information they were instructed to make the exam aid for, which included information organization, accuracy, concept integration and comprehension and visual organization/mapping of concepts. Collectively, these criteria can reflect students’ engagement and learning approach, but also directly influence the utility of the exam aid in supporting academic performance. Exam aid organization permits efficient access to information, which is a critical factor underpinning utility of the exam aid to support academic performance, particularly within the context of a higher stress examination environment wherein students who develop more organized exam aids have been shown to score higher on assessments 58. Information accuracy on the exam aid has previously been shown to positively impact academic performance 59. Using knowledge integration activities, such as developing an exam aid, extends student learning and has been found to improve their understanding of course concepts 43. Additionally, providing students with creative opportunities, such as developing an exam aid, can encourage higher-level learning, and thus, can improve academic performance 32. Knowledge integration approaches utilized in the development of the exam aid, in particular via concept mapping approaches (as assessed in the exam aid rubrics) provide students with the opportunity to engage further with the course content and can promote knowledge retention and enhance student learning 43. Furthermore, employing an integrative approach when designing an exam aid can encourage deeper learning by compelling students to reflect upon their understanding and identify concepts with lower comprehension 43, thereby providing another opportunity for evaluative judgement regarding their own understanding of concepts. In this connection, students with lower exam aid content integration scores had higher surface learning and surface motive scores for learning approach (Tables 4), which reflects learning approaches centered on repetitive memorization techniques, limited knowledge application capabilities and engagement 44, 49. Exam aids have been shown to benefit students on application-based assessments 7, which was also apparent in the current study where higher exam aid general scores and content integration scores were associated with higher grades on the content integration questions on the final exam and with final exam grades (Table 3); however, this is dependent upon both the students approach to constructing their exam aid and their overall learning approach and learning motives. In this connection, higher deep learning scores were associated with higher final exam grades (Table 4), which aligns with previous research 60, wherein deeper learning approaches requires integration and comprehension of concepts that promotes positive academic outcomes 61. Therefore, instructors should emphasize the importance of how to construct an exam aid to promote deeper learning and organizational and content integrative approaches to better support students that do not intuitively take this approach.
Previous research has shown that the use of exam aids can reduce students’ test anxiety and/or stress 7, 61. The negative relationship between students’ stress and academic performance is well documented and can start prior to undergraduate education, impacting multiple types of assessments including written examinations, practical skill assessments, and oral examinations [63-66] 63. Not unexpectedly, this relationship was confirmed in the current study, wherein students’ perceived stress levels were inversely related to their final grade (Table 5). Our findings also showed a negative relationship between students’ perceived stress levels and both their exam aid general score and content integration scores (Table 5). Moreover, students’ perceived stress levels were positively correlated with surface motive learning approach scores (Table 5), and both surface learning approach scores and surface motive scores were also inversely correlated with exam aid content integration scores (Table 4). Collectively, these results indicate that students experiencing higher stress levels employed a surface learning approach and/or surface motives for learning, while concomitantly developing exam aids that also reflected their surface learning approaches. Consequently, the exam aid that was intended to help alleviate academic stress and promote deeper learning of the course concepts was not as successful compared to the outcomes in individuals that utilized a deeper learning approach. This highlights the need to not only provide the option of an exam aid, but to encourage and instruct students on the use of deeper learning approaches in the generation of an exam aid. Research has shown that instructional approaches emphasizing content understanding with less emphasis on rote memorization can alleviate student stress; moreover, these approaches are frequently assessed with knowledge application questions, which could be optimally assisted by the inclusion of an exam aid 7. To adequately support students in the development of exam aids that capture all relevant criteria, which is particularly useful for students that employ surface learning and surface motive approaches, one suggestion could be a formal assessment of the exam aid prior to the final exam, using the rubrics provided herein with personalized feedback for students and/or demonstration exam aids to highlight how the approaches outlined in the rubrics for optimal exam aid development could be beneficial for promoting academic success on the exam.
There were some notable limitations in the current study. The evaluation of students’ exam aids was subjective and based on the evaluator’s perspective. For example, what was identified as lower scored exam aid criteria by the research team may have been beneficial and optimally designed for the individual student using that exam aid to support their specific needs on the examination. Although this situation may have occurred in the evaluation of some exam aids the overall statistically significant relationships between the exam aid assessment criteria and academic performance indices, learning approach and perceived stress levels were still observed. Additionally, the current study only identified associations and did not establish causative relationships, however, based on previous findings (and the current study) linking the use of exam aids and increasing academic performance 25, 58, 59 while limiting examination stress 7, 61 there would be ethical concerns in denying some students the use of an exam aid and permitting exam aids for others in a subsequent study designed to determine a causative relationship. Finally, the current study did not include any assessment of students’ perspectives on the use of exam aids, and future studies should collect direct exam aid user perspectives, both before and after completing an exam aid assisted examination.
Collectively, the current study demonstrated the critical relationships between the exam aid content assessments, both the general score and the content integration score, and students’ academic performance, learning approach and perceived stress experience, which highlights the benefits of permitting an exam aid to promote deeper learning of course concepts during students’ preparation for examinations. Additionally, the exam aid assessment rubrics can be utilized by other researchers and instructors and adapted to specific course contexts where applicable. Providing students with guidance on how to develop a more effective exam aid, based on the assessment criteria, can help facilitate deeper learning and comprehension of course material while preparing for an upcoming exam that could also help to alleviate exam-associated stress, which has been shown to hinder student recall during tests 63. The true benefit of exam aids is likely in the iterative process of developing an exam aid; however, future studies are required to capture the student experience, which was not assessed herein. The current study provides the proof-of-principle for the benefits of permitting an exam aid on student academic performance outcomes and stress experience and provides the exam aid assessment rubrics that could be beneficial for guiding the process of developing a more effective exam aid.
Statement of Competing Interests. The authors have no conflicts of interest to disclose.
Abbreviations. Perceived Stress Scale (PSS); Revised Two-Factor Study Process Questionnaire (R-SPQ-2F).
[1] | May RW, Casazza SP. Academic Major as a Perceived Stress Indicator: Extending Stress Management Intervention. College Student Journal. (2012); 46(2): 264–73. | ||
In article | |||
[2] | Reisberg L. Student stress is rising, especially among young women. Chronicle of Higher Education. (2000); 46(21): 49–50. | ||
In article | |||
[3] | Misra R, McKean M, West S, Russo T. Academic stress of college students: Comparison of student and faculty perceptions. College student journal. (2000); 34(2): 236–45. | ||
In article | |||
[4] | Abouserie R. Sources and Levels of Stress in Relation to Locus of Control and Self Esteem in University Students. Educational Psychology. (1994); 14(3): 323–30. | ||
In article | View Article | ||
[5] | Hembree R. Correlates, Causes, Effects, and Treatment of Test Anxiety. Review of Educational Research. (1988); 58(1): 47–77. | ||
In article | View Article | ||
[6] | Imose R, Barber LK. Using undergraduate grade point average as a selection tool: A synthesis of the literature. Psychologist-Manager Journal. (2015); 18(1): 1–11. 10.1037/mgr0000025. | ||
In article | View Article | ||
[7] | Vogelweid CM, Kitchel T, Rice AH. Veterinary Students’ Use of Crib Sheets in Preparing for Learning and Reducing Stress. NACTA Journal. (2014); 58(1–4): 137–41. | ||
In article | |||
[8] | Krispenz A, Dickhäuser O. Effects of an inquiry-based short intervention on state test anxiety in comparison to alternative coping strategies. Frontiers in Psychology. (2018); 9(FEB): 201. | ||
In article | View Article PubMed | ||
[9] | Hyseni Duraku Z, Hoxha L. Self-esteem, study skills, self-concept, social support, psychological distress, and coping mechanism effects on test anxiety and academic performance. Health Psychology Open. (2018); 5(2). | ||
In article | View Article PubMed | ||
[10] | Yıldırım FB, Demir A, Barutçu F. Education and School Psychology Self-Handicapping Among University Students: The Role of Procrastination, Test Anxiety, Self-Esteem, and Self-Compassion. Psychological Reports. (2020); 123(3): 825–43. | ||
In article | View Article PubMed | ||
[11] | Monk J, M. Beauchamp D, K. von Holt R, Van K. Effectiveness of Literature Critique Peer Discussions to Build Scientific Literacy Skills, Engagement and Improve Learning-Related Emotions during COVID-19-Associated Online Learning. American Journal of Educational Research. (2023); 11(5): 303–15. | ||
In article | View Article | ||
[12] | Barrows J, Dunn S, A. Lloyd C. Anxiety, Self-Efficacy, and College Exam Grades. Universal Journal of Educational Research. (2013); 1(3): 204–8. | ||
In article | View Article | ||
[13] | Chapell MS, Benjamin Blanding Z, Takahashi M, Silverstein ME, Newman B, Gubi A, et al. Test anxiety and academic performance in undergraduate and graduate students. Journal of Educational Psychology. (2005); 97(2): 268–74. | ||
In article | View Article | ||
[14] | England BJ, Brigati JR, Schussler EE. Student anxiety in introductory biology classrooms: Perceptions about active learning and persistence in the major. PLoS ONE. (2017); 12(8): 1–17. | ||
In article | View Article PubMed | ||
[15] | Gloria CT, Steinhardt MA. Relationships among Positive Emotions, Coping, Resilience and Mental Health. Stress and Health. (2016); 32(2): 145–56. | ||
In article | View Article PubMed | ||
[16] | Kahu ER, Nelson K. Student engagement in the educational interface: understanding the mechanisms of student success. Higher Education Research and Development. (2018); 37(1): 58–71. 10.1080/07294360.2017.1344197 | ||
In article | View Article | ||
[17] | Hancock DR. Reproduced with permission of the copyright owner. Further reproduction prohibited without. The Journal of Educational Research. (2001); 94(5): 284–90. | ||
In article | View Article | ||
[18] | Rogowska AM, Kusnierz C, Bokszczanin A. Intensity of physical activity and depressive symptoms in college students: Fitness improvement tactics in youth (fityou) project. Psychology Research and Behavior Management. (2020); 13: 797–811. | ||
In article | View Article PubMed | ||
[19] | Hamouda S, Shaffer CA. Crib sheets and exam performance in a data structures course. Computer Science Education. (2016); 26(1): 1–26. 10.1080/08993408.2016.1140427 | ||
In article | View Article | ||
[20] | Dickson KL, Bauer JJ. Do Students Learn Course Material during Crib Sheet Construction? Teaching of Psychology. (2008); 35(2): 117–20. | ||
In article | View Article | ||
[21] | Drake VK, Freed P, Hunter JM. Crib sheets or security blankets? Issues in Mental Health Nursing. (1998); 9(3): 291–300. | ||
In article | View Article PubMed | ||
[22] | Johanns B, Dinkens A, Moore J. A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse Education in Practice. (2017); 27: 89–94. 10.1016/j.nepr.2017.08.018. | ||
In article | View Article PubMed | ||
[23] | Yu B, Tsiknis G, Allen M. Turning exams into a learning experience. SIGCSE’10 - Proceedings of the 41st ACM Technical Symposium on Computer Science Education. 2010. p. 336–40. | ||
In article | View Article | ||
[24] | Rice AH, Vogelweid CM, Kitchel T. The Influence of Crib Sheets on Veterinary Students Exam Performance, Perceived Stress, and Retention of Subject Matter Knowledge. NACTA Journal. (2017); 61(1): 66–72. | ||
In article | |||
[25] | Nsor-Ambala R. Impact of exam type on exam scores, anxiety, and knowledge retention in a cost and management accounting course. Accounting Education. (2020); 29(1):32–56. 10.1080/09639284.2019.1683871. | ||
In article | View Article | ||
[26] | Feldhusen JF. An evaluation of college students’ reactions to open book examinations. Educational and Psychological Measurement. (1961); 21(3): 637–46. | ||
In article | View Article | ||
[27] | Naveh-Benjamin M, McKeachie WJ, Lin Y guang, Holinger DP. Test anxiety: Deficits in information processing. Journal of Educational Psychology. (1981); 73(6): 816–24. | ||
In article | View Article | ||
[28] | Weimer M. Crib sheets help students prioritize and organize course content. The Teaching Professor Blog. (2013). | ||
In article | |||
[29] | Tussing L. A consideration of the open book examination. Educational and Psychological Measurement. (1951); 11(4): 597–602. | ||
In article | View Article | ||
[30] | Hindman CD. Crib notes in the classroom: Cheaters never win. Teaching of Psychology. (1980); 7(3): 166–8. | ||
In article | View Article | ||
[31] | Erbe B. Reducing Test Anxiety While Increasing Learning: The Cheat Sheet. College Teaching. (2007); 55(3): 96–8. | ||
In article | View Article | ||
[32] | Oliveira AW, Brown AO, Day K, Saviolli R, Campbell SJ, Potkins H, et al. Authorised cheat sheets in undergraduate biology: Using pictographic organisers to foster student creative cognition. Review of Education. (2022); 10(3): 1–26. | ||
In article | View Article | ||
[33] | Danielian SA, Buswell NT. Do support sheets actually support students? A content analysis of student support sheets for exams. American Society for Engineering Education. (2019). | ||
In article | |||
[34] | Yamarik S. Does cooperative learning improve student learning outcomes? Journal of Economic Education. (2007); 38(3): 259–77. | ||
In article | View Article | ||
[35] | De Raadt M. Student created cheat-sheets in examinations: Impact on student outcomes. Conferences in Research and Practice in Information Technology Series. (2012); 123: 71–6. | ||
In article | |||
[36] | Dorsel TN, Cundiff GW. The cheat-sheet: Efficient coding device or indispensable crutch? Journal of Experimental Education. (1979); 48(1): 39–42. | ||
In article | View Article | ||
[37] | Gharib A, Philips W, Mathews N. Cheat Sheet or Open-Book? A Comparison of the Effects of Exam Types on Performance, Retention, and Anxiety. Journal of Psychology Research. (2012); 2(8): 469–78. | ||
In article | View Article | ||
[38] | Cannonier C, Smith K. Do crib sheets improve student performance on tests? Evidence from principles of economics. International Review of Economics Education. (2019); 30(August 2018): 100147. 10.1016/j.iree.2018.08.003. | ||
In article | View Article | ||
[39] | Dickson L, Miller M. Effect of Crib Card Construction and Use on Exam Performance. Teaching of Psychology. (2006); 33(1): 39–40. | ||
In article | |||
[40] | Whitley BEJ. Does “cheating” help? The effect of using authorized crib notes during examinations. College Student Journal. (1996); 30(4): 489–93. | ||
In article | |||
[41] | Burns KC. Security Blanket or Crutch? Crib Card Usage Depends on Students’ Abilities. Teaching of Psychology. (2014); 41(1): 66–8. | ||
In article | View Article | ||
[42] | Eysink THS, de Jong T. Does Instructional Approach Matter? How Elaboration Plays a Crucial Role in Multimedia Learning. Journal of the Learning Sciences. (2012); 21(4): 583–625. 10.1080/10508406.2011.611776. | ||
In article | View Article | ||
[43] | Tripto J, Ben-Zvi Assaraf O, Snapir Z, Amit M. The ‘What is a system’ reflection interview as a knowledge integration activity for high school students’ understanding of complex systems in human biology. International Journal of Science Education. (2016); 38(4): 564–95. 10.1080/09500693.2016.1150620. | ||
In article | View Article | ||
[44] | Biggs J, Kember D, Leung DYP. The revised two-factor Study Process Questionnaire: R-SPQ-2F. The British Journal of Educational Psychology. (2001); 71: 133–49. | ||
In article | View Article PubMed | ||
[45] | Cohen S, Kamarck T, Mermelstein R. A Global Measure of Perceived Stress. Journal of Health and Social Behavior. (1983); 24(4): 385–96. | ||
In article | View Article PubMed | ||
[46] | Bieleke M, Gogol K, Goetz T, Daniels L, Pekrun R. The AEQ-S: A Short Version of the Achievement Emotions Questionnaire. Contemporary Educational Psychology. (2020); 65: 101940. | ||
In article | View Article | ||
[47] | Van K, Beauchamp DM, Rachid H, Mansour M, Buckley B, Choi D, et al. Impact of the COVID-19-induced shift to online dietetics training on PDEP competency acquisition and mental health. Canadian Journal of Dietetic Practice and Research. (2022); 83(3): 144–6. | ||
In article | View Article PubMed | ||
[48] | Beauchamp DM, Monk JM. Effect of Optional Assessments on Student Engagement, Learning Approach, Stress, and Perceptions of Online Learning during COVID-19. International Journal of Higher Education. (2022); 11(5): 87. | ||
In article | View Article | ||
[49] | Teoh HC, Abdullah MC, Roslan S, Daud SM. Assessing students approaches to learning using a matrix framework in a Malaysian public university. SpringerPlus. (2014); 3(54): 1–11. | ||
In article | View Article PubMed | ||
[50] | Rowe AD, Fitness J, Wood LN. University student and lecturer perceptions of positive emotions in learning. International Journal of Qualitative Studies in Education. (2015); 28(1): 1–20. 10.1080/09518398.2013.847506. | ||
In article | View Article | ||
[51] | Brubacher MR, Silinda FT. Enjoyment and Not Competence Predicts Academic Persistence for Distance Education Students. International Review of Research in Open and Distributed Learning. (2019); 20(3). | ||
In article | View Article | ||
[52] | Villavicencio FT, Bernardo ABI. Beyond Math Anxiety: Positive Emotions Predict Mathematics Achievement, Self-Regulation, and Self-Efficacy. Asia-Pacific Education Researcher. (2016); 25(3): 415–22. | ||
In article | View Article | ||
[53] | De La Fuente J, López-García M, Mariano-Vera M, Martínez-Vicente JM, Zapata L. Personal self-regulation, learning approaches, resilience and test anxiety in psychology students. Estudios Sobre Educacion. (2017); 32: 9–26. | ||
In article | View Article | ||
[54] | Rozgonjuk D, Kraav T, Mikkor K, Orav-Puurand K, Täht K. Mathematics anxiety among STEM and social sciences students: the roles of mathematics self-efficacy, and deep and surface approach to learning. International Journal of STEM Education. (2020); 7(46). | ||
In article | View Article | ||
[55] | Putwain D, Sander P, Larkin D. Academic self-efficacy in study-related skills and behaviours: Relations with learning-related emotions and academic success. British Journal of Educational Psychology. (2013); 83(4): 633–50. | ||
In article | View Article PubMed | ||
[56] | Larwin, K. H., Gorman, J., & Larwin, D. A. Assessing the Impact of Testing Aids on Post-Secondary Student Performance: A Meta-Analytic Investigation. Educational Psychology Review. (2013); 25(3): 429–443. | ||
In article | View Article | ||
[57] | Wachsman, Y. , AccessEcon. (2002); 1(1): 1-11. | ||
In article | View Article PubMed | ||
[58] | Yang Song, & Thuente, D. A quantitative case study in engineering of the efficacy of quality cheat-sheets. 2015 IEEE Frontiers in Education Conference (FIE). (2015); 1–7. | ||
In article | View Article | ||
[59] | Colthorpe, K., Gray, H., Ainscough, L., & Ernst, H. Drivers for authenticity: student approaches and responses to an authentic assessment task. Assessment and Evaluation in Higher Education. (2021); 46(7)995–1007. | ||
In article | View Article | ||
[60] | Salamonson, Y., Weaver, R., Chang, S., Koch, J., Bhathal, R., Khoo, C., & Wilson, I. (2013). Learning approaches as predictors of academic performance in first year health and science students. Nurse Education Today. (2013); 33(7): 729–733. | ||
In article | View Article PubMed | ||
[61] | Jensen, J. L., McDaniel, M. A., Woodard, S. M., & Kummer, T. A. Teaching to the Test...or Testing to Teach: Exams Requiring Higher Order Thinking Skills Encourage Greater Conceptual Understanding. Educational Psychology Review. (2014); 26(2): 307–329. | ||
In article | View Article | ||
[62] | Ozer, S. A convergent parallel mixed-method research into the use of the cheat sheet in teacher education: State test anxiety, exam scores and opinions of prospective teachers. The Turkish Online Journal of Educational Technology (2021); 20(3): 101-113. | ||
In article | |||
[63] | Ringeisen, T., Lichtenfeld, S., Becker, S., & Minkley, N. Stress experience and performance during an oral exam: the role of self-efficacy, threat appraisals, anxiety, and cortisol. Anxiety, Stress, and Coping. (2019); 32(1): 50–66. | ||
In article | View Article PubMed | ||
[64] | Leblanc, V.R., Bandiera, G.W. The effects of exam stress on the performance of emergency medicine residents. Medical Education. (2007); 41(6): 556-564. | ||
In article | View Article PubMed | ||
[65] | Roome, T. & Soan, C. A. GCSE exam stress: student perceptions of the effects on wellbeing and performance. Pastoral Care in Education. (2019); 37(4): 297-315. | ||
In article | View Article | ||
[66] | Ng, V., Koh, D., & Chia, S.-E. Examination Stress, Salivary Cortisol, and Academic Performance. Psychological Reports. (2003); 93(3 suppl): 1133-1134. | ||
In article | View Article PubMed | ||
Published with license by Science and Education Publishing, Copyright © 2024 Israa Ihab, Elaina B.K. Brendel, Camille Law, Kelsey Van, Jamie L.A. Martin and Jennifer M. Monk
This work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit
https://creativecommons.org/licenses/by/4.0/
[1] | May RW, Casazza SP. Academic Major as a Perceived Stress Indicator: Extending Stress Management Intervention. College Student Journal. (2012); 46(2): 264–73. | ||
In article | |||
[2] | Reisberg L. Student stress is rising, especially among young women. Chronicle of Higher Education. (2000); 46(21): 49–50. | ||
In article | |||
[3] | Misra R, McKean M, West S, Russo T. Academic stress of college students: Comparison of student and faculty perceptions. College student journal. (2000); 34(2): 236–45. | ||
In article | |||
[4] | Abouserie R. Sources and Levels of Stress in Relation to Locus of Control and Self Esteem in University Students. Educational Psychology. (1994); 14(3): 323–30. | ||
In article | View Article | ||
[5] | Hembree R. Correlates, Causes, Effects, and Treatment of Test Anxiety. Review of Educational Research. (1988); 58(1): 47–77. | ||
In article | View Article | ||
[6] | Imose R, Barber LK. Using undergraduate grade point average as a selection tool: A synthesis of the literature. Psychologist-Manager Journal. (2015); 18(1): 1–11. 10.1037/mgr0000025. | ||
In article | View Article | ||
[7] | Vogelweid CM, Kitchel T, Rice AH. Veterinary Students’ Use of Crib Sheets in Preparing for Learning and Reducing Stress. NACTA Journal. (2014); 58(1–4): 137–41. | ||
In article | |||
[8] | Krispenz A, Dickhäuser O. Effects of an inquiry-based short intervention on state test anxiety in comparison to alternative coping strategies. Frontiers in Psychology. (2018); 9(FEB): 201. | ||
In article | View Article PubMed | ||
[9] | Hyseni Duraku Z, Hoxha L. Self-esteem, study skills, self-concept, social support, psychological distress, and coping mechanism effects on test anxiety and academic performance. Health Psychology Open. (2018); 5(2). | ||
In article | View Article PubMed | ||
[10] | Yıldırım FB, Demir A, Barutçu F. Education and School Psychology Self-Handicapping Among University Students: The Role of Procrastination, Test Anxiety, Self-Esteem, and Self-Compassion. Psychological Reports. (2020); 123(3): 825–43. | ||
In article | View Article PubMed | ||
[11] | Monk J, M. Beauchamp D, K. von Holt R, Van K. Effectiveness of Literature Critique Peer Discussions to Build Scientific Literacy Skills, Engagement and Improve Learning-Related Emotions during COVID-19-Associated Online Learning. American Journal of Educational Research. (2023); 11(5): 303–15. | ||
In article | View Article | ||
[12] | Barrows J, Dunn S, A. Lloyd C. Anxiety, Self-Efficacy, and College Exam Grades. Universal Journal of Educational Research. (2013); 1(3): 204–8. | ||
In article | View Article | ||
[13] | Chapell MS, Benjamin Blanding Z, Takahashi M, Silverstein ME, Newman B, Gubi A, et al. Test anxiety and academic performance in undergraduate and graduate students. Journal of Educational Psychology. (2005); 97(2): 268–74. | ||
In article | View Article | ||
[14] | England BJ, Brigati JR, Schussler EE. Student anxiety in introductory biology classrooms: Perceptions about active learning and persistence in the major. PLoS ONE. (2017); 12(8): 1–17. | ||
In article | View Article PubMed | ||
[15] | Gloria CT, Steinhardt MA. Relationships among Positive Emotions, Coping, Resilience and Mental Health. Stress and Health. (2016); 32(2): 145–56. | ||
In article | View Article PubMed | ||
[16] | Kahu ER, Nelson K. Student engagement in the educational interface: understanding the mechanisms of student success. Higher Education Research and Development. (2018); 37(1): 58–71. 10.1080/07294360.2017.1344197 | ||
In article | View Article | ||
[17] | Hancock DR. Reproduced with permission of the copyright owner. Further reproduction prohibited without. The Journal of Educational Research. (2001); 94(5): 284–90. | ||
In article | View Article | ||
[18] | Rogowska AM, Kusnierz C, Bokszczanin A. Intensity of physical activity and depressive symptoms in college students: Fitness improvement tactics in youth (fityou) project. Psychology Research and Behavior Management. (2020); 13: 797–811. | ||
In article | View Article PubMed | ||
[19] | Hamouda S, Shaffer CA. Crib sheets and exam performance in a data structures course. Computer Science Education. (2016); 26(1): 1–26. 10.1080/08993408.2016.1140427 | ||
In article | View Article | ||
[20] | Dickson KL, Bauer JJ. Do Students Learn Course Material during Crib Sheet Construction? Teaching of Psychology. (2008); 35(2): 117–20. | ||
In article | View Article | ||
[21] | Drake VK, Freed P, Hunter JM. Crib sheets or security blankets? Issues in Mental Health Nursing. (1998); 9(3): 291–300. | ||
In article | View Article PubMed | ||
[22] | Johanns B, Dinkens A, Moore J. A systematic review comparing open-book and closed-book examinations: Evaluating effects on development of critical thinking skills. Nurse Education in Practice. (2017); 27: 89–94. 10.1016/j.nepr.2017.08.018. | ||
In article | View Article PubMed | ||
[23] | Yu B, Tsiknis G, Allen M. Turning exams into a learning experience. SIGCSE’10 - Proceedings of the 41st ACM Technical Symposium on Computer Science Education. 2010. p. 336–40. | ||
In article | View Article | ||
[24] | Rice AH, Vogelweid CM, Kitchel T. The Influence of Crib Sheets on Veterinary Students Exam Performance, Perceived Stress, and Retention of Subject Matter Knowledge. NACTA Journal. (2017); 61(1): 66–72. | ||
In article | |||
[25] | Nsor-Ambala R. Impact of exam type on exam scores, anxiety, and knowledge retention in a cost and management accounting course. Accounting Education. (2020); 29(1):32–56. 10.1080/09639284.2019.1683871. | ||
In article | View Article | ||
[26] | Feldhusen JF. An evaluation of college students’ reactions to open book examinations. Educational and Psychological Measurement. (1961); 21(3): 637–46. | ||
In article | View Article | ||
[27] | Naveh-Benjamin M, McKeachie WJ, Lin Y guang, Holinger DP. Test anxiety: Deficits in information processing. Journal of Educational Psychology. (1981); 73(6): 816–24. | ||
In article | View Article | ||
[28] | Weimer M. Crib sheets help students prioritize and organize course content. The Teaching Professor Blog. (2013). | ||
In article | |||
[29] | Tussing L. A consideration of the open book examination. Educational and Psychological Measurement. (1951); 11(4): 597–602. | ||
In article | View Article | ||
[30] | Hindman CD. Crib notes in the classroom: Cheaters never win. Teaching of Psychology. (1980); 7(3): 166–8. | ||
In article | View Article | ||
[31] | Erbe B. Reducing Test Anxiety While Increasing Learning: The Cheat Sheet. College Teaching. (2007); 55(3): 96–8. | ||
In article | View Article | ||
[32] | Oliveira AW, Brown AO, Day K, Saviolli R, Campbell SJ, Potkins H, et al. Authorised cheat sheets in undergraduate biology: Using pictographic organisers to foster student creative cognition. Review of Education. (2022); 10(3): 1–26. | ||
In article | View Article | ||
[33] | Danielian SA, Buswell NT. Do support sheets actually support students? A content analysis of student support sheets for exams. American Society for Engineering Education. (2019). | ||
In article | |||
[34] | Yamarik S. Does cooperative learning improve student learning outcomes? Journal of Economic Education. (2007); 38(3): 259–77. | ||
In article | View Article | ||
[35] | De Raadt M. Student created cheat-sheets in examinations: Impact on student outcomes. Conferences in Research and Practice in Information Technology Series. (2012); 123: 71–6. | ||
In article | |||
[36] | Dorsel TN, Cundiff GW. The cheat-sheet: Efficient coding device or indispensable crutch? Journal of Experimental Education. (1979); 48(1): 39–42. | ||
In article | View Article | ||
[37] | Gharib A, Philips W, Mathews N. Cheat Sheet or Open-Book? A Comparison of the Effects of Exam Types on Performance, Retention, and Anxiety. Journal of Psychology Research. (2012); 2(8): 469–78. | ||
In article | View Article | ||
[38] | Cannonier C, Smith K. Do crib sheets improve student performance on tests? Evidence from principles of economics. International Review of Economics Education. (2019); 30(August 2018): 100147. 10.1016/j.iree.2018.08.003. | ||
In article | View Article | ||
[39] | Dickson L, Miller M. Effect of Crib Card Construction and Use on Exam Performance. Teaching of Psychology. (2006); 33(1): 39–40. | ||
In article | |||
[40] | Whitley BEJ. Does “cheating” help? The effect of using authorized crib notes during examinations. College Student Journal. (1996); 30(4): 489–93. | ||
In article | |||
[41] | Burns KC. Security Blanket or Crutch? Crib Card Usage Depends on Students’ Abilities. Teaching of Psychology. (2014); 41(1): 66–8. | ||
In article | View Article | ||
[42] | Eysink THS, de Jong T. Does Instructional Approach Matter? How Elaboration Plays a Crucial Role in Multimedia Learning. Journal of the Learning Sciences. (2012); 21(4): 583–625. 10.1080/10508406.2011.611776. | ||
In article | View Article | ||
[43] | Tripto J, Ben-Zvi Assaraf O, Snapir Z, Amit M. The ‘What is a system’ reflection interview as a knowledge integration activity for high school students’ understanding of complex systems in human biology. International Journal of Science Education. (2016); 38(4): 564–95. 10.1080/09500693.2016.1150620. | ||
In article | View Article | ||
[44] | Biggs J, Kember D, Leung DYP. The revised two-factor Study Process Questionnaire: R-SPQ-2F. The British Journal of Educational Psychology. (2001); 71: 133–49. | ||
In article | View Article PubMed | ||
[45] | Cohen S, Kamarck T, Mermelstein R. A Global Measure of Perceived Stress. Journal of Health and Social Behavior. (1983); 24(4): 385–96. | ||
In article | View Article PubMed | ||
[46] | Bieleke M, Gogol K, Goetz T, Daniels L, Pekrun R. The AEQ-S: A Short Version of the Achievement Emotions Questionnaire. Contemporary Educational Psychology. (2020); 65: 101940. | ||
In article | View Article | ||
[47] | Van K, Beauchamp DM, Rachid H, Mansour M, Buckley B, Choi D, et al. Impact of the COVID-19-induced shift to online dietetics training on PDEP competency acquisition and mental health. Canadian Journal of Dietetic Practice and Research. (2022); 83(3): 144–6. | ||
In article | View Article PubMed | ||
[48] | Beauchamp DM, Monk JM. Effect of Optional Assessments on Student Engagement, Learning Approach, Stress, and Perceptions of Online Learning during COVID-19. International Journal of Higher Education. (2022); 11(5): 87. | ||
In article | View Article | ||
[49] | Teoh HC, Abdullah MC, Roslan S, Daud SM. Assessing students approaches to learning using a matrix framework in a Malaysian public university. SpringerPlus. (2014); 3(54): 1–11. | ||
In article | View Article PubMed | ||
[50] | Rowe AD, Fitness J, Wood LN. University student and lecturer perceptions of positive emotions in learning. International Journal of Qualitative Studies in Education. (2015); 28(1): 1–20. 10.1080/09518398.2013.847506. | ||
In article | View Article | ||
[51] | Brubacher MR, Silinda FT. Enjoyment and Not Competence Predicts Academic Persistence for Distance Education Students. International Review of Research in Open and Distributed Learning. (2019); 20(3). | ||
In article | View Article | ||
[52] | Villavicencio FT, Bernardo ABI. Beyond Math Anxiety: Positive Emotions Predict Mathematics Achievement, Self-Regulation, and Self-Efficacy. Asia-Pacific Education Researcher. (2016); 25(3): 415–22. | ||
In article | View Article | ||
[53] | De La Fuente J, López-García M, Mariano-Vera M, Martínez-Vicente JM, Zapata L. Personal self-regulation, learning approaches, resilience and test anxiety in psychology students. Estudios Sobre Educacion. (2017); 32: 9–26. | ||
In article | View Article | ||
[54] | Rozgonjuk D, Kraav T, Mikkor K, Orav-Puurand K, Täht K. Mathematics anxiety among STEM and social sciences students: the roles of mathematics self-efficacy, and deep and surface approach to learning. International Journal of STEM Education. (2020); 7(46). | ||
In article | View Article | ||
[55] | Putwain D, Sander P, Larkin D. Academic self-efficacy in study-related skills and behaviours: Relations with learning-related emotions and academic success. British Journal of Educational Psychology. (2013); 83(4): 633–50. | ||
In article | View Article PubMed | ||
[56] | Larwin, K. H., Gorman, J., & Larwin, D. A. Assessing the Impact of Testing Aids on Post-Secondary Student Performance: A Meta-Analytic Investigation. Educational Psychology Review. (2013); 25(3): 429–443. | ||
In article | View Article | ||
[57] | Wachsman, Y. , AccessEcon. (2002); 1(1): 1-11. | ||
In article | View Article PubMed | ||
[58] | Yang Song, & Thuente, D. A quantitative case study in engineering of the efficacy of quality cheat-sheets. 2015 IEEE Frontiers in Education Conference (FIE). (2015); 1–7. | ||
In article | View Article | ||
[59] | Colthorpe, K., Gray, H., Ainscough, L., & Ernst, H. Drivers for authenticity: student approaches and responses to an authentic assessment task. Assessment and Evaluation in Higher Education. (2021); 46(7)995–1007. | ||
In article | View Article | ||
[60] | Salamonson, Y., Weaver, R., Chang, S., Koch, J., Bhathal, R., Khoo, C., & Wilson, I. (2013). Learning approaches as predictors of academic performance in first year health and science students. Nurse Education Today. (2013); 33(7): 729–733. | ||
In article | View Article PubMed | ||
[61] | Jensen, J. L., McDaniel, M. A., Woodard, S. M., & Kummer, T. A. Teaching to the Test...or Testing to Teach: Exams Requiring Higher Order Thinking Skills Encourage Greater Conceptual Understanding. Educational Psychology Review. (2014); 26(2): 307–329. | ||
In article | View Article | ||
[62] | Ozer, S. A convergent parallel mixed-method research into the use of the cheat sheet in teacher education: State test anxiety, exam scores and opinions of prospective teachers. The Turkish Online Journal of Educational Technology (2021); 20(3): 101-113. | ||
In article | |||
[63] | Ringeisen, T., Lichtenfeld, S., Becker, S., & Minkley, N. Stress experience and performance during an oral exam: the role of self-efficacy, threat appraisals, anxiety, and cortisol. Anxiety, Stress, and Coping. (2019); 32(1): 50–66. | ||
In article | View Article PubMed | ||
[64] | Leblanc, V.R., Bandiera, G.W. The effects of exam stress on the performance of emergency medicine residents. Medical Education. (2007); 41(6): 556-564. | ||
In article | View Article PubMed | ||
[65] | Roome, T. & Soan, C. A. GCSE exam stress: student perceptions of the effects on wellbeing and performance. Pastoral Care in Education. (2019); 37(4): 297-315. | ||
In article | View Article | ||
[66] | Ng, V., Koh, D., & Chia, S.-E. Examination Stress, Salivary Cortisol, and Academic Performance. Psychological Reports. (2003); 93(3 suppl): 1133-1134. | ||
In article | View Article PubMed | ||