Teaching Medical Students History Taking Content: A Systematic Review

H Alyami, B Su'a, F Sundram, M Alyami, MP Lyndon, TC Yu, MA Henning, AG Hill

American Journal of Educational Research

Teaching Medical Students History Taking Content: A Systematic Review

H Alyami1,, B Su'a1, F Sundram2, M Alyami3, MP Lyndon4, TC Yu5, MA Henning5, AG Hill1

1South Auckland Clinical Campus, The University of Auckland, Auckland, New Zealand

2Department of Psychological Medicine, The University of Auckland, Auckland, New Zealand

3School of Psychology, Massey University, Auckland, New Zealand

4Ko Awatea – Counties Manukau District Health Board, Auckland, New Zealand

5Centre of Medical and Health Sciences Education, The University of Auckland, Auckland, New Zealand

Abstract

Context: The medical interview is a cornerstone of clinical practice. Teaching medical students how to take a history can be broadly divided into two components: teaching the process of history taking and teaching the content of history taking. While there is a growing awareness of how history taking processes can be taught, effective history taking content teaching methods remain unclear. Objectives: To identify educational interventions targeting history taking content and how they have improved medical students’ performance. Methods: A literature search in Medline, PsycINFO, Embase and ERIC was performed independently by two authors encompassing 1980 to 2015. Only studies focusing on improving undergraduate medical student history taking content were included. Results: Six articles were included: four Randomized Controlled Trials and two Quasi-Experimental studies. All interventions were additional to traditional teaching methods. Two studies investigated the use of online video demonstrations while two other studies examined the use of computer and mannequin-based human patient simulations. One study investigated the use of a virtual clinic platform while the last study used a written structured history taking pro forma. Outcome measures included: Objective Structured Clinical Examinations (OSCE), Standardized Patient Encounters, written tests and case histories. Overall, five of the six studies showed a positive impact on medical student performance. Conclusions: Most studies in this review showed a positive impact on student performance via objective assessments. While the majority utilized electronic learning methods, there are very limited studies focusing on educational interventions targeting history taking content.

Cite this article:

  • H Alyami, B Su'a, F Sundram, M Alyami, MP Lyndon, TC Yu, MA Henning, AG Hill. Teaching Medical Students History Taking Content: A Systematic Review. American Journal of Educational Research. Vol. 4, No. 3, 2016, pp 227-233. https://pubs.sciepub.com/education/4/3/2
  • Alyami, H, et al. "Teaching Medical Students History Taking Content: A Systematic Review." American Journal of Educational Research 4.3 (2016): 227-233.
  • Alyami, H. , Su'a, B. , Sundram, F. , Alyami, M. , Lyndon, M. , Yu, T. , Henning, M. , & Hill, A. (2016). Teaching Medical Students History Taking Content: A Systematic Review. American Journal of Educational Research, 4(3), 227-233.
  • Alyami, H, B Su'a, F Sundram, M Alyami, MP Lyndon, TC Yu, MA Henning, and AG Hill. "Teaching Medical Students History Taking Content: A Systematic Review." American Journal of Educational Research 4, no. 3 (2016): 227-233.

Import into BibTeX Import into EndNote Import into RefMan Import into RefWorks

At a glance: Figures

1. Introduction

The medical interview or clinical history taking is the most common clinical task performed by clinicians [1]. However, there is some evidence that medical students perform poorly in communication skills due to being distracted with remembering the medical content of history taking. When history taking is done well, it improves patient physiological and psychological health outcomes [2], satisfaction [3, 4, 5, 6], symptom relief and adherence [7]. Moreover, in two thirds of medical outpatient referrals, history taking has been found to be more valuable than physical examination or investigation in reaching a diagnosis [8]. Conversely, incomplete history taking is associated with frequent misdiagnoses [9, 10, 11], unnecessary patient treatment and delays in accessing appropriate interventions [10].

Two of the main elements of history taking are: history taking content (HTC) and history taking process (HTP) [12]. HTC, which is commonly referred to as data/information gathering, is concerned with the “what” part of the medical interview; eliciting specific information about the patient’s symptomatology, from the presenting complaint to the wider circle of the patient’s social and occupational history. On the other hand, HTP, is the method by which such information is elicited, and thus is more concerned with the “how” of the medical interview [12, 13]. A fine balance is needed to combine HTC and HTP in order to effectively acquire adequate clinical information without considering either one as an “alternative” for the other [13].

Since the introduction of the biopsychosocial model in the 1970s by George Engel, communication skills and thus HTP have been viewed as an important part of the medical interview [14, 15], however, it is frequently taught separately to content [16]. As a result, most research has focused on improving communication skills; arguably to the detriment of teaching HTC [3, 17, 18]. This has unfortunately created a vicious cycle as medical students get distracted attempting to remember HTC which then negatively impacts on their communication skills [19].

1.1. Objectives of review

The goal of this review was to investigate various methods of teaching HTC and their effectiveness. The review focuses on content rather than the process of clinical history taking and therefore it does not examine teaching programs designed to improve communication skills or rapport.

1.2. Review Questions

1. What are the different methods used in teaching clinical HTC?

2. What are the effects of such teaching methods on medical students’ performance?

3. What are the characteristics of effective programs?

2. Methods

A literature search was conducted in July 2015 using four databases, including MEDLINE, EMBASE, PsycINFO and ERIC. These searches used combinations of the search terms including: history taking or clinical history or medical interview$, education or curriculum or instruct$ or course$ or intervention$ or training or teach$ or learn$, medical student$.

All original research articles, reviews, editorials, essays and conference proceedings were retrieved. Relevant articles that were included focused on teaching HTC and the impact on medical students’ history taking performance. The study authors met regularly during July 2015 to determine the inclusion and exclusion criteria.

Studies were identified by two authors (HA and BS) working independently to increase the integrity of the search strategy. Both authors met and discussed their search results and any disagreement was resolved by consensus. The search for, and selection of, papers for review was completed in July 2015 and data analysis was conducted by all study authors.

2.1. Inclusion Criteria

1. Study participants were undergraduate medical students.

2. Teaching intervention(s) targeting HTC.

3. The comparison of at least two separate and distinct teaching methods.

4. Performance outcomes for all groups were measured by appropriate objective assessment tools.

5. History taking is a specific skill set integrated with major systems taught in the undergraduate medical curriculum, such as psychiatry, surgery, cardiology, respiratory medicine and neurology. It was noted that a review on sexual history taking has been recently conducted [20].

6. The study was published between 1980 and 2015.

2.2. Exclusion Criteria

1. The article was not published in English.

2. The study outcome results were duplicated in other publications.

3. Studies focusing exclusively on HTP.

2.3. Study Selection

Retrieved citations were screened by two independent authors (HA and BS) for relevance and eligibility. The full texts of potentially relevant articles were reviewed to determine whether inclusion criteria were met. Studies which did not meet the inclusion criteria, and any discrepancies were resolved by consensus. The screening process was conducted according to PRISMA guidelines [21].

The target population was undergraduate medical students learning HTC. All interventional study designs were considered for this review provided they allowed for a comparison to be made by including at least two teaching methods. Studies included needed to have at least one outcome measure of HTC as demonstrated by the use of objective assessment tools. Objective outcome assessments include Objective Structured Clinical Examinations (OSCE) and written tests. Outcome assessments not considered objective include open-ended interviews, learner satisfaction levels and learner self-assessment.

2.4. Data Extraction

One author (HA) extracted data from each of the identified studies using a pre-prepared data extraction instrument. The data extracted were: study location, type, setting, medical student seniority, number of participants, major system taught, characteristics of educational interventions (type, conducted by whom, number of sessions, duration, and educational tools), whether randomized, allocation concealment, objective assessment tool used, assessment tool validity and reliability, time between intervention and assessment, study’s main outcome, other outcomes, whether power calculation was done.

The search strategy generated 552 articles in total from the four databases. There were 344 in MEDLINE, 164 in PsycInfo, 10 in EMBASE, and 34 in ERIC. After duplicates were removed, 459 articles were included for title and abstract review. After inclusion and exclusion criteria were applied, a total of 23 articles were included for full text review. Of these, only six studies were included for analysis (Figure 1).

3. Results

3.1. Study Setting

Three of the studies included in this review were undertaken in the USA, in different medical faculties [22, 23, 24], two studies were from the same medical school in Australia [25, 26], and one study was conducted in Ireland [27]. All educational interventions were conducted as part of medical school programs.

3.2. Study Participants and Professional Disciplines

Participating students ranged from Year 1 to Year 4 of medical school programs. However, the duration of these programs ranged from four years in the American post-graduate entry system to six years in the high school entry of the Irish and Australian systems. Two studies targeted first-year medical students on their 4-year program [22, 24], two studies targeted second-year medical students [6-year program) [26, 27] and one each was conducted for third [6-year program) [25] and fourth year medical students (4-year program) [23]. Demographic information such as age and gender were only reported in one study [23]. Overall, HTC teaching had a medical rather than surgical or psychiatric focus. Four papers focused on cardiology, pediatrics, endocrinology and ophthalmology history taking [22, 23, 25, 26]. The other two studies had a more generic medical history taking focus [24, 27].

3.3. Instructional Method

Two studies investigated the use of online video demonstrations [22, 26], while two other studies investigated the effectiveness of using computer [24] and mannequin-based human patient [23] simulations. One study aimed to investigate the use of a virtual clinic platform [25]. One study, [27] investigated the use of a written, structured, and history taking pro forma.

3.4. Duration of Intervention and Time to Assessment

For the two online video module studies [22, 26], the intervention duration was dependent upon the online access period provided to the student; which ranged between one and two weeks. The time period between intervention and assessment ranged from one to two weeks for both studies. While the video clip duration was 25 minutes in one study [26], there was no specified duration for the other study [22]. The intervention duration for the ophthalmology virtual clinic study [25], ranged between three and ten days but this was extended for both groups to 12 months after the assessment to allow for a later follow-up assessment of knowledge. While the human patient simulation intervention was conducted during a single one-hour session and assessment was undertaken within two weeks, computer simulation cases were completed over a six-week period and assessment was instant. Only one study did not report comprehensive details about the intervention duration and time-to-assessment [27].

3.5. Study Outcome Measures

The most common method of data collection was an OSCE score, which was used in two studies [23, 26]. The other four studies had different objective assessments including standardized patient scoring [22], computer marking [24], written case histories [27] and a knowledge base written test [25]. Qualitatively, satisfaction and evaluation questionnaires were used in two studies [25, 26] to assess students’ perceptions, satisfaction and attitudes to the introduced interventions. Only one study assessed patients’ preference of the introduced teaching method compared to standard practice [27]. The validity of outcome measures was generally unclear. Only one study, that used the OSCE as an outcome measure, commented on its validity in that setting [23]. Moreover, it was generally unclear how the marking criteria for all assessments were devised, with only one study reporting the use of a standardized marking sheet based on a clinical skills textbook [27]. Another study relied on expert opinion consensus [26] while the remaining studies did not describe the process of marking criteria development.

3.6. Overall Characteristics of Teaching Interventions

For the purpose of this review, overall characteristics are reported under the teaching intervention type, which included: online video demonstrations, simulations, a virtual platform and a written pro forma. (See Table 1).

Table 1. Studies Overall Characteristics


3.6.1. Online Video Demonstrations

Two papers investigated online videos as additional interventions to traditional teaching methods, using quasi-experimental and randomized control trial designs [22, 26]. Wagner et al 2011 used a quasi-experimental design to assess the effect of a 25-minute online video, which was used as an additional intervention to traditional teaching methods for pediatrics history taking to 88 1st-year medical students (4-year program). The outcome was assessed via standardized patient (SP) interviews conducted within 1–2 weeks of intervention completion. The content checklist was completed by a trained SP interviewer who acted as a patient and as a marker. The HTC scores were significantly higher in the video group than in historical controls (P = 0.0001).

The randomized controlled study conducted by Hibbert et al [2013], studied the effect of online videos on teaching diabetes history taking to 12 Year 2 (6-year program) medical students on their endocrinology run. The outcome measure was assessed two weeks post-intervention using an OSCE which was conducted by clinicians, using consensus marking criteria. The intervention group was rated significantly more competent than controls (P = 0.007). Overall, both studies showed significant improvement in HTC compared to treatment as usual.


3.6.2. Simulation

Two randomized controlled trials were identified using simulations. One study used computer simulations, while the other used mannequin-based human patient simulation (HPS). Schwarts et al [2007] investigated the impact of a one-hour HPS session on chest pain history taking on final-year medical students (4-year program) compared to standard case based learning (CBL). Within two weeks of intervention participants’ history taking was assessed using history taking OSCE scores and HPS was not found to be superior to CBL. However, it was unclear whether OSCE scores were HTC focused or also included HTP.

Nardone et al [1987] evaluated computer-simulated cases with feedback as adjuncts to traditional teaching of HTC. Using a randomized control design, 78 Year 1 medical students were divided equally between control and intervention groups. Both groups received traditional teaching and completed three computer-simulated case histories over a six-week period. Only the intervention group received detailed computer-generated feedback on their performance in addition to suggested questions.

Overall, there was a significant (P< 0.05) short-term improvement in student performance (listing, characterization and analysis of symptoms) in both groups after the first case was finished, with the experimental group showing more improvement in symptom characterization than the control group. Computer generated feedback was not associated with any additional benefits.


3.6.3. Virtual Platforms

Succar and colleagues conducted a randomized control trial on 188 Year 3 medical students (6-year program), who were divided into intervention and control groups. This study evaluated the effect of a virtual ophthalmology clinic on student HTC skills. The outcome measure was an early post-intervention written test as well as follow-up 12 months’ later to assess the overall retention of knowledge. A statistically significant improvement was observed in the intervention group compared to controls (P< 0.001) and this improvement persisted at 12 months (p< 0.007).


3.6.4. Written Pro Forma

Using a quasi-experimental study design, Morris et al. [2013], compared written pro forma history taking structure to standard interview style in 60 Year 2 medical students (6-year program) performing generic history taking. The outcome measure was a HTC score derived from submitted written case histories, which were marked by a blinded assessor. The findings showed that a pro forma assisted interview was associated with significantly better history taking scores than a standard interview style (p = .0017) in the short term.

3.7. Methodological Quality
3.7.1. Study Goals

One study [24] specified HTC improvement as its main objective, however the remaining studies explored this under broader objectives such as learning and clinical skills improvement. All but one study [25] explicitly stated that they were investigating the effect of a teaching intervention compared to traditional teaching methods whereby HTC improvement was specifically assessed. Two studies [25, 26] included secondary outcomes such as a qualitative evaluation of students’ attitudes or perceptions towards the introduced interventions. Only one study examined patients’ preference for the introduced intervention compared to standard practice [27].


3.7.2. Study Designs

Four of the included studies were randomized controlled trials [23–26], while two were quasi-experimental studies [22, 27]. While all studies used post-intervention outcome measures, only one study used pre- and post-tests as well as long term follow up assessment [25].


3.7.3. Study Quality, Attrition, and Overall Risk of Bias

Two [25, 26] of the four RCTs included had clear randomization procedures while the other two did not provide enough detail. Ophthalmology students [25] were randomized by the local school student coordinator, using a randomization list. Endocrinology students [26], were randomized according to their attendance days. The allocation method of students to either of the two days was not detailed. While both studies showed positive outcomes on history taking, they used different outcome measures. Succar et al [25] assessed short and long term knowledge retention thus addressing knowledge without assessing OSCE performance, while Hibbert et al assessed OSCE performance only. Therefore, the knowledge improvement in Succar and colleagues’ study may not reflect improvement in clinical skills as assessed by OSCE performance [25]. Similarly, improvement in OSCE performance within 2 weeks in the Hibbert et al study might not reflect long term knowledge improvement in a year’s time [26].

Outcome assessors were blinded to the intervention assignment in four studies [22, 23, 26, 27]. The other two studies included computer assessments, where allocation concealment was not necessary. Due to the nature of the interventions, participant blinding was not possible in all studies.

The reasons for participant attrition were given in three studies [23, 26, 27]. The other studies either did not report this at all [22, 24] or had no participant drop outs [25]. Intention to treat analysis was mentioned only in one trial which had no attrition [25]. Overall, there was no incomplete data collection or selective outcome reporting.

4. Discussion

This review aimed to identify educational interventions for teaching HTC and included six studies that specifically evaluated the impact of teaching interventions with medical students’. Five out of the six studies showed the introduced teaching interventions had a positive impact on student history taking performance. These effective interventions used educational technology (electronic and multimedia), with two studies utilizing online video demonstrations, while two others used an online virtual platform and a computer based simulation. Although all studies shared a similar overall objective; they varied with regard to duration, curricular content, seniority of students, study design, intervention, and outcome measures which limited their generalizability. However, a number of key findings can be inferred. Firstly, junior medical students appeared to benefit from educational interventions in teaching HTC, as demonstrated by all but one study [23]. This positive effect persisted longitudinally at 12 months in one study [25].

In light of Miller’s framework of competency assessment whereby testing basic knowledge(“knows”) to testing higher competency levels such as “knows and shows how”, when measuring the effect of educational interventions is desired [28]. This is crucial as history taking is transformed from mere data gathering to performing a clinical task which can be measured objectively. Half of the reviewed studies used outcome measures that were operating at these higher competency assessment levels, with two studies utilizing OSCEs and another study using standardized patients. However, only one of these (Hibbert et al) devised an HTC specific marking criteria for its history taking OSCE station. In the other (Schwarts et al) it was unclear whether OSCE scores were targeting HTC only or incorporated HTP. This is of particular importance as OSCE scores often assess both HTC and HTP unless a HTC specific marking criteria is devised. The other three studies used clinical case histories and computer assessed knowledge base.

Secondly, teaching interventions utilized were consistent with blended learning, which uses a combination of face-to-face teaching in addition to electronic learning (e-learning) methods. According to Finn et al, blended learning is defined as “the effective integration of various learning techniques, technologies and delivery modalities to meet specific communication, knowledge sharing and informational needs” [29]. The overall beneficial impact on learning is in line with the current education literature, which supports a blended approach of teaching delivery methods in a digital world [30, 31, 32, 33]. However, some authors have cautioned using e-learning as an alternative to traditional teaching methods emphasizing the complementary relationship of these two approaches [32, 34].

Reasons for preferring to design or use a blended learning system include: pedagogical richness, improved on-demand access, social interaction, personal agency, cost effectiveness, and ease of revision [35]. This approach would likely optimize training as medical student numbers continue to rise in the face of growing financial constraints and limited clinical staff time [36, 37]. A possible explanation for the positive effect of the four studies could lie in the Modality Effect, which capitalizes on utilizing both auditory and visual pathways to enhance memory and reduce cognitive load [38]. Competing visually perceived text and images can increase cognitive load thus reducing memory consolidation. Conversely, using spoken text in addition to images reduces cognitive load, with enhanced memory retrieval and therefore outcomes, such as those observed in the four included studies [38, 39]. Moreover, junior medical students have been shown to prefer multiple learning styles [40] and the fact that scientific curricula rely substantially on didactic teaching as its main pedagogy means some visual learners could be disadvantaged [41]. Thus, blended learning which caters for students with multidimensional learning strategies such as auditory and visual can improve their overall academic performance. In addition, utilizing online modules could facilitate individual student learning and revision, as they generally study alone, except before clinical exams when they practice with colleagues [42] in a convenient on demand learning environment [26, 33, 43].

This systematic review was restricted to published literature in English, which represents a linguistic selection bias. Moreover, our systematic review did not include grey literature and consisted of a small number of studies. Furthermore, heterogeneity of study designs, medical student seniority, interventions and duration is a limitation of this review. Finally, publication bias could have been introduced as only studies with positive results may have been published. However, to our knowledge, this review is the first of its kind assessing the limited available literature on teaching HTC and focuses on effective educational approaches that may be of benefit to medical school programs globally as enhancing history taking is a fundamental aim of good clinical practice.

5. Conclusions

The findings of this systematic review suggest that educational interventions have a positive impact on teaching medical students HTC. Given that the included studies were heterogeneous; more rigorous research is needed to inform a more consistent approach to the teaching of HTC. Future studies could attempt to examine educational interventions focusing on HTC.

Key findings to optimally develop an effective teaching program for HTC include the incorporation of learner-centered blended teaching approaches utilizing simulation, multimedia and e-learning which capitalize on technological advances. Despite the general consensus on the importance of both HTC and HTP, content remains an understudied area. Educational researchers are encouraged to address the imbalance to enable these to function in a complementary manner thus promoting clinical data gathering through better communication.

6. Implications for Future Research

Better-designed trials are needed to assess educational interventions for HTC. Ideally, these studies should be designed as RCTs using clear randomization/concealment and minimizing contamination. The intervention should be compared with the best available traditional teaching methods such as didactic lectures while using similar control group participant characteristics. The educational interventions should be validated using similar content and duration. So far, most of the studies on HTC have assessed short-term outcomes, therefore there is a gap in the literature with regard to longer-term benefits. Using pre and post validated outcome measures, studies should attempt to measure a number of key domains including students’ knowledge, attitude and clinical performance.

Knowledge could be assessed through validated knowledge tests and marking criteria targeting key learning objectives. In terms of affective outcome measures, authors should consider using validated scales to assess key affective elements such as confidence, anxiety and stress levels as well as satisfaction. This could be explored in depth through both quantitative and qualitative research methods. Finally, to ensure that knowledge translates into behavioral change, validated outcome measures such as OSCEs would be ideal using validated marking criteria.

Contributors

All authors were involved in the planning and execution of this study and in the drafting and revising of this manuscript.

Acknowledgements: none.

Funding: none.

Medical Education: We are not aware of any other conflicts of interest.

Ethical approval: not required.

References

[1]  Lipkin MJ, Frankel RM, Beckman HB, Charon R, Fein O. Performing the interview. In: Lipkin MJ, Putnam SM, Lazare A, Carroll JG, Frankel RM, editors. The medical interview. New York: Springer-Verlag; 1995. p. 65-82.
In article      View Article
 
[2]  Stewart MA. Effective physician-patient communication and health outcomes: A review. CMAJ Can Med Assoc J. 1995 May 1;152(9):1423.
In article      
 
[3]  Robbins JA, Bertakis KD, Helms LJ, Azari R, Callahan EJ, Creten DA. The influence of physician practice behaviors on patient satisfaction. Family Medicine-Kansas City. 1993 Jan;25:17.
In article      
 
[4]  Korsch BM, Gozzi EK, Francis V. Gaps in doctor-patient communication I. Doctor-patient interaction and patient satisfaction. Pediatrics. 1968 Nov 1;42(5):855-71.
In article      PubMed
 
[5]  Kravitz RL, Callahan EJ, Paterniti D, Antonius D, Dunham M, Lewis CE. Prevalence and Sources of Patients’ Unmet Expectations for Care. Ann Intern Med. 1996 Nov 1;125(9):730-7.
In article      View Article  PubMed
 
[6]  Uhlmann RF, Carter WB, Inui TS. Fulfillment of patient requests in a general medicine clinic. Am J Public Health. 1984 Mar 1;74(3):257-8.
In article      View Article  PubMed
 
[7]  Eisenthal S, Emery R, Lazare A, Udin H. ‘Adherence’ and the negotiated approach to patienthood. Arch Gen Psychiatry. 1979 Apr 1;36(4):393-8.
In article      View Article  PubMed
 
[8]  Hampton JR, Harrison MJ, Mitchell JR, Prichard JS, Seymour C. Relative contributions of history-taking, physical examination, and laboratory investigation to diagnosis and management of medical outpatients. Br Med J. 1975 May 31;2(5969):486.
In article      View Article  PubMed
 
[9]  Kowalski RG, Claassen J, Kreiter KT, Bates JE, Ostapkovich ND, Connolly ES, Mayer SA. Initial misdiagnosis and outcome after subarachnoid hemorrhage. JAMA. 2004 Feb 18;291(7):866-9.
In article      View Article  PubMed
 
[10]  Hansen MS, Nogareda GJ, Hutchison SJ. Frequency of and inappropriate treatment of misdiagnosis of acute aortic dissection. Am J Cardiol. 2007 Mar 15;99(6):852-6.
In article      View Article  PubMed
 
[11]  Smith D, Defalla BA, Chadwick DW. The misdiagnosis of epilepsy and the management of refractory epilepsy in a specialist clinic. QJM. 1999 Jan 1;92(1):15-23.
In article      View Article  PubMed
 
[12]  Nardone DA, Reuler JB, Girard DE. Teaching history-taking: Where are we? Yale J Biol Med. 1980 Jun;53(3):233.
In article      PubMed
 
[13]  Kurtz S, Silverman J, Benson J, Draper J. Marrying content and process in clinical method teaching: Enhancing the Calgary–Cambridge guides. Acad Med. 2003;78(8):802-9.
In article      View Article  PubMed
 
[14]  Engel GL. The deficiencies of the case presentation as a method of clinical teaching: Another approach. N Engl J Med. 1971 Jan 7;284(1):20-4.
In article      View Article  PubMed
 
[15]  Engel GL. The biopsychosocial model and the education of health professionals. General hospital psychiatry. 1979 Jul 1;1(2):156-65.
In article      View Article
 
[16]  Alliance for Clinical Education. Guidebook for clerkship directors. 4th ed. Morgenstern BZ, editor. North Syracuse, NY: Gegensatz Press; 2014. 641 p.
In article      
 
[17]  Aspegren K. BEME Guide No. 2: Teaching and learning communication skills in medicine-a review with quality grading of articles. Med Teach. 1999 Jan 1;21(6):563-70.
In article      View Article  PubMed
 
[18]  Cegala DJ, McNeilis KS, McGee DS. A study of doctors' and patients' perceptions of information processing and communication competence during the medical interview. Health Communication. 1995 Jul 1;7(3):179-203.
In article      View Article
 
[19]  Harding SR, D’Eon MF. Using a Lego-based communications simulation to introduce medical students to patient-centered interviewing. Teach Learn Med. 2001;13(2):130-5.
In article      View Article  PubMed
 
[20]  Coverdale JH, Balon R, Roberts LW. Teaching sexual history-taking: A systematic review of educational programs. Acad Med. 2011 Dec;86(12):1590-5.
In article      View Article  PubMed
 
[21]  Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Annals of internal medicine. 2009 Aug 18;151(4):264-9.
In article      View Article  PubMed
 
[22]  Wagner JA, Pfeiffer CA, Harrington KL. Evaluation of online instruction to improve medical and dental students’ communication and counseling skills. Eval Health Prof. 2011 Sep;34(3):383-97.
In article      View Article  PubMed
 
[23]  Schwartz LR, Fernandez R, Kouyoumjian SR, Jones KA, Compton S. A randomized comparison trial of case-based learning versus human patient simulation in medical student education. Acad Emerg Med. 2007 Feb;14(2):130-7.
In article      View Article  PubMed
 
[24]  Nardone DA, Schriner CL, Guyer-Kelley P, Kositch LP. Use of computer simulations to teach history-taking to first-year medical students. Academic Medicine. 1987 Mar 1;62(3):191-3.
In article      View Article
 
[25]  Succar T, Zebington G, Billson F, Byth K, Barrie S, McCluskey P, et al. The impact of the Virtual Ophthalmology Clinic on medical students’ learning: A randomised controlled trial. Eye. 2013 Oct;27(10):1151-7.
In article      View Article  PubMed
 
[26]  Hibbert EJ, Lambert T, Carter JN, Learoyd DL, Twigg S, Clarke S. A randomized controlled pilot trial comparing the impact of access to clinical endocrinology video demonstrations with access to usual revision resources on medical student performance of clinical endocrinology skills. BMC Med Educ. 2013;13:135.
In article      View Article  PubMed
 
[27]  Morris M, Donohoe G, Hennessy M, C OC. Pro forma: Impact on communication skills? Clin Teach. 2013 Oct;10(5):318-22.
In article      View Article  PubMed
 
[28]  Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9):S63-7.
In article      View Article  PubMed
 
[29]  Finn, A, Bucceri, M. A case study approach to blended learning. Los Angel Cent Softw. 2008 Mar 23.
In article      
 
[30]  Childs S, Blenkinsopp E, Hall A, Walton G. Effective e-learning for health professionals and students—barriers and their solutions. A systematic review of the literature—findings from the HeXL project. Health Inf Libr J. 2005 Dec 1;22:20-32.
In article      View Article  PubMed
 
[31]  Garrison DR, Kanuka H. Blended learning: Uncovering its transformative potential in higher education. Internet High Educ. 2004 Jan 2;7(2):95-105.
In article      View Article
 
[32]  Davis MH, Harden RM. E is for everything-e-learning?. Medical Teacher. 2001 Sep;23(5):441.
In article      View Article  PubMed
 
[33]  Khogali SE, Davies DA, Donnan PT, Gray A, Harden RM, McDonald J, Pippard MJ, Pringle SD, Yu N. Integration of e-learning resources into a medical school curriculum. Medical teacher. 2011 Apr 1;33(4):311-8.
In article      View Article  PubMed
 
[34]  Njenga JK, Fourie LC. The myths about e‐learning in higher education. British Journal of Educational Technology. 2010 Mar 1;41(2):199-212.
In article      View Article
 
[35]  Osguthorpe RT, Graham CR, Osguthorpe RT, Graham CR. Blended learning environments: Definitions and directions. Q Rev Distance Educ. 2003;4(3):227-33.
In article      
 
[36]  Bradley P, Postlethwaite K. Setting up a clinical skills learning facility. Med Educ. 2003 Nov 1;37:6-13.
In article      View Article  PubMed
 
[37]  Mckimm J, Wilkinson T, Poole P, Bagg W. The current state of undergraduate medical education in New Zealand. Med Teach. 2010 Jan 1;32(6):456-60.
In article      View Article  PubMed
 
[38]  Van Merriënboer JJG, Sweller J. Cognitive load theory in health professional education: Design principles and strategies. Med Educ. 2010 Jan 1;44(1):85-93.
In article      View Article  PubMed
 
[39]  Reinwein J. Does the modality effect exist? And if so, which modality effect?. Journal of psycholinguistic research. 2012 Feb 1;41(1):1-32.
In article      View Article  PubMed
 
[40]  Lujan HL, DiCarlo SE. First-year medical students prefer multiple learning styles. Adv Physiol Educ. 2006 Mar 1;30(1):13-6.
In article      View Article  PubMed
 
[41]  Solomon R, Jones J, Holmes MM, Hoppe R. Teaching medical students a developmental approach to examining patients. Acad Med. 1991 Feb;66(2):77-8.
In article      View Article  PubMed
 
[42]  Hilliard RI. How do medical students learn: Medical student learning styles and factors that affect these learning styles. Teach Learn Med. 1995;7(4):201-10.
In article      View Article
 
[43]  Harden RM. E-learning–caged bird or soaring eagle? Med Teach. 2008 Jan 1;30(1):1-4.
In article      View Article  PubMed
 
  • CiteULikeCiteULike
  • MendeleyMendeley
  • StumbleUponStumbleUpon
  • Add to DeliciousDelicious
  • FacebookFacebook
  • TwitterTwitter
  • LinkedInLinkedIn