The Learning Effectiveness of Structured Assessment Stations with Immediate Feedback in Evidence-Bas...

Kee-Hsin Chen, Chiehfeng (Cliff) Chen, Pei-Chuan Tzeng, Hsueh-Erh Liu

  Open Access OPEN ACCESS  Peer Reviewed PEER-REVIEWED

The Learning Effectiveness of Structured Assessment Stations with Immediate Feedback in Evidence-Based Practice Education

Kee-Hsin Chen1, 2, 3, 4, 8, Chiehfeng (Cliff) Chen5, 6, 7, 8, Pei-Chuan Tzeng7, Hsueh-Erh Liu9,

1Department of Nursing, Taipei Medical University -Wan Fang Hospital, Taipei, Taiwan

2Graduate Institute of Clinical Medical Sciences, College of Medicine, Chang Gung University, Tao-Yuan, Taiwan

3School of Nursing, College of Nursing, Taipei Medical University, Taipei, Taiwan

4Evidence-Based Knowledge Translation Center, Taipei Medical University-Wan Fang Hospital, Taipei, Taiwan

5Department of Public Health, School of Medicine, College of Medicine, Taipei Medical University, Taipei, Taiwan

6Division of Plastic Surgery, Department of Surgery, Taipei Medical University -Wan Fang Hospital, Taipei, Taiwan

7Evidence-Based Medicine Center, Taipei Medical University-Wan Fang Hospital, Taipei, Taiwan

8Center for Evidence-Based Medicine, Taipei Medical University, Taipei, Taiwan

9School of Nursing, Chang Gung University, Tao-Yuan, Taiwan

Abstract

Background: The development of clinical expertise depends not only on abundant practice but also on guidance through good feedback. Clinical teachers can improve learners’ skills by providing specific, performance-based feedback. In Taiwan, evidence-based practice (EBP) education has been taught through workshops, hands-on practice and small group discussion. However, the impact of feedback in EBP is still unknown. Aim: The aim of this study was to determine the effectiveness of structured assessment stations with immediate feedback in EBP education among nurses. Methods: The study was a quasi-experimental with repeated measures. Convenience sampling was used in two university hospitals. The intervention involved four immediate feedback structured assessment stations one week after a four-hour EBP workshop. The outcomes were measured by Taipei Evidence-Based Practice Questionnaire (TEBPQ), which contains 26 self-report questions, completed three times to evaluate learning efficiency of EBP. At the end of the program, qualitative data on the learners’ feelings, perceptions, and experiences toward the training process were also collected for analysis. Results: Sixty-one of the seventy participants (87.1%; 61/70) completed the study. In this research, the mean scores of Taipei Evidence-Based Practice Questionnaire (TEBPQ) for ‘Ask’, ‘Acquire’, ‘Appraisal’, ‘Apply’, and ‘Attitude’ domains all increased significantly after the participants attended the four assessment stations with immediate feedback (all p<. 05). After the EBP 4 workshop, the standardised TEBPQ scores improvement reached 20%. Furthermore, structured assessment stations with immediate feedback may improve overall learning efficiency by 35%. Conclusions: This study demonstrated that structured assessment stations with immediate feedback may improve overall learning efficiency over an EBP workshop alone. As a result, small group discussion, hands-on practice and additional immediate feedback structured assessment stations may be considered by educators who are interested in enhancing EBP ability among healthcare professionals.

At a glance: Figures

Cite this article:

  • Chen, Kee-Hsin, et al. "The Learning Effectiveness of Structured Assessment Stations with Immediate Feedback in Evidence-Based Practice Education." American Journal of Educational Research 2.8 (2014): 691-697.
  • Chen, K. , Chen, C. (. , Tzeng, P. , & Liu, H. (2014). The Learning Effectiveness of Structured Assessment Stations with Immediate Feedback in Evidence-Based Practice Education. American Journal of Educational Research, 2(8), 691-697.
  • Chen, Kee-Hsin, Chiehfeng (Cliff) Chen, Pei-Chuan Tzeng, and Hsueh-Erh Liu. "The Learning Effectiveness of Structured Assessment Stations with Immediate Feedback in Evidence-Based Practice Education." American Journal of Educational Research 2, no. 8 (2014): 691-697.

Import into BibTeX Import into EndNote Import into RefMan Import into RefWorks

1. Introduction

Online databases and the Internet provide a tremendous volume of timely information in a world where medical breakthroughs occur daily. Under the pressure of information explosion, clinical professionals are still responsible for providing clinical care that is timely, effective, and based on reliable scientific evidence [1]. Evidence-based practice (EBP) is the conscientious and judicious use of current best evidence, together with clinical expertise and patient values, to guide health care decisions [2]. Employing EBP was identified as one of the core competencies which were central to the education of all health professions (Institute of Medicine [IOM], 2003).

In a review of the literature, Corrigan et al. [3] found that effective educational strategies may improve the competence of attitudes toward EBP. Furthermore, Crawford et al.[4]found that discussions in scenario-based workshops helped students learn flexible clinical management. However, Feldsteinet al. [5] found that clinicians’ knowledge and skills in searching for literature and applying EBP in clinical settings did not significantly increase after participating in four-hour interactive workshops. These indicated the inconsistency of the impact of the traditional workshop approach on EBP, and therefore it may require additional educational interventions.

Feedback is the heart of medical education as well as an integral part of the process for clinical trainees [6, 7]. In clinical education, feedback is defined as: “specific information about the comparison between a trainee's observed performance and a standard, given with the intent to improve the trainee’s performance” [8]. The development of clinical expertise depends not only on considerable practice but also on facilitation and guidance through feedback [9, 10]. Clinical teachers have a responsibility to provide specific, performance-based and effective feedback based on direct observation to improve learners’ skills, whether the time for feedback is 5 minutes or 30 minutes [7, 11]. Good quality feedback is essential in helping learners develop accurate self-assessment of their own performance [12]. Positive and constructive use of feedback contributes significantly to improving learners' performance of skills [13]. In Taiwan, EBP education through a scenario-based, hands-on practice and small group discussion workshop has become fashionable in the past few years. However, little is known about the effectiveness of feedback in EBP education. Therefore, we integrated four structured assessment stations with immediate feedback into an EBP workshop to improve learning outcomes.

2. Aim

The aim of this study was to determine the learning effectiveness of structured assessment stations with immediate feedback in EBP education among nurses.

3. Methods

3.1. Study Design and Participants

This is a quasi-experimental study with multiple measures. Convenience sampling was conducted at two university hospitals between July 2010 and March 2011 in northern Taiwan. Individuals who were registered nurses, 20 years of age or older, interested in attending the workshop or recommended by their head nurse were included in the study. Nurses who were certified by the Taiwan Evidence-based Medicine Association (TEBMA) or other evidence-based medicine organization (such as Centre for Evidence-Based Medicine, Oxford) as evidence-based medicine trainers/ tutors were excluded. The study has approval of Taipei Medical University institutional review board.

3.2. Intervention (Structured Assessment Stations with Immediate Feedback)

The structured assessment stations with immediate feedback in this study were developed by the Evidence-Based Medicine Education and Research Committees in one of the target hospitals (Wan Fang Medical Center). The committee comprised seven members who were experts in evidence-based medicine methodology, university professors, clinical nurses, physicians, administrators, and researchers. All members had practical experience of EBP workshops teaching and discussions. The overall pedagogical approach for this study comprised two stages: a four-hour workshop featuring scenario-based discussions followed a week later by four stations with immediate feedback. The process of detail description was as follows:


(I) EBP workshop

Afour-hour, scenario-based discussions ‘PBL-EBM workshop (problem based learning, PBL; evidence-eased medicine, EBM)’was conducted to teach EBP among health care professionals since 2005, and found a significant learning outcomes [14]. The content of EBM workshop included an introduction of EBM/ EBP (lecture, 50 minutes), small group discussion with hands-on practice (90 minutes), presentation (50 minutes), and course evaluation/ feedback (5 minutes). Each group had two facilitators to lead participants in the discussion and hands-on practice according to the 4 A’s model of bedside EBM [2, 15]. The 4 A’s model including: (1) Ask: ask an answerable question based on a scenarioor clinical situation, then identify the PICO (Population, Intervention, Comparison and Outcomes) components; (2) Acquire: compile PICO-related keywords and search electronic databases (such as the Cochrane Library, PubMed, Cinahl…etc.); (3) Appraisal: select appropriate articles from literature and critique the articles critically based on appraisal tools; and (4) Apply: answer patient’s questions in plain language. Before participating in this study, in order to assure course instruction consistency, facilitators had to complete at least eight hours of training about teaching strategies and group-leading skills course.


(II) Structured assessment stations with immediate feedback

Based on literature review and researcher’s previous teaching experiences, the Evidence-Based Medicine Education and Research Committees integrated the concept of ‘immediate feedback’ into multi-structured assessment stations after EBP workshop one week. Each participant had to complete the four structured assessment stations, and the entire process took 40 minutes each person. In each station, the structured questions containing EBP domains were used [16]. The main content includes: (1) Formulate a focused question(Station 1: Ask); (2) determine an appropriate research design to answer the question(Station 1: Ask); (3) demonstrate the ability to search electronic databases (including secondary sources)(Station 2: Acquire); (4) identify issues significant to the relevance and validity of an article(Station 3: Appraisal); and (5) discuss the magnitude and importance of the research findings (Station 4: Apply). A teacher guide containing structured questions and standard answers was provided to the facilitator of each station. At the end of each station, learners received feedback immediately from facilitators, based on the performances in each station. The difference between performance and standard answer determines the content of the feedback [8]. The contents of each structured assessment station are listed in Figure 1.

Figure 1. EBP Domain, time required and completing description of each station (Note: TEBPQ: Taipei Evidence-Based Practice Questionnaire)
3.3. Measurement

The Taipei Evidence-Based Practice Questionnaire (TEBPQ) was used to evaluate the self-reported EBP learning efficiency [17]. The questionnaire was developed based on the concept of EBM evaluation [2], adopting the 4 A’s model of bedside EBM [15]. In addition, ‘attitude’ was added as an assessment domain since it was an important factor in learning motivation and future willingness to practice clinical EBP [18]. In total, there are 26 self-report questions in TEBPQ: ‘Ask (PICO) (5 items)’, ‘Acquire (7 items)’, ‘Appraisal (4 items)’, ‘Apply (6 items)’ and ‘Attitude (4 items)’ (see Appendix). All respondents were asked to rate the questions on a Likert scale from 1 (strongly disagree) to 5 (strongly agree). A standardised score of TEBPQ was calculated by the formula as: Standardised Score = (Actual Score –Lowest Possible Score) / (Highest Possible Score –Lowest Possible Score) * 100%. The total Content Validity Index (CVI) of TEBPQ was 0.90; while Cronbach's α value was 0.87 [17]. Upon completion of the whole training process, the research team asked a ‘Yes/No’ question: “Would you recommend the future provision of a feedback stations-integrated EBP workshop?” After the initial ‘yes’ or ‘no’ response, the reason for their answer was sought by narrative description recording. This question provides qualitative information of the learners’ feelings, perceptions, and experiences toward the training program, and their suggestions.

3.4. Implementation and Data Collection

The overall pedagogical approach for this study comprised two stages: the EBP workshop and the four structured assessment stations with immediate feedback. The TEBPQ by self-report was collected 10 minutes before (baseline), followed by the EBP workshop, and finally after four stations. Moreover, qualitative data of the learners’ feelings, perceptions, and experiences toward the training process were also collected at the end of the program. Figure 2 shows the algorithm of intervention and instruments along with the timeline.

Figure 2. The implementation and data collection process. (Note: TEBPQ: Taipei Evidence-Based Practice Questionnaire)
3.5. Evaluation and Data Analysis

SPSS version 17.0 was used for all statistical analyses. P values less than. 05 were set as the statistically significant difference. Descriptive statistics were used to present the participants’ characteristics. Mean and standard deviation (SD) were used to summarise continuous data. Categorical data were categorised using count and percentage (%). Repeated measures ANOVA, pair wise comparisons, and least significant difference (LSD) were used to examine the results of the multiple TEBPQ measurements and compare learning effectiveness of feedback in EBP education. Qualitative data were analysed by content analysis.

4. Results

4.1. Participant Characteristics

A total of 70 participants enrolled in this study between July 2010 and March 2011. Among them, three participants felt substantial pressure from the training program and could not complete the course. Two people were temporarily assigned to other activities, one person withdrew because of personal reasons, and three participants were excluded due to incomplete data. Therefore, 61 (87%) of the seventy participants completed the study. More than half of the participants (57%; 35/ 61) were between 26 and 35 years old, and had a Bachelor of Science academic background (59%; 36/ 61). Approximately two-thirds of the subjects were clinical nursing staff (64%; 39/ 61), who working in internal medicine/ surgery ward (39%; 24/ 61), OBGYN/ Pediatric ward (15%; 9/ 61) and ER/ ICU (10%; 6/ 61). Average clinical experience was 119 months (SD ± 79, min -max: 6 -402 months). More than half (53%; 32/ 61) of the participants were recommended by a head nurse, 25% (15/ 61) of participants joined because of work requirements, and 31% (19/ 61) of participants applied because of personal interest or for professional development. In this study, over half (54%; 33/ 61) participants had never experienced any lectures related to EBM/ EBP. Basic characteristics of the participants are provided in Table 1.

Table 1. Basic characteristics of participants (n = 61)

4.2. Results of EBP Learning Efficiency Measuring by TEBPQ

EBP learning efficiency trends according to multiple measurements using the TEBPQ questionnaire were assessed by repeated measures ANOVA and pair wise comparisons. Overall, the mean scores for the domains ‘Ask’, ‘Acquire’, ‘Appraisal’, ‘Apply’, and ‘Attitude’ all increased significantly, reaching statistical significance (all p<. 05)after the participants experienced the four structured assessment stations and received immediate feedback. The mean scores for the ‘Attitude’ domain were higher than other domains initially and increased 0.82 after four immediate feedback stations. Interestingly, the lowest score at baseline was for ‘Appraisal’ and it gained the most change (increase of 1.65 points) (Figure 3).

Figure 3. EBP learning efficiency trends according to the repeated TEBPQ measurements. The mean scores for ‘Appraisal’ domain(the lowest score at baseline) gained the most change (increase of 1.65 points), and the‘Attitude’ domain (the highest score at baseline)were increased 0.82 after four immediate feedback stations. After the EBP workshop and the Structured Stations with Immediate Feedback, the mean scores for all domains increased significantly (all p< .05). (Note: TEBPQ: Taipei Evidence-Based Practice Questionnaire)
4.3. Standardised TEBPQ Scores

The standardised TEBPQ scores at baseline, after the EBP workshop and following the four stations with immediate feedback were 40%, 60%, and 75%, respectively. After the EBP workshop, the standardised TEBPQ scores improvement reached 20%. Furthermore, structured assessment stations with immediate feedback may improve overall learning efficiency by as much as 35% above an EBP workshop alone (Figure 4).

Figure 4. The TEBPQ scores and improvement trends. After the EBP workshop, the Standardised TEBPQ scoresimprovement reached 19.7%. Furthermore, structuredstations with immediate feedback may improve overall learning efficiency by 35% above an EBP workshop alone.
4.4. Qualitative Information of Participant Perceptions and Experience

Generally, the written feedback from each participant was generally positive (89%; 54/61). The participants reflected that feedback provided by facilitators immediately at the end of each station is a good way of measuring their improvements and gaining insights into areas which they needed to work on. Participants had gained and learned a lot when the tutors gave them feedback immediately at each station. Alearner said: ‘Before the end of each station, the teacher immediately give me feedback, told me what I did great and gave me some suggestions. Therefore, I can understand what I know, clarify the areas of uncertainty… The best thing is I can know the answer immediately! I think it is a better way to learn.’ In addition, participants recommended additional group discussions, practice and a longer time for receiving feedback from facilitators in future teaching course. Another participant said: ‘I had attended several EBP courses before, but I still feel confused. This time, I know how to solve clinical problem via EBP process through hands-on practice and small group discussion. I learn lots experiences from teachers and group members. Because the time was very limited, so I feel a little nervous during the process. If we can have more time to practice, discussor receive feedback from teachers in each station, I think I can learn more.’

5. Discussion

Our study is in line with results reported in the literature on the effect on midwifery students' self-directed learning from supervisor’s feedback and support [10]. Moreover, Van der Hem-Stokrooset al. [19]mentioned that surgical clerkship students felt that constructive feedback was one of the key features of effective clinical learning. Furthermore, Oldeet al. [20] suggested that for medical students who received immediate detailed oral feedback on the content of the questions from the tutor, including the rationale for the correct and incorrect answers, this may add value and enhance the effect of an interim assessment. These were consistent with our findings whereby approximately 90% of participants acquired knowledge and skills that can be reflected and integrated through the immediate feedback from tutors, enabling participants to learn effectively.

The TEBPQ scores for each domain increased significantly throughout the learning process. The ‘Attitude’ domain had the highest mean score. Kimet al. [21] mentioned that EBP-focused interactive teaching strategies can effectively increase the EBP-related knowledge of nursing students, but not significantly improve the ‘Attitude’ aspect. In our study, although participants presented high baseline scores for the ‘Attitude’ domain, they also showed a significant change following interventions. This is possible because most of the participants were volunteers or recommended by a head nurse. They agreed that clinical work required EBP competence and were more motivated to learn. On the other hand, the ‘Appraisal’ domain scored the lowest. For the content analysis of qualitative data, learners mentioned that the ‘appraisal’ was the most difficult step among the EBP domains. In addition, some suggested allocating more time for literature appraisal in the future.

In this study, the time of each station was between 5 to 20 minutes. All learners and facilitators can complete items and feedback within the allotted time. However, according to our experiences, several suggestions about structured assessment stations setting were provided as follows: (1) At the ‘Acquire’ station, a stable internet connection was found to be extremely important to ensure performance was not affected by technical factors. (2) At the ‘Appraisal’ station, participants might not have enough time to read and appraise a long article, which thus affects learners performance. In addition, tutors have no time to give feedback. We recommend research articles provided be as brief and short as possible.(3) Current evidence-based databases and search engines are mostly in English language. In Taiwan, the official language is Mandarin and Chinese. We found that participants spend much time to check dictionary. Therefore, participants who are not native English speakers may need more time to prepare and practice. Language barriers should be considered in the non-native English-speaking population. (4) Although participants provided positive feedback of this teaching method, the program required substantial time, energy and input from researchers in designing content, organizing/ training facilitators, and evaluating the entire process. On average, the direct cost of each training course was about $6,000 US dollars, not including miscellaneous expenses. These are important considerations for the application.

This study has some limitations. Firstly, it was a small sample size, quasi-experimental study without a control group; thus, generalisability was limited. A study design with a larger sample, a control group and randomised assignment could be considered. Secondly, the entire teaching process was conducted for only one week. Although short-term learning efficiency can be observed, long-term effects, such as patient outcomes from implementing EBP in clinical settings, still require further research using a longitudinal study design. Finally, qualitative data about perception and experiences among participants were not rigorous enough. We recommend that further studies can be designed with a structured questionnaire to explore this phenomenon.

6. Conclusions

Nurses are on the front line of patient care. Lifelong learning is necessary to respond rapidly to a complex medical environment. This study demonstrated that an EBP workshop may improve learning efficiency by as much as 20%. Furthermore, structured assessment stations with immediate feedback may improve overall learning efficiency by as much as 35% over and above an EBP workshop alone. As a result, we recommend that future nursing education encourages teaching strategies such as small group discussion, hands-on practice and structured assessment stations with immediate feedback to improve educational efficiency. In future, this competence can be practically applied to clinical patient care, improving the quality of health care.

Acknowledgements

We also thank Chrissy Erueti -Assistant Professor and Centre Managerin Centre for Research in Evidence-Based Practice (CREBP), Bond University, Australia -help revised English writing.

Funding: Taipei Medical University -Wan Fang Hospital for sponsoring this study under the Grant No. 96WF-EVA-07.

Statement of Competing Interests

The authors have no competing interests. The authors alone are responsible for the content and writing of the article.

List of Abbreviations

EBP, Evidence-Based Practice.

TEBPQ, Taipei Evidence-Based Practice Questionnaire.

TEBMA, Taiwan Evidence-based Medicine Association.

PBL, Problem Based Learning.

PICO, Population, Intervention, Comparison and Outcomes.

CVI, Content Validity Index.

SD, Standard Deviation.

LSD, Least Significant Difference.

References

[1]  Zhang,Q., Zeng,T., Chen,Y. and Li,X, Assisting undergraduate nursing students to learn evidence-based practice through self-directed learning and workshop strategies during clinical practicum. Nurse Educ Today, 32 (5), 570-575.2012.
In article      
 
[2]  Straus,S.E., Richardson,W.S., Glasziou,P. and Haynes,R.B, Evidence-based medicine: How to practice and teach EBM, Churchill Livingstone,United States of America, 2005.
In article      
 
[3]  Corrigan, P.W., Steiner,L., McCracken,S.G., Blaser,B. and Barr,M, Strategies for disseminating evidence-based practices to staff who treat people with serious mental illness. Psychiatr Serv, 52 (12), 1598-1606, 2001.
In article      CrossRef
 
[4]  Crawford,E.J., Geraghty,J. and Cairns,A.J, Scenario-based student workshops in acute medicine. Med Educ, 43 (11), 1103, 2009.
In article      CrossRef
 
[5]  Feldstein,D.A., Maenner,M.J., Srisurichan,R., Roach,M.A. andVogelman,B.S, Evidence-based medicine training during residency: a randomized controlled trial of efficacy. BMC Med Educ, 10, 59, 2010.
In article      CrossRef
 
[6]  Branch,W.T Jr. and Paranjape,A, Feedback and reflection: teaching methods for clinical settings. Acad Med, 77 (12 Pt 1), 1185-1188, 2002.
In article      CrossRef
 
[7]  Schartel,S.A, Giving feedback -an integral part of education. Best Pract Res Clin Anaesthesiol, 26 (1), 77-87, 2012
In article      CrossRef
 
[8]  van de Ridder,J.M., Stokking,K.M., McGaghie,W.C. and ten Cate,O.T, What is feedback in clinical education? Med Educ, 42(2), 189-197, 2008.
In article      CrossRef
 
[9]  Côté,L. and Bordage,G, Content and conceptual frameworks of preceptor feedback related to residents' educational needs. Acad Med, 87(9), 1274-1281, 2012.
In article      CrossRef
 
[10]  Embo,M.P., Driessen, E.W., Valcke,M. and Van der Vleuten,C.P, Assessment and feedback to facilitate self-directed learning in clinical practice of Midwifery students. Med Teach, 32 (7), e263-269.2010.
In article      
 
[11]  Ramani,S. and Krackov,S.K, Twelve tips for giving feedback effectively in the clinical environment. Med Teach, 34 (10), 787-791, 2012.
In article      CrossRef
 
[12]  Rees,C. and Shepherd,M, Students' and assessors' attitudes towards students' self-assessment of their personal and professional behaviours. Med Educ, 39 (1), 30-39, 2005.
In article      CrossRef
 
[13]  Fjortoft,N,Self-assessment in pharmacy education. Am J Pharm Educ, 70 (3), 64, 2006.
In article      CrossRef
 
[14]  Wu,M.S., Chen,C., Tzeng,P.C., Lien,G.S., Wen,M.L. and Chiu,W.T, Group Emulation Improves the Teaching Efficasy in Evidence-Based Medicine. Formosan J Med, 11(4), 425-430, 2007.
In article      
 
[15]  Glasziou,P. and Haynes,R.B, The paths from research to improved health outcomes. Evid Based Nurs,8 (2), 36-38, 2005.
In article      CrossRef
 
[16]  Ramos,K.D., Schafer,S. and Tracz,S.M, Validation of the Fresno test of competence in evidence based medicine. BMJ, 326 (7384), 319-321, 2003.
In article      CrossRef
 
[17]  Tzeng,P.C., Chen,K.H., Lo,H.L. and Chen,C, In the TEBPQ development and initial reliability and validity testing. In the 2011 Conference of the Taiwan Evidence-Based Nursing Association Taipei, National Taiwan University College of Medicine, 2011.
In article      
 
[18]  Aarons, G.A, Measuring provider attitudes toward evidence-based practice: consideration of organizational context and individual differences. Child Adolesc Psychiatr Clin N Am, 14 (2), 255-271, 2005.
In article      CrossRef
 
[19]  van der Hem-Stokroos,H.H., Daelmans,H.E., van der Vleuten, C.P., Haarman,H.J. and Scherpbier,A.J, A qualitative study of constructive clinical learning experiences. Med Teach, 25 (2), 120-126, 2003.
In article      CrossRef
 
[20]  Olde Bekkink,M., Donders,R., van Muijen, G.N., de Waal, R.M. and Ruiter, D.J, Explicit feedback to enhance the effect of an interim assessment: a cross-over study on learning effect and gender difference. Perspect Med Educ, 1 (4), 180-191, 2012.
In article      CrossRef
 
[21]  Kim,S.C., Brown,C.E,. Fields,W. and Stichler,J.F, Evidence-based practice-focused interactive teaching strategy: a controlled study. J Adv Nurs, 65 (6): 1218-1227, 2009.
In article      CrossRef
 

Appendix

Taipei Evidence-Based Practice Questionnaire (TEBPQ)

comments powered by Disqus
  • CiteULikeCiteULike
  • MendeleyMendeley
  • StumbleUponStumbleUpon
  • Add to DeliciousDelicious
  • FacebookFacebook
  • TwitterTwitter
  • LinkedInLinkedIn