Article Versions
Export Article
Cite this article
  • Normal Style
  • MLA Style
  • APA Style
  • Chicago Style
Original Article
Open Access Peer-reviewed

A Practical Model for Implementing Digital Media Assessments in Tertiary Science Education

Jorge Reyna , Peter Meier
American Journal of Educational Research. 2018, 6(1), 27-31. DOI: 10.12691/education-6-1-4
Published online: January 11, 2018

Abstract

Learner-Generated Digital Media (LGDM) has been incorporated as a learning tool to assess students in Higher Education over the last decade. There are models developed for video making in the classroom that considers technical know-how, pedagogies or a combination of both. However, there is the absence of a student-centred, practical framework to inform academics and students on the implementation of digital presentations as an assessment tool in the curricula. This conceptual paper proposes a new framework to assist with the design, implementation and evaluation of LGDM as assessment tools. The framework considers the following elements: (1) pedagogy; (2) student training; (3) hosting of videos; (4) marking schemes; (5) group contribution; (6) feedback; (7) reflection, and; (8) evaluation. The purpose of this paper is to outline the basic elements of the framework and provide practical implementation strategies that academics from any discipline could apply to their classrooms.

1. Introduction

Learner-Generated Digital Media (LGDM) emerged more than a decade ago in the field of education 1, 2, 3. In this field, the use of LGDM assessments has focused on the reflection of pre-service teaching experiences 4, 5. In contrast, in science disciplines, the focus has been active learning, inquiry and research approaches 6. Extensive examples have been documented in science disciplines. Areas of research include biology 7, computer programming 8, 9, health sciences 10, pharmacology 11, 12, 13, geology 14, mathematics 15, 16, and engineering 17. Currently, LGDM is gaining momentum in the higher education landscape 18, 19. The increased use of digital media as an assessment tool has been possible due to the proliferation of digital applications 20, and electronic devices such as smartphones, tablets, video cameras, and the like. 21, 22.

The pedagogical approach to LGDM use is to promote student reflection, engagement in active learning, collaboration, creativity 23, and generate an environment for deep learning 18, 24. Learner-generated content has the potential to add value to hands-on experience and peer-driven learning 25. Other benefits of LGDM include the development of graduate qualities such as interpersonal communication, project planning and time management skills 26, critical thinking, report writing, research skills and digital literacies 27. Nevertheless, research on LGDM in higher education is considered under-theorised and barely sufficient 28, 29. Thus, there is a need for rigorous studies to evaluate the effectiveness of LDGM in different disciplines 1, 6, 30.

The literature on frameworks specific for the application of LGDM in the classroom is limited. Most of these frameworks focused on how to design, implement and evaluate LGDM from the technical aspects (development, pre-production, production, post-production and distribution) with no emphasis on teachers’ and learners’ roles 31, 32, 33. Professional video-makers and multimedia creators have influenced these models, and they lack pedagogical substance 34.

From the student perspective, as a consumer of digital media for learning, the DiAL-e framework focus on what the learner does with an artefact rather than giving priority of its subject or discipline content 35. This framework is well-rounded in pedagogies but fails to engage learners as co-creators of content.

In contrast, in teacher education, a model for the good practice of digital video projects was developed and included nine stages, teacher strategies and peer learning structures 1. Later, a learning design for learner-generated digital stories was proposed based on the previous model 31. Although this framework is very comprehensive, it lacks a practical approach to be used by those outside the discipline of Education. The CASPA model (Consume, Analyse, Scaffold and Produce, and Assess) 36 is a novel instructional design framework to implement multimedia creation in the classroom. The drawbacks of this model are the lack of pedagogical underpinnings. It does not consider student training and support on the task, group work or evaluation. A similar model of digital literacies is the AACRA model that includes Acess, Analyse, Create, Reflect and Act 37. This model fails to identify the skills the students will need to develop to produce digital media assignments.

Consequently, this paper aims to introduce a practical, theoretical framework to guide the implementation of digital presentations as assessment tools in tertiary science learning. This paper will explore and outline the development and implementation strategies of the framework.

2. The LGDM Framework

The LGDM framework has eight elements starting from pedagogy and ending the cycle with an evaluation to inform future improvements (Figure 1). These elements were developed based on a gap assessment of previous models of digital media as an assessment tool 5, 31, 33, 34. For academics, the framework acts as a conduit between theory and good practice. From the student’s perspective, the framework informs why they need to learn using digital media and how the assessment task has been structured. As a student-centred framework, communicating this information is vital to ensure students buy into the task and have clear expectations of what will be required from them. Consequently, each element of the LGDM Framework explained below, links to a key question that students will need to understand before undertaking a digital media assessment. When designing digital assessment tasks, it is vital that these key questions are addressed.

2.1. Pedagogy

This element will address the student’s question: Why I need to learn this way? While the framework begins with pedagogy as a separate element, it of the remaining seven elements. The separation here has been made for instructional proposes.

The student-centred pedagogies that drive LGDM assignments should include active learning approaches, students working in small groups and ‘learning-by-doing’. Relevant theories involve Problem-Based Learning 38; Collaborative Learning 39, Cooperative Learning 40, Peer-Assisted Learning 41, and Case Studies 42. These pedagogies can be used to design LGDM assessment tasks that engage students with technology in developing research skills, collaborative organisational skills, and problem-solving 43. When designing LGDM assessments, it is important to ensure that subject learning objectives are aligned with graduate attributes. For example, at our institution, digital media assignments are aligned with Graduate Attribute 6: Communication skills.

2.2. Student Training

This element will address the student’s question: How do I create a digital media project? Digital media support for students is essential. Training on how to create effective digital presentations needs to be planned and delivered. The suggested topics to be covered include (1) digital presentation types; (2) layout design; (3) colour theory; (4) typography; (5) use of images; (6) audio recording; (7) video quality and resolution; (8) video framing and shots; (9) storyboarding, and; (10) tools available to produce digital presentations 44. At our institution, we have developed hands-on workshops for students to brainstorm their ideas with their peers and instructors. A crucial element at this stage is student feedback provided from the content perspective and digital media perspective. In our faculty, the learning designer undertakes the role of digital media educator and supports the students with the technical parts of the task. Additionally, online student resources have been developed that cover (1) welcome to digital presentations video; (2) Frequent Asked Questions on LGDM assignments; (3) interactive lecture on digital presentations; (4) example storyboard; (5) past student projects; (6) marking rubric for the assessment task; (7) interactive lecture on storyboarding, and; (8) additional resources such as tools to create digital presentations.

2.3. Hosting of Video

This element will address the student’s question: Where do I upload my digital media project? The video hosting service should be determined before designing the assessments. Appropriate attention will need to be paid to privacy, ethics and issues such as intellectual property in line with each institution’s policies. However, as a guiding principle, Learner-Generated Digital Media artefacts should be accessible to all the students as it will foster discussion and consideration of ideas. The use of Web 2.0 tools to host videos such as YouTube and Vimeo can be taken into consideration 32, 45. Creating a classroom account in those services and sharing the details with the students will be well suited. Students should be able to see each group’s work and comment if necessary. Qualitative research has reported that an “awareness of audience” enriches the process of LGDM creation with students reporting high levels of accomplishment and ownership in digital media assignments 5. At our institution, there is an emphasis on work integrated learning. Consequently, during digital media training for the students, the learning designer explains how they could use their digital presentations for their portfolios. A digital artefact can showcase student's creativity, ability to work in groups and as part of a team, and communication in the digital space and digital media skills; all essential skills identified as desirable by employers.

2.4. Marking Scheme

This element will address the student’s question: How is our digital media project going to be marked? When designing an assessment structure, it is important to determine the weighting of the LGDM activity since preparation of digital media projects can be time-consuming 31. It is recommended to have at least 20% of the total subject mark devoted to this assignment 11. Additionally, the use of marking rubrics is highly encouraged as it will help the students focus on the important elements of the task and will make the marking process more objective if several tutors/instructors are involved in the process 46. As students ideally receive training in digital media principles, the assignment should mark the application of these principles in addition to grading content. The exemplary marking rubric used in our institution has, under the communication skills graduate attribute, a criterion for the application of digital media principles such as layout design, colour theory, typography, use of images and basic video techniques such as framing, use of tripod and type of shots.

2.5. Group Contribution

This element will address the student’s question: How do you ensure that everyone contributes to the digital media project? Mechanisms to ensure all group members are contributing to the project need to be implemented. The best approach, in this case, is self and peer-assessment 47, 48. A contribution to group work rubric should be developed, and a peer review application used to allow students to rate each other’s contribution to the project. Using such a tool helps to identify free riders and non-contributors. In our institution, SPARKPlus is used to moderate group work 14. Other tools such Google Forms or even paper-based systems can be used. Our faculty developed a simple group contribution rubric inside the SPARKPlus application. This rubric includes (1) disciplinary/subject input for the project; (2) punctuality and time commitment; (3) contribution with original ideas; (4) communication skills and work effectively as part of the team; (5) focus on the task and what needs to be done. Students will go ahead self and peer review with a sliding bar that contains a scale from well below average, below average, average, above average and well above average. Additionally, the students need to input comments on why they give that mark to their peers. This qualitative data is useful when conflicts between group members occur. When explaining to the students SPARKPlus at the beginning of the semester, group issues are less than 10% in the digital media projects 14.

2.6. Feedback

This element will address the student’s question: How are we going with the digital media project? When implementing learning designs that use innovative ways to assess students, it is critical to provide targeted, specific and timely feedback. The purpose of feedback aims to reduce discrepancies between understanding and performance in relation to a goal 49. In the case of digital media projects, students need early feedback on the storyboard at the start of the process, and then, on the digital media approach and tools, they plan to use. Later, feedback on the draft is critical to reinforce student’s learning of the content and digital media principles. These levels of feedback will allow students to produce an effective digital artefact and minimise task related anxiety 10.

2.7. Student Reflection

This element will address the student’s question: How was the learning experience developing a digital media project? Research has shown that student’s perceptions of the benefits of educational technology can be diminished. Not until analysing the data and comparing performance, can we elucidate the benefits of the intervention 50. Adding a reflection task after the assignment will help the students to rethink if they have gained additional knowledge by engaging in the development of a digital media project. This task can be implemented using a reflective journal inside the Learning Management System and by asking the students questions such as: what do you feel you learned from this task? How could you use the skills you developed? This reflective task could be built into a marking structure, designed for extra credit or simply noted as a required threshold activity.

2.8. Evaluation

This element will address the student’s question: What could be improved on the assignment? Evaluation is an important part of any educational intervention. The purpose of the evaluation is to produce data that will help to improve the assignment in the next iteration. The process involves (1) identifying the activity/task; (2) developing questions (for students and tutors); (3) determining the sources of data; (4) collection and analysis; (5) making the adjustments required, and; (6) starting a new iteration in the following semester. Sources of data can be teacher reflection, student’s perceptions (via surveys, interviews and focus groups), student’s assessment performance (grades attained) and student actions (group contribution) (Phillips & Gilding, 2002). Most institutions will have formalised systems for gathering student feedback, but it is important that feedback also is gained from instructors who implemented the tasks. When collecting such data, consideration should also be given to whether these data will contribute to any research publications. The final step of this process is perhaps the most important as data is often gathered but then not used to review pedagogical practices effectively. At our institution, we have implemented specific quality control processes to ensure teaching practices are regularly reviewed. These processes include an online survey for students to capture their learning experience using digital media.

3. Conclusion

This paper has outlined some shortcomings in existing models that look at the development of Learner-Generated Digital Media as an assessment tools. These include the complexity of teacher educator’s models and the lack of practical application outside the field of Education. To address these gaps, educational researchers at our institution have proposed a new model: the LGDM Framework. This model is a student-centred framework with eight clear elements underpinned by active learning pedagogies. The central theme focuses on student engagement and how academics can design LDGM assessments that are meaningful to students and help ensure a development of desired student graduate outcomes. In subsequent papers, the authors will explore the implementation of the framework and present data that validates the underlying approaches.

For additional information about Learner-Generated Digital Media visit www.digitalmediaforlearning.org.

References

[1]  Kearney, M. and S. Schuck. Students in the director’s seat: Teaching and learning with student-generated video. in Proceedings of Ed-Media 2005 World Conference on Educational Multimedia, Hypermedia and Telecommunications. 2005. Citeseer.
In article      View Article
 
[2]  Crean, D., QuickTime streaming: a gateway to multi-modal social analyses. e-Xplore, 2001.
In article      View Article
 
[3]  Ludewig, A., iMovie. A student project with many side-effects. e-Xplore, 2001.
In article      View Article
 
[4]  Rich, P.J. and M. Hannafin, Video annotation tools technologies to scaffold, structure, and transform teacher reflection. Journal of Teacher Education, 2009. 60(1): p. 52-67.
In article      View Article
 
[5]  Kearney, M., Learner-generated digital video: Using Ideas Videos in Teacher Education. Journal of Technology and Teacher Education, 2013. 21(3): p. 321-336.
In article      View Article
 
[6]  Hoban, G., W. Nielsen, and A. Shepherd, Student-generated Digital Media in Science Education: Learning, Explaining and Communicating Content. 2015: Routledge.
In article      View Article
 
[7]  Pirhonen, J. and P. Rasi, Student-generated instructional videos facilitate learning through positive emotions. Journal of Biological Education, 2016: p. 1-13.
In article      View Article
 
[8]  Powell, L. and F. Robson, Learner-generated podcasts: a useful approach to assessment? Innovations in Education and Teaching International, 2014. 51(3): p. 326-337.
In article      View Article
 
[9]  Vasilchenko, A., et al. Media Literacy as a By-Product of Collaborative Video Production by CS Students. in Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education. 2017. ACM.
In article      View Article
 
[10]  Pearce, K.L. and J.J. Vanderlelie. Teaching and evaluating graduate attributes in multimedia science based assessment task. in Proceedings of The Australian Conference on Science and Mathematics Education. 2016.
In article      View Article
 
[11]  Reyna, J., et al., Implementing Digital Media Presentations as Assessment Tools for Pharmacology Students. American Journal of Educational Research, 2016. 4(14): p. 983-991.
In article      View Article
 
[12]  Nielsen, W., G. Hoban, and C. Hyland, Pharmacology Students’ Perceptions of Creating Multimodal Digital Explanations. Chemistry Education Research and Practice, 2017.
In article      View Article
 
[13]  Henriksen, B., J. Henriksen, and J.S. Thurston, Building Health Literacy and Cultural Competency Through Video Recording Exercises. INNOVATIONS in pharmacy, 2016. 7(4): p. 17.
In article      View Article
 
[14]  Reyna, J., et al., Using Learner-Generated Digital Media (LGDM) as an Assessment Tool in Geological Sciences, in The 11th annual International Technology, Education and Development Conference, INTED2017. 2017: INTED, Valencia (Spain), 6th-8th of March 2017.
In article      View Article
 
[15]  McLoughlin, C. and B. Loch, Engaging students in cognitive and metacognitive processes using screencasts, in EdMedia: World Conference on Educational Media and Technology 2012, T. Amiel and B. Wilson, Editors. 2012, Association for the Advancement of Computing in Education (AACE): Denver, Colorado, USA. p. 1107-1110.
In article      View Article
 
[16]  Calder, N., The layering of mathematical interpretations through digital media. Educational Studies in Mathematics, 2012. 80 (1-2): p. 269-285.
In article      View Article
 
[17]  Anuradha, V. and M. Rengaraj, Storytelling: Creating a Positive Attitude Toward Narration Among Engineering Graduates. IUP Journal of English Studies, 2017. 12(1): p. 32.
In article      View Article
 
[18]  Cox, A.M., A.C. Vasconcelos, and P. Holdridge, Diversifying assessment through multimedia creation in a non-technical module: reflections on the MAIK project. Assessment & Evaluation in Higher Education, 2010. 35(7): p. 831-846.
In article      View Article
 
[19]  Krippel, G., A.J. McKee, and J. Moody, Multimedia Use in Higher Education: Promises and Pitfalls. Journal of instructional Pedagogies, 2010. 2.
In article      View Article
 
[20]  Reynolds, C., D.D. Stevens, and E. West, “I’m in a Professional School! Why Are You Making Me Do This?” A Cross-Disciplinary Study of the Use of Creative Classroom Projects on Student Learning. College Teaching, 2013. 61(2): p. 51-59.
In article      View Article
 
[21]  Devine, T., C. Gormley, and P. Doyle, Lights, Camera, Action: Using Wearable Camera and Interactive Video Technologies for the Teaching & Assessment of Lab Experiments. International Journal of Innovation in Science and Mathematics Education (formerly CAL-laborate International), 2015. 23(2).
In article      View Article
 
[22]  Nilsen, S., Use of a GoPro® camera as a non-obtrusive research tool. Journal of Playwork Practice, 2017. 4(1): p. 39-47.
In article      View Article
 
[23]  Barra, E., et al., Using multimedia and peer assessment to promote collaborative e-learning. New Review of Hypermedia and Multimedia, 2014. 20(2): p. 103-121.
In article      View Article
 
[24]  Hamm, S. and I. Robertson, Preferences for deep-surface learning: A vocational education case study using a multimedia assessment activity. Australasian Journal of Educational Technology, 2010. 26(7).
In article      View Article
 
[25]  Berardi, V. and G.E. Blundell, A learning theory conceptual foundation for using capture technology in teaching. Information Systems Education Journal, 2014. 12(2): p. 64.
In article      View Article
 
[26]  Morel, G. and H. Keahey. Student-generated multimedia projects as a multidimensional assessment method in a health information management graduate program. in Society for Information Technology & Teacher Education International Conference. 2016. Association for the Advancement of Computing in Education (AACE).
In article      View Article
 
[27]  Ohler, J., New-media literacies. Academe, 2009. 95(3): p. 30.
In article      View Article
 
[28]  Hakkarainen, K., A knowledge-practice perspective on technology-mediated learning. International Journal of Computer-Supported Collaborative Learning, 2009. 4(2): p. 213-231.
In article      View Article
 
[29]  Potter, J. and J. McDougall, Digital Media, Culture and Education: Theorising Third Space Literacies. 2017: Springer.
In article      View Article
 
[30]  Duffy, T.M. and D.H. Jonassen, Constructivism and the technology of instruction: A conversation. 2013: Routledge.
In article      View Article
 
[31]  Kearney, M., Towards a learning design for student-generated digital storytelling. 2009.
In article      View Article
 
[32]  Snelson, C., YouTube across the disciplines: A review of the literature. MERLOT Journal of Online Learning and Teaching, 2011.
In article      PubMed
 
[33]  Theodosakis, N., The director in the classroom: How thinking inspires learning. 2001, San Diego, CA: Tech4learning Publishing.
In article      
 
[34]  Hoban, G., W. Nielsen, and C. Carceller, Articulating constructionism: Learning science through designing and making” Slowmations” (student-generated animations). 2010.
In article      View Article
 
[35]  Burden, K. and S. Atkinson. Jumping on the YouTube bandwagon? Using digital video clips to develop personalised learning strategies. in ICT: Providing choices for learners and learning. Proceedings ascilite Singapore 2007. 2007.
In article      View Article
 
[36]  Blum, M. and A. Barger, The CASPA Model: An Emerging Approach to Integrating Multimodal Assignments, in EdMedia: World Conference on Educational Media and Technology 2017, J.P. Johnston, Editor. 2017, Association for the Advancement of Computing in Education (AACE): Washington, DC. p. 709-717.
In article      View Article
 
[37]  Hobbs, R., Create to Learn: Introduction to Digital Literacy. 2017: John Wiley & Sons.
In article      View Article
 
[38]  Hmelo-Silver, C.E., Problem-based learning: What and how do students learn? Educational psychology review, 2004. 16(3): p. 235-266.
In article      View Article
 
[39]  Goodsell, A.S., Collaborative learning: A sourcebook for higher education. 1992.
In article      View Article
 
[40]  Millis, B.J. and P.G. Cottell Jr, Cooperative Learning for Higher Education Faculty. Series on Higher Education. 1997: ERIC.
In article      View Article
 
[41]  Topping, K. and S. Ehly, Peer-assisted learning. 1998: Routledge.
In article      View Article
 
[42]  McDade, S.A., Case study pedagogy to advance critical thinking. Teaching of psychology, 1995. 22(1): p. 9-10.
In article      View Article
 
[43]  Malita, L. and C. Martin, Digital storytelling as web passport to success in the 21st century. Procedia-Social and Behavioral Sciences, 2010. 2(2): p. 3060-3064.
In article      View Article
 
[44]  Snelson, C., Teacher Video Production: Techniques for Educational YouTube Movies, in Society for Information Technology & Teacher Education International Conference 2011, M. Koehler and P. Mishra, Editors. 2011, Association for the Advancement of Computing in Education (AACE): Nashville, Tennessee, USA. p. 1218-1223.
In article      View Article
 
[45]  Sturges, M. and J. Reyna. Use of Vimeo on-line video sharing services as a reflective tool in higher educational settings: A preliminary report. in ASCILITE-Australian Society for Computers in Learning in Tertiary Education Annual Conference. 2010.
In article      View Article
 
[46]  Spires, H. and G. Morris, New Media Literacies, Student Generated Content, and the YouTube Aesthetic, in EdMedia: World Conference on Educational Media and Technology 2008, J. Luca and E.R. Weippl, Editors. 2008, Association for the Advancement of Computing in Education (AACE): Vienna, Austria. p. 4409-4418.
In article      View Article
 
[47]  Willey, K. and A. Gardner, Investigating the capacity of self and peer assessment activities to engage students and promote learning. European Journal of Engineering Education, 2010. 35(4): p. 429-443.
In article      View Article
 
[48]  Hanrahan, S.J. and G. Isaacs, Assessing self-and peer-assessment: The students' views. Higher education research and development, 2001. 20(1): p. 53-70.
In article      View Article
 
[49]  Hattie, J. and H. Timperley, The power of feedback. Review of educational research, 2007. 77(1): p. 81-112.
In article      View Article
 
[50]  Phillips, R., C. McNaught, and G. Kennedy, Evaluating e-learning: Guiding research and practice. 2012: Routledge.
In article      View Article
 

Published with license by Science and Education Publishing, Copyright © 2018 Jorge Reyna and Peter Meier

Creative CommonsThis work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit https://creativecommons.org/licenses/by/4.0/

Cite this article:

Normal Style
Jorge Reyna, Peter Meier. A Practical Model for Implementing Digital Media Assessments in Tertiary Science Education. American Journal of Educational Research. Vol. 6, No. 1, 2018, pp 27-31. https://pubs.sciepub.com/education/6/1/4
MLA Style
Reyna, Jorge, and Peter Meier. "A Practical Model for Implementing Digital Media Assessments in Tertiary Science Education." American Journal of Educational Research 6.1 (2018): 27-31.
APA Style
Reyna, J. , & Meier, P. (2018). A Practical Model for Implementing Digital Media Assessments in Tertiary Science Education. American Journal of Educational Research, 6(1), 27-31.
Chicago Style
Reyna, Jorge, and Peter Meier. "A Practical Model for Implementing Digital Media Assessments in Tertiary Science Education." American Journal of Educational Research 6, no. 1 (2018): 27-31.
Share
[1]  Kearney, M. and S. Schuck. Students in the director’s seat: Teaching and learning with student-generated video. in Proceedings of Ed-Media 2005 World Conference on Educational Multimedia, Hypermedia and Telecommunications. 2005. Citeseer.
In article      View Article
 
[2]  Crean, D., QuickTime streaming: a gateway to multi-modal social analyses. e-Xplore, 2001.
In article      View Article
 
[3]  Ludewig, A., iMovie. A student project with many side-effects. e-Xplore, 2001.
In article      View Article
 
[4]  Rich, P.J. and M. Hannafin, Video annotation tools technologies to scaffold, structure, and transform teacher reflection. Journal of Teacher Education, 2009. 60(1): p. 52-67.
In article      View Article
 
[5]  Kearney, M., Learner-generated digital video: Using Ideas Videos in Teacher Education. Journal of Technology and Teacher Education, 2013. 21(3): p. 321-336.
In article      View Article
 
[6]  Hoban, G., W. Nielsen, and A. Shepherd, Student-generated Digital Media in Science Education: Learning, Explaining and Communicating Content. 2015: Routledge.
In article      View Article
 
[7]  Pirhonen, J. and P. Rasi, Student-generated instructional videos facilitate learning through positive emotions. Journal of Biological Education, 2016: p. 1-13.
In article      View Article
 
[8]  Powell, L. and F. Robson, Learner-generated podcasts: a useful approach to assessment? Innovations in Education and Teaching International, 2014. 51(3): p. 326-337.
In article      View Article
 
[9]  Vasilchenko, A., et al. Media Literacy as a By-Product of Collaborative Video Production by CS Students. in Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education. 2017. ACM.
In article      View Article
 
[10]  Pearce, K.L. and J.J. Vanderlelie. Teaching and evaluating graduate attributes in multimedia science based assessment task. in Proceedings of The Australian Conference on Science and Mathematics Education. 2016.
In article      View Article
 
[11]  Reyna, J., et al., Implementing Digital Media Presentations as Assessment Tools for Pharmacology Students. American Journal of Educational Research, 2016. 4(14): p. 983-991.
In article      View Article
 
[12]  Nielsen, W., G. Hoban, and C. Hyland, Pharmacology Students’ Perceptions of Creating Multimodal Digital Explanations. Chemistry Education Research and Practice, 2017.
In article      View Article
 
[13]  Henriksen, B., J. Henriksen, and J.S. Thurston, Building Health Literacy and Cultural Competency Through Video Recording Exercises. INNOVATIONS in pharmacy, 2016. 7(4): p. 17.
In article      View Article
 
[14]  Reyna, J., et al., Using Learner-Generated Digital Media (LGDM) as an Assessment Tool in Geological Sciences, in The 11th annual International Technology, Education and Development Conference, INTED2017. 2017: INTED, Valencia (Spain), 6th-8th of March 2017.
In article      View Article
 
[15]  McLoughlin, C. and B. Loch, Engaging students in cognitive and metacognitive processes using screencasts, in EdMedia: World Conference on Educational Media and Technology 2012, T. Amiel and B. Wilson, Editors. 2012, Association for the Advancement of Computing in Education (AACE): Denver, Colorado, USA. p. 1107-1110.
In article      View Article
 
[16]  Calder, N., The layering of mathematical interpretations through digital media. Educational Studies in Mathematics, 2012. 80 (1-2): p. 269-285.
In article      View Article
 
[17]  Anuradha, V. and M. Rengaraj, Storytelling: Creating a Positive Attitude Toward Narration Among Engineering Graduates. IUP Journal of English Studies, 2017. 12(1): p. 32.
In article      View Article
 
[18]  Cox, A.M., A.C. Vasconcelos, and P. Holdridge, Diversifying assessment through multimedia creation in a non-technical module: reflections on the MAIK project. Assessment & Evaluation in Higher Education, 2010. 35(7): p. 831-846.
In article      View Article
 
[19]  Krippel, G., A.J. McKee, and J. Moody, Multimedia Use in Higher Education: Promises and Pitfalls. Journal of instructional Pedagogies, 2010. 2.
In article      View Article
 
[20]  Reynolds, C., D.D. Stevens, and E. West, “I’m in a Professional School! Why Are You Making Me Do This?” A Cross-Disciplinary Study of the Use of Creative Classroom Projects on Student Learning. College Teaching, 2013. 61(2): p. 51-59.
In article      View Article
 
[21]  Devine, T., C. Gormley, and P. Doyle, Lights, Camera, Action: Using Wearable Camera and Interactive Video Technologies for the Teaching & Assessment of Lab Experiments. International Journal of Innovation in Science and Mathematics Education (formerly CAL-laborate International), 2015. 23(2).
In article      View Article
 
[22]  Nilsen, S., Use of a GoPro® camera as a non-obtrusive research tool. Journal of Playwork Practice, 2017. 4(1): p. 39-47.
In article      View Article
 
[23]  Barra, E., et al., Using multimedia and peer assessment to promote collaborative e-learning. New Review of Hypermedia and Multimedia, 2014. 20(2): p. 103-121.
In article      View Article
 
[24]  Hamm, S. and I. Robertson, Preferences for deep-surface learning: A vocational education case study using a multimedia assessment activity. Australasian Journal of Educational Technology, 2010. 26(7).
In article      View Article
 
[25]  Berardi, V. and G.E. Blundell, A learning theory conceptual foundation for using capture technology in teaching. Information Systems Education Journal, 2014. 12(2): p. 64.
In article      View Article
 
[26]  Morel, G. and H. Keahey. Student-generated multimedia projects as a multidimensional assessment method in a health information management graduate program. in Society for Information Technology & Teacher Education International Conference. 2016. Association for the Advancement of Computing in Education (AACE).
In article      View Article
 
[27]  Ohler, J., New-media literacies. Academe, 2009. 95(3): p. 30.
In article      View Article
 
[28]  Hakkarainen, K., A knowledge-practice perspective on technology-mediated learning. International Journal of Computer-Supported Collaborative Learning, 2009. 4(2): p. 213-231.
In article      View Article
 
[29]  Potter, J. and J. McDougall, Digital Media, Culture and Education: Theorising Third Space Literacies. 2017: Springer.
In article      View Article
 
[30]  Duffy, T.M. and D.H. Jonassen, Constructivism and the technology of instruction: A conversation. 2013: Routledge.
In article      View Article
 
[31]  Kearney, M., Towards a learning design for student-generated digital storytelling. 2009.
In article      View Article
 
[32]  Snelson, C., YouTube across the disciplines: A review of the literature. MERLOT Journal of Online Learning and Teaching, 2011.
In article      PubMed
 
[33]  Theodosakis, N., The director in the classroom: How thinking inspires learning. 2001, San Diego, CA: Tech4learning Publishing.
In article      
 
[34]  Hoban, G., W. Nielsen, and C. Carceller, Articulating constructionism: Learning science through designing and making” Slowmations” (student-generated animations). 2010.
In article      View Article
 
[35]  Burden, K. and S. Atkinson. Jumping on the YouTube bandwagon? Using digital video clips to develop personalised learning strategies. in ICT: Providing choices for learners and learning. Proceedings ascilite Singapore 2007. 2007.
In article      View Article
 
[36]  Blum, M. and A. Barger, The CASPA Model: An Emerging Approach to Integrating Multimodal Assignments, in EdMedia: World Conference on Educational Media and Technology 2017, J.P. Johnston, Editor. 2017, Association for the Advancement of Computing in Education (AACE): Washington, DC. p. 709-717.
In article      View Article
 
[37]  Hobbs, R., Create to Learn: Introduction to Digital Literacy. 2017: John Wiley & Sons.
In article      View Article
 
[38]  Hmelo-Silver, C.E., Problem-based learning: What and how do students learn? Educational psychology review, 2004. 16(3): p. 235-266.
In article      View Article
 
[39]  Goodsell, A.S., Collaborative learning: A sourcebook for higher education. 1992.
In article      View Article
 
[40]  Millis, B.J. and P.G. Cottell Jr, Cooperative Learning for Higher Education Faculty. Series on Higher Education. 1997: ERIC.
In article      View Article
 
[41]  Topping, K. and S. Ehly, Peer-assisted learning. 1998: Routledge.
In article      View Article
 
[42]  McDade, S.A., Case study pedagogy to advance critical thinking. Teaching of psychology, 1995. 22(1): p. 9-10.
In article      View Article
 
[43]  Malita, L. and C. Martin, Digital storytelling as web passport to success in the 21st century. Procedia-Social and Behavioral Sciences, 2010. 2(2): p. 3060-3064.
In article      View Article
 
[44]  Snelson, C., Teacher Video Production: Techniques for Educational YouTube Movies, in Society for Information Technology & Teacher Education International Conference 2011, M. Koehler and P. Mishra, Editors. 2011, Association for the Advancement of Computing in Education (AACE): Nashville, Tennessee, USA. p. 1218-1223.
In article      View Article
 
[45]  Sturges, M. and J. Reyna. Use of Vimeo on-line video sharing services as a reflective tool in higher educational settings: A preliminary report. in ASCILITE-Australian Society for Computers in Learning in Tertiary Education Annual Conference. 2010.
In article      View Article
 
[46]  Spires, H. and G. Morris, New Media Literacies, Student Generated Content, and the YouTube Aesthetic, in EdMedia: World Conference on Educational Media and Technology 2008, J. Luca and E.R. Weippl, Editors. 2008, Association for the Advancement of Computing in Education (AACE): Vienna, Austria. p. 4409-4418.
In article      View Article
 
[47]  Willey, K. and A. Gardner, Investigating the capacity of self and peer assessment activities to engage students and promote learning. European Journal of Engineering Education, 2010. 35(4): p. 429-443.
In article      View Article
 
[48]  Hanrahan, S.J. and G. Isaacs, Assessing self-and peer-assessment: The students' views. Higher education research and development, 2001. 20(1): p. 53-70.
In article      View Article
 
[49]  Hattie, J. and H. Timperley, The power of feedback. Review of educational research, 2007. 77(1): p. 81-112.
In article      View Article
 
[50]  Phillips, R., C. McNaught, and G. Kennedy, Evaluating e-learning: Guiding research and practice. 2012: Routledge.
In article      View Article