This article describes the development of a 21st century skills instrument for high school students. The first round of development of the instrument was crafted from four rubrics created to assess communication, collaboration, critical thinking, and creativity within project-based learning (PBL) activities. After an exploratory factor analysis, the pilot study results revealed multiple survey items loading across multiple factors requiring a revised instrument. The research team revised the instrument and added more items by using language from P21 standards documents. The revised 21st century skills instrument of 50 items was administered to 276 high school students participating in a STEM program. The final Exploratory Factor Analysis yielded a total of 30 survey items loaded across the four subscales with strong internal consistency within the constructs. This instrument can be used as a baseline and achievement measure for high school students’ 21st century skills.
An educational scholar does not have to search hard to locate publications regarding 21st century skills. A typical university library query today (8/2018) yields over 600,000 results of books, scholarly articles, and news releases on the topic. A broad topic such as 21st century skills leads to many different interpretations and definitions ranging from workforce skills to information and media literacy, social media, and gamification. This illustrates the tremendous need to clearly define 21st century skills. One such effort led to the development of 21st Century standards to help better define these skills. However, like many educational reforms there are a number of standards documents with diverse definitions, learning standards, and assessments: Partnership for 21st Century Skills 1, the OECD framework 2, AASL Standards for the 21st Century Learner 3, and the Common Core State Standards 4. Each of these set of standards have different identified 21st century skills. Some standards remain focused on new technologies and how to effectively use these technologies, other standards focus on key workforce skills. The P21 framework organizes 21st century skills in three basic categories: a) life and career skills; b) learning and innovation skills, and c) information, media, and technology skills. The P21 framework goes on to define these basic categories. In the learning and innovation skills category, one finds the four Cs of 21st century skills that are: a) creativity; b) critical thinking; c) communication; and d) collaboration 1. The following article will focus on the development of a 21st century skills survey instrument to assess students’ skills within learning and innovation skills as defined by the P21 framework. The purpose of the student 21st century skills survey is to create a self-reporting instrument to better understand how students assess their own abilities in creativity, critical thinking, communication, and collaboration. This instrument is not to be an all-inclusive assessment of 21st century skills, and the authors acknowledge that there are additional 21st century skills including life and career skills, media and information, and technological literacy skills, which are not assessed with this proposed instrument. The possible use of this self-reporting 21st century survey instrument is to provide a baseline assessment prior to an educational program and a post assessment after the educational intervention to measure growth and achievement.
The NSF I-Test project called TRAILS (Teachers and Researchers Advancing Integrated Lessons in STEM) seeks to improve students’ learning of STEM content and increase their interest and pursuit in STEM careers by providing over 70 hours of professional development for high school science (biology or physics) and engineering technology education (ETE) teachers. The TRAILS program seeks to help prepare teachers to create lessons that engage students in science inquiry and bio-mimicry-inspired engineering design activities. These TRAILS lessons include engineering design activities that generate 3D printed design solutions, and students learn the science of entomology to inspire biomimicry designed solutions. TRAILS is based upon the theory that these STEM learning experiences are grounded in situated learning and authentic and contextual learning opportunities 5. We theorize that 21st century skills, including critical thinking, collaboration, creativity, and communication, will be improved through the TRAILS. We believe that teachers and students learn from engaging in a community of practice by gathering industry and STEM professionals together to help construct deeper STEM learning experiences grounded in authentic practices. The promise of the TRAILS approach with potential impact on 21st century skills is discovered within the theoretical framework of TRAILS 5. There is great promise in many STEM education efforts to impact the four Cs of 21st century skills. We believe that an effective STEM program should be developed around key pedagogical approaches that address these elements of the four Cs of 21st century skills. In review of Figure 1 below, the conceptual framework for TRAILS 5 is illustrated as the metaphor of a block and tackle pulley system lifting a load ‘situated STEM learning’. Situated STEM learning theory 6, 7 acknowledges that learning is enhanced when taught in an authentic context with both social and physical elements of the learning experiences. Situated learning blends well with engineering design pedagogical approaches that provide real-world engineering problems to explore and develop design solutions 8, 9. Science inquiry, technological literacy, and mathematical thinking are pedagogical approaches taught to students to help them gather, analyze, optimize information, and explore key science concepts that inform design solutions. Finally, students become members of a community of practice with teachers, students, and industry and STEM professionals to help them learn current STEM practices and build new STEM knowledge 7. What is not illustrated in Figure 1 is that these learning experiences occur within student design teams. TRAILS lessons are always experienced in student design teams and most often are teams of students comprised of a blend of science and engineering technology students. This approach to integrated STEM is necessary to improve 21st century skills, especially to promote collaboration and communication skills.
The Partnership for 21st Century Learning challenged educators to move beyond standard education practices when they wrote: “We must move from primary measuring discrete knowledge to measure students’ ability to think critically, examine problems, gathering information, and make informed, reasoned decisions while using technology” 1. Throughout the Partnership for 21st Century Learning document 1, it emphasizes authentic learning experience for student success in the 21st century skills. The document indicates that students should not only learn how to apply content knowledge but also engage in critical thinking, problem-solving, and analysis. Furthermore, the P21 document sets the goal that students are to grasp the concept that much of learning is about the process as it is about learning facts. The Partnership for 21st Century Skills document 1 also calls for the development of multiple types of formative and summative assessments and making sure that both teachers and students are monitoring the development of students’ 21st century skills. It was this challenge that motivated the authors to explore alternative approaches to assess the development of 21st century skills in students.
At the onset of the development of the TRAILS project, the leadership team were challenged by an NSF program officer to identify an assessment for 21st century skills. One of the TRAILS goals was to increase 21st century skills for high school students. The TRAILS team had identified a set of four rubrics from the Buck Institute 10, 11 that would be used as a summative assessment for the TRAILS design activities. These rubrics were designed to assess critical thinking, creativity, collaboration, and presentation skills (communication) within project-based learning activities. These rubrics were one of the few instruments at the time that were designed to assess the four Cs of 21st century skills for project-based learning. The TRAILS principal investigator contacted Suzy Boss (personal communication), the lead author of the rubrics, to discuss how the rubrics were created. Boss indicated that panels of experts assisted Buck Institute staff and key experts in fields of education to create the rubrics. One great feature of these rubrics was that they were available for download and could be modified by PBL teachers to meet their needs. Boss and Buck Institute personnel did indicate that the rubrics did not go through extensive instrument item analysis. These rubrics have worked well for summative evaluation for the TRAILS program, and they were modified for TRAILS teachers based upon the goals of the project.
However, for a National Science Foundation project, such as TRAILS, it is beneficial to have an instrument that provides researchers with an opportunity to measure growth and achievement. Rubrics are great for assessing skills and knowledge that cannot be easily assessed by high stakes testing, multiple choice response items, or other knowledge tests. TRAILS researchers using these rubrics struggled to have a measure of 21st century skills growth in students. The rubrics were good at assessing performance on engineering design projects, but overall student growth and improvement could not be assessed adequately. The TRAILS team realized that a survey-based instrument for measuring students’ self-reporting 21st century skills was necessary. A 21st century skills survey could provide researchers with baseline data on these skills prior to an educational intervention. The survey could be administered as a pre-test post-test assessment. This type of instrument could assess overall student growth over a period of time. The researchers acknowledge the limits of self-reporting survey responses. Although respondents to self-reporting instruments are often trying to be direct and thoughtful in their responses, results can contain inaccuracies and sometimes caused by bias or lack of objectivity 12. One benefit of self-reporting student assessment is the ability to compare student views of their capabilities in 21st century skills with teachers’ views of student’s abilities and achievements. A student survey is a useful instrument for educational researchers and any educators seeking to monitor and promote the increase of student’s abilities in 21st century skills.
The researchers obtained human subjects approval from Purdue University Internal Review Board to conduct the study with high school students participating in the TRAILS program. A pilot test was administered to a total of 55 students from TRAILS program during the fall of 2017 through Qualtrics online survey system. The researchers used Exploratory Factor Analysis to uncover stable factors. The researchers used four distinct rubrics (Boss, 2013) for the development of the instrument items, so we expected to find four stable factors to emerge from the factor analysis. The overall objectives of Exploratory Factor Analysis are to a) reduce the number of variables in the instrument; b) examine the structure of the relationship between variables; c) detect and assess unidimensionality of a construct; d) develop of simple analysis and interpretation; e) address multicollinearity (when two or more variables are correlated); f) develop theoretical constructs 13, 14.
Table 3 illustrates of a total of 55 student participants for the pilot study from one school in Indian State from the TRAILS project. This sample provided a relatively even number of science and engineering technology education (ETE) students. There were ten more males than female students with a total of 33 males and 22 females.
The data analysis for the pilot survey was conducted using the R software program. Before running factor analysis, we conducted a Kaiser-Meyer-Olkin (KMO) test to check sampling adequacy for desired variables 15. The result of KMO test was 0.76, which was an acceptable range of adequacy. To determine the number of factors, we used four psychometric criteria including (a) the Kaiser-Guttman rule (i.e., the eigenvalue should be greater than one), (b) the scree plot whether drastically drop at a point, (c) the number of items that substantially load on a factor, and (d) the amount of variance that explained by the extracted factors 14, 16. The test for Kaiser-Guttman rule indicated that six factors were obtained an eigenvalue of greater than one, and the scree plot dropped at dimensions 1, 3 and 6. Then, the researchers conducted a principal component analysis.
Table 4 below shows the results of factor analysis when the number of factors was set to four, and the factor loadings were limited to 0.30 cutoff. Upon review of these loading factors, it was clear that a number of the instrument items did not load as expected. Although items loaded across four factors, a total of 12 of the 22 items loaded across two or more constructs. Additionally, several survey items loaded in unintended constructs, such a communication item loading as a collaboration item. These results illustrate that when developing survey items, it is challenging to fully understand how a participant will read, comprehend, and respond to the survey item. Although many of the items appeared to be clear to the research team, it may not have been clear to the high school student responders. Although these items may have worked well in a rubric assessment, they were not effective as a self-reporting instrument item for high school students. Additionally, upon reviewing the factor loading results, a second read of the item often revealed how the items could be misinterpreted or loaded in another construct. For example, in survey item number 14: determine the best design idea from a collection of ideas, the researchers can see how this was interpreted as communication and collaboration. The researchers realized that these four constructs are very closely related and required additional investigation to refine an instrument that can discriminate between these four 21st century skills categories.
Upon review of the pilot test results, the researchers determined that the survey needed refinement as many of the items did not load properly within the determined constructs. The researchers determined that more instrument items needed to be developed and included in the instrument to more effectively discriminate between the various skills. To add to the survey items, the researchers reviewed various 21st century skills documents including P21, the OECD framework, and AASL Standards for the 21st Century Learner 1, 2, 3, 4. Many of the objectives, standards, and rubric items were written using similar language and descriptions of the skills. These various documents helped craft instrument items seeking to address the same constructs but worded slightly different. Instrument items measuring the same construct but expressed slightly different often help increase reliability for a survey item 17. After extensive deliberation, a total of 50 survey items were compiled into the final revised instrument. Each of the items were compiled into blocks of ten questions with a random mix of items from the four constructs so as to not lead respondents to a certain line of thinking 18. A panel of experts reviewed the survey for content validity 19, 20. An introductory statement at the start of the survey encouraged students to respond to the instrument items indicating their level of agreement. A five-point Likert scale ranging from strongly disagree to strongly agree was used to measure confidence in their ability with each skill. Upon completion of the revised survey, the researchers had several high school students review the instrument for face validity to ensure that the items were written clearly and understood by reviewers, who are of similar age of the respondents 17, 21. Table 5 below presents the revised survey with a total of 50 items.
The final data collection numbers are presented in Table 6. Upon collection of a total of 343 survey responses, researchers carefully inspected the data and reviewed IRB documentation. A final total of 276 completed and useable data was analyzed using factor analysis (Table 6). Table 7 presents the demographics of participants including class and grade level. The participants were 236 (85.5%) of White, 6 (2.2%) of Black, 19 (6.9%) of Hispanic, 13 (4.7%) of Asian, and 2 (0.7%) of multi-racial.
The researchers ran an Exploratory Factor Analysis on the 50 survey item results. Bartlett’s test of sphericity, x2= 6363.917, p <.001, and Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy of .927 shows that the data with 50 items were adequate for factor analysis. The final survey instrument consisted of a total of 30 items loaded across the four factors or subscales (collaboration, communication, creativity, and critical thinking). There were a final total of nine items for collaboration. The communication and creativity categories both consisted of five items. While critical thinking consisted of 11 items. The items along with their respective factor loadings can be found in Table 8. The structure explained 48% of the variance in the pattern of the relationship among the items. A total of 20 items of the original 50 items were removed for the following reasons: (a) items did not load significantly to their predicted factor, (b) items loaded across two or more factors (cross-loading), or (c) items loaded on the wrong factor 22. A factor of >.40 was the criterion used as the cut off of significance of loading for items. One exception was made for the communication item: present all information clearly, concisely, and logically that loaded across both communication (.347) and critical thinking (.345). This item was loaded just below the .40 cut off but was retained for communication to provide a total of five survey items for this category. The researchers deemed it necessary to have a minimum of five items per category. Retaining this item of communication would strengthen the instrument to assess this 21st century skill. The Cronbach’s alpha reliabilities across the four subscales were: Collaboration, α=.826; Communication, α=.749; Creativity, α= .751; and Critical Thinking, α=.876.
Additionally, it is important to note that some items loaded on the ‘wrong’ or predicted factor. Some items that loaded significantly but did not load within the predicted factor were retained. The constructs of the four Cs of 21st century skills are so closely related that it is challenging to accurately separate one from another. For example, some survey items described activities or ideas that could be considered either creativity or critical thinking. The lines are blurred between when an idea or approach is determined to be critical thinking and not creativity. This very fact of ambiguity among these constructs and the relationship between these constructs only provides more rationale for the need for this type of research. It is necessary to run factor analysis of survey items to allow responders to help define the lines of constructs.
This research study focused on the development a survey instrument for 21st century skills as defined by the P21 framework. The researchers set out to develop the 21st century skills self-reporting instrument to measure students’ assessment of their own abilities in creativity, critical thinking, communication, and collaboration. Using exploratory factor analysis, we sought to identify the underlying relationships between the measured variables (four Cs of 21st century). Exploratory factor analysis was appropriate to use for the development of a survey instrument to ensure that it measures the four constructs accurately and investigates the structure of the relationship between variables. The final results yielded a 30 item self-reporting 21st century survey to provide a baseline assessment of students’ 21st century skills. The authors suggest the following additional research for further investigation:
Conduct a study using confirmatory factor analysis to validate the instrument.
Use the 21st Century Skills survey to investigate student responses with teacher formative and summative assessments of student 21st century skills capabilities.
Apply the 21st Century skills survey as a pre/post assessment measure for 21st century skills growth and achievement.
Due to greater emphasis on student achievement in 21st century skills 1, 2, 3, 4, an instrument designed to assess these constructs will be a valuable measure to any educational program seeking to promote these skills. This research work began because of a void of 21st century skills instruments that can measure achievement; it is the hope of the authors that this assessment tool will help fill this void.
Elements of this paper is supported by the National Science Foundation, award #DRL-1513248. Any opinions, and findings expressed in this material are the authors and do not necessarily reflect the views of NSF.
[1] | Partnership for 21st Century Skills [P21]. (2009). P21 framework definitions. Retrieved July 10, 2019 from: https://www.battelleforkids.org/networks/p21. | ||
In article | |||
[2] | Organization for Economic Cooperation and Development [OECD]. (2005). The definition and selection of key competencies: Executive summary. Paris, France: OECD. | ||
In article | |||
[3] | American Association of School Librarians (2018) AASL Standards for the 21st Century Learner. | ||
In article | |||
[4] | National Governors Association. (2010). Common core state standards. Washington, DC. | ||
In article | |||
[5] | Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3(1), 11. | ||
In article | View Article | ||
[6] | Lave, J., & Wenger, E. (1991). Situated learning. Legitimate peripheral participation. Cambridge, England: Cambridge University Press. | ||
In article | View Article | ||
[7] | National Academy of Engineering and National Research Council [NAE & NRC]. (2009). Engineering in K-12 education: Understanding the status and improving the prospects. Washington: National Academies Press | ||
In article | |||
[8] | National Research Council [NRC]. (2012). A framework for K12 science education: Practices, cross cutting concepts, and core ideas. Washington: National Academies Press | ||
In article | |||
[9] | Boss, S. (2013). PBL For 21st Century Success: Teaching Critical Thinking, Collaboration, Communication, and Creativity. Buck Institute for Education: Novato, California. | ||
In article | |||
[10] | Buck Institute for Education. (2019). 9-12 Presentation Rubric. Retrieved from https://my.pblworks.org/search/resource-document. | ||
In article | |||
[11] | Paulhus, D. L., & Vazire, S. (2007). The self-report method. Handbook of research methods in personality psychology, 1, 224-239. | ||
In article | |||
[12] | Pett, M. A., Lackey, N. R., & Sullivan, J. J. (2003). Making sense of factor analysis: The use of factor analysis for instrument development in health care research. Sage. | ||
In article | View Article | ||
[13] | Thompson, B. (2007). Exploratory and confirmatory factor analysis: Understanding concepts and applications. Applied Psychological Measurement, 31(3), 245-248. | ||
In article | View Article | ||
[14] | Cerny, B. A., & Kaiser, H. F. (1977). A study of a measure of sampling adequacy for factor-analytic correlation matrices. Multivariate behavioral research, 12(1), 43-47. | ||
In article | View Article PubMed | ||
[15] | Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research & Evaluation, 10(7), 1-9. | ||
In article | |||
[16] | Litwin, M. (1995). How to Measure Survey Reliability and Validity. Sage Publications: Thousand Oaks, California. | ||
In article | View Article | ||
[17] | Browne, M. N., & Keeley, S. M. (1998). Asking the right questions: A guide to critical thinking. (5th Ed.). Upper Saddle River, NJ: Prentice Hall. | ||
In article | |||
[18] | Burns, N., & Grove, S. K (1993). The practice of nursing research conduct, critique and utilization. | ||
In article | |||
[19] | Mason, E. J. & Bramble, W. J. (1997). Research in education and behavioral sciences. Chicago: Brown & Benchmark Publishers. | ||
In article | |||
[20] | DeVellis, R. F. (2003). Scale development: Theory and application. London: Sage Publishing. | ||
In article | |||
[21] | Levesque-Bristol, C., & Cornelius-White, J. (2012). The public affairs scale: Measuring the public good mission of higher education. Journal of Public Affairs Education, 18(4), 695-716. | ||
In article | View Article | ||
[22] | Brown, J.S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42. | ||
In article | View Article | ||
[23] | Salant, P., & Dillman, D. A. (1994). How to conduct your own survey. New York: John Wiley and Sons. | ||
In article | |||
[24] | Stronge, J.H., Grant, L.W., & Xu, Xianxuan (2017). Designing effective assessments, Solution Tree Press: Bloomington, IN. | ||
In article | |||
[25] | Wah Chu, S. K., Reynolds, R.B., Tavares, N. J., Notari, M., Yi Lee, C. W. (2017). 21st Century Skills Development through Inquiry-Based Learning: From Theory to Practice. Springer Nature: Singapore. | ||
In article | View Article | ||
[26] | Williams, B., Onsman, A., & Brown, T. (2010). Exploratory factor analysis: A five-step guide for novices. Australasian Journal of Paramedicine, 8(3). | ||
In article | View Article | ||
Published with license by Science and Education Publishing, Copyright © 2019 Todd R. Kelley, J. Geoff Knowles, Jung Han and Euisuk Sung
This work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit
https://creativecommons.org/licenses/by/4.0/
[1] | Partnership for 21st Century Skills [P21]. (2009). P21 framework definitions. Retrieved July 10, 2019 from: https://www.battelleforkids.org/networks/p21. | ||
In article | |||
[2] | Organization for Economic Cooperation and Development [OECD]. (2005). The definition and selection of key competencies: Executive summary. Paris, France: OECD. | ||
In article | |||
[3] | American Association of School Librarians (2018) AASL Standards for the 21st Century Learner. | ||
In article | |||
[4] | National Governors Association. (2010). Common core state standards. Washington, DC. | ||
In article | |||
[5] | Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3(1), 11. | ||
In article | View Article | ||
[6] | Lave, J., & Wenger, E. (1991). Situated learning. Legitimate peripheral participation. Cambridge, England: Cambridge University Press. | ||
In article | View Article | ||
[7] | National Academy of Engineering and National Research Council [NAE & NRC]. (2009). Engineering in K-12 education: Understanding the status and improving the prospects. Washington: National Academies Press | ||
In article | |||
[8] | National Research Council [NRC]. (2012). A framework for K12 science education: Practices, cross cutting concepts, and core ideas. Washington: National Academies Press | ||
In article | |||
[9] | Boss, S. (2013). PBL For 21st Century Success: Teaching Critical Thinking, Collaboration, Communication, and Creativity. Buck Institute for Education: Novato, California. | ||
In article | |||
[10] | Buck Institute for Education. (2019). 9-12 Presentation Rubric. Retrieved from https://my.pblworks.org/search/resource-document. | ||
In article | |||
[11] | Paulhus, D. L., & Vazire, S. (2007). The self-report method. Handbook of research methods in personality psychology, 1, 224-239. | ||
In article | |||
[12] | Pett, M. A., Lackey, N. R., & Sullivan, J. J. (2003). Making sense of factor analysis: The use of factor analysis for instrument development in health care research. Sage. | ||
In article | View Article | ||
[13] | Thompson, B. (2007). Exploratory and confirmatory factor analysis: Understanding concepts and applications. Applied Psychological Measurement, 31(3), 245-248. | ||
In article | View Article | ||
[14] | Cerny, B. A., & Kaiser, H. F. (1977). A study of a measure of sampling adequacy for factor-analytic correlation matrices. Multivariate behavioral research, 12(1), 43-47. | ||
In article | View Article PubMed | ||
[15] | Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research & Evaluation, 10(7), 1-9. | ||
In article | |||
[16] | Litwin, M. (1995). How to Measure Survey Reliability and Validity. Sage Publications: Thousand Oaks, California. | ||
In article | View Article | ||
[17] | Browne, M. N., & Keeley, S. M. (1998). Asking the right questions: A guide to critical thinking. (5th Ed.). Upper Saddle River, NJ: Prentice Hall. | ||
In article | |||
[18] | Burns, N., & Grove, S. K (1993). The practice of nursing research conduct, critique and utilization. | ||
In article | |||
[19] | Mason, E. J. & Bramble, W. J. (1997). Research in education and behavioral sciences. Chicago: Brown & Benchmark Publishers. | ||
In article | |||
[20] | DeVellis, R. F. (2003). Scale development: Theory and application. London: Sage Publishing. | ||
In article | |||
[21] | Levesque-Bristol, C., & Cornelius-White, J. (2012). The public affairs scale: Measuring the public good mission of higher education. Journal of Public Affairs Education, 18(4), 695-716. | ||
In article | View Article | ||
[22] | Brown, J.S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42. | ||
In article | View Article | ||
[23] | Salant, P., & Dillman, D. A. (1994). How to conduct your own survey. New York: John Wiley and Sons. | ||
In article | |||
[24] | Stronge, J.H., Grant, L.W., & Xu, Xianxuan (2017). Designing effective assessments, Solution Tree Press: Bloomington, IN. | ||
In article | |||
[25] | Wah Chu, S. K., Reynolds, R.B., Tavares, N. J., Notari, M., Yi Lee, C. W. (2017). 21st Century Skills Development through Inquiry-Based Learning: From Theory to Practice. Springer Nature: Singapore. | ||
In article | View Article | ||
[26] | Williams, B., Onsman, A., & Brown, T. (2010). Exploratory factor analysis: A five-step guide for novices. Australasian Journal of Paramedicine, 8(3). | ||
In article | View Article | ||