Article Versions
Export Article
Cite this article
  • Normal Style
  • MLA Style
  • APA Style
  • Chicago Style
Research Article
Open Access Peer-reviewed

Using Performance Task-GRASPS to Assess Student Performance in Higher Education Courses

Nuha Iter
American Journal of Educational Research. 2017, 5(5), 552-558. DOI: 10.12691/education-5-5-12
Published online: June 01, 2017

Abstract

This qualitative study explores student ability to integrate the use of knowledge and skills and demonstrates how students utilize skills in real-world situations through performance tasks using the performance taskGRASPS (Goals, Rules, Audience, Situation, Product/Performance, Standards) model. This study was carried out in the Introduction for Education course for 44 students in a teacher qualification program. Interviews, performance task-GRASPS reports, student focus group conversations, student reflections, and student products were used. Grounded theory was employed to analyze qualitative data. Findings show that students explained many educative features, including their views and beliefs toward performance tasks and authentic assessment. Students also understood their abilities through their products and reports about their roles in real-life situations. In addition, students demonstrated what they achieved and developed by themselves, and they felt happy and enjoyed their roles in real-life situations. The students reported that the evaluation method improved their self-confidence. Diversity was observed among the products and performances; students addressed the same challenges differently. This method develops the performance of university professors in authentic assessment by establishing performance tasks and using various rubrics to assess various products. These findings indicate that teacher educators must use authentic assessments and performance tasks to make students interactive in courses and utilize rubrics in evaluation that provide students real description of their performances.

1. Introduction

Students should be knowledge producers and not only knowledge consumers or keepers. The question is how this can be achieved. Allowing students to experience challenges when facing real-life situations and solve these problems enables them to produce solutions, manage situations, and develop different perspectives. These approaches integrate knowledge and skills in various ways. Thus, methodologies must be developed to assess well-being from the personal perspective of children and young individuals that can contribute useful, relevant, and reliable data; these data can then be a basis for policy formulation, implementation, and evaluation (O’Toole, & Kropf, 2012). Authentic assessment recognizes human rights, because this type of assessment is suitable to evaluate inner diversity, intelligences, abilities, and learning styles. Motivational theories such as Vroom’s expectancy theory (1964) states that individuals are motivated by three beliefs. First, individuals must feel that their level of effort will lead to a corresponding level of performance. Self-efficacy or a person’s belief in their own ability to achieve their desired goal (Gist & Mitchell, 1992) is the most critical component of this model. Individuals without a high level of self-efficacy will not be motivated. The second component of expectancy theory links motivation and outcome. The most critical type of learning ensures that performance is measured fairly and accurately, minimizing bias. Expectancy theory also suggests that individuals must value the reward. Whether or not a student actually values a high grade may be out of the educator’s control; nevertheless, this insight may help diagnose a particularly low level of motivation 11. Adam’s equity theory (1965) posits that people maintain a fair relationship between performance and rewards in comparison with others. Adams (1965) states that views of justice are related to inputs, outputs, and social comparison. Inputs are contributions that are used to obtain a certain type of return on a personal investment. Contributions can involve time, effort, skills, and determination 13. Assessment is a rich source of feedback for students. Authentic assessment is effective, because it allows an educator to provide positive feedback in a more motivational form than the usual numerical grade in a test (Litchfield, Mata, & Gray 2007). For most university instructors, authentic assessment is a radical paradigm shift from teacher-centered to student-centered teaching and learning. The majority of instructors continue to teach the way they were taught, namely, via lectures and objective tests. Any one form of teaching or assessment is insufficient to adequately teach a subject or measure learning progress and student performance. Most university courses consist primarily of three components: lectures, traditional assessments, and assignments. Assignments lag behind and are often few during a semester given that the standard objective testing is often used. A course is usually composed of a midterm and a final exam 9. Wiggins and McTighe (1998) said, “evaluation, assessment, performance tasks, and other acceptable evidence” are used for evaluative purposes, and their linear model goes directly from desired results to determine acceptable evidence. Determining acceptable evidence involves completing sets of assessment methods, such as (a) performance tasks or projects, (b) quizzes, tests, academic prompts, (c) informal observations, discussions, and (d) student self-assessments 4. Backward design theory is more likely to be the same as an assessment model, in which the politics of those in areas of measurement and psychometrics are prioritized and thus legitimated. Backward assessment criticizes traditional paper-pencil tests, including standardized tests. Backward theory posits that classroom teachers should be aware of the potential for student engagement as part of their design consideration 4. The hallmark of backward curriculum theory is its great emphasis on assessment and its focus on how student learning increased. In this backward planning model, the entity of assessment is prioritized in which teachers are seen as assessors as opposed to developers (Wiggins & McTighe, 1998). Performance task is aligned with one or more desired results, which will yield appropriate evidence of the identified understanding. Involving complex and real-world (i.e., “authentic”) applications of the identified knowledge, skills, and understanding written in the goal, role, audience, situation, and product (GRASPS) form allows students to demonstrate understanding with some options in the performances and/or products. The performance task is meant to assess and requires one or more of the six facets of understandings. The scoring rubrics includes distinct traits of understanding and successful performance. The scoring rubric highlight what is appropriate, given the evidence needs suggested by the desired results 18. Performance tasks using GRASPS are outlined below. The culminating activities that the students produce are the products that are based on the goal of a performance task. Each task contains between five and eight products that represent cross curricular topics.

Performance task-GRASPS is a design tool to develop a performance task with an emphasis on context and role playing. The acronym stands for the steps in the process, which include goals, roles, audience, situation, product-performance-purpose, and standards, which are the criteria developed for success. The GRASPS design tool includes a stem statement that a teacher can construct in a scenario for a performance task (Mayes, & Myers, 2015). McTighe 12 said that through the defined STEM performance task editor a teacher can edit a task, remove or add products, or up-load other pertinent information to the task. Rubrics are designed for each task for each type of product. The GRASPS frame includes real-world goals, meaningful roles of students, authentic or simulated real-world audience, and a contextualized situation that involves real-world applications. Students generated culminating products and performance, and consensus- driven performance standards (criteria) are used to determine success. Performance tasks with these features provide meaningful learning targets for learners, worthy performance goals for teaching, and the kind of evidence needed to assess true understanding (Tomlinson, & McTighe, 2006), Biggs and Tang 1 are presented standards model of assessment “that designed to assess changes in performance as a result of learning, for the purpose of seeing what, and how well, something has been learned. Such assessment is criterion-referenced (CRA), that is, the results of assessment are reported in terms of how well an individual meets the criteria of learning that have been set”. In this study these tasks show how a student uses math and science in a real-life situation rather than just providing information on a student’s theoretical knowledge. A performance task rates a student’s learning process, and assessing both product and process provides an accurate profile of a student’s ability and makes them value their work processes and products. This paradigm of assessment gives them an opportunity to apply self-monitoring, self-reflection, and self- evaluation using rubrics and reflection journals. Similarly, Frey et al. 7 have a review for significant studies of assessment indicates the following characteristics that promote learning- oriented assessment and employability:

1. Tasks should be challenging, demanding higher order learning and integration of learning and integration of learning from both the university and other contexts such as work-based settings;

2. Learning and assessment should be integrated, assessment should not come at the end of learning but should be part of the learning process

3. Assessment should encourage metacognition, promoting thinking about the learning process not just the learning outcome.

4. Tasks should be involve the active engagement of students developing the capacity to find things out for themselves and learn independently.

5. Tasks should be authentic, worthwhile, relevant and offering students some level of control over their work;

6. Tasks are fit for purpose and align with important learning outcomes.

Assessment refers to the act of determining the extent to which the desired results are achieved and to what extent they have been achieved. Assessment is the umbrella term for the deliberate use of various methods of gathering evidence of meeting desired results, whether these results are state-content standards or local curricular objectives. The collected evidence we seek may include observations and dialogues, traditional quizzes and tests, performance tasks and projects as well as students “self-assessments” gathered over time 19, 20. Thus, assessment is a more learning-focused term than evaluation, and the two concepts should not be viewed as synonymous. Assessment is the process of providing and using feedback against standards for improvement and to meet goals. By contrast, evaluation is more summative and credential-related than assessment. We need not give a grade—an evaluation—to everything we give feedback to. A central premise of our argument is that understanding can be developed and evoked only through multiple methods of ongoing assessment with far greater attention paid to formative (and performance) assessment than typical 18, 19, 20. An extended performance task may develop into a project. A project adapted from Wiggins and McTighe (1999, 2004) is described as follows: “A project is an extended and complex performance task, usually occurring over a period of time. Projects usually involve extensive student inquiry culminating in pupil products and performances which are assessed using a variety of assessment tools.” Tomlinson and McTighe show in their book that educators find understanding by design addresses their need “a model that acknowledges the centrality of standards but also ensures that students truly understand content and can apply it in meaningful ways”; they find it increasingly difficult to ignore the diversity of the learners in their classrooms. For many educators, differentiated instruction offers a framework to address learner variance as a critical component of instructional planning (Tomlinson, & McTighe, 2006).

2. Research Problem

Students in higher education suffer from the way they are assessed, as described by Tomlinson and McTighe (2006), who said that both teaching and learning were redirected in ways that are potentially impoverishing for those who teach and those who learn. The traditional model is usually used, which includes a first exam, second exam, and final exam to assess the achievement of their students. These exams aim to know what students know and how much they know. This type of assessment does not consider that students have different ideas about one point, and exams do not provide ideas about their mistakes. Educators struggle in assessing different students in various ways to measure their in-depth understanding on the concepts and to make students utilize their skills. The research problem concerns changing the assessment paradigm from the traditional one to one that explores students to understand the content and apply it in meaningful ways as well as finding a frame of assessment that considers the diversity of students. “Because they allow students to construct or perform an original response rather than just recognizing a potentially right answer out of a list provided, performance assessments can measure students’ cognitive thinking and reasoning skills and their ability to apply knowledge to solve realistic, meaningful problems” 5, a new paradigm of assessment in higher education, such as performance task-GRASPS, is used in this study to address the main problem of this research.

“Performance assessments are common in high-achieving countries, which have long relied on open-ended items and tasks that require students to analyze, apply knowledge, and write extensively” 5, then students need to explore the different ways to enhance their effectiveness in response to any task given to them. Students in this course are future teachers, and, thus, they must develop good assessment practices and determine the basis of formative assessments. The Introduction for Education course can support this approach by using a new paradigm in formative assessment and by integrating contextually situated evaluation activities that help professors and students develop and improve their assessment practices. Professor will be learning with their students, whose productivity and engagement in the new paradigm in the assessment process will improve.

3. Purpose of the Study

Assessment method must mirror that which is considered to be important, this is when students “learn for life” at the same time as they focus on passing the course, assessment methods govern what student learn, also govern how the students study(summative, or formative assessment), and teachers plan the assessment will also affected when student study 6. Biggs & Tang 1 said that backwash can be a positive force if only the assessment method is constructively aligned with the learning outcomes and teaching and learning activates. Accordingly the qualitative study examines how students reflect and interact with the new paradigm in assessment based on performance tasks. A qualitative design was used to examine how performance tasks are used to assess student knowledge and skills in teacher qualification program courses and determine how (44) students implement their performance tasks in real-life situations while recognizing the diversity in student products. The research specifically examined students’ interaction with and reflections on the use of backward assessment. Qualitative research approach was used to generate detailed and in-depth data to answer the research questions. This method allowed the researcher to create an interpretive analysis for the students, interactions, and reflections with the performance tasks.

All these aspects are crucial in the authentic assessment of performance based-tasks. Therefore, the research question is about how students accept and react to the new paradigm of assessment and towards their individual performance task?

3. Research Approach and Design

A qualitative design was used to examine how students accept and react to the new paradigm of assessment and towards their individual performance task. The empirical data in this paper were obtained from a case study, which develops evaluation processes in a teacher qualification program in Palestinian Technical University in the Introduction to Education course. The case study was conducted when a teacher qualification program was introduced for the first time in academic year 2014–2015. Teacher qualification program offers certification of teaching to individuals who completed all requirements, obtained a bachelor’s degree, and completed a teacher preparation program.

Many courses merely include a midterm and a final exam 9. However, the prospective teachers in this program want to practice different approaches of assessing learning. Thus, various assessment processes had to changed or improved to meet the needs of student teachers when they eventually become teachers in their classrooms. The question here is what types of authentic assessment is respectful and meaningful. The shift in the assessment from traditional assessment paradigm to authentic assessment paradigm (Figure 1) is described below.

In this case, the instructor designed performance tasks using the three following simple steps:

Step 1: Identify the goals (integrate between knowledge and skills) that the pupils are expected to reach in each teaching unit.

Step 2: Set the tasks that will demonstrate the language knowledge and skills that were developed.

Step 3: Develop explicit performance criteria and expected performance levels measuring pupils’ mastery of skills and knowledge (rubrics). A rubric is a scoring tool that outlines the required criteria for a piece of work or the important aspect to assess. It also indicates the weight for each criterion based on its relative importance to the overall task and describes what the performance would look like at different quality levels 18.

Performance task designee:

The researcher is the teacher educator for the course in the teacher qualification program. She establishes a performance task to assess student understanding for education, teaching, and learning in schools and to reform their perceptions towards global teachers. The sample consisted of 44 introduction for education course students. All of them are female, (30) of them pre-service science teachers, the other’s (14) pre-service math teachers, and they participated voluntarily in this research, though they had no prior experience with this type of tasks. The following steps are followed to implement this study:

1. Modified the scale of the course grades and used 40% of the grade as the performance tasks

2. Followed the steps of building these tasks to face one real-life challenge in their real life, such developing high-quality teachers who are able to change schools to meet global demands and 21st century learning or other real-life challenges

3. Table 1 lists the performance task demonstrated to the students.

4. The challenge, which is a real-life problem, is given to the students and discussed with them for 60 minutes.

5. The performance assessment task steps and template are given to students and discussed with them by allowing them to share their ideas about each step.

6. The rubric related to the standards is distributed to the students for them to learn while building their product.

7. The task is implemented individually and in groups.

8. Each student chose the role suite based on his or her personal, knowledge, skills, and attitudes.

9. All students start working on their tasks with a high degree of responsibility.

10.(44) Students who attend the introduction for educational course in the first semester 2014/2015 finalized their tasks individually or in dyads.

11.One month is spent to finalize their tasks.

12.Rubrics are used to evaluate the products which differ from each other.

13.The students are given feedback according to the rubric by stating their weak and strong points.

4. Data Collection

The products, which were sent by all the students before the deadline, were classified based on the type of products, performance, and purpose of their task using two types of rubrics. The first, which is commonly used, is presented in Table 2. The second rubric suits the content of the products. The students are allowed to present their products, and all of them exhibited high positive attitudes towards their work. The students know what they do and the purpose of their work. They use passion statements to say how much they appreciate their work, because they chose their role and the type of products for the first time and they finalized their actual work. Individual interviews were conducted to explore their experiences in this task, and the students were asked to write their statements. Questions include the following: How does this task differ from other tasks? Are you happy with this work? Why? How do you distinguish your work from others? What do you recommend? Data were collected from three types of resources, namely, students’ products, notes about their presentation, and their statements from the interviews.

5. Data Coding and Analyzing

The data were initially coded from different sources using words, phrases, and descriptive codes. To improve the validity and the reliability of the research, a meeting was initiated between one faculty member in the teacher qualification program and a group of students to discuss the coding tables and the draft of the findings. Through the meeting, the final interpretation of the results was confirmed and agreed on. The students’ interviews and reflections were analyzed using grounded theory following these procedures: organization, familiarization, coding, and categorization for their words and the products of (44) students (pre-service teachers). The following statements are sample from student’s reflections:

• Tasneem said, “I’m very happy with this task because this is the first time I collected information by myself and expressed about my point of view.”

• Israa noted, “This is the first time I explored that I am able to write more paragraphs by myself.”

• Anwar said, “I’m happy because I practiced my role in a real life situation.”

• According to Mays, “This is the first time I wrote everything from my analysis and observations without using Google.”

• Layl said, “This is the first time that I feel that I understand everything, and I can discuss with others.”

• Jamila and Mariam stated, “It is amazing to write everything by ourselves, and this is the first time that we depended on our ability. We’ll repeat this experience.”

• Hadeel said, “I will not forget this challenge because I dealt with all the processes.”

• Duaa stated, “I’m interested and happy with my products even though it is not high quality, because it is the first time I worked by myself.”

• According to Muna, “This task enhanced my confidence.”

6. Results

The following statements indicate the findings.

• (18) Students practice their role in real-life situations, and (26) students practice their role in role playing.

• The student products include poster (1), classroom sessions in different specializations (6), research reports (18), workshop reports (3), university media lecture report (1), formal meeting minutes (1), university president decision (simulation by one student) (1), and action research reports (6).

• Their grades are distributed as shown in the following chart:

The figure above shows that the grades of the students are high between (28–38) degrees, and the majority of the students (124) obtained (30) degrees.

• Most of the feedback given to most of students are as follows:

- Errors of grammar and usage make the meaning unclear.

- Language style and word choice are ineffective and /or inappropriate.

- Introductions, transitions, and other connecting materials may be lacking or unsuccessful.

- Any abrupt transitions do not interfere with the intended meaning.

- Details are lacking.

- Information may include some inaccuracies.

• Most skills were achieved by students within the following tasks:

- Implementing interviews with teachers, deans, principals, parents, students, student teachers, and supervisors.

- Implementing classroom observation

- Simulating characters such as educational minister, PTUK president, dean, and trainer

- Analyzing qualitative collecting data: points of views, opinions, and observations

- Making conclusions.

- Building future visions and recommendations.

- Organizing reports.

• The attitudes of students changed; the students’ opinions about this task and the challenge were positive by the time of completion.

7. Conclusion

1. Various rubrics can be used to assess different products.

2. Students said that they can make products without depending on the internet.

3. Students said that they feel happy and enjoy their roles in real-life situations.

4. Students said that the evaluation approach improved their self-confidence.

5. Students take different roles to face a challenge and created a report about their experiment.

6. Diverse products and performances were achieved.

8. Recommendation

1. GRASPS should be utilized to evaluate performance tasks in all courses.

2. Student recommendations must be considered in reforming the course content.

3. Their results can be used as data to reform policies and practices related to the qualification program.

4. Teacher educators use authentic assessments and performance tasks to make students interactive in the courses.

5. Using rubrics in evaluation provide real descriptions to students about their performances.

References

[1]  Biggs, J, and Tang, C (2011). Teaching for quality learning at university . New York: Society for Research into Higher Education& Open University Press.
In article      View Article
 
[2]  Brookhart, S (2013). How to create and use rubrics for formative assessment and grading, p. cm. Includes bibliographical references and index.
In article      View Article
 
[3]  Carol, A, & McTighe, J (2006). Integrating Differentiated Instruction and Understanding by Design: Connecting Content and Kids. Alexandia, Virginia USA: association for supervision and curriculum development.
In article      
 
[4]  Cho, J, & Trent, A (2005). “backward” curriculum design and assessment: What goes around comes around, or haven't we seen this before? Taboo, 9(2), 105.
In article      View Article
 
[5]  Darling-Hammond, L & Adamson, F (2010). Beyond basic skills: The role of performance assessment in achieving 21st century standards of learning. Stanford, CA: Stanford University, Stanford Center for Opportunity Policy in Education.
In article      View Article
 
[6]  Elmgren, M, & Henriksson, A (2014). Academic Teaching . Estonia: Mediapool Print Syd AB.
In article      PubMed
 
[7]  Frey, H, Ketteridge,S & Marshall, S (2015). A handbook for teaching and learning in higher education enhance acadimic practice. Abingdon, and New York: Routledge .
In article      View Article
 
[8]  Igbape, E, & Idogho, P (2014). Performance evaluation model for quality assurance in nigeria higher education. Lecture Notes in Engineering and Computer Science, 2213(1), 334-343.
In article      View Article
 
[9]  Litchfield, B, & Dempsey, J (2015). Authentic assessment of knowledge, skills, and attitudes. New Directions for Teaching and Learning, 2015(142), 65-80.
In article      View Article
 
[10]  Luong-Orlando, K (2003). Authentic assessment: Designing performance-based tasks for achieving language arts outcomes. New York :Distributed i the U.S. by Stonehouse Publishers; Markham,
In article      View Article
 
[11]  Mann, S (2011). Using findings from the performance appraisal literature to inform the evaluation of students in higher education. The Canadian Journal of Higher Education, 41(2), 1.
In article      View Article
 
[12]  McTighe, J (2017, April 27). Performance Task PD. Retrieved from Defined Learning : http://www.performancetask.com.
In article      View Article
 
[13]  Miles, S, Cromer, L, & Narayan, A (2015). Applying equity theory to students' perceptions of research participation requirements. Teaching of Psychology, 42(4), 349.
In article      View Article
 
[14]  Prathap, G, & Ratnavelu, K (2015). Research performance evaluation of leading higher education institutions in malaysia. Current Science, 109(6).
In article      View Article
 
[15]  Robert M, & James M (2015). Quantitative Reasoning in the Context of Energy and Environment: Modeling in the real life world. Roterdam/posten/ Taipei: sense.
In article      
 
[16]  Szikora, P (2015). The role and ineluctability of student performance evaluation in higher education. Managerial Challenges of the Contemporary Society. Proceedings, 8(2), 79.
In article      View Article
 
[17]  Wiggins,G, & Mctighe, J (2004). Understanding by Design Professional Development Workbook. Alexandria, VA: Association for Supervision and Curriculum Development.
In article      
 
[18]  Wiggins,G, & Mctighe, J (2008). Understanding by Design Professional Development Workbook. UbD Design Guide Worksheets - MOD M.
In article      
 
[19]  Wiggins, G, & Mctighe, J (2005). Understanding by Design, Copyright by the Association for Supervision and Curriculum Development (ASCD). All rights reserved1703 N. Beauregard St. • Alexandria, VA 22311-1714 USA, Web site: www.ascd.org • E-mail: member@ascd.org Author guidelines: www.ascd.org/write.
In article      
 
[20]  Wiggins, G., & Mctighe, J, & Ebrary, I (2005). Understanding by design (Expand 2nd; 2nd; ed.). Alexandria, VA: Association for Supervision and Curriculum Development.
In article      View Article
 

Creative CommonsThis work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

Cite this article:

Normal Style
Nuha Iter. Using Performance Task-GRASPS to Assess Student Performance in Higher Education Courses. American Journal of Educational Research. Vol. 5, No. 5, 2017, pp 552-558. http://pubs.sciepub.com/education/5/5/12
MLA Style
Iter, Nuha. "Using Performance Task-GRASPS to Assess Student Performance in Higher Education Courses." American Journal of Educational Research 5.5 (2017): 552-558.
APA Style
Iter, N. (2017). Using Performance Task-GRASPS to Assess Student Performance in Higher Education Courses. American Journal of Educational Research, 5(5), 552-558.
Chicago Style
Iter, Nuha. "Using Performance Task-GRASPS to Assess Student Performance in Higher Education Courses." American Journal of Educational Research 5, no. 5 (2017): 552-558.
Share
[1]  Biggs, J, and Tang, C (2011). Teaching for quality learning at university . New York: Society for Research into Higher Education& Open University Press.
In article      View Article
 
[2]  Brookhart, S (2013). How to create and use rubrics for formative assessment and grading, p. cm. Includes bibliographical references and index.
In article      View Article
 
[3]  Carol, A, & McTighe, J (2006). Integrating Differentiated Instruction and Understanding by Design: Connecting Content and Kids. Alexandia, Virginia USA: association for supervision and curriculum development.
In article      
 
[4]  Cho, J, & Trent, A (2005). “backward” curriculum design and assessment: What goes around comes around, or haven't we seen this before? Taboo, 9(2), 105.
In article      View Article
 
[5]  Darling-Hammond, L & Adamson, F (2010). Beyond basic skills: The role of performance assessment in achieving 21st century standards of learning. Stanford, CA: Stanford University, Stanford Center for Opportunity Policy in Education.
In article      View Article
 
[6]  Elmgren, M, & Henriksson, A (2014). Academic Teaching . Estonia: Mediapool Print Syd AB.
In article      PubMed
 
[7]  Frey, H, Ketteridge,S & Marshall, S (2015). A handbook for teaching and learning in higher education enhance acadimic practice. Abingdon, and New York: Routledge .
In article      View Article
 
[8]  Igbape, E, & Idogho, P (2014). Performance evaluation model for quality assurance in nigeria higher education. Lecture Notes in Engineering and Computer Science, 2213(1), 334-343.
In article      View Article
 
[9]  Litchfield, B, & Dempsey, J (2015). Authentic assessment of knowledge, skills, and attitudes. New Directions for Teaching and Learning, 2015(142), 65-80.
In article      View Article
 
[10]  Luong-Orlando, K (2003). Authentic assessment: Designing performance-based tasks for achieving language arts outcomes. New York :Distributed i the U.S. by Stonehouse Publishers; Markham,
In article      View Article
 
[11]  Mann, S (2011). Using findings from the performance appraisal literature to inform the evaluation of students in higher education. The Canadian Journal of Higher Education, 41(2), 1.
In article      View Article
 
[12]  McTighe, J (2017, April 27). Performance Task PD. Retrieved from Defined Learning : http://www.performancetask.com.
In article      View Article
 
[13]  Miles, S, Cromer, L, & Narayan, A (2015). Applying equity theory to students' perceptions of research participation requirements. Teaching of Psychology, 42(4), 349.
In article      View Article
 
[14]  Prathap, G, & Ratnavelu, K (2015). Research performance evaluation of leading higher education institutions in malaysia. Current Science, 109(6).
In article      View Article
 
[15]  Robert M, & James M (2015). Quantitative Reasoning in the Context of Energy and Environment: Modeling in the real life world. Roterdam/posten/ Taipei: sense.
In article      
 
[16]  Szikora, P (2015). The role and ineluctability of student performance evaluation in higher education. Managerial Challenges of the Contemporary Society. Proceedings, 8(2), 79.
In article      View Article
 
[17]  Wiggins,G, & Mctighe, J (2004). Understanding by Design Professional Development Workbook. Alexandria, VA: Association for Supervision and Curriculum Development.
In article      
 
[18]  Wiggins,G, & Mctighe, J (2008). Understanding by Design Professional Development Workbook. UbD Design Guide Worksheets - MOD M.
In article      
 
[19]  Wiggins, G, & Mctighe, J (2005). Understanding by Design, Copyright by the Association for Supervision and Curriculum Development (ASCD). All rights reserved1703 N. Beauregard St. • Alexandria, VA 22311-1714 USA, Web site: www.ascd.org • E-mail: member@ascd.org Author guidelines: www.ascd.org/write.
In article      
 
[20]  Wiggins, G., & Mctighe, J, & Ebrary, I (2005). Understanding by design (Expand 2nd; 2nd; ed.). Alexandria, VA: Association for Supervision and Curriculum Development.
In article      View Article