Article Versions
Export Article
Cite this article
  • Normal Style
  • MLA Style
  • APA Style
  • Chicago Style
Research Article
Open Access Peer-reviewed

A Comparison of Student Behavior and Performance between an Instructor-Regulated versus Student-Regulated Online Undergraduate Finance Course

Victor Wakeling, Patricia R Robertson
American Journal of Educational Research. 2017, 5(8), 863-870. DOI: 10.12691/education-5-8-5
Published online: August 10, 2017

Abstract

In a study conducted at a large, public university, the authors collected data to measure the relationship between student behavior and performance in an online undergraduate finance class based on two different course formats: instructor-regulated versus student-regulated. The quantitative study indicated significant differences in student behavior when given the self-regulated option, which correlated with deterioration in overall student performance. The study found that when students were given the flexibility to fully control course pacing, there was a statistically significant difference in their pattern of taking quizzes, especially missing quizzes entirely. Also, these students collectively exhibited statistically significant lower overall exam scores. This suggests that some students either do not choose to or experience difficulty with managing their time in a self-paced learning structure.

1. Introduction

This study compares outcomes from an undergraduate finance course that is delivered online and is fully-self-paced (AKA student-regulated) versus the same course that is group-based (AKA instructor-regulated). The research seeks to compare these independent course design options in order to evaluate student behavior and performance.

Most university courses are instructor-regulated. The professor leads the class in lockstep through a scheduled course curriculum parceled out over the semester. Learning is subsequently measured through a series of scheduled and sequenced assessments occurring at intervals throughout the semester. In the instructor-regulated structure, the student is constrained by predetermined, communal deadlines with no flexibility to alter how the course is paced nor when assessments occur.

The instructor-regulated structure has two shortcomings. First, adult students learn at different speeds. 1 Group-based instruction does not recognize the natural differences in adult paces of learning and tends to target the ‘middle.’ Second, college students have competing priorities for their time, such as course load, family/professional obligations, and lifestyle/social commitments. An alternative to the instructor-regulated structure is student-regulated. Each learner manages the pace and timing of content delivery. 2 Student-regulated offers students the flexibility to customize their courses to accommodate their personal preferences, both to reflect their preferred speed of learning and their need to manage their individual schedules to dedicate time to learning.

In online courses, students typically work independently through the course-curriculum by reading the textbook, viewing streaming and narrated videos, and completing other online assignments. Online courses allow students to dedicate time to a course based on individual preferences and schedules. Online classes might appear to be intrinsically student-regulated. However, there is a caveat in how student-regulated is defined. While student-regulated learning implies that the learner is in full control of the course tempo, it is not truly ‘student-regulated’ if all students enrolled in the course move through the curriculum as a group. So, while it might seem counter-intuitive to characterize an online course as group-based, it usually is.

This is the difference between self-paced learning and a self-paced course. In the former, through the web-based learning management system (LMS), the online professor can control how and when course content is made available to the entire group. For instance, the professor might publish and hide material and open and close exams on specific dates, meting out the modules across the semester (instructor-regulated) and/or reveal modules and exams only after the student completes previous ones in sequence (linear sequencing). The course design process necessitates deciding whether and how to use these technological tools to dictate student pacing. 3 In other words, the professor can control the timing and sequencing of the course, for the group, during the instruction-period even though the course is online. With a self-paced course, the professor might upload and make available the entire course on day one (student-regulated). So, online courses can be designed as either group-based or self-paced. Diagram 1 illustrates the online classroom course structure options.

Regardless of when and how course content is made available, so long as the professor imposes restrictions on the timing and sequencing of assessments, through scheduled test dates or testing windows, the course is not self-paced. In fact, one university went out of the way to characterize its online classes as “not self-paced.” The web site indicated, “There will be deadlines for assignments, threaded discussions, quizzes, tests and projects. You will have flexibility of working on your coursework as it fits in your daily schedule.” 4 In other words, even though learning is asynchronous, assessments are usually synchronized if test-taking is mandated to narrow and prescheduled time intervals.

At this point we differentiate between self-paced learning and a fully self-paced course. We define a fully-self-paced course as one that is structured such that student has full control of the timing of both learning and assessment. Rather than being a passenger on a one-way road at a constant speed with stop signs, the student steers the vehicle at any speed via any route. Therefore, we restate the online classroom course design options in Diagram 2.

Some universities do offer self-paced courses. Courses are offered as semester-based or self-paced. Students choose which structure they prefer. With semester-based, the course begins and ends concurrent with the university, semester-based calendar. Alternatively, self-paced (which we will refer to as ‘time limit’ in our illustrations) resides outside of the standard university calendar and the student has a specific amount of time (e.g.: three months, six months, one year) to complete the course. For the time limit option, the student controls the start date, which begins the clock ticking for the course and triggers the financial commitment.

The for-profit Capella University has advertised online programs with the student choosing between semester-based or self-paced options. 5 Capella calls the former (semester-based) GuidedPath. With regard to this option, the web site reads: “With GuidedPath, you stay motivated and on track with pre-set deadlines. Assignments are due at the end of the week, with mid-week message-board discussions and with regular check-ins from program advisors.” So this option is group-based. By contrast, Capella calls the self-paced option FlexPath. “With FlexPath, a class schedule is entirely controlled by you. Once you complete a course, whether that takes you two weeks or 12, you're free to move on to the next course without having to wait for a new term to begin. You can complete as many courses as you want, up to two simultaneously, each quarter for one flat fee [charged every 12 weeks]. The faster your pace, the more money you may save.” Through FlexPath, Capella offers self-paced courses, but there is a finite length of time to complete each course (12 weeks).

A fully self-paced online course structure, where the student controls the timing of assessment, is the subject of this paper. This combination is highlighted in Diagram 3.

While we found several universities where online courses offered fluidity in the timing of learning and assessment, we did not find research on how said flexibility affected student behavior and performance. One study discovered that self-paced learning can improve memory performance. 6 A subsequent extension of this work showed that student self-pacing resulted in better recall performance. 7 Other research found that students allowed to choose the sequence of study had better exam performance. 8 Another study recommended that self-paced courses include learner control and self-direction in course design to increase likelihood of successful completion. 3 Finally, a practitioner who researched the conversion of a course to self-paced concluded it had a positive effect on learning. 9

While studies have been conducted around group-based versus self-paced learning, the previous research sourced has not extended the definition of self-paced to include student control over the timing of assessments. Our empirical research includes the unique combination of a 15-week semester online university course which is fully student self-paced for the last half of the semester. The student had the opportunity to benefit from more time to learn, or could accelerate learning when preferred. Learning could be done at a steady pace or in bursts, and could be front or back-loaded. Students could also elect to follow the instructor-regulated schedule provided in the syllabus. Students controlled both the timing of learning and assessment with no constraints other than the beginning and end dates of the university semester-based calendar. In our examination of a student-regulated course, the timing of test taking, in addition to the pace of learning, was at the student’s full discretion until the end of the semester.

An online course is ideal to test these ideas. With an online course, the student is already learning in a remote, independent, self-study environment where they are free to utilize individual learning preferences. The nuance is that the student may take exams when they believe they have mastered the material, with no restrictions other than the conclusion of the instruction period (the end of the semester). We found no prior research on these specific criteria.

Self-paced learning shifts accountability to the student. “Intuitively, giving learners control over pacing of their own study seems the right thing to do. But is it really wise to give learners control?” 7 Online students are already the most vulnerable to failure since they are fully accountable for their own learning with a limited learning community and no face-to-face time with classmates and professors. Interaction is limited to emails, posts, chat rooms, and discussion boards, thus removing the boundaries and structure that many students might need. “Many are used to the higher level of support typically available in a face-to-face classroom environment.” 10 From the same Capella university web-site, the site reads: “Some students have found it a little difficult to keep track of due dates with their online classes, because they don’t attend a regular class with an instructor reminding them when things are due.”

As intrinsic motivation to learn and expend effort decreases, procrastination increases. 11 Students lacking motivation, discipline, or time-management skills may struggle with the open-ended nature of a fully self-paced course without pre-imposed deadlines that assist with planning and necessitate their staying on track. There is a significant relationship between procrastination and academic performance. 12 Too much flexibility can lead to procrastination and failure to complete requirements, as students are prone to misassessing and mismanaging their learning. 10, 13 In the absence of a set schedule, some students may procrastinate causing them to run short of time and either miss assessments or hastily complete assessments in a compressed time frame, likely to their detriment. The ability of students to self-regulate their learning is a key aspect of having a well-run, self-paced classroom. 9 Students are rarely taught strategies on learning how to learn 13, 14

Additionally, the setting and course structure (online, fully-self-paced) require students to self-monitor progress; the professor is no longer in the position to ascertain if students are on track because there are no intermediate benchmarks against which to gauge progress. "The freedom and flexibility of self-paced learning programs can be a double-edge sword in terms of lack of accessibility of support mechanisms typically build in to traditional learning environments.” 10

2. Materials and Methods

This paper compares student performance between an instructor-regulated versus student-regulated course structure. The study was conducted at a large, suburban, public university serving 33,000 students, 90% of whom are seeking undergraduate degrees and 21% of whom are seeking business discipline degrees. We selected the online course Personal Finance, an undergraduate, introductory survey, non-business elective. The course is non-technical and designed to develop an understanding of basic principles and techniques as they apply to personal income, spending, and investing.

Our research includes three sets of students in the control group and two sets of students in the study group. The control group includes 93 students from three semesters and the study group includes 65 students from two semesters. The groups were structured as follows:

• Control Group - The students participated in a group-based course structure. Specifically, not only did the syllabus outline the date ranges for which content should be completed, the professor imposed those same sequential date constraints in the LMS. Gradable components were sequential and opened/closed in the LMS with specific date ranges for the student to log-in and complete them. This structure forced students to complete the instructor-paced course under linear sequencing.

• Study Group - For the first half of the course, the professor organized the syllabus and set up the LMS identically to the control group. At the mid-point of the semester, the instructor announced that, while the course schedule continued to be recommended, all modules and the gradable components had been ‘opened’ in the LMS and would remain so for the remainder of the semester. Further, students could follow any sequence preferred. Like most courses, the course material is progressive (linear sequencing), so sequenced completion was recommended. However, the LMS did not prevent students from working out of sequence.

• This converted the last half of the course to fully-self-paced. It is the second half of the course under the student-regulated format that this study considers. Students could choose to front-load learning and complete all remaining course requirements early in the semester. Alternatively, students could choose to back-load learning and defer completion of all course requirements until the end of the semester. Or, they could follow the measured course pacing recommended in the syllabus, or stagger learning in phases or bursts. The student, not the professor, chose when and how to complete the remainder of the course, including testing. It is important to note that study group students were not made aware of the new schedule flexibility until it was announced mid-way though the semester in order to minimize any potential enrollment biases. This protected the integrity of the study population so that students enrolled without self-selection, preventing some students from specifically choosing the course because they preferred this structure and/or deterring students from choosing the course because they disliked the structure.

Research Objective – The study aims to measure the behavior and performance of students under increased self-paced flexibility in an online course environment. The performance of the study group was analyzed against the control group to calculate any significant differences in, first, timing of taking quizzes and, second, quiz performance. Timing and performance data for 558 quizzes (instructor-regulated control group) and 390 quizzes (student-regulated study group) are compared applying t-test and chi-square test. This research measures how students perform with increased self-paced flexibility to measure if their scores change from accelerating and/or delaying taking quizzes.

Subjects and Data SourcesTable 1 displays the semesters included in the research and the number of enrolled students per sixteen-week semester for the control group and the study group. The research includes data from five consecutive spring and fall semesters. The control group includes a total of 93 students from three semesters, while the study group is comprised of a total of 65 students from two semesters. All 158 students in the analysis completed the semester and received a grade (the students who withdrew from the course were excluded).

Study Data and Course Structure Design – The study is based on data sourced from the same instructor teaching all five sections, covering identical topics, and using the same course materials, textbook, assignments, and quizzes/exams. The professor developed the course and administered it for 10 years preceding and including the semesters included in the research. Course resources included narrated PowerPoint presentations, an array of outside reading and viewing assignments, and other assignments. In addition to consistency in the course learning resources and delivery, the assessments were identical as to topic, time length, and number of questions. Each quiz question was randomly computer assigned from at least three alternatives from the same test bank.

The assessments for the course included (1) 12 individual quizzes, (2) two individual assignments, and (3) an individual comprehensive final exam. The quizzes, assignments, and final exam accounted for 57%, 10%, and 33%, respectively, of the course grade. Each quiz was associated with a separate learning topic. The quizzes were equally-weighted and unevenly spaced (based on amount of content covered and calendar considerations). No make ups were permitted.

All quizzes were multiple-choice, administered online through the LMS, and ended at midnight on the deadline date. The quizzes ‘timed out’ at the announced time, which ranged from 30-50 minutes depending on the quiz. The instructor controlled the opening and closing timeframes to take quizzes. There was some overlap in the testing windows for chapters included in the same topic, which allowed students access to more than one quiz in the LMS at the same time. For example, student access to Quizzes 1, 2, and 3 had the identical start and end dates. Quiz grades were not curved.

For the study group, the first part of the course, which included the first six quizzes and the first assignment, was structured identically to the control group. After the sixth quiz, the professor announced changes to the course schedule. The study group was told that they no longer had intermediate, sequential deadlines for taking the remaining six quizzes, other than being completed by the end of the semester. Until then, the sequential start of access and deadlines controlled and impeded working ahead of the course schedule. Now, students learned they had individual discretion to advance or defer studying and complete gradable components as late as the semester’s end.

Quiz AvailabilityTable 2 presents information on the length of time quizzes were available in the LMS for students to access and complete. The timing and scores for Quizzes 7-12, now under the “flexible schedule,” were used for this study to measure and compare student behavior and performance. In the three semesters for the control group, all 12 quizzes were ‘opened’ and ‘closed’ sequentially in the LMS throughout the semester as outlined in the syllabus. This meant students needed to complete quizzes in sequence and within a specific timeframe. Quizzes were often bundled in groups of two or three to fit the delivery of the course content to the semester calendar. For example, each quiz one through six did not have a separate 14-21 completion timeframe. Instead, some of the scheduled opening and closing dates coincided. For the study group, Quizzes 7-12 were all opened in the LMS at the same time at mid-semester and remained open for the remainder of the semester. Once opened, the last six quizzes were available for 50 total days, now offering access time which was over twice as long as under the control conditions. The student-regulated study group now had discretion over when and in what order to prepare for and take the remaining six quizzes.

3. Results

Student Behavior – This research measures whether and how students changed their timing behavior and quiz performance when provided with expanded discretion over the last half of the semester’s schedule. Would students accelerate completing the course (taking some quizzes that would otherwise not yet have been opened) or slow completion of the course (completing some quizzes that would have otherwise already been closed)? Descriptive and chi square test results are presented in sections below.

Table 3 presents the frequency where a study group student took a quiz in advance of the original deadline published in the syllabus. Very few study group students chose to accelerate their studies to prepare for and take quizzes earlier.

Table 4 compares the frequency of total missed quizzes. There was a higher percent of occurrences of missed quizzes from the study group versus the control group (16.9% versus 9.5%, respectively). Additionally, fewer students elected to conform to the original schedule’s earlier deadlines for the last half of the course.

Table 5 presents data for individual students in both the control and study groups missing one or more quizzes among the last six quizzes. From Quizzes 7-12, a higher percent of individual students from the student-regulated study group missed quizzes compared to the control group (47.6% versus 31.2%, respectively).

Table 6 displays the median timing that students took quizzes measured in days ahead of quiz deadline. For the first half of their semester, both the instructor-regulated control and student-regulated study groups had the same instructor linearly-sequenced structure for Quizzes 1-6. The collective median for the control group (in terms of days ahead of deadline that students took quizzes) was 0.44 days (10.5 hours ahead). For the student-regulated study group’s last six quizzes, the collective median days ahead of deadline was 0.47 days (11.2 hours ahead).

Table 7 summarizes how far ahead of a quiz deadline control and study group students opted to take the last six quizzes. The timing data indicates that most students routinely wait to take quizzes near the deadline. 60.9% of control group students waited until the last 12 hours on the deadline day compared to 44.6% of the study group students. While the category is measured as 0-12 hours, most students took the quiz in the back-end of the range in the last few hours. There was an increase in the number of missed quizzes for the study group at 16.9% versus 9.5% for the control group. If the two categories are combined (0-12 hours and missed quiz), the result is 70.4% for the control group and 61.5% for the study group. Study group students waited until the last few hours to complete quizzes and many did not take the quiz at all. The data also shows that only 9.4% of control and 16.7% of study group students completed their quizzes at least 4.5 days ahead of the quiz deadlines.

A chi square goodness-of-fit test was applied to analyze if the study group’s timing distribution of taking quizzes matches the control group’s timing pattern. The control group’s timing distribution is used as the expected distribution. Table 8 presents this information. Missed quizzes represents the major impact in the χ2 result. The P-value is very small, so the null hypothesis that the study group’s quiz timing distribution matches the expected distribution is rejected.

H0: The Study group’s distribution matches the expected distribution

HA: The Study group’s distribution differs from the expected distribution.

Student Performance – The evidence of student behavior was reviewed above. Next, we examine whether the performance on the last six quizzes for the study group was significantly different from the control group.

Table 9 displays the results of a t-test applied to the student quiz scores, assuming unequal sample variances. The study group’s performance (Mean 68.7%, N = 390) is 5.98% lower than the control group (Mean 74.7%, N = 558). This difference is significant at the P (T<=t) two-tail = .00282. A significant result of P (T<=t) two-tail = 0.00202 also obtains if the t-test is rerun assuming the samples have equal variance. The -5.98% mean quiz score drop for the study group triggers interest in how the student grade distribution is affected. Each student’s average grade over their last six quizzes was computed. The 6-quiz performance cumulatively accounts for approximately 29% of the course grade. Each quiz average is then categorized as ‘A’, ‘B’, ‘C’, ‘D’, or ‘F’ grade by the standard 90%+ = ‘A’, 80%+ to 90% = B, etc.

H0: The study group’s quiz performance matches the control group

HA: The study group’s quiz performance does not match the control group

Table 10 and Chart 1 present the distribution of each student’s quiz grade for their last six quizzes for the control and study groups. A lower percent of A’s, B’s, and C’s is shown for the study group, as is a higher percent of D’s and ‘F’s. The largest grade shifts for the study group are an 8.9% reduction in B’s and a 10.5% increase in F’s.

Table 11 presents a chi square test to compare the grade distribution between the control and study groups. The test value of 7.36 is less than the χ2 (4, 0.05) = 9.488, so the null hypothesis about the grade distribution is not rejected. However, though falling short of statistical significance (P = .118), Chart 1 and Table 11 display marginal negative slippage across the ‘A to F’ categories for the study group.

H0: The study group’s grade distribution matches the control group

HA: The study group’s grade distribution differs from the control group.

4. Discussion

The higher percent of occurrences of missed quizzes from the study group versus the control group was an unfavorable consequence of study group behavior when students were allowed increased discretion over studying and test timing. Few students elected to conform to the original schedule’s periodic deadlines for the second half of the course. The authors presume that studying for and taking quizzes while the material is fresh on the student’s mind is the norm and that a practice of studying multiple topics first and deferring multiple quizzes to a later date is uncommon. The authors believe the higher incidence of missed quizzes among the study group is linked to student mismanagement of their time. Many students procrastinated and took their quiz on its deadline date. This is evident in the research and varies little with different lengths of access time. Indeed, the instructor regularly received emails from students who missed a quiz and requested extensions. The most common explanations are that they forgot about the quiz, got tied up late at work that night, and/or were distracted by family difficulties. On more than one occasion, a student reported they misread the calendar and thought the quiz was due the following day. The timing pattern of quiz-taking supports an interpretation that procrastination is routine for the majority of students. Regarding the timing of quiz taking, the instructor had hoped that student learning would improve when students gained discretion over their schedules for the second half of the semester. A higher frequency of missed quizzes is not surprising when students, who previously exhibited a tendency for procrastinating, mismanage their schedule when studying multiple topics and taking multiple quizzes with a common end-semester deadline.

The grade distribution between the control and study groups supports a conclusion that the performance of many students is jeopardized when they are offered greater discretion in meeting learning deadlines.

5. Conclusion

This evidence-based study evaluated differences in behavior and performance when students were given increased control of their learning and quiz timeframes. This study included students enrolled in the same undergraduate finance course in consecutive semesters taught by the same professor using the same materials. At semester midpoint, study group students were allowed expanded time discretion to complete learning and assessments.

Given very similar environmental conditions, the instructor-regulated group-based versus fully student-regulated self-paced structure indicates significant differences in student behavior. Student performance declined in the student-regulated structure. The sample mean score of 390 quizzes by students in the study group dropped a statistically significant -5.98%. In reaction to the opportunity, many students postponed taking quizzes and waited until the last possible day to complete their work. Under the study, a large number of students appeared to mismanage the flexible timeframe and missed quizzes entirely.

Acknowledgements

We wish to thank Leo MacDonald, PhD for his technical contribution to this project and Justin Cochran, PhD for his peer review guidance.

References

[1]  Goodlad, J., “Principles of Adult Learning”, Best Practice Resources, 2005. Teaching and Learning, University of Wisconsin. 2008.
In article      
 
[2]  Gureckis, T.M. and Markant, D.B., “Self-Directed Learning: A Cognitive and Computational Perspective,” Perspectives on Psychological Science, 7(5). September, 2012.
In article      View Article  PubMed
 
[3]  Lim, J., “The Relationship Between Successful Completion and Sequential Movement in Self-Paced Distance Courses,” International Review of Research in Open and Distributed Learning. 17(1). 159-179. January 2016.
In article      View Article
 
[4]  https://www.iowalakes.edu/.
In article      View Article
 
[5]  https://www.capella.edu/capella-experience/.
In article      View Article
 
[6]  Tullis J., Benjamin, A., “On the Effectiveness of Self-Paced Learning,” Journal of Memory and Language. 64(2). 109-118. 2011.
In article      View Article  PubMed
 
[7]  de Jonge, M., Tabbers H., Pecher, D., Jang, Y., Zeelenberg, R., “The Efficacy of Self-Paced Study in Multitrial Learning,” Journal of Experimental Psychology, Learning, Memory, and Cognition. 41(3). 851-858. May, 2015.
In article      View Article  PubMed
 
[8]  Carvalho, P., Braithwaite, D., de Leeuw J., Motz, B., Goldstone, R., “An In Vivo Study of Self-Regulated Study Sequencing in Introductory Psychology Courses,” PLoS ONE. 11(3). 2016.
In article      View Article
 
[9]  Highland, C., “Self-Paced Individualized Learning,” Master’s Paper. 2015.
In article      View Article
 
[10]  Magill, D., “What Part of Self-Paced Don’t You Understand?” 24th Annual Conference on Distance
In article      View Article
 
[11]  Rakes, G.C. and Dunn, K.E., “The Impact of Online Graduate Students’ Motivation and Self-Regulation on Academic Procrastination,” Journal of Interactive Online Learning, 9: 1. 2010.
In article      View Article
 
[12]  Kyung, R.K. and Eun, H.S., “The Relationship Between Procrastination and Academic Performance: A Meta-Analysis,” Personality and Individual Differences, 82:26-33. 2015.
In article      View Article
 
[13]  Bjork, R.A., Dunlosky, J., and Kornell, N., “Self-Regulated Learning: Beliefs, Techniques, and Illusions,” Annual Review of Psychology, 64:417-444. 2013.
In article      View Article  PubMed
 
[14]  Kornell, N. and Bjork, R.A., “The Promise and Perils of Self-Regulated Study,” Psychonomic Bulletin & Review, 14(2), 219-224. 2007.
In article      View Article  PubMed
 

Creative CommonsThis work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit https://creativecommons.org/licenses/by/4.0/

Cite this article:

Normal Style
Victor Wakeling, Patricia R Robertson. A Comparison of Student Behavior and Performance between an Instructor-Regulated versus Student-Regulated Online Undergraduate Finance Course. American Journal of Educational Research. Vol. 5, No. 8, 2017, pp 863-870. https://pubs.sciepub.com/education/5/8/5
MLA Style
Wakeling, Victor, and Patricia R Robertson. "A Comparison of Student Behavior and Performance between an Instructor-Regulated versus Student-Regulated Online Undergraduate Finance Course." American Journal of Educational Research 5.8 (2017): 863-870.
APA Style
Wakeling, V. , & Robertson, P. R. (2017). A Comparison of Student Behavior and Performance between an Instructor-Regulated versus Student-Regulated Online Undergraduate Finance Course. American Journal of Educational Research, 5(8), 863-870.
Chicago Style
Wakeling, Victor, and Patricia R Robertson. "A Comparison of Student Behavior and Performance between an Instructor-Regulated versus Student-Regulated Online Undergraduate Finance Course." American Journal of Educational Research 5, no. 8 (2017): 863-870.
Share
[1]  Goodlad, J., “Principles of Adult Learning”, Best Practice Resources, 2005. Teaching and Learning, University of Wisconsin. 2008.
In article      
 
[2]  Gureckis, T.M. and Markant, D.B., “Self-Directed Learning: A Cognitive and Computational Perspective,” Perspectives on Psychological Science, 7(5). September, 2012.
In article      View Article  PubMed
 
[3]  Lim, J., “The Relationship Between Successful Completion and Sequential Movement in Self-Paced Distance Courses,” International Review of Research in Open and Distributed Learning. 17(1). 159-179. January 2016.
In article      View Article
 
[4]  https://www.iowalakes.edu/.
In article      View Article
 
[5]  https://www.capella.edu/capella-experience/.
In article      View Article
 
[6]  Tullis J., Benjamin, A., “On the Effectiveness of Self-Paced Learning,” Journal of Memory and Language. 64(2). 109-118. 2011.
In article      View Article  PubMed
 
[7]  de Jonge, M., Tabbers H., Pecher, D., Jang, Y., Zeelenberg, R., “The Efficacy of Self-Paced Study in Multitrial Learning,” Journal of Experimental Psychology, Learning, Memory, and Cognition. 41(3). 851-858. May, 2015.
In article      View Article  PubMed
 
[8]  Carvalho, P., Braithwaite, D., de Leeuw J., Motz, B., Goldstone, R., “An In Vivo Study of Self-Regulated Study Sequencing in Introductory Psychology Courses,” PLoS ONE. 11(3). 2016.
In article      View Article
 
[9]  Highland, C., “Self-Paced Individualized Learning,” Master’s Paper. 2015.
In article      View Article
 
[10]  Magill, D., “What Part of Self-Paced Don’t You Understand?” 24th Annual Conference on Distance
In article      View Article
 
[11]  Rakes, G.C. and Dunn, K.E., “The Impact of Online Graduate Students’ Motivation and Self-Regulation on Academic Procrastination,” Journal of Interactive Online Learning, 9: 1. 2010.
In article      View Article
 
[12]  Kyung, R.K. and Eun, H.S., “The Relationship Between Procrastination and Academic Performance: A Meta-Analysis,” Personality and Individual Differences, 82:26-33. 2015.
In article      View Article
 
[13]  Bjork, R.A., Dunlosky, J., and Kornell, N., “Self-Regulated Learning: Beliefs, Techniques, and Illusions,” Annual Review of Psychology, 64:417-444. 2013.
In article      View Article  PubMed
 
[14]  Kornell, N. and Bjork, R.A., “The Promise and Perils of Self-Regulated Study,” Psychonomic Bulletin & Review, 14(2), 219-224. 2007.
In article      View Article  PubMed