A Low-Maintenance Approach to Improving ... - SAGE Journals

43 downloads 986 Views 230KB Size Report
The struggle to get weak students to use learning support services plagues .... selected by faculty, eliminating any maintenance in future semesters or limiting.
J. COLLEGE STUDENT RETENTION, Vol. 14(4) 549-566, 2012-2013

A LOW-MAINTENANCE APPROACH TO IMPROVING RETENTION: SHORT ON-LINE TUTORIALS IN ELEMENTARY STATISTICS*

CAROL SPRINGER SARGENT, PH.D., CPA Middle Georgia State College, Macon, Georgia A. FAYE BORTHICK, DBA, CPA, CMA, CISA AMY R. LEDERBERG, PH.D. Georgia State University, Atlanta, Georgia REGINE HAARDÖRFER, PH.D. Emory University, Atlanta, Georgia

ABSTRACT

The struggle to get weak students to use learning support services plagues virtually all retention programs (Friedlander, 1980; Hodges, 2001; Karabenick & Knapp, 1988; Moore & LeDee, 2006; Simpson, Hynd, Nist, & Burrell, 1997; Webster & Dee, 1998). This study presents a cost-effective form of supplemental instruction (SI), in the form of on-line tutorials, ultra-short digital instruction suitable for high enrollment courses, designed to engage underprepared or poorly motivated students pass their required courses. Participation among the low achievers was 39.3%, improving their exam scores about a half a letter grade over non-users and students in control sections, even without giving course credit for using the SI resource. Surveys reveal what aspects of the tutorials were most effective and valued.

*The authors gratefully acknowledge financial support from the Georgia State University Provost’s Office and assistance from Bill Nelson. 549 Ó 2013, Baywood Publishing Co., Inc. doi: http://dx.doi.org/10.2190/CS.14.4.g http://baywood.com

550 / SARGENT ET AL.

INTRODUCTION Improving student retention is one of the leading challenges facing higher education (Hsieh, Sullican, & Guerra, 2007; Tinto, 2006b). As funding from state and federal sources declines, the need to improve student retention in cost effective ways increases. Extra instruction, if it can be provided with modest resources, is one promising answer to this challenge because, in virtually all its formats, it works, with effect sizes depending on the extent of the resource and the match of the resource to the learner (Congo & Schoeps, 1993; Simpson et al., 1997). This article presents a new cost-effective kind of supplemental instruction (SI), in the form of on-line tutorials, ultra-short digital instruction suitable for high enrollment courses, designed to engage underprepared or poorly motivated students, the most important population to increase retention. SUPPLEMENTAL INSTRUCTION Colleges and universities initially responded to underprepared students by creating developmental programs to remedy student deficits. Programs designed for at-risk students discovered that these populations are least likely to participate (Karabenick & Knapp, 1988; Moore & LeDee, 2006; Stansbury, 2001), ask for help too infrequently (Good, Slavings, Harel, & Emerson, 1987; Hodges, 2001; Karabenick & Knapp, 1991), and generally participate only after the failure has occurred (Keimig, 1983). The broadening of learning support services to include all students in a nonremedial bridge-to-mastery model has succeeded in many contexts, generally targeting high-risk courses, and most notably reducing D and F grades. One of the most wide-spread programs, started at the University of Missouri-Kansas City with medical students (Martin & Arendale, 1992), provides voluntary peer-led small group sessions concurrently with content courses. Peer-tutored programs open to all comers have been demonstrated to be effective in numerous formats and domains all over the globe (Congo, 2002; Congo & Schoeps, 1993; Martin & Blanc, 2001; Topping, 1996), although the literature calls for better controls over academic ability and motivation in analyzing outcomes (McCarthy & Smuts,1997; Robbins, Allen, Casillias, & Peterson, 2006). Enrolling students in SI, creating the materials, arranging the space and equipment for the sessions, and recruiting and training of SI leaders annually requires an impressive level of resources. Despite the effectiveness of SI programs, few faculty participate in the creation, organization and implementation of these programs (Tinto, 2006b), have little incentive to get involved (Tinto, 2006a), and often do not understand the impact of SI on grades, study skills, and retention (Congo, 2002; Keimig,

ON-LINE TUTORIALS IN ELEMENTARY STATISTICS /

551

1983). The reality on most campuses is that learning support programs are at the mercy of faculty willingness to entertain adjunct programs (Commander & Stratton, 1996). The SI innovation studied in this article, on-line tutorials, targets the most prominent concern, getting weak students to participate, while minimizing two common barriers: eliciting cooperation from content professors and minimizing initial and on-going resources needed. APPEALING TO THE LOW ACHIEVERS The struggle to get weak students to use learning support services plagues virtually all retention programs (Friedlander, 1980; Hodges, 2001; Karabenick & Knapp, 1988; Moore & LeDee, 2006; Simpson et al., 1997; Webster & Dee, 1998). Even when participation is mandatory, as few as half of the weak students attend (Hodges, Dochen, & Joy, 2001). Discouraged and poorly motivated students show little interest in typical supplemental instruction programs, because they all require considerable extra effort (Linnenbrink & Pintrich, 2002; Simpson et al., 1997) and the students have low confidence (self-efficacy) that effort will lead to success (Kim, Baylor, & Group, 2006). Further, in interviews, low achievers indicate that embarrassment about their academic weakness kept them from participating (Stansbury, 2001). Given that students may also be both poorly motivated and reluctant to reveal their academic weaknesses, any instruction innovation intended to improve retention, needs to consider both sources of non-participation. This SI innovation was first developed for introductory accounting (Sargent et al., 2011). The current study used the same key features to create an SI intervention for an elementary statistics course. KEY FEATURES OF THE ON-LINE TUTORIALS Motivational Components The on-line tutorials in this study aimed to attract poorly motivated or weak students by providing instruction that (1) delivers quick results with small effort, (2) maps directly to mastery of course goals (i.e., high task value) (Hsieh et al., 2007; Kim et al., 2006; Linnenbrink & Pintrich, 2002), and (3) minimizes social embarrassment surrounding needing help (Keimig, 1983; Stansbury, 2001). The main attention-getting promise designed to attract less able students was the learning efficiency: tutorials that deliver results in just three minutes. In addition, easy discrete access (available on-line 24/7) minimized the effort to join (view tutorials anytime without notice) and the potential social stigma associated with participating.

552 / SARGENT ET AL.

Cognitive Components Each three to six minute tutorial taught one or two central concepts, using clear everyday language, helping novices build mastery one nugget at a time (Ayres, 2006; Mayer, Mathias, & Wetzell, 2002; Sweller & Chandler, 1994). Novices are particularly vulnerable to flawed ideas and partial understandings (Smith, diSessa, & Roschelle, 1993). College students with low prior knowledge gain more from explicit instruction of misconceptions than from additional lectures (Korner, 2005; Muller, Bewer, Sharma, & Reimann, 2007). Therefore, the presentations in the on-line tutorials discussed why certain approaches and terms were deficient rather than just explaining the concept. The goal of the tutorial was not to entertain but to offer efficient learning. Instruction was voice-over power point, avoiding animation or other visual presentations that create busy screens that compete with limited short-term memory resources, a typical problem for novice learners (Clark, Nguyen, & Sweller, 2006). Low Faculty Resistance and Involvement On-line tutorials require neither grading nor coordination of facilities or SI leaders, so faculty only need to make the resource known to students, which minimizes instructor involvement and resistance to implementation. Since these tutorials teach core concepts in everyday language, they match virtually any text selected by faculty, eliminating any maintenance in future semesters or limiting their shelf-life to the current text. Once created, they provide a perpetual resource without tracking or grading burden. PRIOR STUDY This SI innovation was first developed and offered to students in an introductory accounting class as a response to a persistent low pass rate. The first tutorial implementation improved course pass rates, from an average of 57.2% in the six terms prior to implementation to 79.8% in the six terms after implementation (Sargent et al., 2011). Tutorial use during the first semester of use averaged 71.4% (61.0% for low achievers), even without awarding course credit for their use. The high participation rates in the first implementation raised questions about which features of the tutorials appealed most to students, especially the low achievers, and whether tutorial use would persist if other course resources, such as a full-featured textbook Website, were available. Because the accounting study was a campus project to increase pass rates, it lacked an experimental design. This study improved upon the accounting implementation. It was in a different content area, where extensive resources were already available, and included random group assignment.

ON-LINE TUTORIALS IN ELEMENTARY STATISTICS /

553

THE CURRENT INVESTIGATION This article reports an implementation of digital SI created for a freshman elementary statistics course, with an historical 20-30% failure rate, a course for which a vast array of extra instructional resources was already available to students (a math lab staffed by graduate students four days a week for one-on-one assistance and on-line resources from the textbook publisher, including quizzes, a diagnostic study manager, slides, outlines, and extra practice problems with worked out solutions). The tutorials were loaded onto the campus learning management system, for randomly selected sections of elementary statistics (intervention sections), but not for others (control sections). This created three groups: (a) control students without access to tutorials; (b) users, students enrolled in the intervention sections who chose to view the tutorials; and (c) non-users, students enrolled in intervention sections with access to the tutorials, but chose not to use them. This study addresses two questions: (1) Would high-risk students (low achievers1) use the tutorials? (2) If high-risk students used the tutorials, would those users be less likely to drop the course and earn better exam scores than the other high-risk students? METHODS Participants Participants were 1,411 students enrolled spring 2009 in 31 sections of freshmen elementary statistics, a math core elective which is required for business and science majors, at a large urban university with freshman SAT scores averaging 1050 and an average acceptance rate of 53%. Attributes of participants by users, non-users, and control students are summarized in Table 1. Users had significantly more college credit hours than nonusers and control students (F = 3.79, p = .03), but the three groups did not differ significantly on any other of the attributes. Procedure All 31 sections of elementary statistics offered in Spring 2009 were randomly assigned to either intervention and control conditions. The first author visited all sections to obtain institutionally approved IRB consent and describe the features of one learning support provided by the department. For the intervention sections, this was the on-line tutorials; for the control sections, it was the math lab. A set of 1 Low achievers were defined in this work as having a cumulative GPA in the bottom 25% of the class. As discussed in the Measures section, this was a cumulative GPA of below 2.4 for our participants.

2.93 (0.65) 69.40 (35.12)

62.11 (35.12)

2.85 (0.81)

527.30 (72.73)

521.77 (77.03)

65.47 (35.56)

2.88 (0.74)

526.41 (72.81)

518.98 (74.30)

59.1%

695

Intervention

*p < .05.

aOpened two or more tutorials during term. bExcludes transfer students, for which SAT scores are not required (n = 430, 54 non-users, 138 users and 238 control). cIncludes 11 transferred students who withdrew from all their classes leaving no GPA so their transfer GPA was used.

College credit hours*

Cumulative

GPAc

525.17 (73.09)

SAT

517.46 (70.40)

375 59.7%

320 58.4%

mathb

Non-Users

Usersa

SAT verbalb

Percent female

Number of participants

Attribute

Table 1. Participant Attributes: Mean (SD)

65.74 (34.33)

2.83 (90.78)

527.06 (76.00)

521.09 (74.84)

58.4%

716

Control

554 / SARGENT ET AL.

ON-LINE TUTORIALS IN ELEMENTARY STATISTICS /

555

21 tutorial files were loaded into each intervention section’s learning management system, 15 core concept tutorials (see topics in Figure 1), and six practice exams that contained multi-chapter practice. Students did not receive any course credit for tutorial use and were not aware that the experiment tracked their tutorial activity. During the last week of the semester, the first author again visited all sections and asked students to complete a survey on course resources used and course goals. Tutorial Contents The first author selected the 15 tutorial topics (see Figure 1) and created the tutorial contents based on interviews with seasoned statistics faculty and the supervisor of the math lab on campus, a review of the course syllabus, and an examination of major elementary statistics textbooks. In the interviews, she probed the content experts for central topics that affected cumulative understanding, common misconceptions, traditionally difficult areas, and helpful strategies. All interviewees named eight of the 15 final topics, suggesting that there was general agreement on central topics and areas of difficulty. The script and content of the slides, spanning from three to six minutes in length, taught terms and procedures using everyday language, minimizing the need for prior learning and long attention spans. The instruction included insights about common misconceptions, strategies for learning and remembering, and practice problems to reinforce the concepts. A graduate student recommended by the Math Lab supervisor reviewed each tutorial for errors, technical correctness, and instructional clarity. The six practice exam tutorials included a series of practice problems with worked out solutions but no content instruction. Measures Tutorial Use

This work defines a “user” as a student who viewed two or more tutorials. A review of tutorial launches showed that 82 students previewed the tutorials, viewing only one. Previewers were not included as users. The length of time the file was open was not captured, so the study defined “use” as “the number of file launches” (i.e., number of views) rather than minutes viewed. Math Aptitude

Math aptitude was measured with Math SAT scores, which were available for students entering the university as first-term freshmen (n = 981, 69.5%), but not for students transferring from other institutions, n = 430, 30.5%.

Figure 1. Number of tutorial views by topic in order of course syllabus.

556 / SARGENT ET AL.

ON-LINE TUTORIALS IN ELEMENTARY STATISTICS /

557

Achievement

Cumulative grade point average (GPA) measured academic achievement. The GPA cut-offs for separating students into low and high achiever groups were selected as the values sorting approximately 25% of the participants in the low group and 25% in the high group. The cut-off for the low group was 2.4 or less (25.3% of the students) and the cut-off for the high group was 3.4 or higher (26.3% of the students). The correlation between GPA and Math SAT was low enough to permit including both variables in the same model, Pearson correlation = .174. Average Exam Grades

Students took three or four exams during the semester and a cumulative final exam. Average exam grades were the average across all exams taken. Instructors wrote their own exams, but the department specified that all exams include at least 50% worked problems. All but one instructor indicated that they took some or all of their exam questions from the textbook publisher’s test bank. RESULTS The data met the basic assumptions of normality with no range restrictions or outliers noted. The computer grade file for one control section was unreadable, resulting in the 47 students in that section being omitted from the analyses. Because users had significantly more college level credit hours than non-users and control students cumulative credit hours earned was included as a control variable in the analysis of exam scores. Participation In intervention sections, 39.3% of the low achievers used the tutorials (Table 2). While a greater proportion of middle achievers used tutorials (51.0%) compared to both low achievers (39.3%) and high achievers (43.1%) (c2 (2, N = 695) = 7.15, p = .03), average number of views for low-achieving users was higher, on average, than middle and high achievers F(2, 317) = 3.01, p = .05. Use of tutorials by topic in semester sequence (see Figure 1) shows an initial surge of interest, and then strategic use by topic rather than consistent use across the term. Students viewed practice exams less often than instructional topics, with the typical user viewing 2.49 practice exams. The average number of practice exam views did not differ across achievement levels, F(2, 81) = 0.45, p = .64. Impact of Tutorial Use on Drop Rate and Grades for Low Achievers For low achievers, the drop rate for users (18.2%) was significantly lower than the drop rate for both non-users (37.3%) and control students (33.3%),

558 / SARGENT ET AL.

Table 2. Participation by Achievement Level for Students in Intervention Sections: Mean (SD) Cumulative GPA Attribute

Low: < 2.4

Middle

High: > 3.4

168

339

188

66

173

81

Percent using tutorials

39.3%

51.0%

43.1%

Percent of users who used practice exams

31.3%

29.2%

26.1%

13.03 (12.5)

10.15 (9.2)

9.16 (8.9)

Participants in intervention sections Participants using tutorials

Average number of views per user*

aOpened two or more tutorials during term. *p < .05.

c2 (2, N = 357) = 7.26, p = .03. Low-achieving users had a significantly higher pass rate (63.7%) compared to both low-achieving non-users (36.3%) and lowachieving control students (44.4%), c2 (2, N = 357) = 12.33, p = .002. Figure 2 presents the grade distribution for low achievers by group. A regression analysis with average exam scores as the dependent variable, tutorial use as the predictor, and cumulative credit hours earned, cumulative GPA, and Math SAT as control variables, showed tutorial use was significant in predicting average exam scores for low achievers, t(4) = 2.41, p = .02. The practical significance of this finding was moderate, 0.379 points higher per tutorial view. Since the average number of views was 13.03 for low achievers (see Table 2), this translates into an average of 4.94 points per exam, about half a letter grade. Persistence Rate As a post hoc test, we reviewed the re-enrollment rate the week after the final exam.2 Users reenrolled at a significantly higher rate (51.4%) than non-users and control students (48.6%), c2 (1, N = 695) = 5.43, p = .02. Survey Results Table 3 lists the numbers of students who completed the end of course surveys. There was a significant difference among users, non-users, and control students in 2 Students can still enroll after this date for the following semester. This was the date of the system download from the institutional records for this study.

ON-LINE TUTORIALS IN ELEMENTARY STATISTICS /

559

Figure 2. Grade distribution by group for low achievers (cumulative GPA below 2.4).

Table 3. Course Resources Reported as Important to Achievement: Count (Percent) Attribute Completed end-of-course survey Percent completing survey

Users

Non-Users

Control

196

189

396

61.25%

50.40%

55.31%

Class lecture

89 (45.5%)

127 (67.2%)

274 (69.2%)

Course notes provided by instructor

42 (21.4%)

48 (25.4%)

118 (29.8%)

On-line tutorials

111 (56.6%)

NA

NA

Math Assistance Center (Math Lab)

22 (11.2%)

10 (5.3%)

28 (7.1%)

Office hours

16 (8.2%)

17 (9.0%)

50 (12.6%)

Course textbook

95 (48.5%)

98 (51.9%)

205 (51.8%)

Course website resources

14 (7.1%)

10 (5.3%)

18 (4.5%)

560 / SARGENT ET AL.

attendance rates on the last day of the semester when surveys were distributed, c2 (2, N = 1,115) = 14.87, p = .001. Uneven attendance was concentrated, however, in the middle and high achievers. The attendance rate was even across all three groups for the low achievers, c2 (2, N = 315) = 5.56, p = .06. Of the students completing end-of-the-course surveys, 297 (37.9%) selfdescribed themselves as “satisficers,” meaning they “worked only until they achieved a sufficient grade goal” or “worked to just pass.” For the low achievers, 63.0% reported themselves as “satisficers,” a much higher rate than the middle (43.4%) or high achievers (24.6%), c2 (4, N = 385) = 33.67, p < .001. The percentage of satisficers among users (39.0%), non-users (39.5%), and control students (36.6%) did not differ significantly, c2 (2, N = 781) = 0.586, p = .75. Student survey responses, indicating the resources important to their achievement, are shown in Table 3, and the percent of students reporting each tutorial feature as important is shown in Figure 3. Low achieving users were more likely than their middle and high achieving counterparts to cite tutorials as important to their achievement when compared to other resources (see Table 4). DISCUSSION Even though at-risk students do not self-select into supplemental instruction at high rates (Blanc, DeBuhr, & Martin, 1983; Karabenick & Knapp, 1988; Kenney & Kallison, 1994), 39% of the low achievers used the tutorials without

Figure 3. Percent reporting feature as important to their course achievement.

ON-LINE TUTORIALS IN ELEMENTARY STATISTICS /

561

Table 4. Low Achievers Only: Course Resources Reported as Important to Achievement: Count (Percent) Attribute

Users

Non-Users

Control

31

23

70

14 (45.2%)

14 (60.1%)

46 (65.7%)

0

0

0

19 (61.0%)

NA

NA

Math Assistance Center (Math Lab)

3 (9.7%)

3 (13.0%)

6 (8.6%)

Office hours

4 (12.9%)

3 (13.0%)

12 (26.1%)

15 (48.4%)

14 (60.9%)

33 (47.1%)

1 (3.2%)

1 (4.3%)

3 (4.3%)

Completed end-of-course survey Lecture Course notes provided by instructor On-line tutorials

Course textbook Course website resources

any course credit for their work, suggesting that low achievers are amenable to this form of convenient extra instruction. The low achievers’ use of this innovation translated into better grades, on average raising exam grades half a letter grade. The end-of-the-course survey reflected students’ enthusiasm for this resource. One of the goals for this work was to replicate the prior study, where the participation rates for low achievers (61.0%) was much higher than in this study (39.3%). Unlike the prior study where the instructors reminded students about the tutorials periodically, statistics instructors were not asked to remind students, and only two out of the 21 instructors previewed a tutorial, indicating their non-involvement with the resource. The average grades in this study were a full letter grade higher than the average grade in the prior work, likely reducing the need for participation. On average, low achievers opened 13.03 tutorials, equivalent to about 40-70 minutes of extra lecture, depending on the topics viewed. While some topics were not of interest (see Figure 1), the low achievers may have viewed more instruction if provided with different topics, potentially boosting the learning effect. Participation was higher at the start of the semester and then dropped significantly after the first exam, following the fourth chapter (see Figure 1). The drop off could have resulted from students reaching their grade goals early in the semester, which diminished students’ motivation to learn more (Hsieh et al., 2007). A high percentage of students, 37.9% overall and 63.0% for low-achievers, self-reported on the end-of-the-course survey that they were not trying to maximize their grade in this course. It is unlikely that the drop off was

562 / SARGENT ET AL.

due to low perceived usefulness of the innovation because tutorials were cited on student surveys as “important to course achievement” more often than any other course resource. Students at all levels of achievement who used the tutorials ranked them higher than other course resources on the course survey. Student enthusiasm for the resource may reflect a benefit to student well-being (Fries, Schmid, Dietz, & Hofer, 2005). That is, perhaps users spent less time studying to achieve their grade goals than they believed would be needed using other tactics. Users ranked the 24/7 availability as an important feature more often than any other feature, confirming that convenience is important to participants (McGuire, 2006). Users also reported high values for clarity (“clear,” “better instruction than the book”), and efficiency (“quick way to learn”), that is, learning either quicker or with less mental effort (Clark et al., 2006). The themes of convenience and efficiency confirm that students value resources that reduce the time investment needed to achieve their grade goals (Fries et al., 2005). The surprise in this ranking of important features was the relatively low rank of “short length” (fifth out of 10 qualities). Isolating the low achievers, short length was seventh out of 10 features, lower even than for middle and high achievers, with only 9 of the 121 (7.4%) who completed the end-of-the-course survey indicating that short length was important to them. Only 18 (< 1%) of users reported that the tutorials were motivating. Students may have already had a reasonable level of motivation by the time they logged on to view them. From the student’s perspective, however, the tutorials’ primary value was developing the needed subject mastery quickly and conveniently, not giving them compelling reasons to move to another tutorial or work harder on course materials. While this work does not provide insight into what might prompt non-users to use the SI, non-users were less experienced college students (fewer total credit hours) and were more likely to cite their lecture as useful. Understandably, the SI innovation may be used more frequently when students find the class instruction less effective. An alternate explanation for the survey results is that the clarity of the tutorials reduced the rating of class lecture by providing a comparison point. LIMITATIONS AND AREAS FOR FURTHER WORK While this work asked students if they had a prior course in statistics, it did not include a pre-test of statistics knowledge. In addition, the measures of learning (exams) differed across all 21 instructors. Using a pre-test and departmental-wide exams (to use as post-tests) would have strengthened the study. Because of data limitations, this work defined “use” as number of files opened. After the short lesson, the tutorials asked students to freeze the frame and attempt

ON-LINE TUTORIALS IN ELEMENTARY STATISTICS /

563

the problems before reviewing the worked-out solutions. Students who stopped the tutorial and worked the sample problems before viewing the solution spent much more time working the tutorials than those who just viewed the short lesson. Time spent viewing tutorials might be a better measure in future studies. Participation rate was less than in the prior accounting study. It is not clear if participation rates were lower due to better exam scores (less need for extra help), differences in instructor encouragement to use the resource, competition from other course resources, fewer tutorials topics available, or a combination of these factors. Lack of information from instructors concerning how the tutorials connected with course topics or how they might help students prepare for exams may have diminished student awareness of, or interest in, the resource. Future work might include more topics and some sections where instructors alert students about the resource. Other studies might track participation in more difficult courses (with lower grades) where the need for extra instruction is higher. Although short length was considered an important design feature of the tutorials, even low-achieving students did not cite short length as an important feature. Future work might investigate how participation and learning change with longer tutorials. In an attempt to please the researcher, students may have reported using resources they never used. Eleven students reported using the tutorials on the survey when in fact they had not opened a file. Sending someone other than the principle investigator to the classroom to promote the innovation may reduce this effect. CONCLUSION This study presents good news about low achievers, a group critical to retention improvements. Low achievers used the tutorials as frequently as higher achievers, even without course credit for the effort. Not only did they use them, they were enthusiastic about them. At-risk students rated tutorials higher in importance than any other course resource. And tutorial use translated into higher grades for low achievers, on average about half a letter grade. This work affords new insight about low achievers. They may be poorly motivated, but they respond to instructional resources that suit them, those that are clear, convenient, efficient, and confidential. This work revealed that 15 ultra-short tutorials and six practice exams, about 40-70 minutes of extra instruction, helped weak students improve their exam scores. Remarkably, low achievers benefitted without customization of the instruction or involvement of faculty in grading or promotion of the resource. And the survey results indicate that low achievers may be open to even more instruction. While more elaborate supplemental instruction models may achieve stronger results, this resource may provide a low cost and low maintenance alternative to bringing up the lower end of the grade curve.

564 / SARGENT ET AL.

ACKNOWLEDGMENTS For helpful comments, the authors are indebted to Nannette Commander, Daphne Greenberg, Shari Obrentz, Dennis Thompson, and an anonymous reviewer.

REFERENCES Ayres, P. (2006). Impact of reducing intrinsic cognitive load on learning in a mathematical domain. Applied Cognitive Psychology, 20, 298-298. Blanc, R., DeBuhr, L. E., & Martin, D. C. (1983). Breaking the attrition cycle: The effects of supplemental instruction on undergraduate performance and attrition. Journal of Higher Education, 54(1), 80-90. Clark, R., Nguyen, F., & Sweller, J. (2006). Efficiency in learning: Evidence-based guidelines to manage cognitive load. San Francisco, CA: John Wiley & Sons, Inc. Commander, N. E., & Stratton, C. B. (1996). A learning assistance model for expanding academic support. Journal of Developmental Education, 20(2), 8-14. Congo, D. H. (2002). How supplemental instruction stacks up against Arthus Chickering’s 7 principles for good practice in undergraduate education. Research and Teaching in Developmental Education, 19(1), 75-83. Congo, D. H., & Schoeps, N. (1993). Does supplemental instruction really work and what is it anyway? Studies in Higher Education, 18(2), 165-176. Friedlander, J. (1980). Are college support programs and services reaching high-risk students? Journal of College Student Personnel, 21(1), 23-28. Fries, S., Schmid, S., Dietz, F., & Hofer, M. (2005). Conflicting values and their impact on learning. European Journal of Psychology of Education, XX(3), 259-273. Good, T. L., Slavings, R. L., Harel, K. H., & Emerson, H. (1987). Student passivity: A study of question asking in K-12 classrooms. Sociology of Education, 60, 181-199. Hodges, R. (2001). Encouraging high-risk student participation in tutoring and supplemental instruction. Journal of Developmental Education, 24(3), 2-9. Hodges, R., Dochen, C. W., & Joy, D. (2001). Increasing students’ success: When supplemental instruction becomes mandatory. Journal of College Reading and Learning, 3(2), 143-156. Hsieh, P.-H., Sullican, J. R., & Guerra, N. S. (2007). A closer look at college students: Self-efficacy and goal orientation. Journal of Advanced Academics, 18, 454-476. Karabenick, S. A., & Knapp, J. R. (1988). Help seeking and the need for academic assistance. Journal of Educational Psychology, 80(3), 406-408. Karabenick, S. A., & Knapp, J. R. (1991). Relationship of academic help seeking to the use of learning strategies and other instructional achievement behavior in college students. Journal of Educational Psychology, 83(2), 221-230. Keimig, R. T. (1983). Raising academic standards: A guide to learning improvement. In J. D. Fife (Ed.), ASHE-ERIC Higher Education Report (Vol. 4, pp. 1-100). Washington, DC: National Institute of Education.

ON-LINE TUTORIALS IN ELEMENTARY STATISTICS /

565

Kenney, P. A., & Kallison, J. M. J. (1994). Research studies on the effectiveness of supplemental instruction in mathematics. New Directions for Teaching & Learning, 60, 75-82. Kim, Y., Baylor, A. L., & Group, P. (2006). Pedagogical agents as learning companions: The role of agent competency and type of interaction. Journal of Computer Assisted Learning, 54(3), 223-243. Korner, C. (2005). Concepts and misconceptions in comprehension of hierarchical graphs. Learning and Instruction, 15, 281-296. Linnenbrink, E. A., & Pintrich, P. R. (2002). Motivation as an enabler for academic success. School Psychology Review, 31(3), 313-327. Martin, D. C., & Arendale, D. (1992). Supplemental instruction: Improving first-year student success in high-risk courses. Columbia, SC: Center for the Study of Freshman Year Experience, South Carolina University. Martin, D. C., & Blanc, R. (2001). Video-based supplemental instruction. Journal of Developmental Education, 24(3), 12-19. Mayer, R. E., Mathias, A., & Wetzell, K. (2002). Fostering understanding of multimedia messages through pre-training: Evidence for a two-stage theory of mental model construction. Journal of Experimental Psychology: Applied, 8(3), 147-154. McCarthy, A., & Smuts, B. (1997). Assessing the effectiveness of supplemental instruction: A critique and a case study. Studies in Higher Education, 22(2), 221-231. McGuire, S. Y. (2006). The impact of supplemental instruction on teaching students “how” to learn. New Directions for teaching and learning (Vol. 106, pp. 3-10). Hoboken, NJ: John Wiley and Sons, Inc. Moore, R., & LeDee, O. (2006). Supplemental instruction and the performance of developmental education students in an introductory biology course. Journal of College Reading and Learning, 36(2), 9-20. Muller, D. A., Bewer, J., Sharma, M. D., & Reimann, P. (2007). Saying the wrong thing: Improving learning with multimedia by including misconceptions. Journal of Computer Assisted Learning, 24, 144-155. Robbins, S. B., Allen, J., Casillias, A., & Peterson, C. H. (2006). Unraveling the differential effects of motivational and skills, social and self-management measures from traditional predictors of college outcomes. Journal of Educational Psychology, 98(3), 598-616. Sargent, C. S., Borthick, A. F., & Lederberg, A. R. (2011). Improving retention for principles of accounting students: Ultra-short online tutorials for motivating effort and improving performance. Issues in Accounting Education, 26(4), 657-679. Simpson, M. L., Hynd, C. R., Nist, S. L., & Burrell, K. I. (1997). College academic assistance programs and practices. Educational Psychology Review, 9(1), 39-87. Smith, J. P. I., diSessa, A. A., & Roschelle, J. (1993). Misconceptions reconceived: A constructivist analysis of knowledge in transition. The Journal of the Learning Sciences, 3(2), 115-163. Stansbury, S. (2001). Accelerated learning groups enhance supplemental instruction for at-risk students. Journal of Developmental Education, 24(3), 20-27. Sweller, J., & Chandler, P. (1994). Why some material is difficult to learn. Cognition and Instruction, 12(3), 185-233. Tinto, V. (2006a). Enhancing student persistence: Lessons learned in the United States. Analise Psicologica, 1(XXIV), 7-13.

566 / SARGENT ET AL.

Tinto, V. (2006b). Research and practice of student retention: What next? Journal of College Student Retention, 8(1), 1-19. Topping, A. J. (1996). The effectiveness of peer tutoring in further and higher education: A typology and review of the literature. Higher Education, 32, 321-345. Webster, T. J., & Dee, K. C. (1998). Supplemental instruction integrated into an introductory engineering course. Journal of Engineering Education, 87(4), 377-383.

Direct reprint requests to: Carol Springer Sargent, Ph.D., CPA School of Business Middle Georgia State College 100 College Station Drive Macon, GA 31206-5145 e-mail: [email protected]