Applicative Personalized Learning

7 downloads 135967 Views 1MB Size Report
Adaptive Learning, Applicative, Learning Analytics, Personalized Learning, ... Today, online learning courses are popular with students whose schedules no ...
International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Applicative Personalized Learning: How Gamification is Driving Learning Leyla Zhuhadar, Western Kentucky University, Bowling Green, KY, USA Phillip Coleman, Western Kentucky University, Bowling Green, KY, USA Scarlett Marklin, Department of Sociology, Florida State University, Tallahassee, FL, United States

ABSTRACT As technology has advanced, educational establishments are faced with this question: Are we investing enough funds and resources to provide college students with effective learning tools that could help them succeed not only by alerting them when they are at risk, but also providing them with personalized remedies while studying? Recently, students became more familiar with the notion of recommendation systems through their experience with social media tools, but does education fit this paradigm? In this paper, the authors review the recent advancement of technologies in personalized learning. They also examine the role of course redesigning where instructors can teach students what they need to learn as effectively as possible while tailoring their feedback to students based on automated performance dashboards with the highest granularity level as needed. Keywords Adaptive Learning, Applicative, Learning Analytics, Personalized Learning, Predictive Analytics, Prescriptive Analytics

INTRODUCTION Since 2011, the U.S. Department of Education (ED) and the U.S. National Science Foundation (NSF) have made increasing investments in education research and development seeking to established guidelines for improving the quality and pace of knowledge development in science, technology, engineering and mathematics (STEM) education as well as other content areas to increase student achievement, and engagement (Snyder & Dillow, 2015). The key focus areas of research are foundational, exploratory, design and development, efficacy, effectiveness, and scale-up research (Aud, Wilkinson-Flicker, Nachazel, & Dziuba, 2013). The rise of open/online distance learning has continuously changed the relationship between education and information and communication technologies. Education has been crucial in the success of communities in the 21st century and will continue to face challenges regarding population demographics, analytic methods, advanced technology, and limited funding. These changes will require the current educational system to continuously adapt (Kaplan, Slivecko, Gardner, & Turner, 2014). On one hand, students witnessed a significant evolution in E-commerce pertaining to recommender systems such as Amazon, Netflix, Facebook, etc. Since 1994, Amazon started integrating various concepts, methods, and technical architectures of recommender systems into their E-commerce storefront (Zhuhadar & Nasraoui, 2010). On the other hand, for almost a decade, faculty members from a mid-western university have been working on designing a variety of intelligent systems, such as CaseGrader (Crews & Murphy, 2007) and DOI: 10.4018/IJKSR.2016100103 Copyright © 2016, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

24

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

HyperManyMedia (Zhuhadar, Nasraoui, & Wyatt, 2008), to personalize student learning experiences. In CaseGrader, Crews et al. used intelligent methods to provide personalized automated scoring to students based on their performance in solving mathematical or business problems within Microsoft Excel, whereas the HyperManyMedia platform provided recommendations to students based on their previous browsing activities. These recommendations are based on artificially intelligent algorithms where ontology is defined and semantic webbing is utilized to provide the most accurate recommendations to students based on their level of understanding within the course. For more details, refer to our previous research (Zhuhadar, Nasraoui, & Wyatt, 2007; Zhuhadar, Nasraoui, & Wyatt, 2009a, 2009b; Zhuhadar, Nasraoui, Wyatt, & Romero, 2009; Zhuhadar, Nasraoui, Wyatt, & Yang, 2010; Zhuhadar & Yang, 2012). As educators in higher education, we noticed, over the last 2-3 years, well-known publishing companies such as Pearson, McGraw-Hill, Wiley, and CengageBrain have begun to invest heavily in the design of intelligent personalized learning systems specifically for higher education. Within these systems, not only can faculty predict if a student is at risk, but students can get instant notifications (alerts) about their performance on specific tasks, no matter the type (i.e., homework, quizzes, etc.). This alert mechanism of student performance could be related to a simple task such as answering a question related to a reading comprehension section or could be involved in solving a complex equation. In this paper, a pilot project at a mid-western university’s Information Systems Department entitled The Impact of Adaptive Personalized Learning on Student Outcome will be analyzed. In addition, detailed information about the new trends of adaptive learning technologies in education such as, MindTap, CourseMate, and SAM Cenagage System, will be reviewed. As a starting point, during the Spring semester of 2016, five faculty members were assigned to teach a course titled Principles of Management Systems in the Information Systems department. Two out of the five faculty members decided to experiment with Intelligent Systems -- more specifically, using the concept of Adaptive Personalized Learning. At the end of this paper, some preliminary results of this study will be discussed. LITERATURE REVIEW Recently, in 2007, the program Changing the Equation (CTE), designed by The National Center for Academic Transformation (NCAT) and supported by the Bill and Melinda Gates Foundation, was created to specifically address the attrition of community college students in introductory algebra and other math courses. This program redesigned remedial and developmental math sequence courses using instructional software such as MyMathLab or MyLabPlus which requires lab attendance, participation, deadlines, monitoring of student progress, and intervention for those lagging behind18. Today, online learning courses are popular with students whose schedules no longer fit the traditional norms. With the increased enrollment in online learning, the field of learning analytics (LA) aims to optimize learning through examination of dynamic processes, which occur within the student context (i.e., measurement, collection, analysis, and reporting). For example, they analyze systematic mapping on learning analytics using the MOOCs context and present perspectives and challenges (Moissa, Gasparini, & Kemczinski, 2015). Personalized Learning and Learning Analytics While Feldstein, Hill, & Cavanagh (2015) note that the definition of personalized learning is not agreed upon, it matters when addressing how personalized learning can help students. Even though the term personalized learning is inadequate, it has enhanced student learning when used in tandem with properly implemented educational practices. The field of LA provides key information that educators

25

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

need to accurately describe the learning process and offers suggestions focused on students who need extra guidance in personalized learning (Moissa et al., 2015). The goal of LA is to enhance the learning process through analysis of student-generated data via various techniques (e.g., navigational data, demographics, environment, etc.) to address and change the way education is delivered to students. Moissa and colleagues (2015) describe three dimensions of learning analytics: what (refers to data collected), who (refers to stakeholders), and why (reasoning behind the analysis). By using LA, one can monitor and analyze the actions of students, predict student future actions/performance, and intervene where appropriate as well as offer tutoring/mentoring resources for students who need it (Moissa et al., 2015). Furthermore, information and communication technologies (ICT) have increased the educational attainment options available for students at any age. Classrooms are no longer constrained by geographical location (space) or time; the Internet Generation of students are now more likely to embrace online learning platforms offering an array of services that fit their needs, abilities, and means (Maggio, Saltarelli, & Stranack, 2016). Through LA, students are given feedback that helps improve learning via adaptation (tells student how to proceed according to needs), personalization (student chooses how they will learn), recommendation (offers learning resources to prevent overload), and reflection (allows comparison on effectiveness of practices) (Moissa et al., 2015). So, why the hype? Educational institutions and providers now offer online learning as a way to cater to the Internet Generation by offering individual learning opportunities via courses covering a broad range of topics. These programs and services tailored to student need, lifestyle, and preference create the student consumer (Kaplan et al., 2014). Yet, software plays a supporting role contrary to popular narratives. Problem-based learning courses also benefit from having such software incorporated into the course design. Therefore, personalized learning needs to be viewed as a practice and not a product for these consumers by 1) moving content broadcast out of the classroom, 2) turning homework time into contact time, and 3) providing tutoring (Feldstein et al., 2015; Hill, 2013). Researchers have noted how technology can be used to increase the understanding of the student learning process by improving research techniques in higher education (Maggio et al., 2016). This transformation creates more personalized learning paths in which students seek out materials to help them master course concepts and material for greater understanding and comprehension (Kaplan et al., 2014). In the following paragraphs, we introduce a series of personalized learning products available for faculty to adopt. Cengage Learning Cengage Learning, a leading provider for innovative teaching, learning, and research for global markets, designs tools to foster academic excellence and professional development providing innovative learning solutions that bridge education and technology. Cengage’s MindTap application is one such technological tool, which promises enhanced student learning through online course management and instructional support. Findings from a mixed methods study of MindTap found that students in classes that used MindTap technology substantially increased their course knowledge and skills (Marketing Works Inc., 2013). Companies like Cengage have cloud-based interactive learning environments like SAM (Skills Assessment Manager) which teaches, trains, and tests essential Microsoft Office and computer concept skills vital to real-world academic and career successes (Keiffer, 2015). Cengage’s SAM platform increases engagement from students as they watch, listen, and practice in a hands-on platform, applying skills to real-world situations limiting redundancies and inefficiencies (Keiffer, 2015). Real-time data identifies struggling students and focuses on the weaknesses noted regarding individual performance. Something else to note is the increased academic performance as the course quality improves using actionable data SAM generates. Cengage 4LTR Press also provides online learning products to instructors for increased classroom success. These online platforms allow for increased proficiency in math readiness and address anxiety with such STEM classes by allowing students to reach out directly to their instructor and peers within the app; a perfect example noted by Cengage is the cloud-based Enhanced WebAssign. Students’ lack of desire to read assigned textbooks are also problematic when assessing success within a course. 26

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Because of this 4LTR Press has changed the look of the traditional textbooks so that students can get through chapters efficiently with an online component; this aids students and instructors by indicating how much students are studying and how much time they are spending on certain tools such as videos, quizzes and games (Perosio, 2016). Acrobatiq Smart Courseware Acrobatiq Smart Courseware is another emerging technology based on methodology developed over 10 years of learning science research from Carnegie Mellon’s Open Learning Initiative, which blends the art of teaching with the science of personalized learning16. The tool helps educators and students be more successful in the classroom via the adaptive learning platform and services, allowing for easy customization (text, video, assessments, and adaptive elements) delivering interactive, adaptive lessons to students, which align with teaching goals for each class. The tool offers feedback to students so that they get personalized feedback when needed on activities and assessments. Pearson Pearson has MyLab, which offers simulations that give students the much needed hands-on experience in the application environment that in turn aids students’ ability to understand how each application works in the real world. This allows the students to perform better with MyLab in a course with instructor generated assignments and exams. MyLab is used by a multitude of disciplines including Computer Information Systems, Business Administration, and Mathematics. All reported increased performance and overall success of students in the classes using MyLab compared to those who did not have such cloud-based technology offering hands-on experience and the self-directed learning advantages. With regard to efficacy and effectiveness, Pearson offers up the best practices for making the most out of the cloud-based platforms which include the following: 1) training adjuncts, tutors, and other instructors – everyone needs to be on the same page with implementation training to assure consistency; 2) position the students for success – have an orientation, provide class structure, and clear expectations; 3) require completion of assignments for credit – do not make anything optional; 4) connect and engage with students – circulate the classroom, send weekly emails, and intervene with those not completing the assignments; 5) employ personalized learning – offer immediate feedback for active learning and enhanced assessment; 6) conduct frequent assessments – assessments allow students an account of what they know and what they do not know; 7) require mastery learning – mastery ensures that skills are solidly understood and that they build one upon another reinforcing previous skillsets acquired; and 8) track learning gains – allows instructor to make informed decisions about programmatic shifts and increase effectiveness through comparisons of homework, quizzes, exams, final grades, etc. After reviewing and testing the above Personalized Learning Products, Cengage CourseMate was selected. In the next sections, we share the methodology used and the preliminary findings. DATA METHODS During the spring semester of 2016, five faculty members were assigned to teach a course titled Principles of Management Systems in Information Systems. Two out of the five faculty members decided to experiment with Intelligent Systems, more specifically, using the concept of Adaptive Personalized Learning. Simultaneously, these five faculty members taught identical curriculum, including the same textbook, homework assignments, case studies, videos, quizzes, and exams. Each faculty was given two to three sections of the course Principles of Information Systems, to teach. Some students taking this course were enrolled in a face-tp-face class environment and other students taking the course were enrolled in the online environment (no face-to-face meetings). In total, we

27

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

had 325 students enrolled in 10 classes. A total of 157 students were introduced to this Adaptive Personalized Learning environment in both settings: face-to-face (81%) and online learning (19%).’ The current study used two methods of data collection to answer the research questions. Before analyzing the impact of using CourseMate on students’ performance, we had to obtain information regarding the technological environment the students were exposed to while studying, such as access to computer, Internet, or University Labs. This information was collected as part of the survey administered to students before taking the exam. Out of 157 students asked to fill out the survey, only 137 completed it (Dataset-1). The next Figures (3,4,5,6 illustrate the answers collected from students. The second dataset (Dataset-2) is based on CourseMate’s tracking system. The system archives the interaction between students and CourseMate on multiple levels, from a simple interaction while playing a “Beat the Clock” Game to reading activities per day and time (detailed information about this dataset is provided in next sections). Measurement of Variables • • • • • •

Final Grade: 0 (F), 1 (D), 2 (C), 3 (B), 4 (A); ScoreAct: The average score on Quizzes within CourseMate traced activities. It has the same coding schema of Grade: 0 (F), 1 (D), 2 (C), 3 (B), 4 (A); eBook: Average number of access to reading Chapters eBook (0 to 8); Flashcard: Average number of access to reading Flashcard (0 to 7); Games: Average number of access to Game Quizzes as practice before the exam (0 to 9); NonGradedQs: Average number of access to Non-Graded-Quizzes as practice before the exam (0 to 12).

While CourseMate provides faculty with information about their students’ performance using engagement tracker, platform provides students with a dashboard of instant information including required readings, tests, and ability to interact with simulated games. Faculty received quick overall analyses of their students, especially those students at risk which were easily identified through the dashboard. The risk indicator could be on a specific resource (i.e., games, flashcards, practice quiz generators, traceable activities per resource, usage of videos/audios, time spent per activity, etc.), shown in Figure 1 or as an overall risk, shown in Figure 2. This scatter chart provides information regarding student grade performance (on the y-axis) and student time performance (on the x-axis) for all the resources. Data Set 1 Descriptive Statistics Of the participants surveyed, a majority of female students enrolled in this course via the online setting (70%); whereas, a majority of male students enrolled in this course via the face-2-face setting, as illustrated in Table 1. Table 2 and Table 3 show the distribution of students based on Ethnicity and Age categories. Approximately two thirds of the sample of students in these courses were white and within an 18-21 age range. As noted in Figure 3, only 2 students out of 137 students did not have access to a personal computer or Internet at home to practice with CourseMate, these two students were dropped from this study. Between 85 -90% of students who took the survey knew how to use Coursemate and found it easy to use regardless of usage either at home or the lab (Figure 4). More than 85% of students agreed that CourseMate was useful in understanding the textbook materials (Figure 5). Furthermore, even though students indicate having no technical problems in accessing CourseMate, Figure 6 shows that some students resisted the idea of using CourseMate to practice for the exam. As a result, only 61% students actually used CourseMate before taking the exam.

28

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Figure 1. CourseMate Engagement Tracker with deep granularity level

Figure 2. CourseMate Engagement Tracker

29

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Table 1. Learning environment Total # of Students

% Female

% Male

Face-2-face

127

30%

79.8%

Online

30

70%

20.2%

Table 2. Ethnicity distribution Ethnicity

Percentage

Asian

2.55%

Black or African American

7.01%

Hispanic (of any race)

4.46%

Nonresident Alien

15.29%

Two or More Races

3.18%

White

67.52%

Total

100%

Table 3. Age distribution Age

Percentage

< 18

2.52%

18 - 21

67.96%

22 - 25

18.31%

>25

11.21%

Total

100%

Figure 3. How often do you do the following activities?

30

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Figure 4. How easy was CourseMate to use?

Figure 5. How useful CourseMate is to understand the textbook materials?

31

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Figure 6. Did you use CourseMate before taking the Exam?

Data Set 2 Descriptive Statistics The focus of this study is to not only to test the new trends of the personalized learning resources that can help students achieve better performance, but also to discover the best pedagogy that can help professors in redesigning their courses to better administer these tools. Out of 157 students registered in these 5 courses, 153 students had accessed CourseMate (at least once) during the whole semester. In this part of the study, we examined the extent to which personalized learning influence the prediction of academic success in these 5 undergraduate courses offered in face-to-face and online learning models. We were interested in finding answers to the following questions: 1. Does practicing more with CourseMate improve students’ grades? 2. Can we construct a “good” model to predict the impact of using CourseMate on students’ grades? Table 4 indicates descriptive of the sample. Of the students observed, we found that students mostly used the following tools within CourseMate: 1) general quizzes activities (ScoreAct), 2) reading eBook chapters (eBook), 3) reading flashcard (Flashcard), 4) practicing on quizzes by using “Beat the Clock” (Games), and 5) practicing on non-graded quizzes (NonGradedQs). The distribution of grades was slightly skewed to the right, with higher grades of A’s and B’s (Figure 7). By running correlation analysis, only Games and ScoreAct are positively highly correlated with Grade (Table 5). MULTIVARIATE ANALYSIS We use Linear Regression model for studying the relationship between a single dependent variable y (Grade) and independent variables x’s (ScoreAct, eBook, Flashcard, Games, NonGradedQs). General Linear Regression Model Refer to Equation 1 below for a general linear regression model:

y = b + b x + b x + … + b x 0 1 1 2 2 k k 32

(1)

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Figure 7. Grade distribution on exam (Blackboard)

Table 5. Correlation matrix Grade

Quiz Scores

Chapters

Flashcards

Games

Non-Graded Quizzes

Grade

1.000

--

--

--

--

--

Quiz Scores

0.744

1.000

--

--

--

--

Chapters

0.058

0.053

1.000

--

--

--

Flashcards

-0.115

-0.101

0.327

1.000

--

--

Games

0.875

0.796

0.073

-0.042

1.000

--

Non-Graded Quizzes

0.193

0.072

0.088

-0.023

0.121

1.000

We pose the following question: Is there evidence to support improvement of a student’s grade after controlling for all other available characteristics in the datasets? Furthermore, if there is support, what is the magnitude of the effect? As a strong correlation between Game and Grade was observed (Table 5), the current study uses ordinary least squares regression models to address the question: Does practicing more with CourseMate improve a student’s grade? As noted in Table 6, model 1 indicates that the intercept (1.154) is the mean Grade for those students who did not practice (at all) with Game Quizzes within CourseMate. The slope (0.320) is the mean Grade for students who practiced (one time on average for each chapter) with Game Quizzes minus the mean Grade for those students who did not practice,(i.e., the difference between the two means). 33

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Therefore, the equation for Model 1 (Table 6) is:

y = 1.154 + 0.320 (Games )

(2)

Cut -off P-Value = 5% The mean Grade for students who used CourseMate Games on average one time per chapter is 1.474. The t-statistic is identical to what we get from doing a pooled t-test for a difference between two means, no other bi-variate model was significant. So in answer to the question posited: if a student practiced on average 3 times with Game quizzes his grade would be increased by one unit (i.e., from grade B to A, or from F to D). The F-test (model 1) is a test of the null hypothesis that the Game Quizzes coefficient is 0 and there is no effect of practicing with Games Quizzes on increasing students’ final grades. The highly significant F-statistic in model 1 (495.96) suggests that there is a strong effect on receiving a higher grade if students practiced with Game Quizzes before the exam. The magnitude of this effect can be obtained from R-squared measure. R-squared measure represents how well we can predict (or explain) the dependent variable (Grade). Model 1 (Table 6) explains 76.6% of the variation in Grades. Therefore, we conclude with high confidence that practicing with Game quizzes impacts students’ grades. Model 2 tests the association between all quizzes and grades accounting for all other available controls (Table 6). Therefore, the equation for model 2 is:

y = 0.989 + 0.279 Games + 0.102 ScoreAct , 0.008 eBook − 0.51 Flashcard + 0.043 NonGradedQ

(3)

The intercept (0.989) is the mean Grade for those students who did not practice (at all) with any of these tools (ScoreAct, eBook, Flashcard, Games, NonGradedQs) within CourseMate. The F-test in this model is the test of the null hypothesis that all 5 coefficients are 0. The highly significant F-statistic of 107.64 for model 2, indicates that at least one coefficient is not 0. The high r-square value (R2 = 0.785) indicates that this model explains 78.5% of the variation in Grades. Model 2 indicates that if a student practiced on average 4 times with Game quizzes, his or her grade would be increased by one unit (i.e., from grade B to A, or from F to D). Also, a student who practiced with non-graded quizzes took on average 30 more tries to increase his or her grade by one letter grade (unit). These findings clearly show support for graded game quizzes in helping increase student performance which is reflective of their final grades. CONCLUSION AND FUTURE WORK In this paper, we reviewed the recent advancement of technologies in personalized learning. We also introduced the role of course redesigning where instructors can teach students what they need to learn as effectively as possible while tailoring their feedback to students based on automated performance dashboards with the highest granularity level as needed. Within this preliminary study, we presented findings related to one particular platform—CourseMate. More specifically, we found that using games can improve students’ grades on finals. As a result, faculty will consider this finding to redesign the instruction of their courses in next semester to better improve student outcomes. So what lies ahead? The future of higher education will be based on the role technology will have in shaping its purpose (Carey, 2015). In the coming years, students and workers will have greater 34

International Journal of Knowledge Society Research Volume 7 • Issue 4 • October-December 2016

Table 6. Ordinary least squares regression on grades using CourseMate resources (N=153) Model 1 Independent Variables Quizzes (Score Act)

β 0.32

***

Model 2 SD

β

0.01

0.10

SD ᶧ

0.05

Reading (Ebook Chapters)

0.01

Flashcards

-0.05



Games

0.28

***

0.02

Non-Graded Questions

0.04

*

0.02

0.99

***

0.13

Constant

1.15

***

0.07

0.03

F-statistic

495.96***

107.64***

R-Squared

0.767

0.785

0.03

ᶧp