Management of an eLearning Evaluation Project ... - CiteSeerX

2 downloads 3193 Views 91KB Size Report
possibilities of web-assisted teaching by offering a range of services from introducing ... fairly rapidly build up a set of cases of good evaluation practice for Hong Kong .... The themes are grouped into those about the learning environment,.
Management of an eLearning Evaluation Project: e3Learning Paul Lam & Carmel McNaught Centre for Learning Enhancement And Research The Chinese University of Hong Kong [email protected] & [email protected]

Abstract: This paper describes the evaluation of purpose-built course websites for teaching and learning at the university level, developed by a funded project (e3Learning) in Hong Kong to support teachers in three universities to supplement classroom teaching with eLearning. Previous papers (e.g. Lam & McNaught, 2004) described the customized, flexible nature of the large number of evaluations conducted over the last two years. This paper concentrates on the procedural mechanisms and management strategies that have considerably facilitated the process and guaranteed the continued quality of the evaluations. The mechanisms and strategies incorporated have ensured externally a good communication between the evaluation team and the teachers, and internally a smooth-running workflow in which the responsibility of each member in the evaluation team is well-defined.

Background This paper describes the evaluation of purpose-built course websites for teaching and learning at university level developed by the project (e3Learning) funded by the University Grants Committee of Hong Kong to support teachers in three universities to supplement classroom teaching with eLearning. The paper builds on an earlier paper on the evaluation model used in the same project reported in ED-MEDIA 2004 (Lam & McNaught, 2004). The focus of the first paper was on the requirements and characteristics of the evaluation, while this paper concentrates on the procedural mechanisms and management strategies that have considerably facilitated the process and guaranteed the evaluation quality. The e3Learning (enrich, extend, evaluate learning; e3L) project has been designed to assist teachers to better exploit the possibilities of web-assisted teaching by offering a range of services from introducing teachers to practical ideas of using the web in education, to helping them to better use the functions of teaching and learning platforms such as WebCT, to developing complete course websites for the teachers. Full details of the design of this project are in James et al. (2003) and the project website http://e3learning.edc.polyu.edu.hk/main.htm. The e3L project operates across three universities, the Hong Kong Polytechnic University, the City University of Hong Kong and The Chinese University of Hong Kong. Over the two-year period, the e3L project has supported the web development of nearly ninety sub-projects designed to actively assist teaching in courses. Evaluation designs have been developed for 76 of these sub-projects. At the time of writing this paper (December 2004), 53 of the sub-projects are in active use and their evaluations have been completed. A total of 4,951students have been involved in the evaluation studies. The number of accesses to these websites is now over 67,000.

Characteristics of e3Learning Combining Consultation, Development and Evaluation The e3Learning project is not just an IT technical support project, but also provides a comprehensive educational support for eLearning. The project treats each teacher’s request for eLearning development and evaluation as a sub-project. Each sub-project has its own lifecycle: planning, design and development, implementation, and evaluation. The different stages of the lifecycle of a sub-project with types of support are portrayed in Figure 1.

Planning Stage

Proposal

One-stop shop meeting • Analyze requirements • Brainstorm educational ideas • Provide technical possibilities in teaching • Discuss evaluation strategy

Design & Development Stage

Design learning & teaching activities Design evaluation plan

Create eLearning materials Review, revise & enhance

Implementa-tio n Stage

Hand-over, assist teachers to use the materials

Evaluation Stage

Solve day-to-day web-use problems

Collect student feedback & learning outcome data

Academic involvements

e3Learning involvements

Figure 1: Lifecycle of a sub-project In the initial stage, academics receive educational advice. If teachers have only brief ideas, the initial meeting (one-stop shop meeting) will provide the opportunity to brainstorm ideas. This is particularly useful for teachers as they can see some concrete examples developed by academics from the same or different departments in this meeting. Ideas generated from viewing different practices in departments can be cross-fertilized. On the other hand, some will come with concrete ideas. The development staff will explore the technicality of the site development, or provide suggestions for modification if the ideas are not feasible. Discussion about how to evaluate the experience begins as early as this stage. In the developmental stage, the project staff members provide technical support. In the implementation stage, academics are welcome to seek in-time advice for emerging issues. In the final stage, evaluation is carried out in cooperation with the teachers. Diversified Evaluation Needs Evaluating the web materials thus built is difficult because of the highly diversified ways in which individual teachers are using the web in their teaching. The overall design of the evaluation is a reflection-improvement model in which the findings of the evaluation contribute to further improvements in each of the web-assisted courses under investigation. There are evaluation resources already available, such as toolkits (e.g. Oliver et al., 2002) or ‘cookbooks’ (e.g. Learning Technology Dissemination Initiative, 1998) but we chose to use a process mediated by an evaluation officer in order to fairly rapidly build up a set of cases of good evaluation practice for Hong Kong university teachers to refer to. Our system (like all others) is not value-free and tends towards a naturalistic model (Guba & Lincoln, 1981; Alexander & Hedberg, 1994). As discussed in detail in Lam & McNaught (2004), five main sources of diversity have shaped the evaluation approach of the e3Lroject. They are briefly explained below.

• Diversity in evaluation purpose: Formative evaluation and effectiveness evaluation are two main evaluation purposes. • Diversity in the nature of web-assisted courses: The teaching and learning functions of the web can be grouped into four main categories – content-delivery, engaging in communication, conducting assessment, and giving learning support (McNaught, 2002). • Diversity in evaluation questions: The evaluation questions vary in accordance with the design purposes of the web elements. The evaluation questions also vary according to the focus the evaluation has on the various stages of the learning experience. • Diversity in evaluation data types: There are ‘feel’, ‘know’ and ‘do’ types of data, and the evaluation participants include teachers, students, and third-party content/ eLearning ‘experts’. • Diversity in evaluation instruments: Many evaluation instruments can be used which include individual and group interviews, satisfaction questionnaires, the Study Process Questionnaire (SPQ) which measures students’ approaches to learning (Biggs, Kember & Leung, 2001), site access counters, classroom observations, reflective journals, and expert reviewers’ reports. Flexible Evaluation In order to cope with this diversity, it is the main feature of our evaluation mechanism that we cooperate closely with the teachers from the beginning to the end of the whole evaluation process. A five-step model is adopted, illustrated in Figure 2: a. b. c. d. e.

meet with the teachers in the very beginning of the evaluation process to understand needs, current and desired use of the web; and then to suggest evaluation questions, strategies and instruments to use; finalize evaluation questions, data types, evaluation instruments, and the evaluation schedule; and write an evaluation plan; design the various evaluation instruments needed based on the evaluation questions in the evaluation plan and then collect the data according to the timing set down in the evaluation plan; analyze the feedback and write reports; and lastly meet with teachers to plan further actions based on evaluation results. DECISIONS & e ACTION!

Frame evaluation a questions a Select strategies

Evaluation process

leads to

Analyse d data

Develop a plan b (with timeline) c

Collect data

Figure 2: The evaluation process

Procedures and Strategies The project requires each case under evaluation to go through the above-mentioned process and this generates a great deal of work, particularly so when the evaluation team is only composed of one supervisor, one team leader, another full-time research assistant, and two to three part-time student helpers. The project has, over time, built a good evaluation management model which stresses the effectiveness of both the external teacher-evaluation team working relationships, and the internal evaluation team-mates’ workflow.

Team-teacher Strategies The key strategy to ensure teacher-evaluation team co-operation is good communication. Hodgson & Lam (2004) discussed how good communication is essential in web development projects in order to ensure that products meet both the requirements of the teachers and fully utilize the potential of the web. Just as development staff and teachers should talk, it is also of utmost importance that the evaluation team and teacher successfully exchange their needs and suggestions to each other. This situation is depicted in Figure 3 below. Teachers need to convey to the evaluation team what their evaluation needs are while the team in return informs teachers about common evaluation strategies – their potential, possibilities and challenges – so as to assist the teachers to form useful decisions. The team also needs to transfer these decisions into workable plans and prepare the evaluation instruments as required. The teacher then helps to arrange for the data collection. Lastly, the team and the teacher communicate again when the team hands over the evaluation results in the form of an evaluation report.

needs

needs

strategy

Development

strategy

Teacher

Evaluation

content

data

website

report

Figure 3: Interactions between the teacher and the evaluation team Initial discussions about evaluation needs include the evaluation team’s explaining the common evaluation themes that we have found teachers to be most interested in. The themes are grouped into those about the learning environment, learning process, and learning outcome. Students work within learning environments, going through learning processes in order to achieve learning outcomes. One recurring evaluation focus about the web learning environments is how the students feel about the learning materials and/or activities. The themes we have articulated in the last two years are listed across Table 1. The other strategy sometimes used to help teachers with multiple evaluation foci is by going though an evaluation decision matrix with them (Table 2). The matrix is usually effective in making teachers aware of the fact that data may come from multiple sources (teachers, students, and third party reviewers); may look at one or more aspects of the learning environment/ learning processes/ learning outcomes; and may make use of many different types of instruments. The matrix also serves as a checklist on which preliminary decisions are recorded. The evaluation decisions are later transferred into a detailed evaluation plan in which the website design, evaluation questions/ themes, the evaluation instruments, and the timelines of the evaluation are all recorded. (See http://e3learning.edc.polyu.edu.hk/evaluate_scenarios2_plan.htm for a sample evaluation plan.) The plan also explicitly writes down the policy for the evaluation data use, which basically guarantees that the teachers will have access to all the evaluation data and have the rights to use the data for their own research purposes. The evaluation team also believes that some real examples of how evaluations are planned and conducted may assist teachers to work out what they need and what they can expect from us. Because of this, apart from general information about eLearning evaluation and common evaluation strategies and tools, a few authentic evaluation stories are put up onto the project website (http://e3learning.edc.polyu.edu.hk/evaluate.htm). The team has kept a good record of the survey questions asked in our various cases, and have gathered them into a question pool group according to the evaluation themes mentioned above. The survey question pool is now composed of

more than 200 items and has become increasingly useful for the team in preparing questionnaires for the teachers. The basic structure of the themes and sub-themes used in the survey question pool can be seen in Table 1. Items exist for all sub-themes. To facilitate data collection, online surveys are widely used. The team, however, is prepared to administer paper surveys any time when the online option is ruled out because of practical limitations. Lastly, the communication between teachers and the team is ensured with a comprehensive report. The format of the evaluation report has been slowly evolving throughout the two years of the project and it now contains sections such as the executive summary, description of the website, evaluation plan, and many appendices in which the raw and the analyzed data collected by each of the evaluation instruments are recorded. Internal Team Strategies An effective management of evaluation also includes building a team spirit. This is achieved by breaking down the evaluation tasks into manageable components, clearly assigning the responsibilities to the team members, but at the same time making sure that none of the members feel that they are alone. Our method used is illustrated by Figure 4. Dual responsibilities are assigned to each team member so that s/he is in charge of a few teacher-cases and one or two evaluation tools at the same time. For example, a helper A may be responsible for collecting and analyzing all student survey data and write the reports. Helper A is also responsible for a Case ‘a’ so that s/he needs to compile a grand report for the case by consulting the other helpers for the other component-reports, such as the focus-group meeting report and the site logs report. The team leader oversees the whole smooth-running of the process and constantly reviews the standards for the grand reports and the component-reports. Other strategies to facilitate team-internal workflow include the utilization of the web to assist exchange of files (ftp) and ideas/ instructions (icq). Moreover, the team has set up templates of each type of report produced. These are discussed at half-year intervals for progressive improvement and to produce upgrades. Team leader Cases

Instruments Student surveys student survey

Case a

Helper A

focus group site logs

Focus group meeting reports student survey Case b

Helper B

site logs

Site logs analysis teacher survey

Helper C… Figure 4: Dual responsibilities of the evaluation team members

Conclusion: The Next Steps This paper has described the effort the evaluation team has invested to make the evaluation process run as smoothly as possible, so that the evaluations suit the needs of the teachers and, just as Figure 1 has implied, will serve to confirm teachers’ efforts, reveal new directions to go, or even inspire the teachers towards further actions in eLearning. The ultimate aim of the project is to promote good-quality eLearning. This project can provide good support to minimize the initial technical barrier and provide informed feedback to teachers on which parts of their eLearning attempts work and which do not. However, a more successful integration of teaching and technology depends on a number of other critical factors: institutional policy, leadership, culture, support, infrastructure, reallocation of resources, and staff training and development (e.g. Thompson, 1999; Robinson, 2001). While the project has the quality to support academics from the planning stage through to evaluation, these factors need to be managed at the institutional and departmental levels. The next phase of the e3L project is to achieve some level of institutional commitment to the models and processes of all aspects of the e3L project. We are building a base of research reports and papers on aspects such as design (including the use of media, assessment, etc.), development, costing, mobile projects, evaluation (see http://e3learning.edc.polyu.edu.hk/publications.htm) for use in our own universities and elsewhere.

Acknowledgements Funding support from the University Grants Committee in Hong Kong and the Hong Kong Polytechnic University is gratefully acknowledged, as is the collaborative support of many colleagues in the three universities associated with the e3Learning Project.

References Alexander, S., & Hedberg, J. (1994). Evaluating technology-based learning: Which model? In K. Beattie, C. McNaught & S. Wills (Eds.), Multimedia in higher education: Designing for change in teaching and learning, pp. 233-244. Amsterdam: Elsevier. Biggs, J., Kember, D., & Leung, D. Y. P. (2001). The revised two-factor Study Process Questionnaire: R-SPQ-2F. British Journal of Educational Psychology, 71, 133-149 Guba, E. G., & Lincioln, Y. S. (1981). Effective evaluation. San Francisco: Jossey-Bass Publishers. Hodgson, P., & Lam, P. (July, 2004) Quality Management of a Joint-university e-learning Project: e3Learning. Global Education. Retrieved, 15 December, 2004, from http://www.globaled.com/articles/PaulaHodgeson2004.pdf James, J, McNaught, C, Csete, J, Hodgson, P and Vogel, D (2003) From MegaWeb to e3Learning: A model of support for university academics to effectively use the Web for teaching and learning. In D. Lassner & C. McNaught (Eds.), ED-MEDIA 2003, Proceedings of the 15th annual World Conference on Educational Multimedia, Hypermedia & Telecommunications, pp. 3303-3310, Honolulu, Hawaii, USA, 23-28 June, Association for the Advancement of Computers in Education, Norfolk VA. Lam, P., & McNaught, C. (2004). Evaluating educational websites: A system for multiple websites at multiple universities. In L. Cantoni & C. McLoughlin (Eds.), ED-MEDIA 2004, Proceedings of the 16th annual World Conference on Educational Multimedia, Hypermedia & Telecommunications, (pp. 1066-1073), Lugano, Switzerland, 21-26 June. Norfolk VA: Association for the Advancement of Computers in Education. Learning Technology Dissemination Initiative. (1998). Evaluation cookbook. Retrieved on 15 December 2004 from http://www.icbl.hw.ac.uk/ltdi/cookbook/contents.html McNaught, C. (2002). Adopting technology should mean adapting it to meet learning needs. On The Horizon, 10(4), 14-18. Oliver, M., McBean, J., Conole, G. & Harvey, J., (2002). Using a toolkit to support the evaluation of learning. Journal of Computer Assisted Learning, 18, 199-208. Robinson, B. (2001). Innovation in open and distance learning: some lessons from experience and research. In F. Lockwood & A. Gooley (Eds.), Innovation in open & distance learning: successful development of online and web-based learning, (pp. 15-20). London and Sterling, Va: Kogan Page and Stylus Publishing. Thompson, D. (1999). From marginal to mainstream: critical issues in the adoption of information technologies for tertiary teaching and learning. In A. Tait & R. Mills (Eds.), The convergence of distance and conventional education: patterns of flexibility for the individual learner (pp. 150-160). London and New York: Routledge.

Main theme

1. Improvement during development (formative/ developmental)

2. Student learning of concepts in the discipline (concepts)

3. Students’ feelings about the experience (opinions)

4. Students’ approaches or attitudes to learning change (approach/ attitude)

5. Effects of the experience on capability development (capabilities)

6. Students’ affection towards the subject (affection)

7. Communication with teachers and classmates (communication)

8. Time spent on learning activities (engagement)

Need analysis

Promotion of learning Learning better than previously

Quality

Self-study readiness Deep approach to learning

Make critical comments Ask meaningful questions

Personal involvement Emotional association

Teacher-student communication Student-student communication

Amount of time online Usage patterns

Achieve learning outcomes Assist remembering material Assist understanding

Enjoyment

Preference for active learning Preference for self-directed learning Devotion to the subject

Work in groups

Love for the subject High motivation to learn

Linkage with professionals

Preparation before classes Revision after classes

Apply knowledge to situations Analyze situations

Design and layout

Formative evaluation

Sub-themes

Usability evaluation Evaluation of previous version

Evaluate correct/ incorrect ideas Create and invent Play a professional role Solve problems

Quantity

Improvement

Navigation

Error-free and smooth-running

Reflect upon study progress Revise work based on comments received Articulate

Sense of belonging

Appreciate other peoples’ ideas Judge quality of work Solve problems Self-study Manage time

Table 1: Themes and sub-themes used in the question pool

eLearning components

Environment Delivery Design

Content  Coverage  Clarity of explanation  Adequacy of exercises

   

Interactivity Ease of use Speed Clarity of instruction

 Graphic design  Animation

Pedagogy  Use of site in class  As self-study materials

Process Site usage info Student use info  Counter on  WebCT logs frontpage  Tailor-made  Counters on student info logs individual  Self-report resources  Self-report  Observation  Observation

Better performance  Assignments  Exams  Tests  Self-report  Observation

    

Outcome Learning approach SPQ Capabilities Self-directed study Self-report Observation

Motivation to learn  Interest in subject  More time spent  Higher enjoyment  Greater participatio n  Attendance  Inventory  Self-report  Observation

Evaluation participants in sub-projects

The dot points noted in this matrix contain a mixture of aspects which might be considered and evaluation strategies which might be used. They are used as triggers for discussion in the development of evaluation plans. There may be several phases of evaluation, with different participants involved in each.

Students



















Teachers



















Others



















Students



















Teachers



















Others



















Table 2: e3Learning evaluation decision matrix