The Proceedings of the 11th European Conference ...

6 downloads 0 Views 489KB Size Report
Julia Fotheringham and Elaine Mowat ..... Carson (Edinburgh University, UK); Antonio Cartelli (University of Cassino,, Italy); Rommert Casimir (Tilburg University, ...
The Proceedings of the 11th European Conference on e!Learning University of Groningen The Netherlands 26 27!October!2012! Edited by Dr. Hans Beldhuis University of Groningen The Netherlands ! !

!

! ! ! ! ! ! ! ! ! Copyright!The!Authors,!2012.!All!Rights!Reserved.! No!reproduction,!copy!or!transmission!may!be!made!without!written!permission!from!the!individual!authors.! Papers! have! been! double blind! peer! reviewed! before! final! submission! to! the! conference.! Initially,! paper! abstracts! were!read!and!selected!by!the!conference!panel!for!submission!as!possible!papers!for!the!conference.! Many!thanks!to!the!reviewers!who!helped!ensure!the!quality!of!the!full!papers.! These!Conference!Proceedings!have!been!submitted!to!Thomson!ISI!for!indexing.! Further!copies!of!this!book!and!previous!year’s!proceedings!can!be!purchased!from!http://academic bookshop.com! CD!version!ISBN:!978 1 908272 74 4! CD!version!ISSN:!2048 8637!! Book!version!ISBN:!!978 1 908272 73 7! Book!Version!ISSN:!2048 8637!! ! Published!by!Academic!Publishing!International!Limited! Reading! UK! 44 118 972 4148! www.academic publishing.org!

Contents Paper Title

Author(s)

Page No.

Preface

vi!

Committee

vii!

Biographies

x!

Enhancing!Second!Language!Skills!Development!Using! Students!Response!System!

Alaba Olaoluwakotansibe Agbatogun

1!

The!Role!of!Smart Board!in!Improving!English!Language! Skills!in!Jordanian!Universities!

Mohammad Akram Alzu'bi

9!

Teaching!Diplomatics!in!2.0!Web!Environments:!An! Innovative!Experience!to!Promote!Interaction!Among! Students!From!Different!Countries!and!With!Different! Learning!Needs!

Antonella Ambrosio, Maura Striano, Corinna Freda, Stefania Fiorentino and Luca Aiello

15!

Experiences!With!use!of!Various!Pedagogical!Methods! Utilizing!a!Student!Response!System!!

Ketil Arnesen, Guri Sivertsen Korpås, Jon Eirik Hennissen and John Birger Stav

20!

Why!and!how!Academic!Staff!Adopt!e Assessment!in!a! Higher!Education!Institution!(HEI)!

Zwelijongile Gaylard Baleni

28!

Interactive!Feedback!in!Virtual!Learning!Environment! ! Video Skills!

Josep Maria Batalla!Busquets, Carolina Hintzmann, María Jesús Martínez!Argüelles, Dolors Plana! Erta, Marc Badia!Miró

37!

Do!we!owe!Them?!The!Impact!of!eLearning!on! Disadvantaged!Communities!

Catherine Beaton

47!

Using!Peer!Assessment!and!Electronic!Voting!to!Improve! Practical!Skills!in!Masters!Students!

Steve Bennett and Trevor Barker

53!

The!Role!of!Computational!Thinking!and!Critical!Thinking! in!Problem!Solving!in!a!Learning!Environment!

Sheryl Buckley

63!

New!Approach!for!Virtual!Master’s!Final!Projects:!Didac tic!Guidelines!for!Students!and!Tutors!

Nati Cabrera, Ana Rodera and Montse Guitert

71

Materials!Development!in!Language!Training:!Online! Course!of!Military!English!!

Ivana Cechova, Dana Zerzanova and Jana Berankova

80!

Social!Software!Applications!and!Their!Role!in!the! Process!of!Education!from!the!Perspective!of!University! Students!!

Miloslava "erná and Petra Poulová

87!

How!to!Design!and!Implement!a!Validation! Methodology!for!Virtual!Educational!Games!

Carmen Mihaela Cretu, Nicoleta Rogoz and Diana Chihaia

97!

An!Evaluation!of!Online!Distance!Learning!Programmes! Through!the!Lens!of!Students’!Expectations!!

Marija Cubric, Mariana Lilley and Karen Clark

106!

University!Preparation!via!Self E Assessment!and!Self Study:!First!Findings!and!Implications!From!Evaluating! an!eLearning Platform!

Katja Derr, Reinhold Hübl and Mohammed Zaki Ahmed

117!

eLearning!and!Social!Media!in!Higher!Education!With!an! Interactional!Approach!!

Fan Ding

126!

i!

Paper Title

Author(s)

Wiki!Strategies!for!a!More!Participatory!Learning! Environment!in!Czech!Education!

Ji#í Dlouhý and Jana Dlouhá

134!

If!you!are!not!Modeling!Good!Teaching,!you!are! Teaching!Something!Else:!The!HUCC!Model!of!Teaching!

Jane Eberle

142!

Peer!to!Peer:!The!Full!Cycle:!Investigating!Online!Peer! Assessment!Through!Action!Research!

Julia Fotheringham and Elaine Mowat

150!

Online!Exams:!Practical!Implications!and!Future! Directions!

Gabriele Frankl and Sofie Bitter

158!

The!Changing!Roles!of!Staff!and!Student!Within!a! Connectivist!Educational!Blog!Model!

Elaine Garcia, Mel Brown and Ibrahim Elbeltagi

165!

Training!e Tutors!in!Romania:!Validating!the!Theory!

Maria Goga

174!

How!are!Web!2.0!Technologies!Affecting!the!Academic! Roles!in!Higher!Education?!A!View!From!the!Literature!

Sue Greener

183!

Media!use!for!Learning!by!Students!in!Higher!Education:! An!International!Survey!

Michael Grosch

188!

Using!Web!2.0!in!Teaching!and!Research:!Insights!From! Trainings!and!User Driven!Research!!

Brigitte Grote

197!

"Digital!Futures!in!Teacher!Education"!Project:!Exploring! Open!Approaches!towards!Digital!Literacy!!

Anna Gruszczynska, Guy Merchant and Richard Pountney

206!

Introducing!Blended!eLearning!Course!Design:!A!Pre Implementation!Assessment!of!Students’!Basic!ICT!Skills!

Samuel Adu Gyamfi and Thomas Ryberg

214!

“Why!Should!I?”!Engaging!Learners!in!Digital!Literacy! Skills!Development!!

Marion Hall, Ingrid Nix and Kirsty Baker

220!

The!Yin/Yang!of!Innovative!Technology!Enhanced! Assessment!for!Promoting!Student!Learning!

Maggie Hutchings, Anne Quinney, Kate Galvin and Vince Clark

230!

Using!Blogs!as!an!Assessment!Tool!in!Higher!Education:! An!Experience!in!the!Catalan!Higher!Education!Context!

Georgeta Ion, Elena Cano, Blanca! Patricia Silva and Pilar Iranzo

238!

Ethnographic!Sensibility:!A!Method!for!Studying!Lurking! as!eLearning!

Hannakaisa Isomäki, Ulla Pohjamo and Johanna Silvennoinen

249!

Rethinking!Creativity!in!Learning!and!Teaching!With! Technology!in!Romanian!Higher!Education!

Cornel Nicolae Jucan and Mihaela Sabina Dolf Baier

256!

Factors!Influencing!the!Use!of!Web!2.0!and!Free!Open! Source!Software!to!Optimize!E Learning!

Billy Kalema, Oludayo. O Olugbara, Ray M. Kekwaletswe

265!

Peer!Review!as!an!Activating!Learning!Method!Within! University!Education!

Elisabeth Katzlinger and Michael Herzog

274!

eLearning!and!Freedom! !Mainstreaming!e Diversity!in! Higher!Education!

Martin Lu Kolbinger and Gisela Prey

283!

A!Web!3.0!Mashup!to!Promote!the!eLearning!Platform! “Unibook”!

Ioannis Kostis, Michail Basios, Kons! tantinos Chimos, Theodoros Karvou! nidis, Christos Douligeris and Ioannis Katsanakis

290!

The!Three!Worlds!of!Live,!Virtual!and!Constructive! Environments!

John Lacontora

299!

ii!

Page No.

Paper Title

Author(s)

New!Module!for!Statistical!Evaluation!in!LMS!Moodle!!

Martin Magdin and Martin Cápay

304!

Intro!to!Positive!Psychology:!Blending!Interactive! Behaviour!in!e Lecture!in!a!Private!University!

Lee Mah!Ngee

311!

eLearning!Modalities:!A!Framework!for!Selecting!Audio!

Bride Mallon, JJ Quinlan and Kieran Nolan

319!

Rights!Management!and!Technological!Protection! Measures!in!the!Educational!Field!

Joaquim Marques and Carlos Serrão

327!

E Enablement!in!Distance!Education!–!Engineering! Growth:!A!Case!Study!of!IMT CDL!!

Tushar Marwaha and Anita Mathew

338!

Hard Skills!on!the!Cloud!–!Certification!Skills!by!Blended! Learning!

Michael Marxer, Luís Rocha, Nuno Araújo and Reto Goetti

350!

The!Role!of!the!Online!Learning!Personal!Tutor!

David Mathew

358!

Improving!Student!Success!Rates!Through!a!Semi Personalized!Feedback!System!

Maedeh Mosharraf and Fattaneh Taghiyareh

364!

Education!on!Wheels!–!Mobile!Dissemination!of!e Services!and!Computer!Based!Learning!in!Rural!Gujarat,! India!!

Peter Mozelius and Amit Roy

370!

Learning!Analytics!Artefacts!in!a!Cloud Based! Environment:!A!Design!Science!Perspective!

Phelim Murnion and Markus Helfert

379!

Note Taking!Skills!and!Student's!Characteristics!in!Online! Courses!

Minoru Nakayama, Kouichi Mutsuura and Hiroh Yamamoto

388!

Assess!the!Assessment:!An!Automated!Analysis!of! Multiple!Choice!Exams!and!Test!Items!

Michaela Nettekoven and Karl Ledermüller

397!

“Why!Bother?”!Learner!Perceptions!of!Digital!Literacy! Skills!Development! !Learning!Design!Implications!

Ingrid Nix, Marion Hall and Kirsty Baker

406!

Three!by!Three!by!Three:!A!Model!for!eLearning! Evaluation!

Donatella Persico, Stefania Manca and Francesca Pozzi

416!

Students!Attitudes,!Opinions!and!Perception!of! eLearning!!

Petra Poulová, Ivana Šimonová and Miloslava "erná

425!

Democratization!of!Knowledge!and!the!Promise!of!Web! 2.0:!A!Historical!Perspective!

Ali Raddaoui

435!

eLearning!Funding!Programme:!A!Successful!Measure! for!the!Sustainable!Modernisation!of!Daily!University! Teaching?!

Jeelka Reinhardt and Harriet Hoffmann

442!

“What!my!Guidance!Councillor!Should!Have!Told!me”:! The!Importance!of!Universal!Access!and!Exposure!to! Executive Level!Advice!

Brittany Rockwell, Joanne Leck, Michael Luthy and Catherine Elliott

452!

An!Approach!to!Assessing!Stress!in!eLearning!Students!

Manuel Rodrigues, Florentino Fdez! Riverola and Paulo Novais

461!

Digital!Natives!...!are!Changed:!An!Educational!Scenario! With!LAMS!Integration!That!Promotes!Collaboration!via! Blended!Learning!in!Secondary!Education!!

Eleni Rossiou

468!

eLearning:!Tool!to!Ensure!Growth!and!Sustainability!of! SMEs!

Andrée Roy

480!

iii!

Page No.

Paper Title

Author(s)

Page No.

Self Directedness!and!Critical!Reading!Skills!in!a!Blended! Learning!Course:!Analysis!and!Challenges!

Zuzana Šaffková and František T$ma

489!

Design!and!Redesign!of!Online!Discussion:!Comparison! of!Lessons!Learned!

Rowena Santiago, Amy Leh and Minoru Nakayama

497!

Challenges!in!Extending!the!Virtual!Learning! Environment!Into!Social!Networks! !Keeping!Staff!and! Students!Together!

Guy Saward

505!

The!Efficacy!of!eLearning!for!Information Retrieval!Skills! in!Medical!Education!

Katherine Schilling

515!

Sharing!and!Shaping!Effective!Institutional!Practice!in! TEL!Through!the!3E!Framework!

Keith Smyth

523!

Designing!and!Developing!Effective!Peer!Learning! Assessment!Services!

John Stav, Trond M. Thorseth and Gabrielle Hansen

532!

Applying!the!Prototyping!Methodology!to!Develop!a! Student!Centred,!Integrated!eLearning!Resource!

Iain!Stewart,!James!Devon,!William! McKee,!David!Harrison!and!Malcolm! Allan!

541!

Frame Based!Semantic!Retrieval!of!Education! Supportive!Materials!!

Maryam Tayefeh Mahmoudi, Fattaneh Taghiyareh, Kambiz Badie and Mohammad Reza Basirati

552!

Multipurpose!eLearning!Course!With!Moodle!Tools:!First! Steps!in!Design!Research!

Anne Villems, Taavi Duvin and Rudolf Elbrecht

561!

Implementation!of!Netbooks!in!the!Teaching!of! Mathematics!in!the!Primary!Schools!!

Na%a Vondrová and Antonín Jan&a#ík

567!

Employing!the!S P!Diagnostic!Table!for!Online! Qualitative!Comments!on!Test!Results!

Chien!hwa Wang and Cheng!ping Chen

575!

A!Swarm Based!Approach!to!Adaptive!Learning:! Selection!of!a!Dynamic!Learning!Scenario!

Inga Zilinskiene, Valentina Dagiene and Eugenijus Kurilovas

583!

From!Expert!to!Novice.....and!back!again?!!A!practical! and!personal!account!of!transferring!a!face to face! taught!Master’s!course!in!leadership!into!a!fully!online! asynchronous!one!for!distance!learners!

Maggie Carson

594!

601!

PHD paper Fatema Al!Yazeedi, Annette Payne and Timothy Cribbon

Auto!Grading!and!Providing!Formative!Feedback!to! Students!on!Documents!Submitted!Electronically!

603! 613!

Work in Progress Papers Cooperation!for!Knowledge:!An!On Line!Framework!to! Promote!Academic!Skills!in!Economic!History!in!Bolivia!

Anna Carreras!Marín, Marc Badia! Miró and César Yáñez Gallardo

615!

Using!Technology!to!Enhance!Formative!Assessment!!

David Comiskey

618!

Teaching!and!Learning!Economy!Through!new! Technologies:!The!EE T!Platform!

Georgeta Ion, Cristina Raluca Popescu and Ilaria Mares

621!

Harnessing!Open!Source!Technology!to!Address!the! Learning!Needs!of!SMEs.!

Rosario Kane!Iturrioz and Anne Dickinson

625!

Acceptance!of!Blog:!Engagement!of!Students!Towards! Positive!Learning!in!Higher!Learning!Institution!

Inderjit Kaur Lally, Mah Ngee Lee and Ser Zian Tan

634!

iv!

!

Paper Title

Author(s)

Some!Policy!Implications!of!Bridging!the!Knowledge!gap! in!Clinical!Transfusion!Medicine!Through!eLearning!

Maruff Akinwale Oladejo and Cees Smit Sibinga

638!

A!Learner Centred!Induction!to!Moodle!

Morris Pamplin and Marie Poirier

641!

Contextualizing!Distance!Business!and!Management! Education:!A!Case!of!the!UK!Open!University!and!IIM! LINK!in!Russia!!

Ruslan Ramanau and Lioubov Krashe! ninnikova

645!

Glogs!as!new!Learning!Products!in!our!Universities!

Ana Rodera

649!

How!to!Teach!Science!in!1!to1!Primary!Schools:!An! Experimental!Study!About!Learning!Objects!Efficacy!in! Critical!Contexts!

Fabio Serenelli, Enrico Ruggeri and Andrea Mangiatordi

653!

Perceptions!of!Staff!and!Students'!use!of!Digital!Media!

Angela Shapiro and Aidan Johnston

658!

Technology!Enhanced!Learning!at!the!University!of! Glamorgan!With!a!Focus!on!Students’!Mobile! Technologies!

Florica Tomos, Trevor Price and Haydn Blackey, Paul Jones and Chris Miller

661!

!

v!

Page No.

Preface These! Proceedings! represent! the! work! of! contributors! to! the! 11th! European! Conference! on! e learning,! ECEL! 2012,! hosted!this!year!by!the!University!of!Groningen,!The!Netherlands.!The!Conference!Chair!is!Prof.!Cees!Smit!Sibinga,!and! the!Programme!Chair!is!Dr!Hans!Beldhuis.! The!conference!will!be!opened!with!a!keynote!address!by!Prof.!Eric!Mazur!from!Harvard!University,!on!the!topic!of! Confessions of a Converted Lecturer.!Also!on!the!first!day!we!have!a!presentation!from!Dan!Peters!of!Blackboard,!Eu rope.!We!would!especially!like!to!thank!BlackBoard!for!their!support!of!the!conference!this!year.!The!second!day!will! be!opened!by!Prof.!dr.!Fred!Mulder,!UNESCO!Chair!in!Open!Educational!Resources!at!Open!Universiteit,!The!Nether lands,!and!later!that!day!we!have!Prof.!Johannes!Cronjé!from!Cape!Peninsula!University!of!Technology!in!South!Africa.! As!usual!the!papers!cover!a!wide!spectrum!of!issues,!all!of!which!are!pertinent!to!the!successful!use!of!e learning.!The! papers!represent!current!thinking!on!e learning!issues!and!within!the!five!conference!themes,!contributions!cover:! Open!educational!resources!(OER)!and!use!of!online!materials!and!learning!analytics! (social)!media!use!in!e learning!with!a!special!focus!on!mobile!learning! electronic!assessment,!with!a!special!focus!on!automated!assessment!! didactics!and!e learning,!following!Eric!Mazur’s!work!going!deeper!into!the!topic!of!peer instruction! Other!conference!highlights!cover!literacy!skills!and!the!changing!role!of!teachers! It!is!clear!that!the!role!being!played!by!e learning!in!the!pedagogical!process!is!considerable!and!that!there!is!still!am ple! scope! for! further! development.! For! this! conference,! the! focus! is! on! looking! beyond the gadgets! and! the! broad! spectrum!of!papers!demonstrates!when,!where!and!how!e learning!is!selected!for!its!true!value.! The! ECEL! Conference!constitutes! a! valuable! platform! for! individuals! to! present! their!research! findings,!display!their! work!in!progress!and!discuss!conceptual!advances!in!many!different!branches!of!e learning.!At!the!same!time,!it!pro vides!an!important!opportunity!for!members!of!the!community!to!come!together!with!peers,!share!knowledge!and! exchange!ideas.!! With!an!initial!submission!of!165!abstracts,!after!the!double!blind,!peer!review!process!there!are!59!academic!papers,! 1!Phd!Paper!and!12!Work!in!Progress!papers!in!these!Conference!Proceedings.!These!papers!reflect!the!truly!global! nature! of! research! in! the! area! with! contributions! from! Austria,! Canada,! Czech! Republic,! Denmark,! Estonia,! Finland,! Germany,! Greece,! India,! Iran,! Ireland,! Italy,! Japan,! Jordan,! Lithuania,!Malaysia,! New! Zealand,!Northern! Ireland,!Por tugal,!Romania,!Slovakia,!South!Africa,!Spain,!Sweden,!Switzerland,!Taiwan,!The!Netherlands,!United!Kingdom!and!the! United!States.!! A!selection!of!papers!–!those!agreed!by!a!panel!of!reviewers!and!the!editor,!will!be!published!in!a!special!conference! edition!of!the!EJEL!(Electronic!Journal!of!e Learning,!www.ejel.org).!!! I!wish!you!a!most!interesting!conference.! Dr. Hans Beldhuis University!of!Groningen,!The!Netherlands! Programme!Chair! October!2012!

vi!

Conference Committee Conference Executive Prof.!Cees!Th.!Smit!Sibinga,!University!of!Groningen,!The!Netherlands! Jaap!Westerhuijs,!Groningen!Convention!Bureau,!Groningen,!! Dr!Hans!J.A.!Beldhuis,!University!of!Groningen,!The!Netherlands! Jetse!Goris,!Wenckebach!Institute,!University!Medical!Center!Groningen! Dr!Koos!Winnips,!University!of!Groningen,!The!Netherlands! Mini Track Chairs Prof.!Dr.!Valentina!Dagiene,!Institute!of!Mathematics!at!Vilnius!University,!Vilnius,!Lithuania! Dr!Hans!Beldhuis,!University!of!Groningen,!Groningen,!The!Netherlands! Dr!Sue!Greener,!Brighton!Business!School,!University!of!Brighton,!UK! Committee members The!conference!programme!committee!consists!of!key!people!in!the!e learning!community!around!the!world.!The!following!people! have!confirmed!their!participation:! Ariffin! Abdul! Mutalib! (Universiti! Utara! Malaysia);! Siti! Aishah! Abdullah! (University! Technology! Mara,! Kelantan,! Malaysia);! Tofan! Cezarina!Adina!(Spiru!Haret!University,!Romania);!Wilfried!Admiraal!(Universiteit!van!Amsterdam,!Netherlands);!Shafqat!Ali!(Uni versity! of! Western! Sydney,! Australia);! Abdallah! Al Zoubi! (Princess! Sumaya! University! for! Technology,! Jordan);! Margarida! Amaral! (University!of!Porto,!Portugal);!Antonios!Andreatos!(Hellenic!Air!Force!Academy,!Greece);!Alla!Anohina!(Riga!Technical!University,! Latvia);!Jane!Ardus!(Stevenson!College,!Edinburgh,!UK);!Mohamed!Arteimi!(7th! of!April!University,!Tripoli,!Libya);!William!Ashraf! (University! of! Sussex,! UK);! Bunyamin! Atici! (Firat! University,! Turkey);! Anders! Avdic! (Orebro! University,! Sweden);! Simon! Bachelor! (Gamos,! Reading,! UK);! Joan! Ballantine! (University! of! Ulster,! UK);! Trevor! Barker! (University! of! Hertfordshire,! UK);! Josep Maria! Batalla! (Universitat! Oberta! de! Catalunya,! Spain);! Orlando! Belo! (University! of! Minho! Campus! de! Gualtar,! Portugal);! David! Benito! (Public!University!of!Navarre,!Pamplona,!Spain);!Yongmei!Bentley!(University!of!Luton,!UK);!Daniel!Biella!(University!of!Duisburg Essen,!Germany);!Radu!Bilba!(George!Bacovia!University,!Romania);!Eric!Bodger!(University!of!Winchester,!UK);!Stephen!Bowman! (Ravensbourne!College!of!Design!and!Communication,!UK);!Willem Paul!Brinkman!(Delft! University!of!Technology,!Netherlands);! Ann!Brown!(CASS!Business!School,!London,!UK);!Mark!Brown!(Massey!University,!Palmerston!North,!New!Zealand);!Giuseppe!Can navina! (University! of! Sheffield,! UK);! Sven! Carlsson! (Lund! University,! Sweden);! James! Carr! (University! of! Newcastle,! UK);! Maggie! Carson!(Edinburgh!University,!UK);!Antonio!Cartelli!(University!of!Cassino,,!Italy);!Rommert!Casimir!(Tilburg!University,!The!Nether lands);! Ivana! Cechova! (University! of! Defence,! Czech! Republic,);! Maria! Celentano! (University! of! Lecce,! Italy);! Athina! Chatzigavriil! (LSE,!London,!UK);!Satyadhyan!Chickerur!(M.S.!Ramaiah!Institute!of!Technology,!Bangalore,!India);!Burhan!China!(PDSA,!Somalia);! Barbara!Class!(University!of!Geneva,!Switzerland);!Lynn!Clouder!(Coventry!University,!UK);!Thomas!Connolly!(University!of!West!of! Scotland,!UK);!Ken!Currie!(Edinburgh!University,!UK);!Valentina!Dagiene!(Institute!of!Mathematics!and!Informatics,!Vilnius,!Lithua nia);!Mark!De!Groot!(Leeds!Metropolitian!University,!UK);!Antonio!De!Nicola!(ENEA,!Italy);!Carmen!De!Pablos!Heredero!(Rey!juan! Carlos!University,!Spain);!Rajiv!Dharaskar!(GH!Raisoni!College!of!Engineering,!Nagpur,!India);!Vicenzo!Di!Lecce!(Politecnico!di!Bari,! Italy);!Martina!Doolan!(University!of!Hertfordshire,!UK);!Christopher!Douce!(Institute!of!Educational!Technology,!Walton!Hall,!UK);! Yanqing!Duan!(University!of!Luton,!UK);!Jane!Eberle!(Emporia!State!University,!USA);!Colin!Egan!(University!of!Hertfordshire,!Hat field,!UK);!Bulent!Gursel!Emiroglu!(Eskisehir!Yolu!Baglica!Mevkii,!Turkey);!Chew!Esyin!(University!of!Glamorgan,!UK,);!Ariwa!Ezendu! (London! Metropolitan! University,! Uk);! Bekim! Fetaji! (South! East! European! University,! Tetovo,! Macedonia);! Andrea! Floros! (Ionian! University,! Greece);! Duncan! Folley! (Leeds! Metropolitian! University,! England);! Katie! Goeman! (Free! University! of! Brussels! (VUB),! Belgium);!Colin!Gray!(Edinburgh!Napier!University,!Scotland);!Susan!Greener!(University!of!Brighton,!UK);!David!Guralnick!(Kaleido scope!Learning,!New!York,!USA);!Richard!Hall!(De!Monfort!University,!Leicester,!UK);!Patricia!Harvey!(Greenwich!University,!Lon don,!UK);!Thanos!Hatziapostolou!(International!faculty!of!the!university!of!sheffield,!Greece);!Rose!Heaney!(University!of!East!Lon don,!UK,);!Alan!Hilliard!(University!of!Hertfordshire,!Hatfield,!UK);!Uwe!Hoppe!(Bildungswerk!der!Sächsischen!Wirtschaft!gGmbH,! Germany);! Md.! Fokhray! Hossain! (Daffodil! International! University,! Bangladesh);! Stefan! Hrastinski! (Uppsala! University,! Sweden);! Balde!Idiatou!(Noble!Group!Organised!Solutions,!Guinea);!Antonin!Jancarik!(Charles!University,!Czech!Republic);!Amanda!Jefferies! (University!of!Hertfordshire,!Hatfield,!UK);!Runa!Jesmin!(Global!Heart!Forum,!UK);!Aidan!!Johnston!(Glasgow!Caledonian!University,! UK);!Paul!Jones!(University!of!Glamorgan,!UK);!Geraldine!Jones!(University!of!Bath,!UK,);!Jowati!Juhary!(National!Defence!University! of!Malaysia,!Malaysia);!Tuomo!Kakkonen!(University!of!eastern!Finland,!Finland);!Michail!Kalogiannakis!(School!of!Pegadogical!and! Technicological!Education,!ASPETE,!Crete);!Clifton!Kandler!(University!of!Greenwich,!UK,);!Jana!Kapounova!(University!of!Ostrava!,! Czech!Republic);!Andrea!Kelz!(University!of!Applied!Sciences!Burgenland,Campus!Pinkafeld,!Austria);!Saba!!Khalil!(Virtual!University! of!Pakistan,!Lahore,!Pakistan);!Jasna!Kuljis!(Brunel!University,! UK);!Sunaina!Kumar!(Indira!Gandhi!National!Open!University,!New! Delhi,!INDIA);!Swapna!Kumar!(University!of!Florida,!USA);!venkata!Durga!kumar!(Sunway!University!College,!Malaysia,);!Blair!Kuntz! (University!of!Toronto,!Canada,);!Eugenijus!Kurilovas!(Vilnius!Gediminas!technical!university/institute!of!mathmatics!and!informat ics!of!Vinius!University,!Lithuania);!Eleni!Kyza!(Cyprus!University!of!Technology,!Lemesos,!Cyprus);!Maria!Lambrou!(University!of!the!

vii!

Aegean!Business!School,!Greece);!Andy!Lapham!(Thames!Valley!University,!UK);!Mona!Laroussi!(Institut!National!des!Sciences!Ap pliquées! et! de! la! Technologie,! Tnis! and! Lille,! Tunisia);! Deepak! Laxmi! Narasimha! (University! of! Malaya,! Malaysia);! Fotis! Lazarinis! (Applied!Informatics!in!Management!and!Finance,!Greece);!Denise!Leahy!(Trinity!College,!Dublin,!Ireland);!Kate!Lennon!(Glasgow! Caledonian!University,!UK);!Mariana!Lilley!(University!of!Hertfordshire,!UK);!Jorgen!Lindh!(Jonkoping!International!Business!School,! Sweden);! Gi Zen! Liu! (National!Cheng!Kung!University,!Taiwan);! Ying!Liu! (Cambridge!University,! UK);!Jenny!Lorimer!(University! of! Hertfortshire,!UK);!Sam!Lubbe!(University!of!South!Africa,!South!Africa);!Nick!Lund!(Manchester!Metropolitan!University,!England,);! Alejandra!Magana!(Purdue!University,!United!States!of!America,!United!States!of!America);!Adnan!Mahmood!(University!of!Jinan,! P.R.China,);!Francis!Maietta!(Real!Thinking!Company,!UK);!Christina!Mainka!(Heidelberg!University!,!Germany);!Chittaranjan!Mandal! (School!of!IT,IIT!Kharagpur,!India);!Augostino!Marengo!(University!of!Bari,!Italy);!Maria!J!Martinez Arguelles!(Universitat!Oberta!de! Catalunya,!Spain);!David!Mathew!(University!of!Bedfordshire,!England);!Sephanos!Mavromoustakos!(Cyprus!College,!Cyprus);!Erika! Mechlova!(University!of!Ostrava,!Czech!Republic);!Cherifa!Mehadji!(University!of!Strasbourg,!France);!Rosina!Merry!(the!school!of! Education! The! University! of! Waikatio,! New! Zealand);! Linda! Joy! Mesh! (Universita! degli! Studi! di! Siena,! Italy);! Jaroslava! Mikulecka! (University! of! Hradec! Kralove,! Czech! Republic);! Peter! Mikulecky! (University! of! Hradec! Kralove,! Czech! Republic);! Mike! Mimirinis! (Middlesex!University,!London,!UK);!Julia!Mingullon!(Universitat!oberta!de!catalunya,!Spain);!Ali!Moeini!(University!of!Tehran,!Iran);! Jonathan!Moizer!(University!of!Plymouth,!UK);!Johann!Moller!(University!of!South!Africa!(UNISA),!South!Africa);!Peter!Monthienvi chienchai!(Insitute!of!Education,!London,!UK);!Pam!Moule!(University!of!the!West!of!England,!Bristol,!UK);!Radouane!Mrabet!(EN SIA,!Morocco);!Minoru!Nakayama!(Tokoyo!Institute!of!Technology,!Japan);!Julian!Newman!(Glasgow!Caledonian!University!,!UK);! Chetsada!Noknoi!(Thaksin!University,!Songkhla,!Thailand);!Abel!Nyamapfene!(University!of!Exeter,!United!Kingdom);!Maruff!Akin wale!Oladejo!(Federal!College!of!Education!(Special),!Nigeria);!Sinead!O’Neill!(Waterford!Institute!of!Technology,!Ireland);!Kamila! Olsevicova!(Univeristy!of!Hradec!Kralove,!Czech!Republic);!Rikke!Orngreen!(Aalbourg!University,!Denmark);!Jalil!Othman,!(Univer sity! of! Malaya! ,Malaysia);! Kutluk! Ozguven! (Dogus! University,! Turkey);! Ecaterina! Pacurar! Giacomini! (Louis! Pasteur! University,! FRANCE);! Alessandro! Pagano! (University! of! Bari,! Italy);! Vasileios! Paliktzoglou! (University! of! eastern! Finland,! Finland);! Stefanie! Panke!(University!of!Ulm,!Germany);!George!Papadopoulos!(University!of!Cyprus,!Cyprus);!Iraklis!Paraskakis!(South!East!European! Research!Centre!(SEERC)!Research!Centre!of!the!University!of!Sheffiled,!Thessaloniki,!Greece);!Vivien!Paraskevi!(TECFA FPSE,!Edu cational! Technology! Unit,! University! of! Geneva,! Switzerland,! Switzerland);! Angie! Parker! (Anthem! College! Online,! USA,);! Paul! Peachey!(University!of!Glamorgan,!Treforest,!UK);!Arna!Peretz!(Ben!Gurion!Univeristy!of!the!Negev,!Omer,!Israel);!Christine!Perry! (University! of! the! West! of! England,! Bristol,! UK);! Donatella! Persico! (Istituto! Tecnologie! Didattiochje Consiglio! Nazionale! Ricerche,! Genova,!Italy);!Pit!Pichappan!(Annamalai!University,!India);!Selwyn!Piramuthu!(University!of!Florida,!Gainesville,!USA);!Michel!Plais ent!(University!of!Quebec!in!Montreal,!Canada);!Lubomir!Popelinsky!(Masaryk!University,!Czech!Republic);!Andy!Pulman!(Bourne mouth! University,! UK);! Muhammad! Abdul! Qadir! (Mohammad! Ali! Jinnah! University,! Islamabad,! Pakistan);! Ricardo! Queirós! (ESEIG/KMILT!&!CRACS/INESC,!Portugal);!Susannah!Quinsee!(City!University,!London,!UK);!Abdul!Rafay!(Asia!Pacific!University!Col lege!of!Technology!&!Innovation,!Malaysia);!Liana!Razmerita!(Copenhagen!Business!School,!Denmark);!Christopher!Reade!(Kingston! University,!UK);!Hugo!Ribeiro!(University!of!Porto,!Portugal,);!Vivien!Rolfe!(De!Monfort!University,!Leicester,!UK);!Asher!Rospigliosis! (University!of!Brighton,!UK);!Eleni!Rossiou!(University!of!Macedonia,!Greece);!Florin!Salajan!(North!Dakota!State!University!,!Can ada);!David!Sammon!(Univesity!College!Cork,!Ireland);!Gustavo!Santos!(University!of!Porto,!Portugal);!Venkat!Sastry!(Defence!Col lege!of!Management!and!Technology,!Cranfield!University,!UK);!Guy!Saward!(University!of!Hertfordshire,!UK,);!Brian!Sayer!(Univer sity! of! London,! UK);! Jeanne! Schreurs! (Hasselt! University,! Diepenbeek,! belgium);! Jane! Secker! (London! School! of! Economics,! UK);! Angela!Shapiro!(Glasgow!Caledonian!University,!Scotland);!Aileen!Sibbald!(Napier!University,!Scotland,!UK);!Petia!Sice!(University!of! Northumbria,! Newcastle upon Tyne,! UK);! Ali! Simsek! (Anadolu! University,! Turkey);! Gurmeet! Singh! (The! University! of! The! South! Pacific,! Suva! ,! Fiji,! Fiji);! Cees! Th.! Smit! Sibinga! (Academic! insitute! for! the! international! development! of! transfusion! medicine,! The! Neverlands);! Alisdair! Smithies! (Manchester! Medical! School,! UK);! Keith! Smyth! (Napier! University,! Edinburgh,! UK);! Bent! Soelberg! (Copenhagen!Business!School,!Denmark);!Or!Kan!Soh!(University!Tunku!Abdul!Rahman!(UTAR),!Malaysia);!Yeong Tae!Song!(Towson! University,!Maryland,!USA);!Michael!Sonntag!(FIM,!Johannes!Kepler!University,!Linz,!Austria);!Rumen!Stainov!(University!of!Applied! Sciences,!Fulda,!Germany);!John!Stav!(Sor Trondelag!University!College,!Norway);!Roxana!Taddei!(Université!Clermont!Ferrand!2,! Montpellier,!!France);!Yana!Tainsh!(University!of!Greenwich,!UK);!Heiman!Tali!(The!Open!University,!Israel);!Bénédicte!Talon!(Uni versité!du!Littoral,!France);!Marian!Theron!(False!Bay!College,!Tokai,!South!Africa);!John!Thompson!(Buffalo!State!College,!USA);! Claudine!Toffolon!(Université!du!Mans! !IUT!de!Laval,!France);!Eulalia!Torras Virgili!(Open!University!of!catalonia,!Spain);!Kathryn! Trinder!(Glasgow!Caledonian!University,!UK);!Christopher!Turner!(University!of!Winchester!,!UK);!Karin!Tweddell!Levinsen!(Danish! University!of!Education,!Denmark);!Aimilia!Tzanavari!(University!of!Nicosia,!Cyprus);!Huseyin!Uzunboylu!(Near!East!University,!CY PRUS);!Linda!Van!Ryneveld!(Tshwane!University!of!Technology,!Pretoria,!South!Africa);!Carlos!Vaz!de!Carvalho!(Porto!Polytechnic,! Portugal);!Andreas!Veglis!(Aristotle!University!of!Thessaloniki,!Greece);!Bruno!Warin!(Université!du!Littoral,!Calais,!France);!Fahad! Waseem! (University! of! Northumbria,! Middlesbrough,! UK);! Garry! Watkins! (University! of! Central! Lancashire,! UK);! Anne! Wheeler! (Aston!University,!UK);!Steve!Wheeler!(Faculty!of!Education,!University!of!Plymouth,!UK);!Nicola!Whitton!(Manchester!Metropoli tan! University,! UK);! Roy! Williams! (University! of! Portsmouth,! UK);! Shirley! Williams! (University! of! Reading,! UK);! Stanislaw! Wrycza! (University! of! Gdansk,! Poland);! Rowena! Yeats! (University! of! Birmingham,! UK);! Panagiotis! Zaharias! (University! of! the! Aegean,! Greece);!Mingming!Zhou!(Nanyang!Technological!University,!Singapore);!Chris!Zielinski!(External!relations!and!Governing!Bodies,! World!Health!Organization,!Geneva,!Switzerland);!Anna!Zoakou!(Ellinogermaniki!Agogi,!Greece).!!!

viii!

Three by Three by Three: A Model for eLearning Evaluation Donatella Persico, Stefania Manca and Francesca Pozzi Institute for Educational Technology, National Research Council, Genova, Italy [email protected] [email protected] [email protected] Abstract: This paper describes the approach adopted to carry out an evaluation of the STEEL eLearning system, developed for an Italian online university. The aim of the evaluation study was to identify the strengths and weaknesses of the system in view of innovative potential and impact. The core of the adopted approach is based on an adaptation of the Technology Acceptance Model (TAM), originally proposed by Davis (1989) as a theory to model how users of a new technological system accept it and take it on, based on two main acceptance factors: perceived usefulness and perceived ease-of-use. Although the TAM was not developed specifically for educational contexts, several authors have subsequently proposed and demonstrated how adaptations of the TAM theory can be used to evaluate the impact of technology even in these contexts. This paper is a step forward along this line of work to perform formative evaluation of eLearning systems. Though based on the TAM, our approach is also grounded on a couple of assumptions. The first is that data concerning actual use of the system and its effectiveness, in terms of concrete actions and achievements of its users, should also be considered, at least to complement and verify the data concerning users’ perceptions. A second assumption is that the evaluation should consider all the phases of use of the system (course design, running and evaluation), all the users of the system (students, teachers, and eLearning management), and all the components of the system (the eLearning platform, the learning resources and, last but not least, the pedagogical approach underlying the eLearning system). The resulting model is a three-dimensional one (phases of use, users and components), with three aspects to be considered on each axis. For each of the 27 combinations of these aspects, indicators of usefulness and ease-of-use have been identified, as well as, when available, data concerning actual use (derived from the tracking functions of the platform) and data regarding effectiveness (based on teachers’ adoption of new tools and students’ exam results). The paper describes the model and its indicators, discusses its pros and cons based on its field test. A summary of the results of the evaluation are presented along with considerations about future research in the field. Keywords: eLearning evaluation, technology acceptance model (TAM), tracking, learning outcomes

1. Introduction Research on the evaluation of formative systems has demonstrated the need for suitable methodologies and tools (Bloom et al., 1971; Flagg, 1990; Macdonald, 2003; Scriven, 1991). In the context of eLearning, much of the literature points out that evaluation should consider the system under study as a comprehensive set of components comprising virtual and/or real learning environments, human resources and learning material (Ardito et al., 2006; Giannakos, 2010; Jeffels, 2010; Lanzilotti et al., 2006; Rovai, 2003). In addition, the evaluation of a formative system cannot ignore the underlying methodological approach that, explicitly or implicitly, pervades the whole system and influences the work of those who use it (institutions, teachers and learners). Another aspect that should be considered is the aim of the evaluation. While some approaches are directed towards achieving a summative evaluation of the system, meaning that their output is the formulation of a comprehensive judgment on such system, often with certifying purposes, other approaches have a formative nature, i.e. their aim is to pinpoint the strengths and weaknesses of the system so that it is possible to devise strategies to improve it. The study presented in this paper belongs to the latter category, thus aiming to investigate the quality of an eLearning system in terms of its suitability to the needs of the various types of users. The identification of its critical issues is also pursued in order to find ways for further development and improvement. Consequently, the approach chosen favoured the adoption of tools that supply information on users’ acceptance (in terms of easeof-use and usability), impact determined by actual use of the system, and effectiveness of the formative processes it allows to be put into place. As far as users’ acceptance, one of the most well-known models to study user acceptance of innovations based on the introduction of new technology is the Technology Acceptance Model (TAM), originally proposed by Davis (1989). Although the TAM was not specifically developed for the evaluation of eLearning systems, its application in this area is now quite a consolidated practice (Edmunds, Thorpe and Conole, 2012; Park, 2009; Teo, 2009). However, starting from the definition above of a formative system as one that includes human resources and educational material, as well

416

Donatella Persico, Stefania Manca and Francesca Pozzi as a methodological approach, its use in this study has been stretched to cover non-technological components of eLearning systems as well. In digital environments, the actual use of a learning system can be specifically assessed thanks to the tracking capabilities featured by most eLearning platforms. All the actions performed by users inside the system can be tracked and their analysis (automatic or semi-automatic) provides both quantitative and qualitative information about the system use. Indicators of system use may relate to fruition of material, completion of and performance in learning activities, communication exchanges with other participants, sharing of material, resources and ideas, but also motivational and emotional aspects (Aviv et al., 2003; Daradoumis, Martinez-Monés and Xhafa, 2004; Garrison and Anderson, 2003; Pozzi et al., 2007; Schrire, 2006). The effectiveness of the formative processes is also considered by many researchers as important information on the quality of a learning system (e.g. Lee, 2005). This indicator is rather difficult to measure empirically, unless all the variables involved can be controlled, which is rather difficult to do in real life environments. However, some indications can also be drawn from an analysis of the students’ learning outcomes provided that these data are taken with due caution. In the following, an evaluation approach that mainly comprises the TAM, combined with tracking features and indicators from learning assessment, is presented. The paper is organized into a number of sections discussing the context of the research; the evaluation approach; the main results achieved in the testing phase; and conclusions and recommendations for further research.

2. Context of the study: The STEEL project The work described in this paper was carried out as part of STEEL (Systems enabling technologies and methods for online learning, http://steel.cilea.it), an Italian project involving the following partners: CNIT (National Interuniversity Consortium for Telecommunications), CILEA (Interuniversity Consortium for Automatic Computation), UNISANRAFFAELE (San Raffaele International Online University in Rome), and ITD-CNR (Institute for Educational Technology of the National Research Council of Italy). The aim of STEEL was the development of an eLearning system integrating different media and different communication technologies and its evaluation through a pilot study involving a significant number of courses and students of UNISANRAFFAELE. While a thorough description of the STEEL system is provided elsewhere (Del Re et al., 2010), this paper specifically concerns the approach adopted to carry out its evaluation. The STEEL project (2007-2011) developed along two intertwining lines of work: a technological and a methodological one. The first line aimed to develop a technological platform for online learning consisting of two integrated and tightly interconnected macro-components - an asynchronous open source Learning Management System (Moodle, http://moodle.org) and a Virtual Synchronous Classroom based on Elluminate live! (http://www.elluminate.com), allowing audio-video-textual videoconferencing enhanced with some collaboration functions (sharing of applications, guided tours of the Web, etc.). The STEEL platform makes it possible for teachers to design several types of learning activities for their students, to support them in the study of their subject, to devise and make assignment tasks, simulations, tools for e-portfolio management and formative assessment tests available online. Teachers can also release audio-visual as well as other types of learning material and structure educational contents into learning units; as to tutoring, they can use several tools to monitor students’ online work and to manage evaluation activities. As to collaborative learning, the system features both asynchronous tools such as forums and wikis, and synchronous tools such as textual chats, instantaneous messaging and audio-visual communication (mainly via Elluminate live!) (Del Re et al., 2010). The methodological line was aimed at defining a flexible instructional model and allows for personalised learning paths according to users’ needs and preferences. From the instructional point of view, the system allows the design and implementation of different eLearning approaches, from self-instructional solutions (audio or video lessons, individual exercises, etc.) to collaborative activities (asynchronous discussion, collective tasks, brainstorming, problem solving). The transition from the instructional mode in use at UNISANRAFFAELE before the project (mostly transmissive) to the STEEL approach was facilitated by the project partners through a blended course called DID@STEEL. This course was run in the second half of 2009 and lasted 5 months. Its main aim was

417

Donatella Persico, Stefania Manca and Francesca Pozzi getting the university teaching staff acquainted with the new system and, in particular, with those teaching and learning methods that STEEL makes possible, thus allowing for a smooth transition to the new system. During the final part of this training course, the university teachers tried to put into practice what they had learnt by (re)designing their courses. After the DID@STEEL course, the final stretch of the project was devoted to a pilot test of the new courses and consequently to an evaluation of the STEEL system, which is described in the following section.

3. Evaluating the STEEL eLearning system The overall evaluation process took place from April 2010 to July 2011, beginning with a pilot test (11 months, from April 2010 to March 2011) and then devoting the remaining months to data collection and analysis. The field test consisted of using the system to deliver eight pilot courses, a selection of the teaching subjects from the sports science programme of UNISANRAFFAELE, involving about 100 active students each (active meaning who accessed the course at least once). This field test was aimed at verifying the impact of the new system from both the standpoint of methodological effectiveness and that of the technological quality. The methodological aspects mostly concern soundness and effectiveness of the instructional design of the courses while the technological aspects primarily regard the suitability and reliability of the software platform. The eight teachers were in different positions: only one had already designed and delivered various editions of her course; the others were newly hired and their courses had been taught by others, or (in one case) never taught before. After the DID@STEEL course, all the teachers chose to (re)shape their courses by introducing new methods and tools, except for the teacher of the new course, who designed her course from scratch. Hence, the pilot test consisted of using the platform to run preexisting courses enhanced with STEEL methods and tools in all cases but one. As already mentioned, the approach adopted was a formative one, aimed at improving the system from inside the project, possibly minimizing wastes of effort and anticipating problems, in a constructive manner. To this end, the evaluators could contribute to the project from its early stages, continuously monitoring project development and trying to anticipate possible problems. For example, the decision to run the DID@STEEL course derived from an analysis of the context where the pilot course was to be run that brought to evidence the need to create more favourable conditions for the introduction of STEEL and the adoption of its tools. Many choices made in this project are in line with the findings of Laurillard et al. (2009). These authors underlined the need for active participation of teachers in educational innovation processes and the importance of getting them involved in the design of any such initiative. Moreover, they suggest implementing changes gradually rather than imposing sudden alterations of current practices. In the case of STEEL, the UNISANRAFFAELE teachers were introduced to the new system and its methodological innovations and then supported during the re-design phase of their courses, knowing that new tools and methods cannot be forced on teachers and learners but need to be tried out so that appreciation of their advantages can bring about the motivation needed to face possible problems. In STEEL, the engagement of teachers was essential not only to implement innovation but also to collect the evaluation data.

3.1 The evaluation approach The evaluation was carried out by a team of researchers from ITD-CNR. This project partner was not involved in the development of the system except for what attained to its role of formative evaluator. The core of the adopted approach is based on an adaptation of the TAM where user acceptance of a new technology is measured in terms of two factors: perceived usefulness and perceived ease-of-use. The former is defined as "the degree to which a person believes that using a particular system would be free from effort", while the latter is “the degree to which a person believes that using a particular system would enhance his or her job performance” (Davis, 1989: 320). In addition to the basic ideas of TAM, the approach proposed is also grounded on a couple of assumptions. The first is that, in view of assessing the impact of an eLearning system on its users, information about perceived usefulness and perceived ease-of-use is not sufficient: data concerning

418

Donatella Persico, Stefania Manca and Francesca Pozzi the actual use of the system and its effectiveness, in terms of concrete actions and achievements of its users, should also be considered, at least to complement and verify users’ perceptions. A second assumption is that the evaluation should consider all the phases of use of the system, all of its users, and of course all of its components. In our case, the phases of use are the development phases of the courses: course design, running (or delivery) and evaluation. The users are not only the students, but also the teachers and the eLearning managers (experts who provided technical support to teachers and students concerning use of the platform and university employees who handled enrolments and supported other users on administration matters). As to the components of the system, these are the eLearning platform, the learning resources and, last but not least, the underlying pedagogical approach, be it explicitly defined or tacitly accepted. So, the model resulting from the above considerations is a three-dimensional one (phases of use, users and components), with three aspects to be considered on each axis (see figure 1). For the 27 combinations of these aspects, indicators of usefulness and ease-of-use have been identified, as well as data concerning actual use (derived from the tracking functions of the platform) and data regarding effectiveness (based on teachers’ adoption of the new tools and students’ exam results).

Figure 1: The three-dimensional model used for the evaluation of the STEEL system 3.1.1 Evaluation means According to the STEEL evaluation approach, the three-dimensional space of figure 1 can be split into three bi-dimensional planes, one for each type of user. We will call these user-views. For each type of user, suitable means were identified to gather data concerning their opinions on the relevant indicators. These means were chosen according to suitability criteria, such as expected number of respondents, ease of contact and richness of expected information. More specifically, the students were addressed through questionnaires requiring them to rate each indicator on a 5-point Likert scale, so as to obtain quantitative information about qualitative aspects such as perceived ease-of-use and perceived usefulness of the different system components. Given that there were only eight teachers involved in the pilot study, it was felt that a structured individual interview would allow us to gather more in-depth information, in addition to quantitative assessment of some aspects that teachers were required to rate. The interviews were recorded to allow further analysis.

419

Donatella Persico, Stefania Manca and Francesca Pozzi As for the six eLearning managers, they were asked to take part in a one-day meeting structured with the ‘meta-plan’ technique. This meeting allowed us to collect their complementary views on the problems and strengths of the system. They were also asked to rate the relevant indicators. 3.1.2 Indicators and items The two dimensions of the system components and course development phases determine a matrix consisting of nine cells, shown in figure 2.

Figure 2: The 3x3 matrix identified by the phases of course development and the system components The matrix identifies the nine categories of indicators used to determine the user-views and therefore to build the data-collection tools. Each user-view is a subset of the matrix, containing those aspects for which it makes sense to probe the relevant user group. For example, since the design phase of a course mostly concerns teachers and eLearning managers, the first column of the student-view is empty. Similarly, the last column of the student-view is almost empty, except for those aspects that have to do with self-monitoring and self-assessment. According to our model, each of the three user-views derived from the 3x3 matrix are instances containing the relevant aspects for that type of user. The tools actually used to probe the users involved in the pilot study were built on the basis of the relevant view, so the student questionnaire was based on the student-view, the content of the teacher interview derived from the teacher-view and the structure of the meta-plan for the eLearning managers came from the eLearning-managerview. Since it would be too long, here, to include all the user-views and all the tools derived from them, only one example is reported in detail, focusing on the first column of the teacher-view and showing how the relevant section of the interview structure was derived from it (figure 3). Section 4 provides a summary of the data obtained with this adaptation of the TAM, integrated with those coming from tracking and learning outcome analysis. 3.1.3 Complementing the TAM with tracking data and learning outcomes As introduced above, the STEEL evaluation approach complemented the data deriving from the adapted TAM approach with data concerning actual use of the system obtained thanks to the tracking functions of the platform and data concerning learning outcomes deriving from the normal learning assessment procedures of the online university.

420

Donatella Persico, Stefania Manca and Francesca Pozzi More specifically, actual use during the pilot study was tracked for students (about 100), teachers (the eight teachers of the courses involved) and eLearning managers (six people). The STEEL system provides accurate and detailed tracking data, so a selection was made to identify the most useful indicators in view of the aim of complementing or identifying the degree of credibility of the information deriving from the TAM study. Given that the transmissive approach dominating before this study at UNISANRAFFAELE was still rather strong after the introduction of the STEEL system, data about fruition of textual, audio, video and interactive learning material were among those considered. However, data about message exchanges in forums and usage of synchronous communication were also collected, in order to understand to what extent interactions between students and teachers had been practiced.

Figure 3: Deriving the teacher interview from the teacher-view (design section) Examples of tracking data for teachers therefore included the number of messages sent and read; the number of accesses and contributions to the glossary, the database, the wiki; the number of quiz revisions; the number of times teachers participated in chats, and the number of assignments evaluated. As for students, similar criteria were used, except that, instead of tracking student behaviour for each individual resource, the data gathered were limited to check whether individual students had used one particular kind of resource at least once. So, the number and percentage of students who used at least one audio or video lesson, the number and percentage of students who read or sent a message in a forum, accessed or contributed to the glossary/database/wiki, accesses, use and completion of quizzes, accesses to webpages and participation in chats, are examples of this kind. Finally, an ad-hoc system was developed to evaluate the effectiveness and efficiency of the services provided by the eLearning managers. This system is based on MantisBT bugtracking Open Source system (http://www.mantisbt.org/) which keeps track of help requests sent, help requests processed; time to delivery of support; queues and priority of requests and other relevant data. As far as learning outcomes are concerned, traditionally these data comprise the drop-out percentage, data about success and failures in exams, average grades obtained, etc. These data

421

Donatella Persico, Stefania Manca and Francesca Pozzi were available for the evaluation of the STEEL system, even though student anonymity had to be preserved for legal and ethical reasons. However, since the new system replaced a previously existing eLearning system, a comparative study of pre-STEEL courses and pilot courses was not possible given that many of the variables at play were not controllable: most of the teachers had changed, but one had not; some of the students had started studying with the old system and ended with the new one, some of the exams had a different form in the two phases, etc. Due to these limitations, two types of data were considered: the structure of the final exam in terms of content and assessment method, and the results (grades) obtained by the students. This information was collected through a form, filled in by the teachers after each exam session in the pilot period. The form requested information about the course aims and content, assessment procedure, and performance of individual students. These data allowed us to check what objectives were consistently achieved or not achieved by several students, thus pointing out potentially successful or critical aspects of the courses. Cross referencing these data with those on use and with perceptions on ease-of-use and usefulness completed the picture and allowed the evaluators to draw some useful conclusions.

4. Main results In the following the main elements emerging from the STEEL evaluation are discussed, while further details are reported in (Delfino et al, 2011). The data obtained are undoubtedly rich and complex, thanks to the variety of methods which supplied information complementary to each other, so for the sake of clarity, they are grouped into educational, technological and organizational categories. From the educational point of view, data from tracking and from the adapted TAM suggest that the introduction of the system brought about important changes in the approach adopted within each course: most teachers have redesigned learning objectives, contents, presentation modes and other aspects concerning their teaching approach. Some have also featured the possibility of customizing formative paths, mainly from the point of view of time, and have set up self-assessment procedures and flexible final assessment methods. All of these activities concerning the instructional design of their courses were considered relatively easy to perform and quite useful by teachers, and went down well with students. In addition, the use of tools facilitating communication among course actors has remarkably increased, the prevalent uses being asynchronous communication and one-to-many support, while interaction among students and group work were still not very common. The acceptance degree of the platform by students was, in general, quite high. However, consistently with what just stated as to the preferred modes of communication, tools supporting real time communication (chat and videoconference) turned out to have lower degrees of acceptance than the other platform components, such as asynchronous communication tools. This lower acceptance was put down to difficulties of use which determined a sort of distrust for synchronous communication tools. In this regard, the opinion of the eLearning managers and teachers is that the use of synchronous tools is rather cumbersome to arrange due to the requirement of simultaneous access. It was also suggested that the choice of such a rich tool as Elluminate live! for synchronous communication determined the integration of functions which are too refined for the needs of UNISANRAFFAELE users. Similarly, the limited use of more specialized Moodle functions oriented to collaborative learning, such as wiki, database and glossary, and the much higher uptake of forums, confirms the impression of a low degree of maturity of STEEL users, both teachers and students, towards the educational use of the STEEL functions. It should be noted that this type of collective maturity cannot be attained in a short time. In this sense, the DID@STEEL course was certainly useful, and a direct, continuous and prolonged counselling for teachers, in the long run, could further promote the uptake of innovation. Moreover, ad-hoc initiatives oriented to teach students how to use less popular system tools could foster their wider use. As for the organizational aspects, the introduction of STEEL has determined some important changes, such as teachers’ increased autonomy in making changes to their course structure or producing new learning materials without external support.

5. Conclusion and future directions In the following the main lessons learnt from this experience are summarized. The evaluation model adopted is rooted in existing theory and practice, but it has some peculiarities that have proved important and could be usefully transferred to similar contexts. These peculiarities are the three

422

Donatella Persico, Stefania Manca and Francesca Pozzi dimensions considered to implement the TAM and the fact that the TAM results have been integrated with other data coming from the pilot experiment. As for the three dimensions of the approach (users, course development phases and system components) the following comments can be made. Involving all types of users in the evaluation is essential, not only because they provide multiple points of view on the same aspects, but also because they actually see different kinds of problems. In general, each type of user needs to be reached with ad-hoc tools, but since these tools derived from a common methodological framework, they were easy to merge to provide a comprehensive picture. The three phases of course development were also very important, mostly for teachers, because providing them with the most effective tools to deliver their courses would be useless, if care were not also taken in supporting design, monitoring and evaluation effectively. Last but not least, the system components should not be limited to the hardware and software made available by the eLearning system: overlooking the underlying pedagogical approach would prevent us from understanding the intended use and thus realizing when the system does not comply with the needs of the users. Tracking the actual use of the platform and considering the students’ learning outcomes, have helped a lot to find some of the main weaknesses of the system: tracking pointed out platform functionalities that have been under-used by students and/or teachers, learning outcome analysis was necessary to check that the system introduction did not generate any macroscopic problems in terms of drop-outs or assessment results. As conceived, it should also have pointed out problematic areas inside individual courses but the information provided by the teachers in the assessment results were not accurate enough to achieve this aim. This evaluation approach has several strengths and some drawbacks. Among the strengths, its completeness and internal coherence are perhaps the most important. Once the indicator matrix is ready, the users’ views can be derived and the tools to probe the users can be built. Even though the tools are different, they have the same structure and this is very useful when the data coming from the different sources are compared. The presence of both qualitative and quantitative data is also an asset of the method: the users’ opinions help to identify the causes of problems, while the ratings help to measure their importance. Among the drawbacks, it should be noted that the whole evaluation process required a great deal of effort, and this was made possible, in STEEL, by the availability of funding devoted to this purpose. However, much of the work was devoted to define the general framework of the approach; hence, its results could be partly reused in view of transferability to a different context. A second aspect that turned out to be problematic derives from its being heavily based on a pilot run of several courses in a real environment, with real teachers and students. Although this feature is an indefeasible source of reliable information, it forces the evaluators to cope with a number of issues such as ensuring a smooth transition from the old system to the new one, ensuring an acceptable quality of service during the experiment, renouncing all the well-known advantages of in-vitro experiments where variables are all controlled except for the ones under study, etc. During the STEEL pilot, UNISANRAFFAELE hired new teachers and made changes to the course programme thus interfering with the evaluators’ work at various levels. Moreover, although the juxtaposition of different data has been useful for the study, the respect for privacy of students’ data did not allow us to cross-check some of the results. For example, the association of the marks in the final exams of a student on the activity they performed on the platform was not possible. Summing up, this approach shows that research in eLearning evaluation is facing a scenario characterized, on one hand, by ever more accurate and effective tools for monitoring and mapping of objective data on learning processes and, on the other hand, by challenges to the management, presentation, analysis and interpretation of these data. As a consequence, a research direction that would deserve further effort is the definition of effective methods and tools to support this phase of the formative evaluation of eLearning systems.

References Ardito, C., Costabile, M.F., De Marsico, M., Lanzilotti, R., Levialdi, S., Roselli, T. and Rossan, V. (2006) “An approach to usability evaluation of eLearning applications”, Universal Access in the Information Society, Vol 4, No. 3, pp 27083.

423

Donatella Persico, Stefania Manca and Francesca Pozzi Aviv, R., Erlich, Z., Ravid, G. and Geva, A. (2003) “Network analysis of knowledge construction in asynchronous learning networks”, Journal of Asynchronous Learning Networks, Vol 7, No. 3, pp 1-23. Bloom, B.S., Hastings, J.T. and Madaus, G.F. (1971) Handbook on Formative and Summative Evaluation of Student Learning, McGraw-Hill, New York. Conole, G., White, S. and Oliver, M. (2007) “The impact of eLearning on organisational roles and structures”. In G. Conole and M. Oliver (Eds.), Contemporary perspectives in eLearning research: themes, methods and impact on practice, Routledge Falmer, London and New York. Daradoumis, T., Martinez-Monés, A. and Xhafa, F. (2004) “An integrated approach for analysing and assessing the performance of virtual learning groups”, Lecture notes in Computer Science, 3198, pp 289-304. Davis, F.D. (1989), “Perceived usefulness, perceived ease-of-use, and user acceptance of information technology”, MIS Quarterly, Vol 13, No. 3, pp 31940. Delfino M., Manca S., Persico D., Pozzi F. and Sarti L. (2011) Rapporto sulla qualità didattica del corso pilota, STEEL Technical Report RT-2.4.1. Del Re, E., Delfino, M., Limongiello, G., Persico, D., Scancarello, I. and Suffritti, R. (2010) “Systems, enabling technologies and methods for distance learning: the STEEL project”, ISDM Information, Savoirs, Décisions & Médiations - Information Sciences for Decision Making, n. 39, article n. 657, http://isdm.univtln.fr/PDF/isdm39/Article_Isdm_Ticemed09_Delre-et-al_OK.pdf Dillenbourg, P., Jarvela, S. and Fischer, F. (2009) “The Evolution of Research on Computer-Supported Collaborative Learning”. In N. Balacheff, S. Ludvigsen, T. de Jong, A. Lazonder, and S. Barnes (Eds.), Technology-Enhanced Learning: Principles and Products, Springer, Dordrecht, pp 3-20. Edmunds, R., Thorpe, M. and Conole, G. (2012) “Student attitudes towards and use of ICT in course study, work and social activity: a technology acceptance model approach”, British Journal of Educational Technology, Vol. 43, No. 1, pp 714. Flagg, B.N. (1990) Formative evaluation for educational technologies, Lawrence Erlbaum Associates, Hillsdale, NJ. Garrison, R. and Anderson, T. (2003) ELearning in the 21st century. A framework for research and practice. Routledge Falmer, London and New York. Giannakos, M. (2010) “The Evaluation of an ELearning Web-Based Platform”, Proceedings of the 2nd International Conference on Computer Supported Education, pp 433-438. http://ionio.academia.edu/mgiannakos/Papers/396835/The_Evaluation_of_an_ELearning_WebBased_Platform Lanzilotti, R., Ardito, C., Costabile, M.F. and De Angeli, A. (2006) “eLSE Methodology: a Systematic Approach to the eLearning Systems Evaluation”, Educational Technology & Society, Vol. 9, No. 4, pp 42-53. Laurillard, D., Oliver, M., Wasson B. and Hoppe, U. (2009) “Implementing Technology-Enhanced Learning”. In N. Balacheff, S. Ludvigsen, T. de Jong, A. Lazonder, and S. Barnes (Eds.),Technology-Enhanced Learning: Principles and Products, Springer, Dordrecht, pp, 289-306. Lee K. (2005) “ELearning: The Quest for Effectiveness”, Malaysian Online Journal of Instructional Technology, Vol. 2, No.2, pp 61-71. Macdonald, J. (2003) “Assessing online collaborative learning: process and product”, Computers & Education, Vol. 40, No. 4, pp 377-391. Park, S. Y. (2009) “An analysis of the technology acceptance model in understanding university students' behavioral intention to use eLearning”, Educational Technology & Society, Vol. 12, No. 3, pp 15062. Pozzi, F., Manca, S., Persico, D. and Sarti, L. (2007) “A general framework for tracking and analysing learning processes in CSCL environments”, Innovations in Education & Teaching International, Vol. 44, No. 2, pp 169-179. Rovai, A. P. (2003) “A practical framework for evaluating online distance education programs”, Internet and Higher Education, Vol. 6, No. 2, pp 10924. Schrire, S. (2006) “Knowledge building in asynchronous discussion groups: Going beyond quantitative analysis”, Computers & Education, Vol. 46, No. 1, pp 49-70. Teo, T. (2009) “Modelling technology acceptance in education: A study of pre-service teachers”, Computers & Education, Vol. 52, No. 1, pp 302-312.

424