Journal of Information Literacy

0 downloads 0 Views 125KB Size Report
Jan 1, 2007 - for your audience along with a reading list to allow them to conduct further ..... anticipated that students would give more correct answers in the ...
Journal of Information Literacy ISSN 1750-5968 Volume 1 Issue 1 January 2007

Refereed Article Walton, G., Barker, J., Hepworth, M. and Stephens, D. (2007) “Using online collaborative learning to enhance information literacy delivery in a Level 1 module: an evaluation.” Journal of information literacy, 1(1), 13-30.

Copyright for the article content resides with the authors, and copyright for the publication layout resides with the Chartered Institute of Library and Information Professionals, Information Literacy Group. These Copyright holders have agreed that this article should be available on Open Access. “By 'open access' to this literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. The only constraint on reproduction and distribution, and the only role for copyright in this domain, should be to give authors control over the integrity of their work and the right to be properly acknowledged and cited.” Chan, L. et al (2002) Budapest Open Access Initiative. New York: Open Society Institute. http://www.soros.org/openaccess/read.shtml (Retrieved 22 January 2007)

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Using online collaborative learning to enhance information literacy delivery in a Level 1 module: an evaluation Geoff Walton, Subject & Learning Support Librarian, Staffordshire University. Email [email protected] (author to whom correspondence should be addressed ) Jamie Barker, Lecturer in Sport & Exercise, Staffordshire University. Email [email protected] Mark Hepworth, Lecturer in Information Science, Loughborough University. Email [email protected] Derek Stephens, Lecturer in Information Science, Loughborough University. Email [email protected]

Abstract Purpose The purpose of this study was to encourage Sport & Exercise Level 1 students to use the discussion board

facility in the Blackboard Virtual Learning Environment (VLE) in order to engage them in online collaborative learning of information literacy. This was achieved by using notions of scaffolding, reflection and situated learning in delivering the information literacy (IL) elements of the programme. Delivery of the programme was carried out in a blended fashion (a mix of face-to-face and online interventions). The study is part of a PhD pilot study and a Learning & Teaching Fellowship project undertaken by the main author.

Methodology This was a quasi-experimental design using both qualitative and quantitative strategies. Qualitative data was gathered via: capturing student postings and examining their content; a questionnaire administered at the end of the module and from Focus Group responses. Quantitative data was gathered via pre and post delivery tests and by calculating numbers of postings and time taken by students to make initial postings.

Findings This paper indicates that it is possible to engage students in even the most detailed aspects of IL (for example, breaking down a URL as a criterion for evaluating a web site or where to place commas in a reference) if the appropriate tasks (involving active hands on, collaborative working), settings (within a subject based module during a timetabled session) and assessments (task based with some form of evaluation and reflection) are used. Discussion board output captured via VLE provides a rich insight into what students learn as they tackle IL online activities. From the tutors’ perspective the process of iteration used in the evaluation activities was successful and was an unanticipated outcome of the delivery. It can be seen that by seeding online discussions with student comments ‘moments of iteration’ were provided which enabled IL learning to be articulated in increasing detail.

Practical implications We feel that the study shows that IL programmes underpinned with a productive collaborative relationship between support services and faculty is central to successful delivery. Scaffolded learning has two benefits: as a successful pedagogical technique within online collaborative learning (OCL) and as a mechanism for realising the iterative process within IL itself. A new process map showing how to structure this within the Blackboard discussion board facility is put forward for the purposes of improving future delivery and providing the basis for further research. However, the research also revealed that more work is required both in terms of courseware development and in articulating more robust techniques for analysing discussion content.

Originality Providing learning opportunities via means of online collaborative learning to level 1 Sport & Exercise students is a novel approach to the delivery of IL in the university sector.

13

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Keywords Information Literacy; Higher Education; Undergraduate students; Teachers: librarians; Sports faculty; Qualitative research; Quantitative research; Blended learning; Teaching methods; Online collaboration

1. Introduction The study focuses on the impact Information Literacy (IL) provision delivered via blended learning (a mixture of face-to-face and online interventions) had on students who attended Effective Learning, Information & Communication Skills in Sport & Exercise SHP91000-1 in 2005-06. The online intervention was delivered using the Blackboard discussion board facility. This is a 30 credit Level 1 core module taught by the second author and by academic staff from Sport and Exercise at Staffordshire University. The delivery of the whole module took place over two semesters and the IL element was covered in semester one. The IL element was taught by the first author. Until the academic year 2005-06 IL provision has consisted of one 50-minute face-to-face workshop delivered to students in seminar groups where they learned to use the Library Catalogue, e-journals and e-books. Student learning was assessed via a portfolio exercise (the IL component entitled Introduction to e-resources is Section E of the assessment) and a reflective practice statement. The change in the leadership of the module allowed us to thoroughly re-examine the delivery of IL which was influenced by the findings of recent research and resulted in an expanded programme including the online activities as detailed below. This paper describes the structure of the IL programme by setting out its various elements in chronological order. It also provides a detailed evaluation using a range of data gathering tools. Changes in students knowledge was measured using pre and post delivery diagnostic tests. The level of activity in discussions was quantified by measuring the number of student postings made and by recording the time taken by students to do postings. The quality of discourse was analysed by examining the content of student postings made to a discussion board. Students’ views regarding the programme were gathered via questionnaire. Additional material was gathered by convening a Focus Group of 5 students selected from the experimental group. Finally, based on the findings of the study this paper recommends changes to the programme and a new process of managing online collaborative learning for further research.

2. Literature review Recent IL scholarship (Bruce, 1995; ACRL, 2000; Bordinaro & Richardson, 2004; Walker & Engel, 2004; Bundy 2004; Armstrong et al, 2005) recommends that learners need to become proficient problem-solvers and critical thinkers. This can be facilitated by setting students a subject-based problem to solve and then exposing them to tools such as an e-journals package and techniques like how to construct a search strategy and how to evaluate the information that they have found. The need for problem-solving and critical thinking are notions which also feature in research on learning theory (Kolb et al., 1991; Moseley et al, 2004), studies on e-learning (Mayes and de Freitas, 2004), and information behaviour theory (Ford, 2004). The need to communicate information effectively both in terms of exchanging information between peers and in completing assignments, by creating a bibliography for example, also features in many IL models (Big Blue, 2002). It is argued that by engaging in these activities students will become information literate within their subject area. In the teaching & learning arena work by Laurillard (1993), Jonassen et al (1995), Northedge (1997), Goodyear (2001) and Salmon (2004) suggests that learning is a social activity, in that students are more successful learners when they work and interact together. Furthermore, Goodyear (2001), JISC (2004), Mayes & de Freitas (2004) and Salmon (2004) suggest that online learning 1 is most effective, i.e., students are more engaged when they are involved in online discussion and collaboration as well as reading web pages and using interactive e-resources. Features of e-learning include notions of collaboration (Mayes, 1995; Goodyear, 2001; Salmon, 2004), scaffolding (Mayes & de Freitas, 2004), reflection (Race in Hinett, 2002 a & b), situated learning (Mayes & de Freitas, 2004). The new module leader was concerned that

1

In this paper the terms online learning and e-learning are used synonymously.

14

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

students were under using Blackboard as a means of communicating ideas it therefore seemed sensible to structure delivery of the module using these ideas. How this was achieved is discussed below.

3. Research project aims & objectives To realise the specific goal of encouraging students to use the discussion board in order to engage in online collaborative learning we recognised that a framework for delivering the IL elements of the programme in a blended fashion (a mix of face-to-face and online interventions) was required. To maximise learning in an online or blended setting we adopted the recommended pedagogic approaches of scaffolding, reflection and situated learning. The definition of scaffolding followed is put forward by Mayes & de Freitas (2004) which states (in brief) that scaffolding involves four stages: learning activities which are real or simulated with structured interaction (collaboration) between participants – where opportunities are made for all students to contribute and the process rather than the task is the focus; guidance by an expert (e.g., tutor(s)) and the locus of control gradually passes from tutor to learner. Hence, we attempted to devise activities which reflect this process. The definition of reflection for this project is taken from Race particularly noting; ‘reflection is about linking one increment of learning to the wider perspective of learning’ (Race in Hinett 2002a & b, p2). This suggests that reflective practice exercises should take place ideally after each activity: our provisional model was designed to achieve this. The notion of situated learning emphasises the importance of context-dependent learning where the learning activity is given as authentic a social context as possible (Mayes & de Freitas, 2004). A subject specific problem based scenario was developed to address this. Courseware which facilitate this kind of pedagogic approach is put forward by Goodyear (2001). Based on the work of Mayes (1995), Goodyear (2001) argues that online material, or courseware, can be divided into three types primary, secondary and tertiary. Primary courseware consists of subject matter such as web pages, online lecture notes and documents which convey information (for example, concepts and arguments) but do not necessarily engage students in actively learning about a topic. Secondary courseware consists of online tests or quizzes to question and apply new concepts to meaningful tasks. Therefore, in being interactive they are more engaging than primary courseware. Tertiary courseware, for example the Blackboard discussion board, enables the production of new material, generated by the online discourse which is then captured and made available to all learners and tutors. The captured online discourse can then be read, analysed or re-used by all participants. Tertiary courseware provides an opportunity to engage in collaborative working with fellow students and tutors which is highly interactive and engaging with a large learning payoff (Mayes, 1995). Hence, we constructed a number of activities (Online Discussions 2 – 6 shown in the Module Delivery section below) in tertiary courseware mode with a view to using secondary and/ or primary courseware to support these in order to maximise learning. To take forward the project aim we devised the following objectives: •

Devise subject specific IL activities which provide students with opportunities for online discussion & collaboration



Create a pre-delivery test to determine baseline skills in certain IL areas



Re-assess these skills after delivery to detect differences



Try out a scaffolded delivery framework for online collaborative working



Analyse online collaborative dialogue to gain a rich picture of the online conversational processes.



Devise a set of reflective questions to be used to encourage reflective learning.



Use findings to improve this framework for next cohort and Main Study.

In essence this research constitutes the Pilot phase of the first author’s PhD research where he (in close liaison with the module leader) has assembled an initial delivery package and tried out some suggested research techniques.

4. Methodology This was a quasi-experimental design using both qualitative and quantitative strategies. Qualitative data was gathered via: capturing student postings and examining their content; a questionnaire administered at the end

15

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

of the module and from Focus Group responses. Quantitative data was gathered via pre and post delivery tests and by calculating numbers of postings and time taken by students to make initial postings. The participants were undergraduate students who have been organised into their seminar groups for the Level 1 core module Effective Learning, Information & Communication Skills in Sport & Exercise (2005-06 entry) at Staffordshire University. The cohort for the year examined in this study consisted of 142 students organised into seminar groups of up to 25. There was a mixture of BA and BSc students in the cohort, but only BA students participated in this study. The participants (30 in total) were divided between two of these seminar groups: 1. An experimental group of 17 students who received the face-to-face IL workshop and online collaborative learning intervention. The Virtual Learning Environment (VLE) used to facilitate learning was Blackboard. 2. A control group of 13 students who only received the face-to-face IL workshop.

5. Delivery This section describes the tasks and activities that we designed and implemented, including examples of student response.

5.1 Face-to-Face Workshop The whole session was now focused on answering the subject specific problem-based scenario rather than showing students how to use individual e-resources. The learning outcomes, in summary, were that at the end of the session students would be able to: •

Recognise their information need by identifying appropriate keywords;



Use common Boolean terms ‘and’ ‘or’, ‘not’ to construct a simple search strategy;



Identify appropriate electronic resources such as the Library Catalogue, e-journals and e-books and use them appropriately.

The scenario for student participants was as follows: “A local PE teacher asks you to do a talk about soccer violence (also known as football hooliganism, or stadium violence) to class of ‘A’ level sports students. How would you go about gathering and selecting appropriate information for this? Please provide a selection of web sites for your audience along with a reading list to allow them to conduct further reading on the topic. “Your list should include full references to the following: 2 web sites; 2 books; (or e-books); 2 journal articles (at least one from an e-journal).” Delivery included identifying the information need by: discussing with students how to identify keywords and turning these into a simple search strategy using Boolean ‘and’; using the Library Catalogue and using ejournals to answer the question. Students also received a handout containing instructions on how to use ebooks (e-brary in particular). Each week, following the workshop, students were presented with an online task in the discussion board area as shown below: •

Online Reflective Practice Task (Online Discussion 1)



Online Discussion 2 (general evaluation criteria)



Online Discussion 3 (detailed evaluation criteria)



Online Discussion 4 (evaluation criteria for web pages)



Online Discussion 5 (decoding Uniform Resource Locators (URLS)



Online Discussion 6 (referencing information sources)



Online Reflective Practice Task (Online Discussion 7)

In brief the learning outcomes for these activities are summarised below: •

In completing Online Discussions 2, 3, 4, and 5 students will be able to systematically evaluate web sites by using an agreed set of criteria identified via discussion.



In completing Online Discussion 6 students will be able to use the APA referencing style to create a bibliography for common information sources (i.e., books, journals and web sites).

16

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

The provisional process map for managing the IL online activities is shown in Figure 1 below:

Figure 1: Initial process for managing OCL activities

Forum

Forum: This constitutes the title of a discussion board activity for example, Referencing your sources.

Thread

Thread: This contains the instructions for the task that students are expected to complete for example, ‘1) Think about what you have covered today. 2) Answer the questions set out below […].’ (Squared brackets in this fashion […] denote where some discourse has been omitted for clarity.)

Web resource

Seed Week 1

Student Posting: This is the output from students once they have opened the Forum, read the Thread and Seed and engaged with the online activity for example, “I would judge a book by determining how referenced the book is & what references it included. […]”

Student posting

Tutor Summary Week 2

Forum

Seed: This contains the starting point for the discussion for instance this was used to start a discussion on URLs, “It was mentioned in the previous discussion that you can check the URL of a web page to work out its origin[…]. What is a URL and how do you work out its origin?”

Web resource

…And so on, each activity concluded with a summary.

Tutor Summary: This is constructed by the tutors and contains salient points raised from student postings with additional comments from tutors for example, “You have identified some excellent criteria - we have added some extra points […]. There are no guarantees that any web page is unbiased, error free or reputable […]” This was envisaged as a linear process with each activity concluded and closed with a summary before moving on to the next new Forum. The threads would be generated each week by tutors. Hence, the content would be tutor rather than student led.

5.2 Online Reflective Practice Task At the end of the workshop students in the experimental group were given their first online task. This was designed to enable them to actively reflect on the workshop and also encourage them to use the discussion board. The Forum that the students were directed to locate was entitled Online Reflective Practice Task. This constituted the thread of the discussion where students were asked to respond in the following fashion: “1) Think about what you have covered today. “2) Answer the questions set out below. “3) Send your response to the discussion board. “4) Read the responses your fellow students have made and make at least one comment on any of the postings. Please be constructive in your comments and undertake this task by […]”

17

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

The questions provided on the discussion board were as follows: “1) What did I actually learn in this session today? “2) Which were the most difficult parts and why were they difficult for me? “3) Which were the most straightforward parts and why did I find these easy?” These three questions were adapted from Race (in Hinett 2002a & b) and constituted the seed for the discussion. This element of the programme will be discussed in more detail in section 6.

5.3 Discussion 2 (general evaluation criteria) The next online collaborative working activity focused on general evaluation criteria in order to evaluate information. Students were given this activity in the second half of the workshop following a session involving an unrelated strand of the module. They were all instructed to log on to Blackboard and find the discussion board Activity. This took place within the timetabled workshop. The discussion thread asked students their views on what criteria they would use to evaluate information. A seed in the form of some initial general ideas was given on the discussion board as follows: “ For the problem based scenario in Section E of the portfolio you need to be able to evaluate the books, articles and web pages you have found and state your reasons (criteria) in the proforma so that you are sure that the information you have is of a good quality. “We thought it would help if we had a discussion about what good evaluation criteria are. To get you started consider the issue of references in an information source and ask the questions: are there any? How are they being used in a particular source (book, article, web page)? What data is being used to back up an argument (easy in books and journal articles, not so easy in web pages)? “We would argue that a good quality book will contain many reference to other books and academic journals, there will be quite a bit of data e.g., statistics which will indicate where they were taken from (author, sporting or government body etc). “So...What would you say are good criteria? Please have a think about this and post some reasons (criteria) for others to respond to. Once you have done this have a look at your colleague’s postings and post a response. We will summarise replies and give additional guidance as we proceed.” Students were then instructed to do the following: Read the activity instructions; write their evaluation criteria; post these to the discussion board for all to see; read all postings made by group colleagues; write a second posting in response to one of the postings made by a fellow student. It should be noted that for this and each subsequent online discussion all students present at the time responded with at least one posting. Two selected student postings are shown here: “A valid and effective book, as a resource, needs to be fairly up-to-date/recent material” “A book or article by a well known author will seemingly add credibility to the resource.” Tutors analysed the responses and made selections for the tutor summary (using students’ own words as much as possible to engender a sense of ownership of the material). Tutors made additional comments and indicated these by means of [squared] brackets as shown below in the following extract. This was the only time that we used squared brackets in this fashion as it seemed to make the summary look unnecessarily complicated. The summary was couched in positive terms to create a motivational atmosphere. “You have raised several important issues regarding the evaluation of sources and identified some excellent criteria. “[…] Someone else's (unbiased) argument to back the original argument (argues the same point). “[Bias is minimised by using]: Evidence to back up the points (lots of facts and figures) that are made with detailed explanations of the information that is needed. “It should give any statistical information (on previous research) i.e Graphs which help understand the information given. It should also contain definitions of words and principles stated by the author […]”

18

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

5.4 Discussion 3 (detailed evaluation criteria) We scaffolded the activity in the following way: a statement was selected from summarised responses within Discussion 2 and used to seed Discussion 3. The selected statement was used as part of the new question to focus students on more detailed issues of evaluation; for example, the student statement from the previous postings that we decided to use was as follows: “A book or article by a well known author will seemingly add credibility to the resource.” We changed this into the following question: “What do you feel gives an author credibility (or authority) other than how well known he or she is”? The summary plus this new seed was posted in readiness for the next seminar. The scaffolded activity thus provided the seed for this forum. We also provided a web site (Berkeley University’s evaluation of sources web page) as a resource. The subsequent procedure was identical to Discussion 2 (with the exception of the tutor summary). Hence, students were engaged in a ‘moment of iteration’ by articulating last week’s task but in much greater depth as shown in these examples: “1. An author's reputation is important; positive comments/feedback from other authors who have read the text, also valid and reliable comments from others such as students who have actually read the book. “2. Relevance of subject that the individual is researching. A good use of references- sound and up-to-date data provided- support from other secondary references. “3. What the date was when published. Read the abstract prior to obtaining the book in order to make sure it is relevant to a specific subject/s.” “[…] I would judge a book by determining how referenced the book is & what references it included. It would probably be of a higher calibre if the references that are included are from authors that have been more highly publicised.” The summary was then posted to the discussion board following the procedure set out above.

5.5 Discussion 4 (evaluation criteria for web pages) The scaffolded activity for Discussion 4 was drawn from a student posting made in Discussion 2: this enabled us to focus specifically (for this and the next discussion) on web pages and discuss how they can be evaluated and what criteria might be used. The student posting taken from Discussion 2 was as follows: “Ensure that the web page is reliable before using any information on it.'” Our question (constituting the thread of this discussion) raised as a result of this comment was: “But what makes a web page reliable?” We then seeded the discussion by instructing students to look at a web resource entitled The Good the Bad and the Ugly and list at least four web evaluation criteria. The same procedure used in previous discussions was deployed. An example of a student posting is shown below: “Is the page signed or sponsored? “Is the author qualified or an expert? “How reputable is the website, author, sponsor? “Does the information contain errors? “Is the website bias? “How dated is the work?” We provided a summary drawn from student postings, an extract of which is shown below: “[…] You have identified some excellent criteria - we have added some extra points you should consider when using web pages. These are the questions we should we have in mind when using web pages as an information source. “General point - There are no guarantees that any web page is unbiased, error free or reputable but if you adhere to the criteria you have out forward - which is summarised below - then you

19

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

should get a reasonable idea as to how reliable the source is […] Is the author identified anywhere on the page? Is he/she reputable - qualified -academically - (and/or an expert)? […]”

5.6 Discussion 5 (decoding URLs) This discussion was scaffolded by selecting a student comment regarding URLs (from Discussion 4). We then used this as the thread for Discussion 5. Again we were re-iterating the previous discussion in more detail. This is an extract from the student posting that we wished to use: “[…] You can check the URL of a web page to see if it’s an educational piece or a personal piece which contains someone’s own opinion.” This is our task derived from the previous example:

“It was mentioned in the previous discussion that you can check the URL of a web page to work out its origin for example, educational, personal, commercial, governmental, sporting organisation etc. What is a URL and how do you work out its origin? Open this web resource in another window to find out how to do this. […]” An example of a student posting is shown here: “URL's [sic] specify the name and address of the resource of the internet, as well as the origin of the source for example www.sportsarena.gr this internet sport site (gr) comes from Greece. URL's also indicate whether the context of the resource is for commercial or academic use.” We then summarised the postings by bringing together the most salient points, an extract is shown here: “A URL (Uniform Resource Locator) is the specific name and address of the resource on the internet. It gives human readable names to the hidden computers numeric address used and understood by the computer (the Internet Protocol Number). Also it can provide information about the organisation or individual […]”

5.7 Discussion 6 (referencing information sources) This was a new activity focused on the referencing of sources and wasn’t linked (via student postings) to previous discussions. This activity took place over two weeks. This activity preceded a face-to-face session delivered by the module leader on why it is important to cite and reference sources correctly. We took three types of references: book, journal and web page and directed students to reference them to APA style using the web resource provided as shown below: “1.) Use this web resource to find out how to reference sources correctly. http://www.uwsp.edu/psych/apa4b.htm#A3 “2.) Now turn these items into the correct format and post them to the discussion board. a) The book; Statistics in Kinesiology, published by Human Kinetics (Champaign: Illinois) in 1999 and written by W. J Vincent b) Researching collaborative working in sport & exercise, an article in the Journal of Information literacy (volume 1, number 1, pages 155 - 169) by Walton, G and Barker J. c) Psychology with style: A hypertext writing guide (Version 5) written by M. Plonsky in 2004. Taken from the Internet January 10, 2004. It has the following web address; http://www.uwsp.edu/psych/apa4b.htm” As this task spanned two weeks students (for the first time) had enough time in the workshop to respond to initial postings, two typical comments from the tutors are: “The second one is incorrect. You do not need the volume of page number. It is meant to go Walton, G and Barker J. (2006), working in sport & exercise, Journal of Information literacy, 1 (1), 155-169 “Thank you!!! There are only a few mistakes in these references, including: punctuation errors when referencing the name of the author; wrong placement of italic references for the title of the book; no date for the journal; incorrect referencing of the version, chapter and page numbers; no version indicated in the website.”

20

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

6. Data analysis 6.1 Quantitative data gathering Both the experimental and control groups of students were given a Pre & Post Delivery Test (Appendix 1) of eleven questions. Students were instructed not to confer. Questions focused on: the library catalogue (3 questions); e-journals including questions regarding combining keywords (4 questions); evaluation of Sources (4 questions). They were asked to consider each statement and respond true, false or don’t know. These tests were used to detail differences in knowledge pre and post delivery (in the control group). We anticipated that students would give more correct answers in the post-delivery test. It is argued that this would then indicate the impact of the face-to-face workshop on students learning. The comparison of post delivery tests between the control and experimental group was designed to indicate the level of impact online collaborative learning had made on students over-and-above the face-to-face delivery. It was anticipated that students in the experimental group would do (statistically) significantly better in the post-test than the control group. The quantitative analysis for the discussion board activity focused on the number of student postings for each discussion, which were used to calculate the actual level of activity and the time taken by each student to complete and send an initial posting.

6.2 Qualitative data gathering This data was gathered from the experimental group only, with the exception of the assessed work. Data gathering was focused on five areas to gain a rich illustration of the online discussion board activities. •

Discussion board output, i.e., the content of student postings made were examined.



Observations of students as they worked were made to gain some indication of level of engagement.



A questionnaire (Appendix 2) was administered at the end of the module to find out how students felt about the online collaborative working experience.



A focus group of 5 participants (timetabled for Semester 2) was convened to discuss in more detail some of the responses given in the questionnaire.



Analysis of assessed work (both groups) was undertaken in order to further ascertain whether online collaborative working had enhanced students’ learning.

6.3 Analysis of Pre and Post-Delivery Tests Comparing results from the pre and post tests for the control group (percentages of correct answers for each question) indicated that the face-to-face delivery alone had had some impact because students gained more correct answers in the post-test than the pre-test (with the exception of question 10 – which was focused on evaluation of sources) as shown in Table 1 below. Question 1 was omitted from the analysis because it was later noticed that this was framed in an ambiguous fashion.

21

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Table 1: Pre-Delivery Test (Control Group) – percentage of correct answers Question number 2 3 4 5 6 7 8 9 10 11

Percentage correct pre-delivery 84 84 10 6 16 35 81 41 63 70

Percentage correct post-delivery 85 92 54 31 23 62 92 46 54 85

It can be seen from the Pre-Delivery Test scores that question 2, 3 (Library Catalogue) and question 8 (an evaluation question) gained very high number of correct answers. This may be for two reasons: one, students had received an induction lecture in which the Library Catalogue was covered and so some prior knowledge was evident; two, students had occasion to use the Library Catalogue to find books for seminars prior to this session. The comparison of post-tests (percentages of correct answers) gained by the control group and the experimental group was designed to demonstrate to what extent OCL enhanced students learning in addition to the face-to-face workshop. The higher score demonstrated by the experimental group is a possible indicator that more learning had taken place (with the exception of question 11 which focused on evaluation of sources) as shown in Table 2 below:

Table 2: Post-Delivery Test (Control and Experimental Group) - Percentage of correct answers) Question number 2 3 4 5 6 7 8 9 10 11

Control group 85 92 54 31 23 62 92 46 54 85

Experimental group 94 100 56 62 25 69 100 69 75 62

However, an analysis of these results using a Mann-Whitney ranking test revealed that differences in scores were not statistically significant at p > .05 (one-tailed), n=29.

6.4 Student postings (experimental group only) For all activities all students present at each workshop made an initial (primary) contribution and posted it to the discussion board. Secondary postings were only significant during Discussion 6– as shown in Table 3.

22

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Table 3: Numbers of postings to Discussion Board Online Discussion 2 3 4 5 6

Number of students present 17 16 13 15 13

Primary posting

Secondary posting

17 16 13 15 13

2 7 2 8 21

Whilst the initial (primary) postings demonstrated an attempt to tackle the activity seriously the second postings were a little frivolous apart from Discussion 6. Discussion 6 secondary postings were: greater in number than primary postings, focused on the task and contained useful suggestions to fellow students on how to rectify mistakes made in the referencing exercise. In addition, we received two unexpected postings to the summary: in particular one student had noticed a mistake in our summary and posted a response correcting our mistake. It could be argued that this indicates a high degree of interactivity, learning and motivation. All postings took place in timetabled class time. Tutors initially allowed 20 minutes to do the IL online activity. Analysing the captured postings and timing the activity as it happened revealed that students were taking between 10 and 24 minutes to do an initial posting. This may indicate why secondary postings were low: students simply ran out of time.

6. 5 Analysis of Qualitative Data Discussion 2 elicited very brief responses (in some cases only one sentence). Whereas as Discussion 3 brought far more detailed and richer responses to the activity. It is believed that this is because we did not sufficiently socialise students into the online environment in the initial stages (as recommended by Salmon, 2004) and this is supported by Focus Group evidence in which student participants reported a nervousness at making a contribution which may have been perceived as ‘wrong’. It is felt that a greater degree of online socialisation at the outset would have prevented this. Discussion 3 (detailed evaluation criteria) and Discussion 5 (URLs) brought longer and more detailed responses. This may be in part due to students feeling more socialised within the online setting and partly because the activity was more clearly stated than in previous discussion threads. When students commenced the activity they appeared engaged (there was little talking – apart from those who, by necessity were working together on one computer) in reading the instructions, finding and digesting the resources and in making a posting. In Discussion 6 (Referencing of sources) students readily carried out the task and needed little prompting the following week to make their postings. The module leader reported that, because students had been primed by the activity in Discussion 6, he found his subsequent face-to-face session on citing and referencing was the best session he had experienced. He found students were far more receptive to the notion of referencing and were more successful at the task of correcting references in an example text and bibliography than previous students. Focus Group participants noted that they found Discussion 6 the most useful and could immediately see that the skills learned would be useful in the future. This was also supported by the majority (75%) of questionnaire responses (discussed below). Two very clear sets of issues were revealed in the Questionnaire (Appendix 2) which asked students to comment on the Online Discussions. 16 completed questionnaires were received. Questionnaire responses revealed that students were very positive regarding Discussion 6 (Referencing of sources) and three typical response were: “I think I learning about references was useful and improved my understanding of it” “[…] helps all across the board for modules” “how detailed references have to be …”

23

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Responses indicated that students felt they had learned something new and useful for this module and for the future. Eleven of the sixteen student respondents were negative in their comments regarding reflective practice stating that; “I did not find reflecting on the work of another useful.” “I’m not really interested in other peoples comments, I’d rather be learning.” These typical questionnaire responses strongly suggest that the online reflective practice activity was unpopular particularly as they were not asked about this directly. However, the Focus Group participants agreed that they would have preferred a reflective task each week to “[…] remind us of what we did the previous week.” In contrast to the problems with the reflective practice task done during the workshop each student provided a rich reflective statement for their assessed work. In addition, the reflective practice statements made by the experimental group were more detailed and positive than those provided by the control group for example: a) The experimental group mentioned the online resources they have used in specific terms for example, Library Catalogue, Swetswise and e-brary in contrast to the control group who tended to mention these resources in general terms for example, Library and e-journals; b) The experimental group commented in positive terms regarding their new skills mentioning for example: “[…] I am now confident in using search connectors like ‘and’ and ‘not’ […]” c) They also mentioned changing keywords to refine a search and observe a positive outcome for example: “[…] I have had no previous experience in using the various online services and initially found it difficult to understand. I did however persist with these problems and finally found a number of sources by changing the keyword in the search box […]” Having completed two online reflections before the assessment may have enabled them to express themselves in this manner.

7. Discussion It was clear from the number of postings made and the way in which students engaged with the activities that they were very willing (in a timetabled session) to complete activities set but were most reluctant to engage in online activity outside the classroom. The ‘fast and furious’ exchange of ideas and information mentioned by Salmon (2004) was not evident in this research. When students could see a definite and direct benefit to their own learning they were far more inclined to regard learning activities positively. This was most clearly illustrated in Discussion 6 (the referencing activity). The referencing sources activity was the most successful in terms of; how students felt about the activity; the amount of activity (i.e., the large number of secondary postings) and in the quality of student postings (evidenced in the amount of new knowledge they contained). This was possibly because the activity was straight-forward, specific and the benefits were tangible. In addition, it is argued that the content of student postings demonstrated that learning was apparent in all activities undertaken (even though students, through their questionnaire responses, disagreed with this). The failure to cause the majority of students to realise the benefits of the evaluation of sources and reflective practice activities indicates that further work is required to prevent this occurring in the future. This failure may be for a number of reasons; the summaries were very long; the questions derived from the discussions could have been expressed with greater clarity and tutors asked (for some activities) a great deal of students in a very short time. Summarising student postings into a meaningful whole took more time than anticipated (up to two hours initially) and this may have affected the quality of the subsequent scaffolding process, in that tutors were working to a very tight delivery timetable during a busy time of year. Using squared brackets to express tutor contributions did not add to the intelligibility of the summary. The process of synthesising student postings into a coherent summary was also found to be more challenging than initially thought. It is argued that more work (research and training) is required in this area.

24

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

We have identified what we call a “moment of iteration.” This denotes the point at which the discussion is moved by tutors from a less to a more detailed level. For example, a student mentions a book author's credibility as an important criterion in the evaluation of an information source without qualifying what this means; the tutor via seeding (i.e., using questions) brings about a more detailed level of discussion where students qualify what they mean by the notion of credibility through the identification of specific criteria. This effectively unpacks the initial point in greater depth by drawing out specific issues such as the author's qualifications, reputation within the subject domain and/or some other explicit relevant criterion. Planning for such moments and making them more explicit to students may form part of future teaching strategy.

8. Conclusion and recommendations for future research This paper demonstrates that it is possible to engage students in even the most detailed aspects of information literacy (for example, breaking down a URL as a criterion for evaluating a web site or where to place commas in a reference) if the appropriate tasks (involving active, scaffolded, collaborative working), settings (within a subject based module during a timetabled session) and assessments (task based with some form of evaluation and reflection) are used. We contend that a productive collaborative relationship with support services and faculty is an essential pre-requisite: the joint authorship of this paper indicates what can be achieved when this is the case. What has also emerged from this research is that by using facilities such as discussion boards in a VLE, a rich insight into what students have learnt (as they engage with a particular activities) can be gained. From the tutors’ perspective the process of iteration used in the evaluation activities worked very well and was unanticipated outcome of the delivery, leading us to develop the concept of the “moment of iteraction”. This approach also creates a more student centred ‘feel’ to the activities in that it is students’ input that is used to drive further discussion and learning. Goodyear (2001) identifies this as the real benefit of online collaborative learning. This was particularly evident in Online Discussions 3, 4 and 5. This process of iteration is mentioned as an essential part of information literacy in, for example, SCONUL (1999); Big Blue (2002) and Andretta (2005); as part of the information behaviour process in, for example Hepworth (2004) and Ford (2004), and as part of the learning process in, for example, Race (2001) and Moseley et al (2004). It is felt that this clearly shows that the scaffolding method was not only a successful pedagogical technique but also offers a mechanism for realising the iterative process within IL itself. In addition, the simplicity of the referencing task indicates how future activities could be constructed, in that instructions should be short and to the point with a clear link to assessed work. However, the research also revealed that more work is required both in terms of courseware development and in the online discussion processes to follow. From the evidence gathered and discussed it is argued that a new student centred process model (for structuring IL online collaborative learning) is required in order to improve delivery and provide a model for future research. This is shown in Figure 2, overleaf. This article provides an initial analysis of assessed reflective practice statements: future research will seek to focus on a more rigorous content analysis of these. In addition, the online discourse itself requires a more robust examination to identify emergent themes and the implications these have for student learning (these issues will be addressed in more detail in Andretta, 2007). It is envisaged that future research will focus on developing a different approach to online reflective practice activities which are less onerous for student participants and do not require a collaborative dimension. Findings from this work, in particular the new emergent process (see Figure 2), will be deliberately harnessed as a basis for managing online collaborative learning in the next round of fieldwork. The next phase of this research takes place in academic year 200607.

25

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Figure 2: New process for managing student centred OCL activities:

Iteration of initial Thread in greater detail

Extracts from student postings

Forum

Thread

Reflective task

Web resource(s)

Seed

Tutor summary

Student postings

Student postings

Tutor intervention

26

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Appendix 1 Questionnaire for Effective Learning, Information & Communication Skills Please answer the following questions as honestly as you can. It should take no more than 5 minutes. Please return your completed form to Shaun. In your view are the following statements TRUE or FALSE? (Please circle your answer). If you don’t know then please circle DON’T KNOW

Library Catalogue 1. You can search for books on the Library Catalogue by title only. 2. When doing an author search it is best to put the surname first. 3. You can search for books on the Library Catalogue by using one or more keywords.

Swetswise 4. Swetswise is a full-text e-books service. 5. Swetswise is only available on-campus 6. You can search for the singular and plural of a word at the same time by using ‘truncation’. 7. The term AND is used to combine two keywords together so that Swetswise retrieves articles containing both keywords.

Evaluating sources of information 8. All the information published on the Internet is sound. 9. Websites are always more up to date than periodicals. 10. Articles published in academic journals (including e-journals) are not as reliable as books. 11. To evaluate a source (book, journal, web page etc) you only need to check the date.

27

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Appendix 2 Online Discussion Board Draft Questionnaire 6/12/05 The purpose of this questionnaire is to find out what you thought of the online discussions (in which you participated) during the last few weeks. The online Discussion Board activities included: 1. Evaluating E-material (Online Discussion 2, 3, 4 and 5): In this activity you had an in-depth look at evaluating information. You set your own evaluation criteria for books, journals articles and web resources and posted these to the discussion board. You also looked at web addresses (URLs) and how to decode them. For each discussion; you completed the task yourself and then commented on the work of at least one other. 2. Referencing your sources (Online Discussion 6): In this activity you looked at citing references to the APA standard. You completed the task yourself and then commented on the work of at least one other. We value your honest and constructive thoughts – there are no right answers! Please answer the questions as fully as possible.

============================================================= 1a. Please state which online activity (or activities) you found useful? 1b. In what ways did you find the online activity (or activities) useful. 2a. Please state which online activity (or activities) you found NOT so useful? 2b. In what ways did you find the online activity (or activities) NOT so useful? 3. In general: What did you like about the online activities? 4. In general: What did you DISlike about the online activities? 5. In general: In what ways do you feel the online activities helped you to learn? 6. Each online discussion was summarised by the tutors. Have you looked back at previous summaries? If you circled YES to question 6 please answer question 7 (otherwise go to question 12) 7. Did you find the summaries useful? If you circled YES to question 7 please answer question 8 (otherwise go to question 12) 8. In what ways did you find the summaries useful? 9. In what ways did you find the summaries NOT so useful? 10. Will you be using any of the summaries in your assignment? If you circled YES to question 10 please answer question 11 (otherwise go to question 12) 11. For which question in particular will you use the summary (or summaries)? 12. Please describe a key moment (when you were completing one of the online activities) when your understanding changed. (Please Identify for which activity this was) 13. What were the three most important things you learned in doing the online activities? 14. What new skill (or skills) did you develop as a result of participating in the online activities? 15. What surprised you that you learned from doing the online activities? 16. Will you do anything differently as a result of doing the online activities? 17. What advice would you give a friend about to start on these online activities? 18. How much time would you suggest that it would be worth putting into it? 19. What pitfalls would you advise to be well worth not falling into? 20. Please make any further comments you have (which aren’t covered in the questionnaire) in the space below.

28

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

References Andretta, S. (2005). Information literacy: a practitioners guide. Oxford: Chandos. Andretta, S. (ed.) (2007) Change and challenge: information literacy for the 21st Century. Auslib Press: Adelaide Armstrong, C., Abell, A., Boden, D., Town, S. Webber, S. & Woolley, M. (2005). “Defining Information Literacy for the UK.” Update, 4(1-2), 23 – 25. Association of College & Research Libraries (ACRL): a Division of the American Library Association (ALA) (2000). Information Literacy Competency Standards for Higher Education. Chicago: American Library Association. http://www.ala.org/ala/acrl/acrlstandards/standards.pdf (Retrieved 14 January 2007). Big Blue Project (2002). The Big Blue: information skills for students: final report. http://www.library.mmu.ac.uk/bigblue/pdf/finalreportful.pdf (Retrieved 14 January 2007). Bordinaro, V. & Richardson, G. (2004). “Scaffolding and reflection in course-integrated library instruction.” Journal of academic librarianship, 30(5), 391-401. Bruce, C. S. (1995). “Information Literacy: a framework for higher education.” Australian library journal, 44(3), 158-170. Bundy, A. (ed.) (2004). Australian and New Zealand Information Literacy framework: principles, standards and practice. Adelaide: Australian and New Zealand Institute for Information Literacy. http://www.anziil.org/resources/Info%20lit%202nd%20edition.pdf (Retrieved 14 January 2007) Ford, N. (2004). “Towards a model of learning for Educational Informatics.” Journal of documentation, 60(2), 183-225. Goodyear, P. (2001). Effective Networked Learning in Higher Education: notes and guidelines. Networked Learning in Higher Education project: JISC Committee for Awareness Liaison and Training (JCALT). Lancaster University: Centre for Studies in Advanced Learning Technology. (Volume 3 of the Final Report to JCALT). http://csalt.lancs.ac.uk/JISC/guidelines_final.doc (Retrieved 14 January 2007) Hepworth, M. (2004). “A framework for understanding user requirements for an information service: defining the needs of informal carers.” Journal of the American Society of Information Science and Technology, 55(8), 695-708. Hinett, K. (2002a). Improving learning through reflection: part one. The Higher Education Academy. http://www.heacademy.ac.uk (Retrieved 14 January 2007). Hinett, K. (2002b). Improving learning through reflection: part two. The Higher Education Academy http://www.heacademy.ac.uk (Retrieved 14 January 2007). JISC infoNet (2004). InfoKit: effective use of VLE’s: Introduction to VLEs. http://www.jiscinfonet.ac.uk/InfoKits/effective-use-of-VLEs/intro-to-VLEs/index_html (Retrieved 14 January 2007). Jonassen, D., Davidson, M., Collins, M., Campbell, J. & Haag, B. B. (1995). “Constructivism and ComputerMediated Communication in distance education.” American journal of distance education, 9(2), 7-26. Kolb, D. A, Rubin, I. M. & Osland, J. (1991). Organizational behavior; an experiential approach. 5th ed. Englewood Cliffs, N J: Prentice-Hall Laurillard, D. (1993). Rethinking university teaching. London: Routledge. Mayes, J. T. (1995). “Learning Technology and Groundhog Day.” In: W. Strang, V.B. Simpson, & J. Slater, (eds.). Hypermedia at work: practice and theory in Higher Education. Canterbury: University of Kent Press. pp21-37. Mayes, T. & de Freitas, S. (2004). JISC E-learning models desk study: Stage 2: Review of e-learning theories, frameworks and models (Issue 1). Bristol: JISC. http://www.jisc.ac.uk/uploaded_documents/Stage%202%20Learning%20Models%20(Version%201).pdf (Retrieved 14 January 2007) Moseley, D., Baumfield, V., Higgins, S., Lin, M., Miller, J., Newton, D., Robson, S., Elliot, J. and Gregson, M. (2004). Thinking skills frameworks for post-16 learners: An evaluation: A research report for the Learning & Skills Research Centre. Trowbridge: Cromwell Press.

29

Journal of Information Literacy, 1(1), January 2007, 13-30. Walton et al. Refereed paper.

Northedge, A. (1997). “Different ways of studying.” In: A. Northedge, A. Lane, A.. Peasgood, & J. Thomas. The sciences good study guide. Milton Keynes: Open University Press. pp155-180. Race, P (2001). The lecturer’s tool kit: A resource for developing learning, teaching and assessment. 2nd ed. London: Kogan Page. Salmon, G. (2004). E-moderating: The key to teaching and learning online. 2nd ed. London: Routledge. Society of College, National and University Libraries (SCONUL): Advisory Committee on Information Literacy (1999). Information Skills in Higher Education: A SCONUL position paper. http://www.sconul.ac.uk/groups/information_literacy/papers/Seven_pillars2.pdf (Retrieved 14 January 2007) Walker, H. M. & Engel, K. R. (2004). “Research exercise: A sequenced approach to just-in-time information literacy instruction.” Research strategies, 19, 135-147.

30