Whats going on behind the screens?

7 downloads 170880 Views 1MB Size Report
Few reports beyond anecdotal accounts, newspaper articles (e.g., Foote, 2012; .... App developer forums were extensively researched. This revealed a possible ...
Original Article doi: 10.1111/jcal.12044

bs_bs_banner

What’s going on behind the screens? Researching young students’ learning pathways using iPads G. Falloon The Faculty of Education, University of Waikato, Hamilton, New Zealand

Abstract

Since their introduction in 2010, much has been said and written about Apple’s iPad (Apple Inc., Cupertino, CA, USA) and its potential to transform when and how students learn. Much of this hype has focused on attributes of the device such as its touch screen interface, light and portable form factor, easy-to-use operating system, and large array of apps. Emerging studies mainly report positive outcomes for students from using iPads in specific learning situations. These studies have cited enhanced motivation and learner engagement, often linking this with improved learning outcomes. However, there is a dearth of research exploring how students interact with these devices, and the factors that affect the quality and learning value from that interaction. This article reports on the use of an innovative recording solution to capture video and audio data of young students using iPads to develop literacy, numeracy and problem-solving/ decision-making skills. The study offers insights into how students go about solving problems within apps, and highlights the importance of knowledge, affective and dispositional elements, and good app design to profitable interaction. It also flags considerations for teachers embarking on initiatives involving iPads, and suggests app developers need to work more closely with teachers to improve the learning value of their products.

Keywords

apps, iPads, literacy, numeracy, tablets, technology.

Introduction

Since their emergence in 2010, commentators have been heralding devices such as Apple’s iPad as educational ‘game changers’, citing unique affordances such as its form factor, portability, touch screen interface and wide array of apps as offering education unprecedented opportunity to usher in ‘a new style of invisible, matter of fact computing . . . engaging with this generation with a teaching approach fit for the future’ (Brown-Martin, 2010, p. 2). However, despite the hype

Accepted: 15 October 2013 Correspondence: Garry Falloon, University of Waikato, School of Education, Hillcrest Rd, Waikato, Hamilton 3216, New Zealand. Email: [email protected]

© 2013 John Wiley & Sons Ltd

Journal of Computer Assisted Learning

and rhetoric, few studies have emerged to date exploring how students interact with these devices, or investigating any influence they may have on their learning. While some studies have been published, many of these focus on device use in specific contexts such as learning English (Billings & Mathison, 2012), special education (McClanahan, Williams, Kennedy, & Tate, 2012), pre-service teacher education (Pegrum, Howitt, & Striepe, 2012), early years education (Lynch & Redpath, 2012) and literacy learning (Hutchison, Beschorner, & Schmidt-Crawford, 2012; Saine, 2012). Few reports beyond anecdotal accounts, newspaper articles (e.g., Foote, 2012; Ihaka, 2013) or qualitative perception-based narratives have been published researching their use in general education, especially at primary or elementary school level. 1

2

This article reports on a study that used an innovative digital tool to capture screen display data as 5-year-old students worked independently on a series of 45 iPad apps, carefully selected by their teacher to support foundation numeracy, literacy and problem-solving skills. Data were analysed to determine factors that influenced the effectiveness and quality of the students’ learning pathways while using the apps. It identifies the complex interaction between knowledge, the design and content of apps, and cognitive and dispositional elements that affect students’ responses to challenges and problems posed in the apps, as being critical factors in determining the extent of ‘learning value’ they derive from them. Sample case studies are detailed and plotted against a typological framework, illustrating different levels of student learning performance in response to different combinations of these factors. Results provide teachers and app developers with information useful to advancing the design of apps, and improving their use in classrooms. ‘Hyped’ technology and the iPad

Because of their recent advent, research on the use of tablet technologies such as the iPad, in education, is just beginning to emerge. Despite this, their launch back in 2010 was surrounded by much hype and rhetoric, with some commentators and early researchers claiming the unique affordances of the device offered ‘game changing’ potential across the education spectrum (Cochrane, Narayan, & Oldfield, 2013; Geist, 2011). These claims included enhanced motivation and engagement likely to lead to improved learning (e.g., Geist, 2012; Manuguerra & Petocz, 2011; Saine, 2012), learning efficiency and cost gains associated with using paperless systems (Foote, 2012; Hall & Smith, 2011), particular benefits from niche use in special education (e.g., Aronin & Floyd, 2013; Hutchison et al., 2012; McClanahan et al., 2012) and greater learning personalization (e.g., Crichton, Pegler, & White, 2012; Heick, 2012). However, others sound a more cautionary note, pointing to lessons from history of education’s fixation with technological fads and gadgets, commenting that it simply ‘lurch(es) from one fad to the next’ (Masters, 2002, p. 1). Maddux (1986) refers to this phenomenon as the ‘pendulum syndrome’ (p. 27), where educational innovations usually surrounded by hype or bold and

G. Falloon

ambitious claims are hastily adopted by schools, only to be followed by disillusionment and eventual abandonment, when they inevitably fail to meet overinflated expectations. With reference to technology innovations, Maddux and Cummings (2004) point out that often this repeated cycle has little to do with the educational potential of the innovations themselves. They identify poor understanding (usually by teachers) of the theoretical connections between the potentials of the innovation and what is known about how children learn – and researchers’ inadequate communication of this – as principal contributors to this situation. As they put it, ‘in the absence of theory, implementation can be based only on intuition, trial and error, superstition, popularity, or random means unlikely to be quickly productive in any way’ (Maddux & Cummings, 2004, p. 523). Possible reasons for the sometimes-frenzied adoption of hyped technologies such as the iPad are explored in an interesting study on iPhone uptake by Hedman and Gimpel (2010). Their study of a group of graduate students used the theory of consumption values (TCV) to understand their motivations for adopting the then new iPhone 3G. The TCV was originally developed by Sheth, Newman, and Gross in 1991 as a theoretical model to help explain consumer selection and decision-making, and was chosen by Hedman and Gimpel due to its broad scope and ‘incorporat(ion) of hedonic factors and other decision drivers that tie in strongly with hyped technology’ (p. 162). The theory incorporates five core values: functional (perceived value for achieving a specific task); social (portraying a desired social image); epistemic (desire to learn about new technology); emotional (response to features such as aesthetics or design); and conditional (perceived value in a specific context). Contrary to what might be expected, their findings indicated the functional value of the iPhone rated lowly in initial adoption decisions, with most participants commenting that it offered little more than their existing device/s. Instead, participants prioritized epistemic, social and emotional values as key uptake drivers. They were principally motivated by an interest and curiosity to learn more about the device, and the perceived social value from ownership (or symbolic ‘cool factor’) that media and publicity vigorously promoted. Emotional responses to aesthetic features were also important, with features such as the large touch screen and smooth, contoured casing, rating highly. © 2013 John Wiley & Sons Ltd

3

Behind the screens

While Hedman and Gimpel’s study did not focus on the school sector, it does prompt questions as to why schools are so quick to latch on to new and usually expensive technology innovations, in the absence of any research-verified information relating to their effectiveness and performance in that context. It should be reasonable to assume that functional value (i.e., ‘value added’ to teaching and learning) would rate highly in such decisions. However, a study on the integration of iPads and iPod Touch devices in five K-12 classrooms (Crichton et al., 2012) suggests this may not necessarily be the case. Following the first phase of their project, they concluded, ‘the hype in the public press and media enticed principals to believe that iPods and iPads might just be the much hoped for “silver bullet” for school-based technologies’ (p. 27). Like Hedman and Gimpel’s study, this suggests a powerful influence from media and vendor promotional materials in school decision-making, ‘which can blur the distinction between what the technology can actually do, and what potential users imagine’ (Hedman & Gimpel, 2010, p. 161).

‘learning value’ they derive from that interaction. Addressing this gap has been identified by other researchers as a matter of priority (e.g., Geist, 2011; Pegrum et al., 2012), citing the existence of ‘little empirical evidence on whether, or how they (iPads) facilitate student learning’ (Pegrum et al., 2012, p. 1). It is essential such studies be undertaken, if the iPad is to avoid being the next device to fall victim to Maddux’s pendulum syndrome. The study this article reports on starts to explore these factors, and offers an innovative methodology to enable researchers to gather data that accurately represent students’ use of iPads as part of a normal classroom programme. Research question

The research and data collection were guided by this question: What influences are there on the learning pathways of young students using iPads and apps for literacy, numeracy and problem solving tasks, and what are the effects of these?

iPads in education

Research context

Much research to date on iPads in education has focused on ‘niche’ uses such as developing literacy skills (e.g., Billings & Mathison, 2012; Hutchison et al., 2012; McClanahan et al., 2012; Saine, 2012), enhancing learner engagement (e.g., Manuguerra & Petocz, 2011), promoting learner independence and personalization (e.g., Geist, 2011), and stimulating pedagogical change (e.g., Cochrane et al., 2013). Many of these are qualitative case studies based on selfreport, interview or observational data, although some quasi-experimental designs are now emerging (e.g., Billings & Mathison, 2012). An extensive review of academic databases also revealed numerous media publications and ‘teacher stories’ reporting on device use in different learning scenarios (e.g., Foote, 2012; Ihaka, 2013; Webster, 2012). These were often associated with vendor promotional materials or located on developer websites as case studies illustrating claimed advantages from iPad use.1 While it would be fair to say outcomes from the studies and media reports are generally positive, due to the iPad’s relatively recent arrival on the scene, few studies have been completed that explore how students interact with the device, and factors that influence the

The study was undertaken between late June and early December 2012 in a class of eighteen 5-year-old (new entrant) students who went to a moderately sized primary school (400 students) in a small town close to Hamilton, in the Waikato region of New Zealand. There were 11 girls and seven boys, most of whom had been at school for less than 4 months. The school resided in a mixed socio-economic area, rating at 5 on the national school decile scale.2 The research was established after the principal of the school approached the researcher indicating a desire to trial iPads as a precursor to what he described as ‘the possible establishment of an iPad class’ (Principal, personal correspondence, March 2012). He indicated a desire to learn more about the devices and their learning potential before committing to the expense of establishing the classroom in 2013. After holding an open selection process during which interested staff were invited to apply, the teacher of the new entrant class was selected based on ‘her history of receptiveness to innovation, and very sound, child-focused pedagogy’ (Principal, interview, June 2012). The teacher (Tanya) had 16 years of teaching experience, of which ten had been in the junior school (years 1–3).

© 2013 John Wiley & Sons Ltd

4

G. Falloon

Following a successful application to the University’s Research Committee, a small grant was secured that allowed for the purchase of eight iPad 3s and protective covers. When combined with the researcher’s own device, this meant that sufficient devices for paired use were available. The research complied with all requirements as outlined in the University of Waikato’s Human Research Ethics Regulations (2008), following standard procedures for informed consent and participant confidentiality. Methodology and data collection

The study followed a conventional interpretive case study method. However, challenges existed to collecting accurate data through methods normally associated with case studies, principally due to the age of the students, their newness to school, and their limited capacity to communicate meaningfully in interview situations. Early trials of ‘over the shoulder’ video to record students’ interactions revealed a strong observer-effect with students appearing very conscious of being recorded, while researcher observation also proved problematic for the same reason. Following this, a gaming application, Air Server (App Dynamic Inc., CA, USA), was tested. This transmitted the content of a device’s display via WiFi to a laptop, which then recorded it using an application such as Quicktime Pro (Apple Inc., Cupertino, CA, USA). This solution met with limited success due to fluctuating WiFi speed, and an inability to record the display of more than one device at a time. A technical solution was needed that would enable the device itself to record and store data without network use or researcher presence. After an intensive search of Apple’s App Store revealed nothing, a number of Cydia (‘alternative’) App developer forums were extensively researched. This revealed a possible solution called Display Recorder (developed by Ryan Petrich) which, although limited in its capability, offered the potential for adaptation. The app required devices to be ‘jailbroken’, but it could be installed on each device, and, when activated, record as a.mov file the display’s contents, user audio and finger placement (Figure 1). Video data could be recorded ‘in the background’ while students were using their apps, and then retrieved and stored on a laptop for later analysis. However, after trialling, issues were identified with audio recording and the

Figure 1 Display Recorder Screenshot Showing Finger Placement (White Spot on Suction Cup)

storage capacity of the iPads. When Display Recorder encountered an app that used a sound codec that conflicted with its own, external audio recording capability was disabled, meaning videos with app sound but no user audio were captured. Additionally, the basic compression codec used in Display Recorder limited the duration of video able to be stored on the 16 GB iPads, of which only about 2 GB was available after the iOS and student apps were installed. With the assistance of University technical support personnel, adjustments were made to Display Recorder that eventually enabled nearly an hour of video to be recorded per GB of storage space. This was sufficient to record a typical classroom use session. Unfortunately, the larger file size created by the longer recordings caused issues for Display Recorder’s normal delivery vehicle, Camera Roll, meaning files needed to be extracted directly from each device’s root directory. iExplorer (Macroplant LLC, Louisiana, TX, USA) was used for this purpose. The sound recording conflict proved more problematic. After unsuccessfully testing a range of different codecs, a decision was made to disable the app’s native recording capability in favour of a small USB-style digital voice recorder that was attached using Velcro to each iPad’s protective cover (Figure 2). The plain appearance of the USB recorder © 2013 John Wiley & Sons Ltd

5

Behind the screens

across a school could quickly become prohibitive. Four or five apps were organized into each of five separate folders – one per day of the week, and these were changed frequently during the research period. Typically, students worked with the apps daily between 30 and 40 min. Data coding

Figure 2 Typical Set-up Showing USB Voice Recorder Attached to the iPad’s Protective Cover

made it unattractive, meaning students did not interfere with it during use. Separate video and audio files were therefore recorded, and later synced using iMovie (Apple Inc.). Over the 6 months, data were recorded on different days and at different times each week to as closely as possible represent normal classroom use patterns. A total of 24 h of video and audio data were collected representing the interactions of 40 different pair combinations with all 45 apps selected by the teacher (Appendix I). App selection and organization

The teacher was responsible for selecting all apps used in the research, basing selections on her professional judgement of how well each app could support the learning needs of her students. This assessment was made after she had trialled each app herself (often several times), used them with her own primary-aged children, evaluated online reviews and other teachers’ opinions, assessed comments and ratings on Apple’s App Store, and appraised the affordability of provisioning to all devices. Despite apps being able to be installed on up to five devices associated with a single iTunes account, the cost of purchasing all apps for all devices was beyond the budget of the school, meaning a blend of ‘free’ (25) and paid-for (20) apps were selected. It is likely such a situation would exist for schools embarking on initiatives such as ‘iPad classrooms’ or ‘Bring Your Own Device’ programmes (e.g., Bilby, 2013), as the cost of paid-for app provision © 2013 John Wiley & Sons Ltd

The researcher was fortunate to access a postgraduate research student to assist with data coding, as part of the University’s Summer Research Scholarship programme. A 41∕2 h data sample was copied onto separate computers and examined independently by the assistant and the researcher. This was to identify initial themes relating to observed influences on the students’ learning pathways. The data were purposively selected to be representative of student use of the iPads across the three learning focuses – literacy, numeracy and problem-solving/decision-making. Care was taken to balance data according to the time and day of the week they were captured, and to ensure a variety of different pair combinations were included. The sample represented the interaction of seven different pairs who accessed four problem-solving/decision-making, two numeracy and four literacy-focused apps. Following examination, both coders met and compared assessments. While the precise wording used to describe observed influences differed, there was sufficient agreement for three broad themes to be negotiated. These were: 1. The influence of app design, content and structure (features). 2. The influence of student knowledge (procedural – related to device/app operation; and declarative – related to completing demands of the app). 3. The influence of students’ cognitive strategies, effort, dispositions and responses. The data sample was then independently recoded using Studiocode3 video analysis software (Studiocode Business Group, Sydney, NSW, Australia). Studiocode enables incidents in multiple video samples to be ‘tagged’ using researcher-specified codes. All occurrences aligned with a code are entered into a timeline (Figure 3) that assembles data so it can be replayed as a ‘bundle’ or as individual samples. Subcategories

6

G. Falloon

Figure 3 Studiocode Window Showing Timeline with Macro and Subcoding

can be generated and linked to primary codes enabling deeper data mining and more precise data classification. Figure 3 illustrates this, with the macro codes of ‘Strategies’ and ‘Features’ being subcategorized into ‘Gamification’, ‘New App’, ‘Restart’, ‘Parameters’ (etc.), and ‘External Web Link’ and ‘Purchasing’, respectively. Subcategorization was used extensively to explore the specific influence of different app design and content features reported in an earlier article (Falloon, 2013). Coding templates were generated for each of the seven pairs’ samples, and the synced display capture videos imported into

Studiocode. Each sample was recoded, and incidents each coder aligned with the three macro themes (knowledge, strategies, and design and content features) were logged on the timelines. Inter-rater agreement calculations were carried out on data ascribed to each of the code themes. As recommended by Gwet (2012), calculations were restricted to occurrences that both coders had identified to avoid the likelihood of underestimation of agreement probability caused by the inclusion of data upon which no agreement existed. The agreement calculations (κ) are summarized in Table 1. Agreement strength ratings refer to © 2013 John Wiley & Sons Ltd

7

Behind the screens

Table 1. Inter-rater Agreement Aligned with Coding Themes Theme

κ

Observed agreement

Confidence interval (95%)

Standard error

Agreement strength (from Landis & Koch, 1977)

Features Knowledge Strategies

0.723 0.668 0.635

0.923 0.864 0.886

0.429–1.018 0.401–0.935 0.467–0.804

0.150 0.136 .086

substantial substantial substantial

Landis and Koch’s (1977) often-referenced guidelines for interpreting inter-rater agreement using Cohen’s κ. While agreement ratings were generally acceptable, results for Strategies in particular were at the lower end of Landis and Koch’s scale, suggesting the need for greater clarity before undertaking coding of all data. To facilitate this, a combined re-examination of sample data was completed so more precise characteristics aligned with each theme could be identified. This resulted in operational definitions that were used by the research assistant to code the remaining 191∕2 h of data, in addition to recoding the original 41∕2 h sample. These definitions were: (a) Design and content influences: The effect of graphical, textual or multimedia content. These include instructions, feedback systems, teaching/ practice/game balance, control accessibility, embedded parameters, learning scaffolds, advertising, resource access (etc.). (b) Knowledge influences: The application of student knowledge (of some form) towards achieving the learning goal, solving problems or completing tasks required by the app. This typically included declarative (knowing that) and procedural (knowing how). (c) Strategy influences: How students responded to demands associated with solving or attempting to solve problems or tasks required by the app. Examples included direct knowledge application, trial and error, random guessing, repeating learned patterns, closing apps, reverting to game. Subcategories based on these definitions (see Figure 3) were included in a revised coding template used to code all data, even though it was understood not all would be applicable to each video. Doing this meant that separate recoding for specific design and content feature influences as required for the earlier article (Falloon, 2013) was not needed. © 2013 John Wiley & Sons Ltd

Results

Because of the volume of data generated from 24 h of video and audio involving 40 pairings, five cases have been generated as a representative range of student-app learning pathways, and to illustrate the effects on these of the various influences detailed above. Data from the Display Recorder video for each case have been organized in tables (Tables 2–6). Each table comprises a description of the apps accessed, a series of screenshots from Display Recorder, an explanation of student activity at each screenshot with time-log, and recorded student verbal interaction. At the foot of each table is an analysis of running time coded as selection and loading, learning (evidence of interaction focused on the app’s learning goal) and non-learning/ entertainment (evidence of interaction not focused on the app’s learning goal – excluding any learning game component). Discussion

Using the modified Display Recorder app enabled data to be captured revealing the complexity of influences on student interaction with the devices and apps. The sample cases have been plotted onto a typological framework (Figure 4) that illustrates each pair’s response to learning challenges and demands presented by app design and content, and how the cognitive and affective ‘resources’ they could access affected the extent of learning derived from the interaction. These will now be discussed with reference to the framework. Cases I&A, J&N, N: application – amplification

These cases, located towards the upper end of the gradient on the typology, share common characteristics, and will be discussed collectively. However, where pertinent differences exist, these will be explained and

8

G. Falloon

Table 2. I&A’s Learning Pathway Surrounding student and/or teacher dialogue or interaction

Apps accessed and thumbnail

Description/stated learning focuses

Scenario/description/ elapsed time

Rocket Speller

Letter recognition – upper and lower case Expanding vocabulary Phonics development Letter sounds Ordering letters Spellinga Students build words using visuals by selecting letters. Lower levels do not require letters to be correctly sequenced. ‘Shadow’ letter scaffolds available at lower levels (see example).

1 min 45 s. Students taking turns to match scrambled letters to build word. At this level, correctly ordering letters not a requirement. Students negotiating tasks.

Smarty Pants School Rhymes

Letter knowledge Phonemic awareness Phonics Reading phonetically Decoding and sight word recognition Students choose from a range of vocabulary building activities: rhymes, sight words, word finds, puzzles, letter recognition and sounds, etc.

5 min 55 s. Students enter Smarty Pants school app (rhymes). They select ‘Peter Pumpkin Eater’ and then ‘Two Little Dicky birds’. Each takes a turn to read a rhyme, copying the text to speech in the app. They continue with this until 8 min 25 s when they close page.

Smarty Pants School Word Finds

Letter knowledge Phonemic awareness Phonics Reading phonetically Decoding and sight word recognitionb See previous description.

10 min 24 s. A suggests rhymes were becoming too difficult and prompts I to change. I closes the rhymes page and enters the word find. Students need to find correct letter sequence to match spoken word (in example: ‘how’).

Time analysis

a

Total run-time (this sample): 20 min 40 s

Selection and loading: 2 min 21 s

I: So, how about I do two and you do one. Is that fair? A: No! I know a good idea. You do those . . . I: You need to get the letters right (sounding letters) p . . . p . . . a. . . . a. . . . nnn . . . pan. Pan! Your turn. A: Yep . . . yep . . . yep . . . my turn . . . Fish . . . f . . . f . . . i.sh Fish (moving letters to correct position while sounding). I: Cool . . . to the rocket . . . blue . . . A: No, green! I: OK, green (selecting colours for building rocket components). A: They don’t have pink. Remember we are going to do a silly one (rocket colour scheme) Students continue until 5 min 38 s and then close app. A: Can we do a different one now? I: Shall we try one I think we’ve never heard? How about this? (tapping ‘Georgie Porgie’). A: Oh! I like this one, don’t go home . . . (referring to pressing home button). Students read rhyme following text to speech cues. A: I don’t want any sound. I just want us to read it. [A asks teacher (T) for advice on how to deactivate sound]. T scaffolds beginning of rhyme and encourages students to read along. A&I: ‘Georgie Porgie . . . pud . . . pud . . . ding and pie . . . kissed the girls and made them cry. . . . I: You do the next page by yourself . . . then I’ll have a turn . . . Students continue until 10 min 24 s. Students worked on this app for 7 min 15 s. A: We’ve only got to get two more until we’re on the next level . . . I: ‘How’. . . . Let’s see . . . H-O-W (drags finger across correct letters, speaking each as she goes). A: We might get a certificate . . . I: We’ve already got one . . . they’re dumb anyway . . . your turn. A: Yeah. Just a bit of paper. We need more points . . . ‘w- ill’ . . . Will. (drags finger across letters, sounding blend as she does so). One to go . . . Students complete level and change app at 17 min 39 s.

Learning: 16 min 33 s

Non-learning/ entertainment: 1 min 6 s

Refer: https://itunes.apple.com/nz/app/rocket-speller/id492504689?mt=8. bRefer: http://smartypantsschool.com/wp/?page_id=21.

© 2013 John Wiley & Sons Ltd

9

Behind the screens

Table 3. J&N’s Learning Pathway Apps accessed and thumbnail

Description/stated learning focuses

Jungle Time

A digital/analogue time telling practice app. Students required to convert analogue to digital time, with adjustable difficulty levels ranging from whole hour down to 1 min increments. Built in time teaching tutorial. Choice of jungle animal and associated sound reinforcement.

2 min 15 s. Students are taking turns to rotate digital time wheels to match clock. One student is inputting while other is discussing with them the correct response. Corrective feedback being provided by peer as each takes their turn.

N: You need to move it to the 9 . . . J: No, it’s 12 . . . N: No, that’s the big hand! The big hand . . . it’s got to be on 9 . . . that’s the little hand, see . . . J: OK (J moves time wheels towards the 9) N: Now do ‘done’. (J presses ‘done’ button) N: My turn now. J: Yeah, OK. How ‘bout changing the animal noise? N: Let’s do a monkey . . . (laughing) Students continue until 4 min 47 s then change app.

Pop Maths

Reinforces basic math facts (+, −, /, ×). Students required to match maths problems with correct answer by ‘popping’ the correct combination of balloons. Problems increase in difficulty (number difficulty and mixed operators) according to student success. Students can select single operator and number range if desired.

9 min 07 s. Students entered this app at 4 min 58 s. They are forced to start at level 1 as app does not allow user level set-up. They take turns to match balloons, ‘popping’ them with their fingers, although both contribute orally to solving the problems. Thumbnail shows equations from level 6. Few difficulties to this point.

N: Yeah, it’s 5 . . . (J presses ‘3 + 2’ and ‘5’ balloons simultaneously and they pop) N: Cool, we got 18 now (referring to points total) J: 14 take away 4. N: That’s a hard one . . . (N can be heard counting backwards from 14). Try 10 . . . it’s 10! (J presses ‘14-4’ and ‘10’ balloons simultaneously and they pop) N: YES! 19 now . . . Students continue and complete level.

Pop Maths

As above

17 min 33 s. By this time, the students had reached level 9 and had scored 26 points. Problems are getting more difficult with larger numbers and multiple operators.

J: These are getting tricky . . . do you wanna do a different one? N: But we’ve got 26! We’re getting into the hard ones, like a big kid. 14 plus 5 . . . (N begins to tap balloons randomly but after three incorrect attempts stops) J: I know . . . I’ll get a number thing (a cardboard number line from 0–30). (Pause). We’ll use this . . . (pause) (J can be heard counting 5 up from 14). N: It’s 19. (N taps ‘14 + 5’ and ‘19’ balloons simultaneously and they pop) 27! Yeah! Students continue and complete level.

Pop Maths

As above

21 min. Students reach level 13 but have only scored 28 points (only 2 more than at level 9). They had discarded the number line as problems were too large, and generally reverted to a ‘process of elimination’ (guessing) approach. This meant despite progressing levels, the mistake rate was high which affected their overall score.

J: My turn . . . I’ve no idea. This is really hard . . . N: Just touch the green ones . . . (J randomly taps different combinations of light green balloons to no effect) N: 7 . . . that’s 6 plus 1. Touch them . . . (J taps 7 and 6 + 1 and balloons pop) . . . good, that’s 29. Just guess the rest . . . J: OK. (J randomly taps different balloons until all correct combinations are guessed) N: We’ve got 31. We get to know the big numbers ‘cause we already know the little ones . . .

Pop Maths

As above

23 min 33 s. Students at level 14 and have accumulated 31 points. Speed of balloon movement increased. By this stage, problems are beyond student capabilities, with correct combinations resulting from guesswork. Effort is transferred from genuine problem-solving attempts to finding the quickest way to increase points total (gamified).

N: Have a race . . . (laughing) . . . see who can pop them the quickest (N randomly taps different combinations until a correct one is discovered) J: I’ll have a go (process is repeated). Students continue taking turns in this way until all balloon combinations are discovered. J: We’ve got 33. Students progress to level 15. N: (pause) I don’t think any of us can do it . . . J: Yeah, change . . . (N closes the app and begins to select a new one)

Time analysis

Total run-time (this sample): 25 min 16 s

© 2013 John Wiley & Sons Ltd

Scenario/description/elapsed time

Selection and loading: 2 min 36 s

Surrounding student and/or teacher dialogue or interaction

Learning: 17 min 28 s

Non-learning/ entertainment: 5 min 12 s

10

G. Falloon

Table 4. N’s Learning Pathway Apps accessed and thumbnail

Description/ stated learning focuses

Scenario/description/elapsed time

Dr Seuss – The Cat in the Hat: Green Eggs and Ham

The app is one of the read-a-long books in the Dr Seuss series. Students have the option of selecting from three interaction modes, easily accessed from the bottom of the title page. This ‘Lite’ version is free, while the full version needs to be purchased.

Dr Seuss – The Cat in the Hat: Green Eggs and Ham

As above

3 min 51 s. This student usually chose or was forced to work alone, as his learning and social difficulties often left him unpaired. He was often recorded ‘talking’ to the device, in a manner similar to if he was working with a peer. N opened this app and selected the ‘Read to Me’ option. 4 min 32 s. N has reached the end of the ‘free’ version, but appears confused as he indicates knowledge that it was not the end of the story.

N: Read-to-me . . . read to me. Yes, I want you to read to me please . . . (taps the button). N then moves through pages listening as the story is read. He repeats the words on each page after it has finished. N: ‘I am Sam . . . Sam-I-am . . . That Sam-I-am, that Sam-I-am . . . I do not like that Sam-I-am.! Do you like green eggs and ham? I do not like them Sam-I-am. I do not like green eggs and ham.’ App concludes at this point. N: Huh? Oh . . . this comes up. Where’s the rest? There’s s ’post (sic) to be more. I’ll start again . . . (closes app) ...

Dr Seuss – The Cat in the Hat: Green Eggs and Ham

As above

4 min 56 s. N re-enters app, this time selecting the ‘Read it myself’ option.

N: I want to read it this time. I want to try myself please . . . N reads the story as previously but without voice-over assistance. He self-corrects on five occasions during reading.

Dr Seuss – The Cat in the Hat: Green Eggs and Ham

As above

5 min 46 s. N reaches the end of the ‘free’ version as previously.

N: What the . . .? Why do you keep doing this to me? (prompts T for assistance). N: Mrs F., it keeps coming up with this! T: Oh. We have to buy it to get the rest N. We only have the free bit . . . N: Can we get it? T: Do you like it? N: It’s funny. It helps me learn my words. T: OK, I’ll put it on my list (T leaves). N closes app at 6 min 48 s and selects ‘Toy Story’ read-along book. He ignores the book (N: ‘it’s too hard’), immediately selecting the ‘Sheriff Woody’ colouring-in activity. N: OK . . . time to have a bit of fun with you . . .

Time analysis

Total run-time (this sample): 2 min 57 s

Selection and loading: 0 min 8 s

Surrounding student and/or teacher dialogue or interaction

Learning: 2 min 49 s

Non-learning/ entertainment: 0 min 0 s

© 2013 John Wiley & Sons Ltd

11

Behind the screens

Table 5. S&S’s Learning Pathway Description/stated learning focuses

Apps accessed and thumbnail

Scenario/description/ elapsed time

Surrounding student and/or teacher dialogue or interaction

Pirate Treasure Hunt

The app requires students to use numeracy and literacy skills to solve a range of eight ordered problems to find the missing treasure. Conceptual learning includes understanding shape, time, interpreting oral, word and visual cues, addition, ordering/sequencing and spelling.

0 min 53 s. Students need to arrange words on a ladder in correct order from coldest to hottest to reach lantern needed to explore treasure cave. The parrot speaks the instructions and the word labels. This point marks their second attempt.

S(1): It must be ‘warm’ . . . the word . . . S: Why? S(1): ‘Cos it’s a hot word and the hot words need to go at the top . . . (S selects ‘warm’ and it appears at the top rung of the ladder) S: Now check it . . . (S taps ‘Check’ button) App responds with parrot voice stating ‘Oops! That wasn’t it!’

Pirate Treasure Hunt

As above

0 min 55 s. App clears previous attempt and student attempt for the third time in this session.

S: I’ll do it this time . . . S(1): OK. Maybe we need to put them the other way around . . . you know, with the cold words at the top. It doesn’t tell us . . . S: Shall we try? S(1): Start with boiling . . . it means really hot. Put it at the bottom (S taps ‘boiling’). S(1): Now do ‘warm’ . . . it’s not as bad as boiling . . . (continues in this way until rungs are completed)

Pirate Treasure Hunt

As above

1 min 25 s. Students have completed their third attempt, this time arranging the words in what they considered correct reverse order.

S taps ‘Check’ button when ladder completed. Parrot squawks ‘Only one more try!’ Ladder empties. S: Wrong again! S(1): I don’t know how to do this. I wish it gave you some clues . . . S: It just says you got it wrong! It doesn’t help you. S(1): It’s dumb . . . let’s do another one. S: Yeah . . . (S closes app).

Where’s my Water?

This app requires students to excavate underground pathways for water supply so Cranky the alligator can take a bath. It explores physics concepts of water flow and rate with ‘water following the rules of physics, so gravity makes it flow into any opening it would in the real world’a

13 min 51 s. Students working on this app for approximately 9 min. They successfully completed level 1, selecting ‘Replay’ to try to accumulate more points at this level.

S(1): We got 20. Let’s do it again . . . S: Replay (S taps ‘Replay’ button). That was easy. See if we can get some more points. S(1): (Laughing). Do it again S . . .

Where’s my Water?

As above

16 min 30 s. Students re-selected level 1 two more times. At the conclusion of their fourth replay, they decided to move onto level 2, completing up to Level 2-1 and accumulating 250 points.

S: Wow . . . this looks hard! S(1): Level 2 . . . 2 S: What d’ya have to do here? What do those cross things do? S(1): The water’s in the big tank . . . S: Try to dig it so it gets to the duck down there . . . (S(1) ‘digs’ dirt with finger, making a path for it to the lower duck). It can’t get in! (to the Alligator). Start again shall we? S(1) quits app and selects level 2 from menu. Students successfully repeat level 2-1, but skip levels 2-2, 2–3, 2–4, 2–5, 2–6 and 2–7.

Where’s my Water?

As above

21 min 23 s. Students opt to skip levels through to Level 2-8, after briefly ‘dipping’ in to look at previous levels. At 2-8, students spend 22 s randomly tapping features before closing the app.

S(1): This is going to be hard . . . S: Oh . . . oh! Tricky! S(1): Look at those block things . . . how do those work? S: I have no idea . . . S(1): I know . . . tickle the Alligator. If you touch him sometimes he closes the curtain. (laughing). S touches Alligator. He moves but curtain stays open (both laugh). S(1): Oh! Oh well. Try another one . . . (S closes app).

Time analysis

a

Total run-time (this sample): 21 min 45 s

See http://www.slidetoplay.com/review/wheres-my-water-review/.

© 2013 John Wiley & Sons Ltd

Selection and loading: 2 min 39 s

Learning: 10 min 07 s

Non-learning/ entertainment: 8 min 59 s

12

G. Falloon

Table 6. S&H’s Learning Pathway Apps accessed and thumbnail

Description/ stated learning focuses

Scenario/description/ elapsed time

Surrounding student and/or teacher dialogue or interaction

Rocket Maths

This app involves students in a range of ‘maths space missions’. They encounter math challenges they need to solve by tapping correct responses from space objects. These include telling the time, counting money, basic facts, shapes, odd/even numbers. Three difficulty levels.

2 min 10 s. Students enter app. After entering ‘space’ an instruction appears at bottom of screen (tap the EVEN numbers). Students appear to ignore it (no verbal recognition) but begin to randomly tap numbers.

H: Get as many as you can . . . S: It’s falling . . . ohhh . . . my rocket . . . (S increases rate of random tapping. When he taps even numbers, moons explode) H: (Laughing) . . . psssshhhh (explosion sound) S: We’ve got 7 . . . H: Come on S . . . faster . . . (rocket descends back to earth, crashing into pieces) H: Ohhh . . . look at all the bits . . .! S: It bounces. . . .

Rocket Maths

As above

3 min 10 s. Students start third mission. This requires matching of time in digital format (6.45) with analogue clock face. H randomly taps clocks as they emerge.

S: . . .. go fast H . . . (pause) . . . it’s those ones . . . get those ones . . . (S appears to have identified the pattern to get the clocks to exploded, without recognizing the actual time) H: Which ones . . .? S: These ones . . .! (S taps clock indicating 6.45) (H concentrates taps on clocks showing 6.45) H: We’ve got 9 this time . . . this is going to hurt! (referring to rocket crashing). Kapowww . . .! S: See how many bounces it does . . . it’s got more bits than the last one . . . (rocket disintegrates on impact with earth)

Rocket Maths

As above

4 min 21 s. Students required to tap coin combinations adding up to 20c. Coins are American currency. No coin values are visible. Both students appear to tap screen randomly attempting to accidentally discover correct combinations.

H: Money . . .! I like money . . .! S: They look different (referring to coins). That looks like a man . . . our stuff has a lady . . . H: I don’t know . . . let’s both touch . . . (four tap points appear, randomly tapping different coin combinations. After 12 s only one correct combination had been discovered) S: Come on . . . we need to go quicker . . . the rocket’s going . . . (rocket falls to earth). I don’t get it . . . H: Can we go on water?

Where’s my Water?

This app requires students to excavate underground pathways for water supply so Cranky the alligator can take a bath. It explores physics concepts of water flow and rate with ‘water following the rules of physics, so gravity makes it flow into any opening it would in the real world’a

8 min 03 s. After skimming other apps for approximately 3 min, students open this app.

S: Ok. I sort of like this one . . . H: Me too, it’s fun! Look . . . look . . . the ducks . . . shall we can kill the ducks . . .? (S digs trench from water reserve – it flows into the duck’s cave) H: (Laughs) It’s having a shower! Do the other one . . . (S repeats with second duck) S: Oh, we didn’t get enough! H: Go on this one? (referring to repeating app) S: Yeah!

Where’s my Water?

As above

8 min 59 s. Students restart app at same level as previously.

H: My turn this time . . . (H starts to dig trench for water to flow into holding tank) S: We should be able to kill another duck . . . H: . . . now . . . just to dig down to their hole (digs hole to ‘drown’ first duck) (Laughs) . . . that’s 4 we’ve killed! S: This is cool . . . get the other one . . .! (Students repeat this level 5 more times over the next approximately 2 min accumulating ducks as they go)

Time analysis

Total run-time (this sample): 11 min 03 s

a

Selection and loading: 2 min 26 s

Learning: 0 min 0 s

Non-learning/ entertainment: 8 min 37 s

See http://www.slidetoplay.com/review/wheres-my-water-review/.

© 2013 John Wiley & Sons Ltd

Behind the screens

13

Figure 4 The Learning Journey Typology

described. Generally, these students represented ‘best performance’ scenarios for this class. Data indicate in each case students showed a willingness to engage cognitively with learning problems and challenges demanded by the app (and possessed some declarative and procedural knowledge supporting them to do so); displayed essential affective and cognitive attributes such as perseverance, determination and reflection; and in two cases were prepared to adjust the parameters of an app and use its built-in scaffolds to extend their learning (cases I&A; N). Recorded interaction indicated high ratios of learning time to non-learning/ entertainment, and the recorded dialogue demonstrated some ability and motivation to exercise learned knowledge to solve problems. Examples of this include I&A’s use of taught phonics skills and syllabification to ‘sound out’ letters to spell words in Rocket Speller and complete the word finds in Smarty Pants School, and J&N’s use of materials (number line) to solve equations in Pop Maths. I&A also deactivated the ‘read © 2013 John Wiley & Sons Ltd

along’ option in Smarty Pants Rhymes to enable them to read the rhymes themselves without text-to-speech support. This was also the case for N in his interaction with Dr. Seuss’s Green Eggs and Ham. In both of these cases, the ability to adjust the app’s settings enabled learning to be extended by providing the option of independent reading. Another characteristic of this group was their ability to deal with distractive app content or features. The effect of these on other students has been comprehensively detailed in an earlier article (Falloon, 2013). However, despite the presence of enticing animations, colourful pop-ups and entertaining game components (reward/reinforcement), students in this group generally kept their focus on the app’s learning goal to the extent of their capability (and their willingness to use built-in scaffolds to extend that capability). Unlike most other pairs, potentially distractive content such as the responsive animations in J&N’s Pop Maths was largely ignored. I&A also appeared dismissive of

14

rewards such as certificates that could be gained after certain levels of achievement had been reached (see comment in Smarty Pants Word Find). While not included in the slice of J&N’s pathway, they also demonstrated a ‘lukewarm’ response to these rewards, with J commenting, ‘anyone can get one, no matter how bad they are’ (J&N, capture 2, 7.05–7.09). Interestingly, a slight tendency towards this was also noted in other data, with the impact of similar rewards diminishing with repeated student use of the app. However, while recordings for these groups demonstrated determination and persistence in learning engagement, in each case they eventually reached a point where their existing knowledge, and ability and/or willingness to use any app built-in scaffolds to extend that knowledge, effectively ‘ran out’. That is, the app’s content difficulty level incrementally crept beyond students’ capabilities due to their (usually) unalterable automatic promotion characteristic (i.e., content difficulty automatically increases in response to correct answers). The point at which this occurred (‘The Wall’ on Figure 4) varied, depending on the level of cognitive effort and dispositional qualities of each pair, enabling greater or lesser penetration of the ‘knowledge amplification’ space. When this point was reached a ‘bounce off’ effect occurred, generally resulting in the app being closed and a different option selected. App design also played a significant role in locating ‘The Wall’ for each pair, with the nature of feedback provided by the app being particularly important. Apps containing corrective feedback – for example, tutorials or exemplars responding to detected student deficits or multiple incorrect attempts – were most effective in supporting knowledge building. Those where feedback was limited to points on a score chart or visual or audio reinforcements – such as character hand clapping or dancing animals – appeared far less effective. Of the 40 pair combinations, 11 were coded as displaying performance characteristics aligned with knowledge application-amplification. Case S&S: accommodation

This example is illustrative of the performance of the majority of pairings in this study (20 were coded as displaying characteristics aligned with accommodation). While displaying some willingness to engage cognitively with app challenges and problems, this

G. Falloon

group preferred to work within known capabilities and/ or, if possible, adjust the parameters of the app (difficulty level, maths operator, available time, etc.) to match these capabilities. A typical strategy these students applied was multiple repetition of a level they had previously accomplished in order to accumulate ‘success points’. The example illustrates this in S&S’s interaction with ‘Where’s my Water?’ where they chose to repeat levels 1 and 2 several times, skipping higher levels judged to be too difficult. Where apps lacked an obvious goal, understandable and accessible instructions, or failed to provide scaffolds or corrective feedback, students coded in this group did not display the level of cognitive persistence or determination of those described earlier. Instead, they became easily diverted into non-learning/ entertainment interactions, or ‘bailed out’ to start a different app. S&S demonstrated this in their response to the temperature-ordering task in ‘Pirate Treasure Hunt’. While the app provided summative verbal and text feedback responding to each attempt, it offered no corrective mechanism that the students could use to improve on their previous performances. This frequently resulted in mistakes being repeated, and, as in the example, eventual abandonment. S&S noted frustration with this, commenting on the desirability of apps providing ‘clues’, rather than simply telling them that they had the wrong answer. A substantial number of pairs coded in this category (n = 13) when faced with similar predicaments were recorded diverting into app features or content that they found in some way appealing, entertaining or less challenging. The S&S case illustrates this by their interaction with the alligator when they could not solve the puzzle at level 2.8 on ‘Where’s my Water?’ This tendency towards gamification when the going got tough (i.e., engagement with an app in an entertaining way not linked to its learning goal) emerged in this group, and took a number of forms. Some responded similarly to S&S, while others (n = 6) deliberately input incorrect responses or guesses, to see what would happen. Four pairs were recorded setting up a competition between themselves to see which could crash their rocket the fastest by offering random inputs in ‘Rocket Maths’. The phenomenon of gamification really began to emerge in pairs at this point on the typology. It was triggered both by a combination of diminished willing© 2013 John Wiley & Sons Ltd

Behind the screens

ness or capacity to engage cognitively – blended with poor instructions or a lack of formative feedback mechanisms or scaffolds (as for S&S), and a situation that was labelled as ‘app fatigue’ (see Figure 4). App fatigue could best be described as boredom brought about through overfamiliarity or repeated use of an app, prompting a sense of ‘déjà vu’. Early recorded data indicated a tendency for these young students, if presented with a large number of different apps, to spend much of their time on surface-level skimming or sampling across the apps, usually failing to engage with any in a substantial or learning-oriented way. A decision was made to provide more structure by selecting and sorting apps into bundles of four or five according to the learning focus at the time, and then into separate folders for each day of the week. However, while this system went some way to solving one issue, it spawned another. Through overexposure, limiting the number of accessible apps contributed to the emergence and eventual prevalence of app fatigue in this group, fuelling their tendency towards gamification. From the teacher’s perspective, difficulties existed in detecting skimming, app fatigue and gamification. At a glance from a distance (the usual scenario in a busy junior classroom), it appeared students were thoughtfully engaged in learning with the app. However, it was not until display recordings were reviewed that the nature of actual activity was revealed. This raises wider issues relating to difficulties teachers face capturing evidence of student learning pathways with these devices. Such evidence is crucial if they are to be in a position to offer diagnostic or remedial support in the case of learning difficulties. Case S&H: abdication

A smaller but still noteworthy number of pair interactions (n = 9) were coded under abdication. From the outset, students in this category displayed minimal cognitive engagement with the learning-oriented demands of the app, choosing instead to concentrate their efforts on any game component present, or convert the learning task into a game (gamification). In addition to ‘cognitive reluctance’, frequently this behaviour was linked to or triggered by poor app design. Factors such as the absence of an obvious learning goal or purpose, no, overly complex (to 5-year-olds) or inaccessible instructions, or cultural © 2013 John Wiley & Sons Ltd

15

mismatches in content, quickly caused students in this category to disengage from learning-focused interaction. Of the 45 apps used in this trial, only five had an option for text-to-speech instructions or explanation of content. Although conceptually app content may have been rated as suitable for young children, interacting with it profitably relied on students understanding what they needed to do and how to do it, and what the learning purpose or goal of the app was. To achieve this, instructions and goals needed to be accessible and understandable for students who had very limited reading ability, independent of teacher support. Unfortunately, this was seldom the case. Such issues had a compounding and negative effect on the performance of students coded in this category, who also displayed a general reluctance to develop their own solutions or seek assistance. Typically, they responded in a combination of ways. Most (n = 6) sought out the game or entertainment component in the app and spent much of their session time engaged in completing, or developing variations to it. They were also recorded challenging each other to see who could complete games the fastest or with the most impact. S&H’s interaction pathway for ‘Rocket Maths’ illustrates this in their efforts to see who could make the rocket disintegrate into the most pieces upon its return to earth, as was their quest to drown as many ducks as possible in ‘Where’s my Water?’. Interaction strategies used by students in this category lacked deliberation. Instead, they variously employed a combination of guesswork and random trial and error when attempting to solve problems, relying on chance to arrive at correct answers or combinations. S&H’s pathway provides a typical illustration of this. Affective feedback in the form of points scores motivated these students by providing a visual display of progress, and it allowed benchmarking against each other in competitions. Other commonly recorded responses for this group included skimming and bailing out. Where initial forays into an app suggested a requirement for cognitive effort beyond their inclination or capability – or if the game component held limited appeal, these students were quick to close the app and reselect, sometimes repeatedly. Likewise, if they encountered challenges within the app they judged met the same criteria, they would respond similarly. Often they would repeat this process until the array of available apps was exhausted, eventually settling on what they

16

considered was their best option. As previously discussed, it was impossible for the teacher to detect this activity unless she observed the students closely. From a distance, it appeared they were busily focused on their learning task, and usually no retrievable record of their activity was available. Implications from results

Some may criticize this study for its contentconsumption focus to device use, claiming that the approach to learning this represents is outdated, and does not take full account of the iPad’s contentcreation potential. However, it must be remembered that being new to school, these students did not possess the literacy, numeracy or research capability, or sufficiently independent work habits, to facilitate unsupported content-creation tasks. In fact, the reason why this experienced teacher chose to use the devices in this way and carefully selected the apps using robust criteria was to assist in the development of these very skills that she saw as prerequisite for more advanced and independent use later in the students’ schooling. She had also noted the motivational effect of the devices on her own children, and hoped to capitalize on a similar effect with her class. It is likely similar arguments would exist for many other teachers considering integrating these devices into their young students’ curriculum. On this basis, the study highlights a number of implications for teachers and app developers. Student knowledge and dispositions

Acknowledging the limited number of apps used in this trial of the thousands available, the majority selected offered few learning scaffolds, or formative or corrective feedback or features, that students could use to build knowledge needed to solve problems or advance levels within apps. While most provided motivating environments where existing knowledge could be practised and reinforced, only five included features that provided even the most basic forms of this type of feedback. Students’ lack of conceptual or content knowledge was not compensated for by any supportive features in most of the apps used. For teachers, this highlights dangers in assuming by default that learning is being extended while students

G. Falloon

are interacting with these devices, as usually judged by visibly high levels of motivation. Teachers have an important role to play via deliberate acts of teaching and fostering skill development, to ensure students know what they need to know and possess essential dispositional traits to profitably engage with these resources. App developers could also benefit from working alongside teachers to learn more about suitable content and features to better support and extend student learning. Teachers selecting apps

While the apps used in this study all rated highly in online reviews and were recommended for young students (K-2) – and the teacher appraised each before selecting, it was not until they were actually used with the students that problems became apparent. The main issues were no, poor or inaccessible instructions, unstated or obscure learning purpose, lack of options other than difficult text for instruction or content delivery, inadequate structure guiding student interaction, overemphasis on games of limited learning value, and limited or no built-in learning scaffolds or formative feedback systems. In selecting apps, teachers would be well advised to spend time evaluating each very carefully to ensure learning goals are apparent, accessible to students and match teaching objectives. Instructions and content should be available in a form students can understand and interact with reasonably independently, and if possible, game or practice components should include some element of formative or corrective feedback. Ideally, user parameters should also be customizable to suit different learning contexts and student needs. Collecting evidence and monitoring learning

The study highlighted difficulties gathering evidence of student learning from using these devices, due mainly to apps not storing results or providing any other easily accessible record of student interaction. While some offered the option of a result printout after each session, access to an AirPrint capable printer (Apple Inc.) was required, which this school did not have. Additionally, as devices were shared, students were not guaranteed to get the same device each time. This frequently resulted in students repeating levels or content, © 2013 John Wiley & Sons Ltd

17

Behind the screens

as either they got a different device, forgot where they were up to or the app required them to start again from the beginning. The capacity for apps to store and retrieve results, and a system developed whereby students used the same device each time, appears desirable in scenarios where devices are shared. Ideally, one device per student would mitigate this issue, but such a scenario may prove challenging for many schools. Motivation and engagement

As introduced earlier, a number of studies point to the motivational effects of these devices on students. Most speculate that this contributes in some way to enhanced learner engagement, and subsequently, improved learning outcomes. Some of these studies were quite intensive in nature, frequently using a single or few devices supported by substantial teacher (or equivalent) input, and involving an individual or small group of students focused on a narrow, often remedial task (e.g., McClanahan et al., 2012). Others reported on devices used in ‘concentrated bursts’ lasting only a few days or weeks. Some of these targeted specific student needs such as literacy (e.g., Hutchison et al., 2012), while others sought information on how students managed the devices (e.g., Geist, 2012). Although not disputing claims made in these reports about the motivational effects of using iPads with students, results from this study of a more general classroom context suggest a level of caution in automatically linking this motivation with thoughtful learning engagement. While strong motivational effects as judged by visible ‘on task’ activity were noted in this study, the difficulty was not knowing what this outwardly visible motivation was focused on. Unlike a more traditional teaching situation where the teacher could monitor students as she moved around the classroom or worked with groups – and assess and offer formative input as needed, monitoring students working on iPads proved more difficult. From a glance, appearances suggested students were eagerly engaged with the cognitive challenges posed by the apps they were using. Even on closer inspection, unless the teacher spent more than a few minutes observing, it was almost impossible to determine accurately the depth of actual learning that was occurring. Furthermore, the poor logging characteristics of most apps as discussed in the previous section did not assist this © 2013 John Wiley & Sons Ltd

situation. The modified Display Recorder app installed on each device was instrumental in revealing such issues. Conclusion

It is important at this point to make clear that this study is not ‘anti-iPad’. The author is firmly convinced that these devices, and ones like them, hold unrivalled potential as powerful learning tools in the hands of students working with skilful and diligent teachers. Their unique interface, simplicity, intuitive design, portability, connectivity, speed, range of apps and relative affordability mark them as being a significant technological leap forward in the array of digital resources available to support learning at all levels. However, while acknowledging limitations of this study in terms of the age and number of students, the relatively small number of apps trialled, and its concentration on content consumption, the data recorded using the modified Display Recorder app mapped an honest and accurate account of how these young students interacted with the device and its apps, when working independent of teacher assistance. It revealed a number of issues related to app design and content, the importance of deliberate teaching to ensure students have the necessary knowledge and skills to use apps profitably, and difficulties with monitoring progress and assessing and recording achievement. It also offered a gentle challenge to the popularly held notion of high student motivation = thoughtful learning engagement when they are using these devices. While not suggesting devices like iPads do not have an important role to play in our educational future, teachers would be well advised to be thoroughly diligent in monitoring their students’ interaction with them, and ruthlessly critical in their selection of apps. Salient lessons such as those described by Maddux and Cummings (2004) and Masters (2002) should be heeded, if the iPad is to fulfil its undoubted potential in schools. References Aronin, S., & Floyd, K. (2013). Using an iPad in inclusive preschool classrooms to introduce STEM concepts. Teaching Exceptional Children, 45(4), 34–39. Bilby, L. (2013). IT classrooms of the future. The New Zealand Herald, June 9, 2013. Retrieved from http:// www.nzherald.co.nz/nz/news/article.cfm?c_id=1 &objectid=10889272

18

Billings, E., & Mathison, C. (2012). I get to use an iPod in school? Using technology-based advance organisers to support the academic success of English learners. Journal of Science Education and Technology, 21(4), 494– 503. Brown-Martin, G. (2010). Game changer: Is it iPad? Handheld Learning. Retrieved from http://www .handheldlearning.co.uk/content/view/64/ Cochrane, T., Narayan, V., & Oldfield, J. (2013). iPadagogy: Appropriating the iPad within pedagogical contexts. International Journal of Mobile Learning and Organisation, 7(1), 48–65. Crichton, S., Pegler, K., & White, D. (2012). Personal devices in public settings: Lessons learned from an iPod Touch/iPad project. The Electronic Journal of e-Learning, 10(1), 23–31. Falloon G. W. (2013). Young students using iPads: App design and content influences on their learning pathways. Computers and Education, 68, 505–521. Foote, C. (2012). Learning together: The evolution of a 1:1 iPad program. Internet@Schools, 19(1), 22–26. Geist, E. A. (2011). The game changer: Using iPads in college teacher education classes. College Student Journal, 45(4), 758–768. Geist, E. A. (2012). A qualitative examination of two year-olds interaction with tablet-based interactive technology. Journal of Instructional Psychology, 39(1), 26–35. Gwet, K. L. (2012). Handbook of inter-rater reliability (3rd ed., pp. 15–46). Gaithersburg: Advanced Analytics. Hall, O., & Smith, D. (2011). Assessing the role of mobile learning systems in graduate management education. In R. Kwan, J. Fong, L. Kwok, & J. Lam (Eds.), Hybrid learning: Proceedings of the 4th International ICHL Conference (pp. 279–288). Berlin: Springer-Verlag. Hong Kong, 10–12 August 2011. Hedman, J., & Gimpel, G. (2010). The adoption of hyped technologies: A qualitative study. Information Technology Management, 11, 161–175. Heick, T. (2012). The past, present, and future of the iPad in learning. Emerging technologies and mobilisation. TeachThought. Retrieved from http://www.teachthought .com/ipad-2/the-past-present-and-future-of-the-ipad-in -learning/ Hutchison, A., Beschorner, B., & Schmidt-Crawford, D. (2012). Exploring the use of the iPad for literacy learning. The Reading Teacher, 66(1), 15–23. Ihaka, J. (2013, January 30). Schools put tablets on stationery list. The New Zealand Herald, Section A4. Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33, 159–174.

G. Falloon

Lynch, J., & Redpath, T. (2012). Smart technologies in early years education: A meta-narrative of paradigmatic tensions in iPad use in an Australian preparatory classroom. Journal of Early Childhood Literacy, 0(0), 1–28. Maddux, C. D. (1986). The educational computing backlash: Can the swing of the pendulum be halted? Computers in the Schools, 3(2), 27–30. Maddux, C. D., & Cummings, R. (2004). Fad, fashion and the weak role of theory and research in information technology in education. Journal of Technology and Teacher Education, 12(4), 511–533. Manuguerra, M., & Petocz, P. (2011). Promoting student engagement by integrating new technology into tertiary education: The role of the iPad. Asian Social Science, 7(11), 61–65. Masters, G. N. (2002). Towards a national school research agenda. Melbourne: Australian Council for Educational Research. (Retrieved from http://www.aare.edu.au/99pap/ mas99854.htm). McClanahan, B., Williams, K., Kennedy, E., & Tate, S. (2012). How use of an iPad facilitated reading improvement. TechTrends, 56(3), 20–28. Pegrum, M., Howitt, C., & Striepe, M. (2012). Learning to take the tablet: How pre-service teachers use iPads to facilitate their learning. Manuscript submitted for publication. Saine, P. (2012). iPods, iPads and the SMARTBoard: Transforming literacy instruction and student learning. The NERA Journal, 47(2), 74–79. Computers in the Classroom. Sheth, J., Newman, B. I., & Gross, B. L. (1991). Consumption values and market choices: Theory and applications. Cincinnati: South Western. Webster, M. (2012). The NZ iSchool. The New Zealand Herald, p. 8.

Notes 1

See http://www.apple.com/education/profiles/. An explanation of this scale can be found here: http://www.minedu.govt.nz/ NZEducation/EducationPolicies/Schools/SchoolOperations/Resourcing/ ResourcingHandbook/Chapter1/DecileRatings.aspx. 3 See http://www.studiocodegroup.com/?page_id=77. 2

Appendix I: Apps Used During the Study

1. 2. 3. 4. 5.

Play Square Rocket Speller Smarty Pants School series BlobbleWriteHD Mr Phonics Series © 2013 John Wiley & Sons Ltd

19

Behind the screens

6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25.

Pinocchio Icky Bathtime Cat in the Hat (Lite) Scramble PCS 3 Letter Words Hay Day Pirate School Where’s My Water? Cut the Rope Treasure Hunt Pet Shop Green Farm Kids’ Puzzles Pattern Game Animal Fun Matches PickinStick Classic Math Bingo Game of Life Dots for Tots Jungle Time

© 2013 John Wiley & Sons Ltd

26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45.

PopMath Rocket Maths Free Geometry Maze Toy Puzzles Logos Quiz Blitz Reading Comprehension Princess Pea The Three Pigs Gingerbread Maker Snow White Hairy Maclary Golden Lite Magnetic ABC Talking Tom and Ben News My Dogs The Emperor Little Mermaid The Berenstain Bears Lite Pirate Treasure Hunt