1 IF YOU BUILD IT WILL THEY COME ...

3 downloads 19171 Views 168KB Size Report
(2011-2013) will create online tools to guide faculty assessment design and ... email feedback to workshop leaders; b) evidence of faculty endorsement, as indicated by ... patterns,” a template based on an ECD-inspired assessment system called PADI ... interview and non-responsive answers to interview questions, such as ...
IF YOU BUILD IT WILL THEY COME? SUPPORTING FACULTY IN ASSESSING PROBLEM-BASED LEARNING

Louise Yarnall SRI International, USA [email protected] Jane Ostrander Truckee Meadows Community College, USA [email protected]

ABSTRACT Problem-based learning (PBL) curricula involve students in acquiring skills and knowledge needed to succeed in the world of work, but assessment is still needed to signal to students what is valuable to learn, provide evidence of student learning, and supply both students and instructors with feedback for continuous improvement of instruction. This paper describes an effort to develop a process to support community college and university faculty in learning to design and use assessments with one model of PBL—problem-based learning with scenarios (SBL). The findings indicate challenges to bringing SBL assessment professional development activities to scale. The paper recommends improvements that can assist SBL faculty in designing and implementing an assessment design. INTRODUCTION The Scenario-Based Learning (SBL) projects (2003-2013) were funded by the National Science Foundation to work with community college and university faculty to create and implement

1

scenario-based learning and assessment in their science, technology, engineering, and mathematic (STEM) classrooms. In the SBL approach to PBL (Schank, 1997), students are engaged in a loosely constructed, industry-inspired scenario intended to teach technical knowledge and meta-cognitive and professional skills. SBL curricula offer a solution to persistent complaints that U.S. graduates of technician programs lack skills to apply knowledge flexibly (Kolmos & Holgaard, 2010; Levy & Murnane, 2005; Lynch, 2000). Efficacy studies focused on cognitive learning outcomes show that PBL is comparable and sometimes superior to traditional instruction in facilitating content learning and supports learning flexible problem solving, application of knowledge, hypothesis generation, and coherent explanation (Dochy, Segers, Van Den Bossche, & Gijbels, 2003; Hmelo, 1998; Schmidt et al. 1996). In a study of the range of knowledge and skills that students learn from SBL curricula, researchers found such curricula provided an opportunity to acquire technical problem solving as well as a range of social skills around teamwork and social-technical skills (Yarnall & Ostrander,

in press).

Accordingly, to assess such learning, an assessment needs to target more than a student’s capacity for factual recall. This paper will describe the development and implementation of a process to support U.S. community college and university STEM educators in creating such assessments. THE DESIGN PROBLEM: BUILDING FACULTY ASSESSMENT LITERACY Facilitating and assessing PBL demands greater understanding of teaching and learning from faculty (Kolmos, Du, Holgaard, & Jensen, 2008). Yet, in the U.S., community college faculty are typically hired for their technical content knowledge and industry experience—not their pedagogical or andragogical training. Few educators receive any formal training in assessment (Guskey, 2006). Indeed, even K-12 educators with extensive credentialing requirements for pedagogy lack extensive experience in the art and science of assessment design and use (Shavelson, 2003). Stefani (2004-2005) has recommended increased professional development for assessment to help higher education faculty teach students more sophisticated skills, such as analysis and evaluating information. Past work in cultivating teachers’ assessment literacy has encountered numerous barriers, including: resistance based on fears of accountability, lack of time, and little clear vision on what to do with assessment results beyond entering a grade in a

2

grade book. Fostering change in assessment requires time to reflect and integrate ideas into one’s own work (Sato & Atkin, 2006/2007). In a review of 14 post-secondary institutions’ effort to improve assessment on their campuses, Maki (2010) describes “challenges to gain faculty momentum in assessment” and “patterns of faculty resistance that range from initial mild discomfort to outright denial based on the enduring belief that grades themselves are the best documentation of student learning” (p. 3). The design for the SBL project’s assessment process has passed through two phases. In Phase 1 (2006-2008) researchers with assessment expertise co-designed assessments with faculty developing new SBL tasks. The next phase (2008-2010) involved researchers and instructors collaborating to develop and implement an evidence-based assessment reflection (EC-AR) process and tools to help faculty prepare effective task assessments during professional development workshops (Yarnall & Ostrander,

in press). The current faculty-led phase

(2011-2013) will create online tools to guide faculty assessment design and implementation. RESEARCH QUESTIONS This paper will address the following research questions: 1. How have faculty resisted assessment design and use during the SBL project? 2. How have faculty endorsed assessment design and use during the SBL project? 3. What quality of assessment materials have been designed during these two phases of the SBL project? 4. When examining these patterns of faculty resistance and endorsement and the quality of assessment materials over the two phases, what are the core elements of the SBL assessment approach that must be maintained when bringing the professional development system to scale? METHODOLOGY/ANALYSIS In analyzing results from the SBL project, the team leaders reviewed several key documents, including annual evaluation reports; annual and final project reports; and SBL assessment

3

materials produced during the project. From these data sources, researchers focused on specific evidence of resistance and endorsement, and on actual assessment materials produced. They tracked the following data: a) evidence of faculty resistance, as indicated by alternative interpretations of a workshop assignment in the online community discussion threads and direct email feedback to workshop leaders; b) evidence of faculty endorsement, as indicated by consistent interpretations of a workshop assignment in online discussion threads and direct email feedback to workshop leaders; and c) evidence of differences in assessment materials produced in each phase of the project, as indicated by an assessment comprehensiveness coding system. This analysis involved reviewing SBL assessment documentation created through an evidence-centered design approach (Mislevy & Risconcente, 2006), which captured the knowledge and skills to be learned—and measured—in a fine-grained manner to improve the assessment validity argument. A coding system was applied to the assessment documentation that is based on (1) the literature that indicates the potential of PBL curricula to teach a range of technical problem-solving, teamwork, and communication skills (Downing, Kwong, Chan, Lam, & Downing, 2009; Hmelo, 1998; Hmelo-Silver, 2004; Reynolds & Hancock, 2010) and (2) prior project research that characterized the core learning outcomes of SBL modules developed by faculty as: technical problem solving skills and social and social-technical forms of knowledge and skill. The following activities in the modules and assessments were classified as technical problem solving: research and analysis, framing a problem, generating a product, using tools, and making inferences (Yarnall & Ostrander,

in press). Activities around presenting or

justifying solutions to technical problems were classified as social-technical, and activities involving teamwork were classified as social. Accordingly, assessment materials produced during each phase were rated according to a 3-level scale (low to high) that characterized how comprehensively the assessment materials targeted the technical, social-technical, and social skills. Materials received “high” ratings if they included at least one social and social-technical skill type and two phases of technical problem solving, which, by its nature, should be multi-step or multi-phase (Jonassen, 2000). Materials received “moderate” ratings if they included at least one social and social-technical skill and one phase of technical problem solving. Materials received a “low” rating if they included one type of social and social-technical skill or one phase of technical problem solving.

4

FINDINGS PHASE 1: RESEARCHER-FACULTY ASSESSMENT CO-DESIGN In the initial phase, researchers from SRI International engaged in assessment co-design with faculty who were designing SBL tasks. SRI researchers interviewed faculty about learning goals, evidence of learning in student behavior and work products, rubrics, teamwork and project management, and complexity of problem solving. These findings were documented in “design patterns,” a template based on an ECD-inspired assessment system called PADI, or Principled Assessments for Design in Inquiry (Mislevy & Haertel, 2006). The design pattern includes several attributes of an assessment: rationale for teaching the knowledge/skills to be assessed, lists of the knowledge/skills to be assessed, prerequisite knowledge/skills, work products, observations, characteristic features of an assessment of the knowledge/skills, and variable features that will make assessments easier/harder. Design patterns may be used repeatedly to create formative or summative assessments. Elements that Faculty Endorsed Some faculty used the design patterns to create a set of assessments to use with their SBL tasks. Below are the six findings from the work of five participating faculty: Team work assessment (Social). Faculty conducted informal progress checks with teams, identifying problems around project management and team dynamics. Few faculty actually graded team participation, but those who did employed a “peer assessment” technique where students rated all team members by how much they contributed. Presentation assessment (Social-technical). Faculty focused on aspects such as content accuracy and thoroughness, presentation organization, professional appearance, and eye contact. Some brought in an outside professional to view the presentations to provide a more authentic experience.

5

Technical assessment (Technical problem solving). Faculty reported SBL let them observe aspects of student processes not seen before. They engaged in more nuanced coaching of student technical processes as a result. Types of assessments. Faculty members preferred to focus on the type of assessment task they were using (e.g., presentation, final exam, or log book); these were specific artifacts they were accustomed to using in their classrooms. Grading procedures. Grading procedures varied widely. Some faculty provided a holistic grade with broad feedback about student performance; Some used detailed checklists of performance features assigned weighted numeric scores; Some faculty graded students on whether they handed in an assignment on time or not. Rubrics. When asked to supply researchers with existing assessments, none of the faculty shared any rubric. When working with researchers to develop assessments, however, faculty members share good ideas for how to specify rubrics. Faculty Resistance to Assessment During the initial interview, resistance took the form of complaints about the length of the interview and non-responsive answers to interview questions, such as providing descriptions of the knowledge and skills (learning objectives) that were too vague or too complicated to be useful for developing assessments. During SBL development, resistance took the form of refusals to create or implement assessments, usually because of lack of time or a perception that the assessment process might alienate students. The review of the comprehensiveness of formative and summative assessments in Phase 1 are presented in Table 1. As the data indicate, researchers could usually assure that the SBL assessments covered multiple phases of the technical problem-solving process and different types of social and social-technical knowledge. Researchers could also ensure that different modes of assessment were designed (formative and summative). The moderate ratings occurred

6

when faculty designers declined to include specific assessment rubrics around teamwork or presentation skills or declined to assess more than one type of technical problem solving skill. Table 1. Phase 1 SBL Tasks and Assessments Produced, and Quality of Learning Outcomes Specification SBL Tasks Computer programming Ajax Python Network security Environmental studies Engineering Structural collapse Rescue robot Bioinformatics Databases Search/Data Use Data Use/Analysis Full Analysis

SBL Assessments Formative Summative

Comprehensiveness of Learning Outcomes Specification in Assessment Materials

Yes Yes Yes Yes

Yes Yes Yes Yes

High High High Moderate

Yes Yes

Yes Yes

High High

Yes Yes Yes Yes

Yes Yes Yes Yes

Moderate Moderate Moderate Moderate

Adjustments Made Most of the adjustments in this phase occurred around the information collection process. Researchers refined the interview process to link to faculty members’ existing assessment vocabulary and to take less time. Researchers also developed systems to document faculty input to ensure each SBL task assessed the core SBL knowledge and skills. FINDINGS PHASE 2: ASSESSMENT DESIGN IN SBL DEVELOPMENT WORKSHOPS To scale up SBL usage, the project team offered six different workshops that educators could attend from June 2008 to May 2010. Since the focus of this paper is on faculty design of assessments using the EC-AR approach, only the two workshops including assessment design in their agenda will be discussed. These workshops devoted the initial three assignments to defining student learning outcomes (SLOs) and creating SBL tasks, and then the fourth to creating assessments. To make the process manageable, faculty were asked to create a formative and summative assessment for each of their top two SLOs. The process engaged faculty in a paired

7

peer interview to select SLOs for their SBL task and identify relevant knowledge and skills, and a series of interactive assessment design worksheets. Elements that Faculty Endorsed In the workshops faculty endorsed certain activities and tools. Faculty filled out the SLO and assessment sections of the Task Development form, sometimes putting a high level of detail into the assessment plans, and, in one case, showing considerable assessment literacy. Faculty also endorsed the Project-Based Task Interview activity. They liked talking with each other about respective SBL tasks and how to prioritize skills or characterize them according to SBL categories relating to technical problem solving or social and social-technical skills. Those who endorsed the Assessment Menu and the Design Pattern Template said they liked considering how different assessments revealed different aspects of what students were learning. They also liked having a structure to understand what and how to measure the various skills students were learning in an SBL task. Some faculty members found it easy to discern the two distinct levels of granularity in the assessement design process, from defining high-level SLOs in the Task Development Form to specifying finer-grained attributes of the assessments in the assessment design pattern templates. Faculty Resistance to Assessment When faculty resisted the Assessment Menu and Design Pattern Template, they complained that it felt redundant. In reviewing their work, it appeared that some faculty members had determined that writing high-level SLOs in the Task Development Form led directly into assessment design, usually using the same approaches they had used for non-SBL lessons. The results were problematic in the following ways: Central skills to be learned in an SBL task were not identified for assessment. Students doing an SBL task on survey creation were to be graded on how well they wrote a survey report and worked on a team—but criteria for quality around the process skills of survey creation were never specified. In another case, students creating online courses were to be graded on how well

8

they completed a “Seven Components Model,” but criteria for rating the instructor’s seven components were never specified. Critical forms of process knowledge and skills to be learned in an SBL task were not differentiated for assessment. In another workshop, it was clear that few of the 6 out of 12 workshop participants who turned in assessment work differentiated among the high-level categories of SBL learning outcomes: technical problem solving, social skills, and social-technical skills. As a result, faculty did not use the design patterns as intended to create distinct assessments for these specific SBL learning categories. Instead, they put all kinds of skills into one design pattern. The final assessments often focused narrowly on the quality of a final product. It was not clear that students’ process skills were being assessed. The assessment comprehensiveness analysis of the assessment materials produced during the workshops appears in Table 2. Table 2. Comprehensiveness of Phase 2 SBL Assessments SBL Tasks

SBL Assessments Formative Summative

Comprehensiveness of Learning Outcomes Specification in Assessment Materials

Consumer Report Surveys

No

Yes

Moderate

Financial Report

Yes

Yes

Low

Creating an Online Class

No

No

Low

Public Health Presentation

Yes

No

Moderate

Create Flier on Agricultural

No

Yes

Moderate

Blog Reviews of Browsers

No

Yes

Low

Accounting Spreadsheets

No

No

Low

Updating Websites

No

Yes

Low

Report on Novel

Yes

Yes

Moderate

Industry with Data

Adjustments Recommended A review of the data from Phase 2 indicates that there was a decline in the comprehensiveness of SBL skills assessed and the use of both formative and summative assessment modes. Recommended adjustments include the following:

9



Provide faculty with a clear understanding that SBL employs multiple forms of assessment, both formative and summative, which focus on different phases of technical problem solving and different types of social or social-technical skill.



Clarify that assessment design does not flow directly from identification of high-level SLOs.



Clarify that the SBL assessment specification process expands faculty ideas about the range of assessments to use and the process skills involved in producing specific products.

CONCLUSION AND DISCUSSION In any effort to bring an innovation to scale, designers make adjustments to their original approach to support wider dissemination (Coburn, 2003; Dede & Rockman, 2007). As the SBL project team continues its work, it will be important to foster faculty buy-in without undermining the central justifications for using the SBL curriculum. In this paper, we have identified the core categories of learning outcomes that need to be assessed in SBL tasks: a) process skills of technical problem solving and b) process skills of a social or social-technical nature. The data indicate variation in the assessment literacy of faculty. Those with lower assessment literacy may have problems orchestrating the multiplicity of SBL learning outcomes. A common response is to toss together a mix of familiar assessment approaches, particularly those focused on holistic ratings of final products, without sufficient reflection on whether the assessments target the most important process learning outcomes of the SBL task. In future work, faculty will be engaged in using EC-AR online tools to develop SBL assessments. Our findings suggest that a clear framework defining the range of SBL task knowledge and skills needs to be incorporated into these tools to support faculty understanding and assessment development needs. This material is based upon work funded by the National Science Foundation under grant DUE #0603297 and #0903276. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.

10

REFERENCES Coburn, C. E., 2003, Rethinking scale: Moving beyond numbers to deep and lasting change. Educational Researcher, (32), 6, 3-12. Dede, C. and Rockman, S., 2007, Lessons learned from studying how innovations can achieve scale., Threshold, (5), 1, 4-10. Dochy, F., Segers, M., Van Den Bossche, P. and Gijbels, D., 2003, Effects of problem-based learning: A meta-analysis. Learning and Instruction, 13, 533-568. Downing, K., Kwong, T., Chan, S-W, Lam, T-F, Downing, W-K, 2009, Problem-based learning and the development of metacognition, Higher Education, 57, 609-621. Guskey, T. R., 2006, April, “It wasn’t fair!” Educators’ recollections of their experiences as students with grading. Paper presented at the annual conference of the American Educational Research Association, San Francisco, CA. Hmelo, C. E., 1998, Problem-based learning: Effects on the early acquisition of cognitive skill in medicine. Journal of the Learning Sciences, (7), 2, 173-208. Hmelo-Silver, C. E., 2004, Problem-Based learning: What and how do students learn? Educational Psychology Review, (16), 3, 235-266. Jonassen, D.H., 2000, Toward a design theory of problem solving. Educational Technology Research and Development, (48), 4, 63-85. Kolmos, A., Du, X., Holgaard, J.E. and Jensen, L.P., 2008, Facilitation in a PBL environment. Aalborg

University,

Denmark.

Retrieved

26

September

2011

from

http://www.euronet-pbl.net/wp-content/uploads/2009/11/Facilitation_in_a_PBL_environm ent.pdf Kolmos, A. and Holgaard, J. E., 2010, Responses to problem based and project organised learning from industry, The International Journal of Engineering Education, (26), 3, 573-583. Levy, F. and Murnane, R. J., 2005, The New Division of Labor: How Computers are Creating the Next Job Market. (Princeton, NJ: Princeton University Press). Lynch, R. L., 2000, High school career and technical education for the first decade of the 21st century., Journal of Vocational Education Research, (25), 2, 155-198.

11

Maki, P. L., 2010, Coming to Terms with Student Outcomes Assessment: Faculty and Administrator's Journeys to Integrating Assessment in Their Work and Institutional Culture. (Sterling, VA: Stylus). Mislevy, R. and Haertel, G., 2006, Implications of Evidence-Centered Design for Educational Testing (Draft PADI Technical Report 17). (Menlo Park, CA: SRI International). Mislevy, R. and Riconscente, M. M., 2006, Evidence-centered assessment design: Layers, structures, and terminology. In S. Downing & T. Haladyna (Eds.), Handbook of Test Development (Mahwah, NJ: Erlbaum), 61-90. Reynolds, J. M. and Hancock, D. R., 2010, Problem-based learning in a higher education environmental biotechnology course, Innovations in Education and Teaching International, (47), 2, 174-186. Sato, M. and Atkin, J. M., 2006/2007, Supporting change in classroom assessment., Educational Leadership, (64), 4, 76-79. Schank, R., 1997, Virtual Learning. A Revolutionary Approach to Building a Highly Skilled Workforce. (New York, NY: McGraw-Hill). Schmidt, H. G., Machiels-Bongaerts, M., Hermans, H., Ten Cate, T. J., Venekamp, R. and Boshuizen, H. P. A., 1996, The development of diagnostic competence: Comparison of a problem-based, an integrated, and a conventional medical curriculum. Academic Medicine, 71, 658-664. Shavelson, R. J., 2003, On the ıntegration of formative assessment ın teaching and learning with ımplications for teacher education. Paper prepared for the Stanford Education Assessment Laboratory and the University of Hawaii Curriculum Research and Development Group. Stefani, L. 2004-2005, Assessment of student learning: Promoting a scholarly approach. Learning and Teaching in Higher Education, 1, 51-66. Yarnall, L. and Ostrander, J. (in press). The assessment of 21st‐century skills in community college career and technician education programs. In C. Secolsky and D.B. Dennison (Eds.) Measurement, Assessment, and Evaluation in Higher Education (New York, NY: Routledge).

12