Diffusion of Engineering Education Innovations - Wiley Online Library

30 downloads 795 Views 5MB Size Report
and department use of seven engineering education innovations. One hun- ..... are also influenced by peer support, department climate, and institu-.
Diffusion of Engineering Education Innovations: A Survey of Awareness and Adoption Rates in U.S. Engineering Departments MAURA BORREGO, JEFFREY E. FROYDa, AND T. SIMIN HALL Texas A&M Universitya, Virginia Tech

BACKGROUND Despite decades of effort focused on improvement of engineering education, many recent advances have not resulted in systemic change. Diffusion of innovations theory is used to better understand this phenomenon. PURPOSE (HYPOTHESIS) Research questions include: How widespread is awareness and adoption of established engineering education innovations? Are there differences by discipline or institutional type? How do engineering department chairs find out about engineering education innovations? What factors do engineering department chairs cite as important in adoption decisions? DESIGN/METHOD U.S. engineering department chairs were surveyed regarding their awareness and department use of seven engineering education innovations. One hundred ninety-seven usable responses are presented primarily as categorical data with Chi square tests where relevant. RESULTS Overall, the awareness rate was 82 percent, while the adoption rate was 47 percent. Eighty-two percent of engineering departments employ student-

I. INTRODUCTION Over the past two decades, tremendous effort has been invested in improving engineering education, producing advances such as student-centered pedagogies, the introduction of design and other engineering concepts and experiences earlier in the curriculum, better understanding of the role of assessment, and new ideas on how to recruit, retain, and graduate underrepresented groups. Sadly, “these changes…have not resulted in major systemic change within engineering education” (National Science Foundation, 2008). It is clear that propagating these innovations from the institutions at which they were developed to more widespread use requires additional attention (Brainard, 2007; Ertmer, 1999; George and Horne, 1998; NSF, 1996). A number of reports call for “collective action” to connect and build upon STEM education initiatives (Project Kaleidoscope, 2002), specifically to support rapid dissemination of successful educational innovations (Fox and Hackerman, 2003). Among organized efforts to disseminate engineering education findings are the National Digital Science Library (Zia, 2001), Peer Reviewed Research Offering Validation of Effective and Innovative Teaching (PR2OVE-IT) (Lovitts and Fortenberry, 2006), long-running workshop programs such as the National Effective Teaching Institute (NETI) (Felder and July 2010

active pedagogies (the highest). Mechanical and civil engineering had the highest rates, in part due to many design-related innovations in the survey. Few differences by institution type were evident. In the past, word of mouth and presentations were far more effective than publications in alerting department chairs to the innovations. Department chairs cited financial resources, faculty time and attitudes, and student satisfaction and learning as major considerations in adoption decisions. CONCLUSIONS The importance of disciplinary networks was evident during survey administration and in the results. Specific recommendations are offered to employ these networks and the engineering professional societies for future engineering education improvement efforts.

KEYWORDS change, diffusion of innovations, faculty development

Brent, 2010), and The Science and Engineering Education Scholars Program (Matsumoto et al., 1998). Experienced STEM education researchers know that disseminating their innovations, together with compelling assessment results, is necessary—but not sufficient—to stimulate faculty to change their teaching practices (Foertsch et al., 1997; Silverthorn, Thorn, and Svinicki, 2006). Within U.S. engineering education, this realization was translated into a call for more rigorous—and therefore more convincing—research (Borrego, 2007; Gabriele, 2005). However, widespread change, recommended in such widelycited works such as Engineer of 2020 (National Academy of Engineering, 2004, 2005) and How People Learn (Bransford, Brown, and Cocking, 2000), requires more than compelling results, as evidenced by the experience of the U.S. National Science Foundation-funded engineering education coalitions. Over time, coalition leaders learned that focus on the innovation itself must be supplemented by gaining feedback, adjusting the innovation, and providing on-going support to faculty members (Clark et al., 2004; Froyd, 2001; Froyd, Penberthy, and Watson, 2000; Malave and Watson, 1996). With engineering education innovations, then, we must be careful to avoid over-reliance on the quality of the innovation and Journal of Engineering Education 185

assessment results alone to compel engineering faculty to change their practices. According to Harvard professor Howard Gardner, research is just one of seven levers for Changing Minds (2004). Engineering education leaders are painfully aware that the reward system values research over teaching, learning, advising, and career development (Fairweather, 1993; Milem, Berger, and Dey, 2000; Soyster, 2008). The purpose of this study is to understand and make recommendations to promote adoption of engineering education innovations with demonstrated value. The research questions which guided our study are: • How widespread is awareness of established engineering education innovations? Are there differences by discipline or institutional type? • How widespread is adoption of established engineering education innovations? Are there differences by discipline or institutional type? • How do engineering department chairs find out about engineering education innovations? • How do we expect adoption levels to change, if at all, in the future? • What factors do engineering department chairs cite as important in adoption decisions? • Which colleges or universities are considered leaders or innovators in engineering education? We begin to address these questions using a national survey of U.S. engineering department chairs. These results provide important empirical data to quantify awareness and adoption levels across a range of innovations, disciplines, and institution types. By definition, the awareness level will be equal to or greater than the adoption level for a given innovation. But the gaps between these two measures and the relative gaps for various innovations will provide some insight as to why adoption levels are not higher. The absolute values will help identify whether “the problem” lies at the awareness stage or later in the adoption process. Thus, our results describe current awareness and adoption levels to help review prior efforts at diffusing engineering education innovations and provide deeper understanding of dissemination mechanisms leading to specific recommendations for the continued improvement of engineering education.

II. LITERATURE REVIEW Because this research is informed by two different bodies of literature: diffusion of innovations in engineering education and the engineering education innovations themselves, the literature review has two parts. A. Part 1: Research on Diffusion of Innovations in Engineering and Higher Education A number of perspectives and frameworks have been applied in studying and promoting change in undergraduate STEM instructional practices. Henderson et al. (2008a, 2008b; 2010) systematically analyzed nearly 400 journal articles published in 176 journals across a range of fields to identify approaches to change. First, they identified four categories of change strategies; then, they noted a pattern of authors from particular disciplines favoring one approach over the others. Higher education researchers tended to focus on the culture and policies of 186

Journal of Engineering Education

organizations, e.g., valuing teaching in promotion and tenure. Faculty developers frequently sought to support educators in becoming reflective and systematic in their teaching. STEM faculty members sought to “disseminate (and, perhaps develop or compile) a set of 'best practices' instructional strategies or materials” (Henderson et al., 2008a, p. 133). A much smaller proportion of articles, written primarily by administrators, focused on developing a shared vision for an organization, such as an academic department, that values teaching (Henderson, et al., 2008a). While the engineering education literature includes some faculty development studies (e.g., McKenna, Yalvac, and Light, 2009), the primary change strategy has been to develop and disseminate pedagogies and curricula. Thus, among the candidate theoretical perspectives to guide this study, the broad interdisciplinary research area referred to as ‘diffusion of innovations’ (Rogers, 2003) is most relevant. A diffusion of innovations perspective emphasizes characteristics of the innovators, the innovation(s), the potential adopters, and their context to understand and predict levels and rates of adoption. Steps, phases, and trends over time are also frequently part of this type of study. According to Rogers (2003), the adoption process occurs in five stages: 1. Awareness—Awareness of the innovation, but lacking complete information about it. 2. Interest—Growing interest and information seeking. 3. Evaluation—Decision whether or not to try innovation based on present and future situation (process may end here if negative decision). 4. Trial—Making use of the innovation. (If use does not continue, this is called “reneging” on adoption.) 5. Adoption—Continued full use of the innovation. The implications of this stage model for engineering education are that different types of interaction and information are needed at each stage to encourage adoption. To assist in making recommendations for change agents, we measure both awareness and adoption levels in our survey. Rogers also identifies five categories of independent variables which influence the rate or level of adoption of a given innovation (which is the dependent variable): 1. Perceived attributes of the innovation 2. Type of innovation-decision 3. Communication channels 4. Nature of the social system 5. Extent of change agents’ promotion efforts In the following, we focus on the meaning of these variables in an engineering education context; those variables that are not considered applicable (or at least do not vary within the engineering education context) will not be mentioned further. For example, widespread adoption is more likely the greater the extent of change agents' promotion efforts. All of the innovations in our study have evidence of some effort, but quantification of this effort is beyond the scope of our analysis. We return to this topic in our discussion and recommendations. Within perceived attributes of the innovation, there are two subcategories of direct relevance. Compatibility refers to consistency of the innovation with values, experiences, and needs of the potential adopter; the value of teaching innovation or improvement is most relevant across engineering education. Between engineering disciplines, there may also be variations in the value or definitions of July 2010

certain skills, e.g., design, which would impact adoption of a specific innovation. Second, complexity refers to the perceived (or actual) difficulty of adopting the innovation. Some innovations in our study can be adopted by a single faculty member, while others require coordination across departments and other academic units. Rogers explains that the higher the perceived complexity of an innovation, the lower its rate of adoption. To explore these factors, we included survey questions on seven innovations of varying complexity. The type of adoption decision can be optional, collective, or authority. Many engineering education innovation adoption decisions are optional among faculty members, particularly those that take place in one course in one department. Adoption of more complex innovations, in this case those requiring coordination across academic units, may need to be a combination of collective and authority decisions. Authority decisions can be made more rapidly than collective decisions, but may be undermined in actual implementation, while collective decisions may lead to embodiments of an innovation that may be sustained. Again, these factors will be considered in survey questions on seven different innovations. Communication channels can be mass media or interpersonal. In engineering education, mass media includes journal articles, conference publications, and professional society publications such as ASEE’s Prism. These are contrasted with interpersonal channels, such as having an informal conversation with someone describing his or her positive experience with an engineering education innovation. Rogers explains that mass media channels are more important at the awareness stage, while interpersonal channels are critical at the evaluation stage. Thus, the survey intended to gather information about both awareness and adoption levels to inform recommendations about strategies to promote awareness and adoption. Nature of the social system refers to characteristics of the relationships between change agents, opinion leaders, and potential adopters. In engineering education, engineering schools and/or influential individuals may act as opinion leaders who influence the adoption decisions of others. Rogers explains: …the heart of the diffusion process is the modeling and imitation by potential adopters of their near peers’ experiences with the new idea. In deciding whether or not to adopt an innovation, individuals depend mainly on the communicated experience of others much like themselves who have already adopted a new idea. These subjective evaluations of an innovation flow mainly through interpersonal networks (Rogers, 2003, pp. 330–331). Thus, interpersonal networks play a key role in diffusion of innovations. One important factor in network communication is similarities between members of a network. Heterophilous networks are more likely have an influx of novel ideas, but homophilous networks have better communication: “When two individuals share common meanings, beliefs, and mutual understandings, communication between them is more likely to be effective” (Rogers, 2003, p. 306). This explains, for example, why engineering educators, like those in other disciplines, tend to discount the results of research studies which did not include engineering students; they desire information on settings similar to their own (Lattuca and Stark, 1995; Wankat et al., 2002). Foertsch et al. (1997), describe common understandings between individuals as a necessary, but insufficient condiJuly 2010

tion for change in chemistry education. Networks in higher education are complex, but likely to be influenced by specific institution, institutional type, and discipline. In specific disciplines (e.g., civil engineering, computer science), networks are a strong factor in professional success, and therefore much effort is expended in building and maintaining them (Crane, 1972; de Solla Price and Beaver, 1966; Lemaine et al., 1976). Thus, our survey compares the responses by institutional type and by engineering discipline. Additionally, we ask respondents to list specific institutions they view as opinion leaders. However, there are also limitations to a diffusion of innovations perspective. Adapting the innovation to the local environment is taken into account to some extent, but by and large, faculty members are treated as though they do not have the agency, creativity, or skill to develop or adapt their own teaching innovations (even though the innovators themselves are likely focusing on time constraints when they package curricula for easy dissemination). The more an innovation is changed, adapted, and even developed independently in different locations, the more challenging it is to study using a diffusion of innovations framework. The framework acknowledges that this is a good sign of widespread adoption, but it also becomes more challenging to define, delineate, or trace. Indeed, the more widely adopted an innovation becomes, the term “innovation” itself can become problematic. The preceding point also suggests another challenge in applying the diffusion of innovations framework. Adoption of an innovation can be apparent, e.g., has a household purchased a HDTV? However, in the case of each of the innovations selected for this paper, discerning individuals might disagree about whether a faculty member, department, or college has actually adopted an innovation. A department chair might state that a faculty member is applying student-active pedagogies, while the faculty member, peer faculty members, and experts in student-active pedagogies may or may not agree. Similar differences of opinion could arise in the case of the other six innovations. For innovations over which individuals might have different opinions about whether an agent has adopted an innovation, it would be helpful to have multiple perspectives. As is the case with this study, resource limitations may constrain the diversity of perspectives that can be solicited, researchers must decide which points of view will be sought. For this study, a limited set of viewpoints were requested and the rationale for the decision are included in the methodology section. B. Part 2: Innovations in Engineering Education We based our selection of innovations for this study on four criteria: 1. The innovations should be adopted broadly enough and sufficiently documented in the literature that an adoption decision can be informed by applications at diverse institutions and several years of experience. 2. The innovations should not have been adopted almost universally so that the study can provide some information on the process of adoption across the engineering education community. 3. The innovations have been shown to have some positive influence on student learning and/or retention. 4. Each innovation can be distinguished from the other innovations so that respondents are not confused. We acknowledge some overlap but that is somewhat mitigated by the way the innovations are labeled and discussed in the literature. Journal of Engineering Education 187

Table 1. The seven engineering education innovations, descriptions, and order used in final version of survey. The following sections present concise overviews of the innovations to demonstrate that they satisfy these criteria. In this section, they are presented in chronological order based on the earliest published descriptions we could find. Table 1 summarizes the innovations and definitions in the order used in the survey. 1) Student-Active Pedagogies: The earliest innovation in our study is student-active pedagogies (Bonwell and Eison, 1991; Froyd, 2008; Prince, 2004). At its core, an instructor using a student-active pedagogy designs the class meeting so that students are routinely involved in tasks other than taking notes from a lecture. Lecture activities are an integral part of student-active pedagogies, but so are other activities such as think-pair-share (Lyman, 1981), and jigsaws (Doymus, 2008) in which students become experts in a specific topic and then teach each other in small groups, working on problems in problem-based learning, or formulating and addressing questions in inquiry-based learning. (We note that problem- or project-based learning (PBL) is seeing increased application in the United States; however, lack of a common definition suggested that creating a separate category of for this innovation at this time might be problematic. Most PBL approaches would be captured in this category by our broad definition.) Non-lecture activities can be as short as 1–3 minutes or as long as entire class periods. Concerns that faculty members have about adopting student-active pedagogies have been well documented. These include sufficient content coverage (Cooper, 1995; Cooper et al., 2000); loss of control, especially in large enroll188

Journal of Engineering Education

ment courses (Cooper 1995, Cooper et al., 2000); student resistance (Cooper et al., 2000; Felder and Brent, 1996; Keeney-Kennicutt, Gunersel, and Simpson, 2008); and students who depend on other students in the group (Cooper, 1995). Applications of student-active pedagogies have been sufficiently documented in the literature as evidenced by meta-analyses (Dochy, Van den Bossche, and Gijbels, 2003; Johnson, Johnson, and Smith, 1998; Prince, 2004; Springer, Stanne, and Donovan, 1999). Numerous studies, including the meta-analyses cited above, document positive influence on student learning (Armstrong, Chang, and Brickman, 2007; Lewis and Lewis, 2005; Strobel and Barneveld, 2009). One characteristic that may impact adoption is the relatively low complexity; an individual faculty member can adopt this innovation through an optional decision. However, Dancy and Henderson found that faculty decisions are also influenced by peer support, department climate, and institutional structures and policies (Henderson and Dancy, 2007). 2) Engineering Learning Communities and Integrated Curricula: Engineering student learning communities intend to help students concurrently build social and conceptual connections. Students build social connections in academic contexts because they take multiple courses with roughly the same set of peers. Students build conceptual connections because faculty members teaching the multiple courses construct mechanisms to help students link concepts in one course to those in another. These learning communities, or integrated curricula, take various forms (linked courses, freshman interest groups, July 2010

clustered or federated courses, integrated curricula, and coordinated studies), all intended to help students connect with faculty as well as each other (Froyd and Ohland, 2005; Gabelnick et al., 1990; Tinto, 1998). Froyd and Ohland (2005) reviewed integrated first-year engineering curricula and found implementations of engineering learning communities programs at 14 of approximately 350 engineering institutions. Engineering learning communities have demonstrated that they (a) enhance success in the first-year courses, (b) address professional skills, (c) improve student retention, and (iv) promote diversity (Froyd and Ohland, 2005). Because implementing learning communities requires registration coordination and may compel instructors to communicate with each other, implementation logistics are complex and by nature span different departments, frequently beyond engineering. As a result, they require the active support of administrators, as well as consensus across departments, if a large percentage of an institution's engineering enrollment is to participate. 3) Artifact Dissection: Artifact dissection refers to classroom or laboratory activities in which students dissect a common product currently in use (e.g., a carburetor, sewing machine, bicycle) to explore function and design. Students usually prepare a final report describing the design and may describe historical or social aspects influencing the product design (Agogino, Sheppard, and Oladipupo, 1992). Agogino et al. cite Sheppard's conference paper (1992) as the first describing this learning activity. At least three universities involved in the Synthesis Coalition are documented to have used artifact dissection (Agogino, Sheppard, and Oladipupo, 1992), while other publications describe these activities at seven others (Ambrose and Amon, 1997; Barr et al., 2000; Carlson et al., 1997; Demetry and Groccia, 1997; Lamancusa et al., 1997; Lamancusa et al., 1996; Michelson, Jenison, and Swanson, 1995; Ollis, 2004). In most cases, artifact dissection is employed in mechanical engineering courses, but there are also a few instances of materials and chemical engineering dissection experiences. Like student-active pedagogies, this innovation might be adopted by a single instructor, or a group responsible for first-year engineering courses. 4) Summer Bridge Programs: Summer bridge programs in engineering are typically residential programs for incoming (freshman) students to assist with the transition to college and engineering (Kezar, 2001). They are often run by an engineering diversity programs office (but not by all of these offices); however, they include a significant academic component which requires participation from faculty (Nave et al., 2006) and administrators. This level of coordination, or nature of the social entity, has implications for adoption. One conference paper (Matanin et al., 2007) provides significant detail of how one of these programs is organized in engineering. During five weeks in the summer before their freshman year, admitted students live on campus and attend short courses in chemistry with lab, math, and engineering problem-solving “designed to be similar to those taught in the fall semester” in which students are “treated as if they were enrolled as freshmen,” complete with tests and grades (p. 2). Additional activities for social development took place in class, the dormitories, and on weekends. Historically targeted to underrepresented students, many of these programs are now offered to all incoming students to comply with federal and state laws. Surprisingly little is written about summer bridge programs (Kezar, 2001), but their design is based upon sound research into the major barriers to success of first generation college students July 2010

(Terenzini et al., 1996). Most common are individual programs presenting their assessment data without comparison to a control group (Kezar, 2001), e.g., (Matanin et al., 2007; Nave et al., 2006). Only a few institutions have reported on program design and assessment results in recent years (Matanin et al., 2007; Nave et al., 2006). 5) Design Projects in First-Year Engineering Courses: A major initiative and legacy of the NSF-funded engineering education coalitions was bringing engineering to the first year, especially in the form of team engineering design projects (Dally and Zhang, 1993; Froyd, 2005; Richardson, et al., 1998; Sheppard and Jenison, 1997a, 1997b). In his summary of the coalitions program, Froyd (2005) identifies variations on design and first-year curricula as major contributions for three of the six coalitions. For example, “Most of the partner institution[s] in the ECSEL coalition developed and subsequently institutionalized a first-year engineering course that emphasized engineering design as a process and enabled student teams to engineer meaningful prototypes.” Conference and journal publications throughout the 1990s emphasized teamwork, the design process, and institutionalization, with continued interest in describing new types of first-year design projects and configurations, e.g., King and Richter (2007). Bazylak and Wild (2007), in their review of best practices of first-year engineering design in Canada and the United States, identified six different instructional methods in common use: full-scale projects, small-scale projects, case-study analysis, reverse engineering (which is related to artifact dissection); design tools and methods instruction, and integration. A survey of first-year engineering programs by Brannan and Wankat (2005) identified 19 percent of responding institutions (n  68) with first-year design courses and 45 percent integrating design with other topics. Marra, Palmer, and Litzinger (2000) showed that first-year engineering design courses had a positive influence on the intellectual development of students compared with students who did not participate in these courses. Knight, Carlson, and Sullivan (2007) showed “significant retention gains” for the groups of students that participated in their “first-year, projectbased curriculum” when compared to similar groups of students who did not participate in the curriculum. Other studies of firstyear engineering courses also reported improvements in retention (Picket-May and Avery, 2001; Pomalaza-Raez and Groff, 2003; Razzaq, 2003). As Bazylak and Wild indicated, first-year engineering design courses may incorporate reverse engineering projects. Also, first-year engineering learning communities or integrated curricula often include a first-year engineering design course. Despite these overlaps, first-year engineering design courses can be distinguished from the other six innovations. Finally, first-year engineering courses may either be offered by one or more departments or coordinated as a common experience for all engineering majors. Therefore, they may require an administrative adoption decision for institutionalized implementation. 6) Curriculum-based Engineering Service-Learning Projects: Oakes (2009) defines service learning as …the intentional integration of service experiences into academic courses to enhance the learning of the core content and to give students broader learning opportunities about themselves and society at large. It is not an add-on volunteer activity but rather an integrated learning experience that creates curricular efficiency. Journal of Engineering Education 189

Prominent examples of service learning in engineering curricula include EPICS (Coyle, Jamieson, and Oakes, 2006), the SLICE program at the University of Massachusetts, Lowell (Duffy et al., 2009), and Engineers without Borders (Amadei, 2003). Under the EPICS program, for example, teams of undergraduates partner with local not-for-profit community organizations to define, design, build, test, deploy, and support engineering-centered projects that significantly improve the organization’s ability to serve the community. In a national longitudinal study of 8,474 students, Astin et al. (2006) found that service learning had positive effects on nine of 13 measured outcomes, including specifically: civic leadership, charitable giving, and overall political engagement. Duffy et al. (2009) report positive influences of service learning on (a) recruitment in engineering, especially among Hispanic students, (b) student self-reports of their motivation, (c) subject matter comprehension, and (d) student self-reports of their abilities in teamwork and communication. Jiusto and DiBiasio (2006) report positive effects on lifelong learning in the Global Studies Program at Worcester Polytechnic Institute. While an individual faculty member can decide to implement service learning in a course, additional coordination is still required to develop relationships with clients. 7) Interdisciplinary Capstone Design Projects: Interdisciplinary capstone design project courses are capstone courses in which student project teams are formed among students from multiple disciplines. In a 1994 survey, Todd et al. (1995) found that 21 percent (n  71) of the responding departments “reported that students are organized onto inter-departmental project teams.” Most involved collaboration between engineering departments, but some included business, social sciences, and other departments. More recently, Richter and Paretti (2009) conducted an extensive review of engineering education literature. They found: In the past 8 years, for instance, the International Journal of Engineering Education has published 17 articles describing interdisciplinary projects and courses. Another dozen have appeared in the European Journal of Engineering Education, particularly in relation to sustainability issues. A search of the American Society for Engineering Education (ASEE) Conference Proceedings from 2003 to 2006 returned 624 citations on “multidisciplinary” and 834 on “interdisciplinary” by authors from around the world. An analysis of papers from the 2007 conference illustrates the scope of current efforts. …Eighty-six papers from a broad spectrum of conference divisions were identified as addressing interdisciplinary themes (p. 31). Ollis (2004) described the multidisciplinary design courses that had been established at six of the institutions in the SUCCEED coalition. Institutions in the Manufacturing Engineering Education Partnership (Pennsylvania State University, The University of Puerto Rico-Mayagüez, and the University of Washington) had interdisciplinary, inter-institutional student teams working on “open-ended hardware-oriented projects provided by industry” (Lamancusa et al., 1995). Assessment of interdisciplinary learning, including identifying specific learning outcomes, remains a challenge (Shuman, Besterfield-Sacre, and McGourty, 2005), but new research is seeking to identify and measure interdisciplinary student learning outcomes (Mansilla and Duraisingh, 2007; Borrego et al., 190

Journal of Engineering Education

2009; McNair, Paretti, and Kakar, 2008; Richter and Paretti, 2009). The motivation for implementing new interdisciplinary design experiences remains grounded in global and societal forces changing the nature of engineering (as described by the Committee on Facilitating Interdisciplinary Research, 2005; Friedman, 2005; National Academy of Engineering, 2005). However, the major barriers to widespread adoption are reaching more than a small percentage of students, collaboration across departments, balancing faculty teaching loads, and financial resources; Ollis describes these and potential solutions in the context of interdisciplinary engineering design experiences (2004).

III. RESEARCH METHOD A. Methodology We employed a primarily quantitative research design using a Web survey constructed by the research team. Most survey items were multiple choice or otherwise quantitative, but open-response items were also included to allow clarification. In mixed methods terminology, we followed an embedded (Creswell and Plano Clark, 2007) or concurrent nested (Creswell et al., 2003) research design, in which the data were analyzed separately according to their respective traditions (quantitative or qualitative) and mixed during the analysis phase. The quantitative data were analyzed using descriptive statistics and Chi-squared tests. The qualitative responses were categorized using thematic analysis (Boyatzis, 1998) to understand influences on respondents’ awareness and adoption of engineering education innovations. The research questions were answered using both sets of data, and are combined in our Discussion. B. Population and Pilot Tests All research procedures were approved by Virginia Tech’s Institutional Research Board (IRB #08-398). Personally identifying information, such as that collected for raffle purposes, was removed prior to data analysis. The population for this survey was all engineering department chairs at U.S. institutions; however, as described below, identifying and encouraging these populations to respond was not a trivial task. Retrospective diffusion studies of this type often focus on identified “opinion leaders”—those who tend to adopt the studied innovation(s) before others. This was not our intention in targeting department chairs. From many iterations of proposing this type of study, it was clear that peer reviewers perceived department chairs to be the optimal target group for dissemination activities. Department chairs are important allies to supporting changes (innovation, adoption) in engineering education because they control material resources and influence the culture in an engineering department. In many other settings, department chairs serve as spokespeople or representatives for what is happening in their departments. The chairs themselves are often not the adopters, nor are they opinion leaders; however, since many of the innovations require their support, they were selected as the best informants regarding whether engineering education innovations are adopted widely. Focusing on department chairs rather than faculty members also reduces (but does not completely remove) the risk of self-selection bias among respondents. We conducted two pilot tests to determine strategies to maximize the clarity and response rate of our survey. We note here and elsewhere that compiling an accurate and inclusive list of all U.S. engineering July 2010

department chairs and encouraging them to respond was a significant challenge, which is indicative of the broader challenges of improving engineering education. We conducted two pilot tests: one with former department chairs at one institution and a second with the current department chairs at three institutions. For the first pilot, we sent e-mails to former department chairs and asked if they were willing to participate. Those who agreed were sent an e-mail invitation with the survey URL. Four department chairs and one assistant department chair in charge of undergraduate curriculum participated in the first pilot test. Four provided face-to-face feedback as well as written comments on aspects of the survey comprising the e-mail invitation, descriptions of the innovations, and survey items. We used their comments to revise the survey and invitation procedures for the second pilot. The second pilot test included more authentic invitation and raffle procedures. The participants in the second pilot test were all engineering department chairs from two research institutions and one baccalaureate institution, 32 individuals total. In an effort to demonstrate the national salience of the survey and to encourage a higher response rate by associating the survey with a nationally respected organization, an e-mail introducing the survey was sent by the director of the Center for the Advancement of Scholarship on Engineering Education at the National Academy of Engineering. This was followed a few days later by our e-mail invitation including the survey URL. These were followed by three additional reminders. Each message referenced a $100 raffle prize designed to increase the response rate. Six people responded, resulting in a response rate of 19 percent. We used the feedback from both pilot surveys to further refine the questions and limit repetitiveness, reducing the number of questions from 109 to 81. Respondents to both surveys indicated that although they care about engineering education, they are unlikely to respond unless asked by their dean or a professional contact. In other words, the names of the investigators, the National Science Foundation, or the National Academy of Engineering would not be sufficient to ensure a high response rate. C. Survey Instrument Development 1) Seven Engineering Education Innovations: The literature review describes our selection criteria and explains why each innovation meets these criteria. Table 1 summarizes the innovations and the brief descriptions that were used in the survey. Originally, the research team included Engineering Diversity Programs (e.g., Women in Engineering) as one of the innovations to be studied; however, during pilot testing, we eliminated it from our list, since responses indicated that virtually all department chairs had known about these programs for over five years and did not consider them to be an innovation. While this is a sign of positive change over time, we removed this innovation to shorten the survey and since we did not expect to observe much variation in the data. Table 1 lists the innovations in the order they were presented in the survey, while the remaining sections of this paper present them in order of decreasing adoption level. 2) Survey Instrument: The research team, with expertise in engineering education, sociology, education, and statistics, developed the survey of 81 questions. The survey was organized into seven sections (one for each innovation in Table 1) followed by July 2010

general items. In each section we asked multiple choice as well as open-ended questions about awareness and adoption of each innovation, and sources from which respondents obtained information about these innovations. All respondents were asked about the first five innovations in Table 1. Only respondents who influenced decisions about first-year courses (identified by a filtering question embedded in the survey) were asked about Learning Communities/Integrated Curricula and Design Projects in Firstyear Engineering Courses. Innovation-related questions are listed in Table 2. Question 1 was used primarily to determine awareness, and question 2 provided supplementary awareness data. Depending on the innovation, we used between one and three questions to determine adoption levels. For all of the innovations, we asked question 3, worded as “students” or “faculty,” depending on the nature of the innovation. Question 5 (percentage of students) was also asked for Design Projects in First-year Engineering Courses and Interdisciplinary Capstone Design Projects. Question 6 (percentage of faculty) was also asked for Design Projects in First-year Engineering Courses, Interdisciplinary Capstone Design Projects, and Student-active Pedagogies. Note that questions 5 and 6 are multiple-choice format with ranges of percentages. SNAP© assigned a code ranging from one to five to these response categories. We calculated an average response percentage by first removing the non-responses from the data, then finding the mean code of the coded responses, subtracting this mean from five (the highest code), multiplying the result by 20 (the categories are graduated by 20), and adding 10 to get the average percentage participation. For example, 33 responses with mean code of 3.5 corresponded to 40 percent faculty or student participation. Finally, regardless of adoption status, all respondents were asked in question 7 about the likelihood of offering each innovation in the future. In each of the seven innovation sections, two open-ended items (questions 8 and 9) elicited comments on factors that would impact department plans for using the innovation in the future and general comments or clarifications about the innovation. General questions followed the seven sections and included the name of the institution, the Carnegie classification of the institution, disciplinary degrees offered by the department, current position, length of time in that position, and the names of institutions they looked to for innovative engineering education practices. Lastly, we asked the respondents to provide comments about the financial or personnel resources that could influence the adoption of these education innovations. D. Data Collection Based on our pilot experience, to increase the response rate for the national survey, we approached engineering professional societies directly and inquired about using their e-mail lists. Unfamiliar with our reputations and our research, society personnel were understandably reluctant. We then approached the engineering department chairs at one of our own institutions (Virginia Tech) to assist in distributing the survey through the professional societies. For some societies, for example, anyone on the department chair listserv could post to the list. In addition, we offered raffle incentives (ten $500 and ten $100 cash prizes) to increase the response rate. The following professional societies forwarded survey announcements to their department chair e-mail lists: American Journal of Engineering Education 191

Table 2. A subset of survey questions. With exceptions noted through annotations, these survey questions were repeated for each of the innovations.

192

Journal of Engineering Education

July 2010

Table 2. Continued... Society of Mechanical Engineers (ASME), American Society of Civil Engineers (ASCE), Institute of Industrial Engineers (IIE), American Institute of Aeronautics and Astronautics (AIAA), and the Electrical and Computer Engineering Department Heads Association. Whether or not department chairs are active members of these professional societies, they are recognized as important networks for a range of professional, research, and educational information, so we have every reason to expect that these are the most complete cross-institution contact lists of U.S. department chairs in these disciplines. Smaller disciplines did not have department chair e-mail lists organized by their professional societies. The first-year engineering department chair at Virginia Tech sent the invitation e-mail with the survey link directly to similar program chairs. Computer and software engineering, chemical engineering, and materials science and engineering department chairs at Virginia Tech provided names of department chairs at institutions across the U.S. Other disciplines, such as biomedical engineering, were not included in the study because undergraduate programs and thus department chairs were not available at our institution to provide access to these networks. We used SNAP© software and purchased a one year subscription to their Web host service. We used this software to send individualized invitations to these department chairs in these three disciplines and followed up with one additional e-mail reminder to non-respondents. Of the 257 total responses (overall response rate  16 percent), 197 resulted in usable data (response rate  12 percent). This response rate is consistent with a precipitous decline in electronic survey response rates since the mid-1980s (Sheehan, 2001). The average length of time respondents had served in their current department chair position was approximately 2.5 years. Table 3 shows the response rate from each discipline for 197 respondents. The three smallest disciplines were combined to allow for statistical analysis by discipline. The total possible respondents are reported in column 2. The entries for chemical engineering, computer science and software engineering, first-year engineering, and materials science and engineering were obtained from the list provided to us July 2010

by the department chairs at Virginia Tech. The total possible participants for the rest of the disciplines were approximate numbers provided by the respective societies and/or department chairs at Virginia Tech. Since these values are estimates, it would be inappropriate to statistically analyze representativeness across engineering disciplines. Of the 197 respondents, 72 percent (n  143) were at extensive/intensive research institutions, 13 percent (n  25) at masters, and 15 percent (n  29) at others (Carnegie Foundation for the Advancement of Teaching, 2000). The breakdown of 376 U.S. accredited Engineering programs from ABET.org (accessed July 2009) based on the year 2000 Carnegie classification of the institution is 52 percent (n  196) extensive/intensive research, 34 percent (n  126) masters, and 14 percent (n  54) baccalaureate and others. Therefore, research institutions were overrepresented and masters were underrepresented in our study. The sample in our survey is not a random sample; therefore, all inferences apply to circumstances that match the characteristics of the respondents in this study. E. Limitations Finally, we acknowledge a number of limitations to the methods employed. Self-selection bias is embedded in any survey of educational innovation and/or teaching, as individuals and organizations that identify themselves as innovative or dedicated to quality teaching are more likely to respond. Similarly, this survey relies on self-reports of adoption decisions, including in some cases, estimates of numbers of faculty members and students involved. Some of the open-ended comments indicate that although we provided definitions of the innovations, some respondents made assumptions or associations with which we would not agree, e.g., that active learning requires expensive software. The innovations are not mutually exclusive, for example, because artifact dissection is a special case of student-active pedagogy. Respondents may have interpreted the innovations differently than intended, and they might have overestimated adoption levels if they Journal of Engineering Education 193

Table 3. Survey response rates by discipline. perceived the innovations to be socially desirable. Henderson and Dancy (2005, 2009) surveyed introductory physics instructors who claimed to have adopted specific active learning innovations, but found that few actually included all the behaviors defined by the developers, sometimes limiting the active engagement of students. Defining adoption in most cases to mean that at least one faculty member or course employs the innovation privileges larger departments, which tend to be at larger (doctoral) institutions. Depending on the size of the departments and distribution of administrative responsibilities, department chairs may not be fully aware of all educational initiatives. For example, chairs are much more likely to be aware of innovations that require them to coordinate with other departments than they are to be aware of how many of their faculty members currently practice active learning techniques. Finally, the mean time in position is 2.5 years, which limits historical perspective on whether the innovations were ever tried in the past. Taken together, these limitations make the results for student-active pedagogies the least valid of all the innovations. However, through the peer review process, we learned that many reviewers felt that this class of innovations is significant (perhaps the single most important means of improving engineering education) and thus we felt compelled to include it in this survey. The process of identifying all U.S. department chairs and encouraging them to respond also proved challenging. ASEE maintains address lists of engineering department chairs, which we purchased for use in the study. In the process of identifying e-mail addresses for the labels, we realized that the list, compiled primarily for marketing purposes, was incomplete and inaccurate. Professional societies would not give free access to their lists; most would only forward an announcement to protect their community from spurious contacts. While these procedures sacrificed complete identification of the population, response rate, and representativeness of certain disciplines, we felt these procedures were absolutely necessary to elicit as many responses as possible. Finally, we reiterate that a diffusion of innovations framework also bears limitations. While the framework allows for adopters to change the innovation to accommodate their local environment, defining and delineating the innovation is necessary to study it. 194

Journal of Engineering Education

This framework also assumes that adopters heard about the innovation from somewhere else, rather than developing the idea independently.

IV. RESULTS A. How Widespread is Awareness of Established Engineering Education Innovations? Are there Differences by Discipline or Institutional Type? Table 4 reports awareness levels of each innovation by discipline, as well as totals. Overall, awareness levels are very high (82 percent). Interdisciplinary capstone design and design projects in first-year courses had the highest levels of adoption, while artifact dissection had the lowest. Awareness was highest for civil engineering, mechanical engineering, and other engineering (first-year engineering, industrial engineering, aerospace engineering, and materials science and engineering combined). We performed a Chi-squared test of association between awareness of the department chairs due to their disciplinary affiliation controlling for the innovations (“Total” row of Table 4). There is convincing evidence that the differences in awareness across disciplines are statistically significant (p-value  0.0003). We note that this is most likely due to the wide range of awareness rates for artifact dissection in particular. Figures 1 and 2 include plots of aggregate data. Table 5 shows the proportions of respondents’ awareness of each innovation by Carnegie classification of the institution (Carnegie Foundation for the Advancement of Teaching, 2000). The highest awareness level corresponds to masters institutions (82 percent), but controlling for innovation type, the differences in awareness rate due to disciplinary affiliation were not statistically significant. B. How Widespread is Adoption of Established Engineering Education Innovations? Are there Differences by Discipline or Institutional Type? Table 6 lists adoption levels for each innovation by discipline, including totals. At 47 percent, the overall average proportion of adoption across disciplines and innovations is much lower than awareness, suggesting that most engineering department chairs are at intermediate stages of the adoption process. Adoption level was July 2010

Table 41. Proportions of respondents from each discipline reporting their awareness of each innovation.

Figure 1. Proportions of respondents’ awareness and adoption by innovation. Values are listed in Tables 4 and 6. Numerical values are the ratio of adoption to awareness for each innovation. highest for student-active pedagogies and lowest for artifact dissection. Similar to awareness levels, the highest adoption levels were reported by mechanical engineering and civil engineering department chairs. We performed a Chi-squared test of association between adoption responses of the department chairs due to their July 2010

Figure 2. Proportions of respondents’ awareness and adoption by disciplines. Values are listed in Tables 4 and 6. disciplinary affiliation controlling for the innovations (“Total” row of Table 6). There is convincing evidence that the differences in adoption across disciplines, namely the extremes of mechanical engineering and computer science and software engineering, are statistically significant (p-value  0.02). Figures 1 and 2 include plots of aggregate data. Journal of Engineering Education 195

Table 5. Proportions of respondents reporting their awareness of each innovation by institution type.

Table 6. Proportion of respondents from each discipline reporting their adoption of each innovation. Table 7 shows adoption levels of each innovation by Carnegie classification of the institution. Again, the highest level was reported by masters institutions (47 percent), and again, controlling for innovation type, the differences in 196

Journal of Engineering Education

adoption rate due to institutional type were not statistically significant. Figure 1 compares awareness and adoption rates for each innovation (last column in Tables 4 and 6). Included in this July 2010

Table 7. Proportion of respondents reporting their adoption of each innovation by institution type. figure is the calculated ratio of awareness to adoption. The innovation with the highest awareness level is interdisciplinary capstone design projects, perhaps because it is an obvious response to ABET EC2000 criteria. However, the innovation with the highest adoption level is student-active pedagogies. We believe this is because among the innovations, this requires the least amount of coordination between multiple faculty members—low complexity and an optional innovation-decision, which both predict high adoption rate. It is also, as evidenced by the literature, the oldest innovation, and as such has been tested in many different disciplines. Across innovations, there are large differences in the ratio of awareness to adoption. For example, nearly all department chairs who have heard of student-active pedagogies report at least one faculty member currently practicing it. On the other hand, while 79 percent of department chairs had heard of service learning, only 23 percent of departments currently offered it. This highlights important differences in the resources and coordination (complexity and type of innovation-decision) required to adopt the various innovations. We explore this observation later. Figure 2 compares awareness and adoption rates for each discipline (last row in Tables 4 and 6). Across disciplines, adoption scales with awareness. However, the history and nature of the innovations helps explain the disciplinary trends. Mechanical engineering department chairs reported the highest adoption rates and second highest awareness rates, followed closely by civil engineering. As demonstrated by the engineering education and design literature cited earlier, artifact dissection has been developed and investigated almost exclusively by scholars who affiliate themselves with design and mechanical engineering professional communities. This innovation appears to have a high level of compatibility with mechanical engineering values and to have taken advantage of this disciplinary network. Similarly, the strong ties between mechanical July 2010

engineering and design explain why mechanical engineering department chairs also report the highest adoption rates for interdisciplinary capstone design experiences and design projects in firstyear engineering courses. While electrical engineering, chemical engineering, and computer science include design in their curricula, they are less likely to emphasize dissection of physical artifacts or early design experiences for first-year students. In other words, the compatibility of these innovations with the values of these disciplines is low. As a more accurate measure of adoption, department chairs also estimated the percentage of faculty using student-active pedagogies, design projects in first-year courses, and interdisciplinary capstone projects. The average of these responses is presented in Table 8. (The procedure used to average these values is described in the Research Method section.) Further, the adoption levels of design projects in first-year engineering courses and interdisciplinary capstone design projects were also determined from the average percentage of students using these innovations as reported by department chairs (Table 9). As expected, there are some differences in these measures of adoption level. For example, student-active pedagogies are adopted by at least one faculty member in 71 percent of departments (Table 7) but only 36 percent of faculty in these departments (Table 8). Similarly, interdisciplinary capstone design projects are offered by 56 percent of departments (Table 7), but only 45 percent of senior students (Table 9) and 24 percent of faculty in these departments (Table 8) participate. C. How do Engineering Department Chairs Find Out About Engineering Education Innovations? Table 10 summarizes combined responses for how department chairs heard about each of the innovations. The most common method was word of mouth (28 percent), followed closely by Journal of Engineering Education 197

Table 8. Weighted average percentage of faculty using each innovation as reported by the department chairs from each discipline.

Table 9. Weighted average percentage of students using each innovation as reported by the department chairs from each discipline. presentations on campus or at conferences (not including technical professional societies) (23 percent). These relatively high percentages are in stark contrast to very low rates of reading about innovations (at least initially) or hearing about them through technical professional societies. These results indicate that engineering department chairs are not generally engaging with engineering education literature, or not recalling it. However, given the relatively high awareness rates, coupled with Rogers’ predictions regarding communication channels at various stages of the adoption process, interpersonal interactions are most likely to encourage adoption in the future. Therefore, these results are consistent with theory and should encourage further interpersonal interactions in the future. D. How Do We Expect Adoption Levels to Change, If at all, in the Future? This data provides an important snapshot of current awareness and adoption levels, but we also wanted to know whether adoption levels are likely to rise or fall in the near future. We asked about future plans regarding each of the innovations and compared responses of those who are and are not currently using the innovation. These are listed in Table 11 and combined for clarity because there were no substantial differences across innovations. The only major difference between the two groups was that 60 percent of current adopters were likely to continue, while only 10 percent were likely to adopt in the future. It is disappointing not to see more positive data, but the following section highlights the many considerations in adoption decisions. The 198

Journal of Engineering Education

Table 10. Respondents’ initial information sources. current climate of budget cuts, combined with this data leads us to conclude that adoption rates of these engineering education innovations may fall in the near future. Also of note is the unusually high non-response rate (68 percent) for department chairs not currently using the innovation. For this question in particular, department chairs were not comfortable or interested in commenting on possible engineering education changes in their futures. Their comments, including some that focus on individual faculty control over teaching decisions, provide some insight. E. What Factors Do Engineering Department Chairs Cite as Iimportant in Adoption Decisions? Through open-ended items for each innovation, department chairs were asked to offer comments on factors that impact their July 2010

Table 11. Combined proportions of current adopters versus non-adopters reporting their future plans for using the innovations. plans for adopting or continuing the innovations in the future. Respondents’ comments were categorized using thematic analysis (Boyatzis, 1998). Three major themes emerged: resources, students, and faculty members. Overall, the open responses were similar across innovations. Consistent with our mixed methods design, in which the qualitative data provide supplemental information and are simply not deep enough to stand alone nor to be quantified (Creswell et al., 2003), we do not report actual counts of these responses, but rather describe their meanings, relationships, and relative frequencies. More than half of the comments fell into the category we label as resources, including funding, computers, other educational technologies, classroom and laboratory space, and instructional staff. For example, one department chair listed annual software updates as a prohibitive cost. Laboratory space and safety was a barrier cited for artifact dissection. Another offered “It is an expensive and time consuming process. It is not clear how to sustain it with respect to financial situations.” In many cases, the decision was framed as a cost-benefit analysis (Wejnert, 2002), particularly when the cost to continue an innovation exceeded the benefits. One department chair indicated student-active pedagogies were unlikely to continue because they are “more resource intensive than had been claimed.” However, more optimistic comments indicated that progress can be made despite limited resources: “Always we have financial constraints to implement all innovative education. But we have made a lot of progress over the years.” There were also many comments relating to student learning or satisfaction with the innovations as considerations in adoption. One department chair explained, “We have found students learn concepts better with an active 'learning-focused' approach.” Another responded, “We find that students enjoy being more involved and they seem to retain the information better.” Others offered factors related to students as barriers such as “Some students like the methods, others not so much.” In other words, student resistance was a consideration in some adoption decisions. Faculty issues were more complex and nuanced (as well as more frequently cited) than student issues. Sometimes faculty members and their time were described as limited resources, while at other times they were described as actors making their own adoption decisions. In both cases, department chairs stressed that adoption of educational innovations is heavily reliant on participation of faculty members. Faculty time for preparation and management of labor-intensive innovations was mentioned frequently. Others described the culture of engineering higher education. There were many comments related to faculty resistance to July 2010

change, marginalization of teaching in promotion and tenure, and skepticism regarding evidence of improved student learning. At least one department chair described increasing familiarity with the innovation as a key strategy: “We do not force faculty to use the active pedagogies, but once used, most stick with it and we all try to use new active learning methods and report back to the rest of the faculty.” She or he goes on to assert that poor teaching evaluations during this process should not deter the faculty from using the innovations (implying that in some cases, they might be a disincentive). A few differences across the innovations are worth noting. For the innovations that by definition involved coordination across departments (interdisciplinary capstone design projects and learning communities or integrated curricula), there were more comments about faculty leadership/coordination and different grading/requirements for students from different departments. This complexity was acknowledged as a significant challenge, but some positive comments also indicated it was surmountable. F. Which Colleges or Universities are Considered Leaders or Innovators in Engineering Education? Past research suggests that high-status innovators and early adopters can influence the adoption of others (Rogers, 2003). In engineering education, engineering schools and/or influential individuals may act as opinion leaders who influence the adoption decisions of others. To explore this, we asked department chairs to name colleges and universities they look to for innovative engineering education practices. Then, as a proxy for prestige, we compared these to U.S. News & World Report rankings. (These rankings are admittedly flawed, but the criticism is an overreliance on reputation, which we are trying to measure in this case.) Responses are listed in Table 12; 18 institutions were cited five or more times by 76 department chairs. The relative frequencies with which baccalaureate and master’s institutions were cited align well with national rankings as a comparison measure of prestige. This is to be expected, as these institutions focus on undergraduate education. On the other hand, a concept from diffusion of innovations is necessary to interpret the results for doctoral institutions: monomorphic and polymorphic opinion leadership. Polymorphs are opinion leaders in multiple areas, e.g., research and teaching, or research in several engineering disciplines. Some of the opinion leaders in Table 12 are polymorphic (highly cited as opinion leaders and high rankings), while others are monomorphic (highly cited as engineering education opinion leaders, but with lower rankings focusing on research). MIT, Stanford, and UC-Berkeley appear at the top of most engineering disciplinary rankings as well as this list, and are therefore polymorphic opinion leaders. On the other hand, Purdue University and to a lesser extent Virginia Tech, were no doubt cited more frequently than their rankings would predict because of the departments and Ph.D.s in engineering education offered by both institutions; they are more monomorphic opinion leaders. An additional explanation for the results in Table 12 is disciplinary networks. For a number of institutions, one particular discipline accounts for most of its opinion leader citations. In computer science, institutions with top-ranked graduate programs were cited more often than others (Carnegie Mellon, University of Journal of Engineering Education 199

Table 12. Number of times the following colleges and universities were cited by department chairs for innovative engineering education practices. A few department chairs provided vague comments such as: “Top undergraduate universities, with or without doctoral program.” Washington, and Georgia Tech). Additionally, we believe the Center for Advancement of Engineering Education increased the visibility of its lead institution, University of Washington. North Carolina State University was cited many more times than would be expected by its overall rank of 26th. However, the high concentration of citations by chemical engineering department chairs indicates that the work and reputation of Richard Felder accounts for much of the engineering education reputation of NCSU. While Stanford University boasts the #2 mechanical engineering department in the U.S., we believe the work of Sheppard and colleagues (cited herein under artifact dissection, coupled with more recent 200

Journal of Engineering Education

publications) also contributes to the engineering education reputation of this institution. We note that in these two cases, these individuals and their coworkers maintained close ties within their engineering disciplines. Thus, our findings reinforce the diffusion of innovation principles of the importance of disciplinary networks and opinion leaders who are similar to (i.e., practicing in the same discipline as) potential adopters. Table 13 shows the opinion leaders (Table 12) by their Carnegie Institution Type. Although the literature suggests that institution type would be an important consideration in diffusion of innovations, the distributions across Table 13 are very similar. Department July 2010

Table 13. Distribution of colleges and universities most cited (Table 12) by Year 2000 Carnegie Institution Type. chairs at masters institutions made 33 percent of their references to other masters institutions, which is slightly higher than either department chairs at doctoral institutions (21 percent) or baccalaureate (23 percent). Department chairs at all three types of institutions referenced leaders in all three categories. This finding is surprising, since the literature suggests opinion leaders would more often be cited by followers from the same types of institutions.

V. DISCUSSION Considering that some U.S. reports on adoption of innovations in STEM education suggest only limited success (Bok, 2006; Handelsman et al., 2004; Seymour, 2001; The Boyer Commission on Educating Undergraduates in the Research University, 1998), adoption rates reported by respondents may be overestimated. Two possible reasons may be traced to limitations in the survey methodology and its results. First, self-selection bias among respondents may contribute to the discrepancy, i.e., institutions that have been more active in adopting innovations may have been more likely to respond to the survey. Another reason may be differences in perceptions of department chairs and actual adoption of the innovations by faculty members. Finally, department chairs may have more optimistic perceptions of adoption than faculty members. Nevertheless, this survey study provides important descriptive data on awareness and adoption of seven engineering education innovations. The overall level of awareness was 82 percent, which is expected given the tremendous effort invested in improving engineering education for the last few decades. In spite of the high levels of awareness, the average adoption rate across the seven innovations was 47 percent, and if the results are to be believed, nearly one-third (29 percent) of engineering departments are not practicing any form of student-active pedagogies. Diffusion of innovations theory both provides explanations for this gap and suggests productive directions for future efforts directed at increasing adoption of engineering education innovations as a means for improving engineering education. First, it is important to have realistic expectations for adoption goals. The research base presented in the literature review supports the efficacy of the seven innovations in promoting student learning and retention in a variety of (but by no means all) settings. Nonetheless, July 2010

reaching 100 percent adoption for any innovation—educational, technological, or otherwise—may be impractical or infeasible. Rogers (2003) cites many examples across a range of contexts of innovations that do not reach 100 percent adoption, including cell phones, birth control methods, and agricultural technologies. In his S-curve of adoption rates over time, the rate only approaches 100 percent asymptotically. On the other hand, awareness rates of 100 percent are desirable, as this is a necessary (but insufficient) condition for making an informed adoption decision. Potential adopters seek and require different types of information at different stages. During awareness, basic definitions and some evidence of success are necessary, but during trial and decision stages, more specific details about successful implementation are more important. These ideas are elaborated upon in our recommendations. Open-ended responses were most helpful in clarifying reasons department chairs offered for decisions not to adopt one or more innovations. Financial resources, class sizes, space, technology, instructional staff time, and student learning and satisfaction were directly cited as considerations in adoption decisions. Adoption decisions are frequently made by weighing the benefits against the costs (Wejnert, 2002), particularly in times of declining budgets. However, some comments also indicated that the innovations were perceived to be more complex than necessary. For example, student-active pedagogies were associated with technology costs in some comments by chairs who may have been thinking of classroom response systems (Fies and Marshall, 2006). They seemed unaware of many other, less resource-intensive, approaches for implementing student-active pedagogies, such as cooperative learning (Johnson, Johnson, and Smith, 1998). Another reason for the gap may be that department chairs, while aware of the general nature of an innovation, may be less familiar with the research base documenting the positive influence of the innovations on student learning, or may consider it weak or otherwise not generalizable to their students. Finally, they may not know how to address faculty concerns about adopting a particular innovation. This is the job of change agents (which can include, but are not limited to, department chairs)—a group we address in our recommendations. Another trend in the findings was significant differences in rates of adoption across the seven innovations. One of the primary sources of difference was complexity and the type of innovationdecision, in other words, whether intraorganizational cooperation Journal of Engineering Education 201

was required for adoption. Student-active pedagogies and artifact dissection could easily be implemented by individual faculty members (low complexity and an optional decision), while interdisciplinary capstone design projects and integrated curricula by definition require coordination across multiple departments (high complexity requiring collective and/or authority decisions). The lowest ratios of awareness to adoption were observed for complex innovations requiring significant coordination across departments and/or significant material resources (curriculum-based engineering servicelearning projects, artifact dissection, and learning communities or integrated curricula). Variation in open-ended comments supports the conclusion that complexity and coordination are more critical concerns for some innovations than others. We also observed differences in awareness and adoption across the engineering disciplines, including lower rates for electrical and computer engineering departments. We found this result odd because faculty members from these disciplines founded and supported the Frontiers in Education (FIE) Conference, which has promoted conversations about innovations in engineering education, primarily among these disciplines. However, we note that several of the innovations in our study were originally developed by mechanical engineers, and differences in the educational context between mechanical and electrical/computer engineering (coupled with disciplinary networks discussed later) may be one reason for the differences. Different adoption levels can, in part, be attributed to differences in compatibility of the design-related innovations with the values of electrical, computer, and chemical engineering and computer science. This is not to say that these disciplines do not value design, but simply that they do not define it as artifact dissection or projects that focus on producing a working physical prototype. In terms of interpersonal communication, the disciplinary networks of the various engineering branches are an important consideration. Beginning with pilot testing, we found that department chairs were more likely to respond to survey requests from their professional societies or disciplinary peers. Statistically significant differences between awareness and adoption rates in the various disciplines can be traced through disciplinary networks, particularly for the strong relationship between mechanical engineering and the design-related innovations. While the engineering education literature documents a longstanding strategy of presenting convincing assessment evidence in mass media publications (Borrego, 2007; Clark et al., 2004), openended comments citing skepticism coupled with strong preference for personal interactions suggest that dissemination efforts should focus elsewhere. This finding is reinforced by the literature, which emphasizes different information types and sources at different stages of the adoption process: But increasingly at the persuasion stage, and especially at the decision stage, an individual seeks innovation-evaluation information in order to reduce uncertainty about an innovation's expected consequences. Here an individual wants to know the innovation's advantages and disadvantages for his or her own particular situation. Interpersonal communication networks with near peers are particularly likely to convey such evaluative information about an innovation. Mass media channels are not very important at this stage because their messages are general in 202

Journal of Engineering Education

nature, and an individual deciding to adopt wants to know specific information: Will the innovation be beneficial to me in my particular situation? Subjective evaluations of a new idea by other individuals are especially likely to influence an individual at the decision stage, and perhaps at the confirmation stage (Rogers, 2003, p. 21). Survey responses indicated that presentations in educationfocused venues have been successful in informing department chairs about engineering education innovations. Diffusion literature confirms this is an efficient use of mass media. However, specific information about implementation, revision, assessment and institutionalization presented exclusively in publications is less effective in promoting adoption than raising awareness. After awareness has been achieved, local and interpersonal networks become more important (Rogers, 2003, p. 21). Although partnership with technical professional societies has been advocated, it is clear from our results that these meetings have historically not been a good source of information on educational innovations. Because engineering departments, particularly in a given sub-discipline, are so similar to each other, information about adoptions is likely to travel more quickly than through channels directed more broadly at all engineering disciplines. This is doubly true if the network is open (non-competitive) and considers the topic legitimate for discussion (Wejnert, 2002). These are precisely the considerations addressed by recommendations for engineering professional societies (beyond ASEE, SEFI, AaeE, etc.) to open dialog about engineering education.

VI. RECOMMENDATIONS Low adoption rates and sizeable gaps between rates of awareness and adoption, coupled with the emergence of newer engineering education innovations not included in this study, suggest that there is more work to be done. Several recommendations from our findings follow. The primary way diffusion of innovations research can inform change efforts is by guiding different strategies aimed at each stage in the adoption process (Froyd, 2001; Rogers, 2003). Current engineering education dissemination efforts, e.g., Web sites, conference papers and presentations, workshops, and journal publications, appear to be successful at creating awareness—but not promoting adoption—of innovations. Dissemination initiatives, such as those required in NSF STEM education proposals, need to focus more creativity and resources to formulate plans that promote transitions to stages of adoption beyond awareness. Based on responses from department chairs, these dissemination initiatives must address concerns about material resources as well as time and effort to adopt the innovations. Results in Table 10 suggest that department chairs learn about innovations primarily through word of mouth, conference presentations, and presentations on their campuses. Other propagation mechanisms, e.g., journal articles, were much less significant in influencing department chairs learning about innovations, which is also the stage at which diffusion of innovations theory suggests publications will be most effective. To promote adoption of engineering education innovations, workshops, conferences, and on-campus presentations need to address factors influencing adoption, i.e., resources, faculty development, and July 2010

addressing student resistance (see Section IV.E). Findings in this study suggest that presentations need to support department chairs in their thought processes regarding how to implement innovations without significant resource investments, how faculty members can adapt innovations to their teaching approaches in small increments of effort, and how faculty members can anticipate and embrace student resistance to change. This represents a fundamental shift from many presentations which focus on the nature of the innovation and evidence of its efficacy. The literature also predicts that adoption levels will be higher in situations where change agents focus on clients’ (i.e., faculty and administrator) needs over promoting adoption of a specific innovation. Innovators who are highly invested in a particular innovation can demonstrate empathy with potential adopters in order to increase adoption levels. Other change agents who are less committed to specific innovations can work with faculty and administrators to select innovations and pedagogies that meet the needs of their specific context. This recommendation points to a faculty development approach already practiced by teaching and learning centers at many institutions. When these centers (or some of their staff members) are dedicated to engineering, this increases homophily which in turn can increase adoption rates. In addition, disciplinary networks in the various branches of engineering are very strong but remain largely untapped as a resource for change agents to encourage engineering education improvement efforts. Sources such as the recent report Creating a Culture for Scholarly and Systematic Innovation in Engineering Education (Jamieson and Lohmann, 2009) include specific and extensive recommendations for professional engineering societies. These include rewarding faculty members for educational activity, creating “education-focused interest groups, publications, and meetings,” and integrating student and professional activities (Jamieson and Lohmann, 2009, p. 25). Additionally, they suggest that ASEE (and by extension, other engineering education professional societies) take the lead in coordinating strategies and partnerships between and among the engineering professional societies for the purpose of engineering education innovation. While some of these activities are the purview of society leaders, others, such as interest groups and paper sessions at annual meetings are frequently grassroots activities that can be initiated by any member or group of members. The basic argument is that education should be a concern for the future of the profession. If it is not, then perhaps stronger connections and arguments need to made between better preparation of students, on one hand, and, on the other, ways these and other educational innovations cultivate professional skills. Although the focus of this study was on department-level decisions and department chair perceptions, some of the open-ended comments touched upon compatibility of the innovations with the values of the organization. Clearly, faculty members (and not just the administrators we surveyed) must be involved in adopting engineering education innovations, and they respond to the values of their environment. Faculty are unmotivated to adopt engineering education innovations when they perceive that teaching innovation is marginalized in promotion and tenure considerations and that their colleagues are skeptical of assessment evidence. Like many others, Jamieson and Lohmann (2009) rally faculty, chairs, and deans to “support and recognize educational innovation” in promotion, tenure and merit reviews, and by including professional development funds in startup packages. Again, however, research also July 2010

shows that faculty attitudes play an important role in peer willingness to adopt new pedagogies including active learning, individual instructors should also show support for their colleagues through more open discussion of teaching (Dancy and Henderson, 2004; Henderson and Dancy, 2007). Finally, we acknowledge the role of students in changing engineering education—they can “vote with their feet” in selecting engineering departments, majors, and institutions with innovative programs. We already know that we are losing top students to other majors (Salzman and Lowell, 2007). Several of the newer innovations in this study, for example service learning, resonate with Generation Net, which is focused on meaningful work such as helping others (Chubin, Donaldson, Olds, and Fleming, 2008). On a smaller scale, students can begin to request innovative formats and technologies that they have used in previous courses, signaling to their instructors that at least the student resistance barrier is sometimes easily overcome.

VII. FUTURE WORK We recognize that when studying diffusion of innovations in engineering education, department chairs provide only one perspective. Other valuable perspectives can be provided by engineering faculty members and engineering students. One obvious direction for future research would be to focus a similar survey on engineering faculty members, and compare the results to those presented here. This research could also be better contextualized with respect to similar efforts outside the United States, perhaps through international sampling. Future studies of similar design (i.e., surveys) must carefully consider how to identify the population and promote higher response rates. Response bias threatens validity in studies focusing on faculty and administrator experience and attitudes towards teaching, as those who see themselves as valuing innovative or effective teaching are more likely to respond. Administrators claim that teaching changes are faculty-level decisions; on the other hand, faculty members (including peer reviewers, in their feedback to us) emphasized the important role of administrators. Rather than making a simplifying assumption of focusing on one group or another, future studies should consider the complex system of actors which includes interactions between administrators, faculty members, students and other stakeholders. Taken together with the difficulties of identifying an unbiased group of respondents and obtaining higher response rates, we suggest that at least one line of future research should focus more deeply on a smaller number of organizations (i.e., a carefully selected sample of departments or colleges/schools of engineering) to more closely investigate the complex relationships between faculty members and administrators in making adoption decisions. Finally, our specific recommendations for dissemination could be implemented and evaluated or tested in a variety of quasi-experimental designs primarily focused on achieving high levels of awareness and adoption.

ACKNOWLEDGMENTS This work was funded by the U.S. National Science Foundation under grants EEC-0835711 and EEC-0835816. Any opinions, findings, and conclusions or recommendations expressed in Journal of Engineering Education 203

this material are those of the authors and do not necessarily reflect those of the National Science Foundation. We are grateful to Erin Leahey for sharing her research expertise, Norman Fortenberry for his partnership on this project, Bill Oakes for assisting with early versions of the proposal, and Chris Strock and anonymous peer reviewers for constructive criticism. We would also like to thank professional society staff and Virginia Tech department heads who forwarded survey invitations, and department chairs who participated in the survey.

REFERENCES Agogino, A.M., S. Sheppard, and A. Oladipupo. 1992. Making connections to engineering during the first two years. In Proceedings of the American Society for Engineering Education Annual Conference. Nashville, TN. Amadei, B. 2003. Program in engineering for developing communities viewing the developing world as the classroom of the 21st century. In Proceedings of the Frontiers in Education Conference. Boulder, CO. http:// ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber1264725 (last accessed, February 2010). Ambrose, S.A., and C.H. Amon. 1997. Systematic design of a firstyear mechanical engineering course at Carnegie Mellon University. Journal of Engineering Education 86 (2): 173–81. Armstrong, N., S.-M. Chang, and M. Brickman. 2007. Cooperative learning in industrial-sized biology classes. CBE–Life Sciences Education 6: 163–71. Astin, A.W., L.J. Vogelgesang, K. Misa, J. Anderson, N. Denson, U. Jayakumar, V. Saenz, and E. Yamamura. 2006. Understanding the effects of service-learning: A study of students and faculty. Los Angeles: University of California, Los Angeles. Barr, R.E., P.S. Schmidt, T.J. Krueger, and C.-Y. Twu. 2000. An introduction to engineering through an integrated reverse engineering and design graphics project. Journal of Engineering Education 89 (4): 413–18. Bazylak, J., and P. Wild. 2007. Best practices review of first-year engineering design education. In Proceedings of the Canadian Design Engineering Network and Canadian Congress on Engineering Education. http://cden2007.eng.umanitoba.ca/resources/papers/65.pdf (last accessed, July 2009). Bok, D. 2006. Our underachieving colleges: A candid look at how much students learn and why they should be learning more. Princeton, NJ: Princeton University Press. Bonwell, C.C., and J.A. Eison. 1991. Active learning: Creating excitement in the classroom. Washington, DC: George Washington University Press. Borrego, M. 2007. Development of engineering education as a rigorous discipline: A study of the publication patterns of four coalitions Journal of Engineering Education 96 (1): 5–18. Borrego, M., C.B. Newswander, L.D. McNair, S. McGinnis, and M.C. Paretti. 2009. Using concept maps to assess interdisciplinary integration of green engineering knowledge. Advances in Engineering Education 2 (1): 1–26. Boyatzis, R.E. 1998. Transforming qualitative information: Thematic analysis and code eevelopment. Thousand Oaks, CA: Sage Publications. Brannan, K.P., and P.C. Wankat. 2005. Survey of first-year programs. In Proceedings of the ASEE Annual Conference and Exposition. Austin, TX. http://soa.asee.org/paper/conference/paper-view.cfm?id=21601 (last accessed, June 2009).

204

Journal of Engineering Education

Bransford, J.D., A.L. Brown, and R.R. Cocking. eds. 2000. How people learn: Brain, mind, experience, and school. Washington DC: National Academies Press. Carlson, B., P. Schoch, M. Kalsher, and B. Racicot. 1997. A motivational first-year electronics lab course. Journal of Engineering Education 86 (4): 357–62. Carnegie Foundation for the Advancement of Teaching. 2000. Standard Listings Retrieved 7/1/09, from http://www.carnegiefoundation. org/classifications/index.asp?key783 Chubin, D., K. Donaldson, B. Olds, and L. Fleming. 2008. Educating generation net—Can U.S. engineering woo and win the competition for talent? Journal of Engineering Education 97 (3): 245–57. Clark, M.C., J. Froyd, P. Merton, and J. Richardson. 2004. The evolution of curricular change models within the foundation coalition. Journal of Engineering Education 93 (1): 37–47. Committee on Facilitating Interdisciplinary Research. 2005. Facilitating interdisciplinary research. Washington, DC: National Academies Press. Cooper, M.M. 1995. Cooperative learning: An approach for large enrollment courses. Journal of Chemical Education 72 (2): 162–64. Cooper, J.L., J. MacGregor, K.A. Smith, and P. Robinson. 2000. Implementing small-group instruction: Insights from successful practitioners. New Directions in Teaching and Learning 81: 64–76. Coyle, E.J., L.H. Jamieson, and W.C. Oakes. 2006. Integrating engineering education and community service: Themes for the future of engineering education. Journal of Engineering Education 95 (1): 7–11. Crane, D. 1972. Invisible colleges: Diffusion of knowledge in scientific communities. Chicago, IL: University of Chicago Press. Creswell, J.W., and V.L. Plano Clark. 2007. Designing and conducting mixed methods research. Thousand Oaks, CA: Sage Publications. Creswell, J.W., V.L. Plano Clark, M.L. Gutmann, and W.E. Hanson. 2003. Advanced mixed methods research designs. In Handbook of mixed methods in social and behavioral research, eds. A. Tashakkori and C. Teddlie, 209–240. Thousand Oaks, CA: Sage Publications. Dally, J.W., and G.M. Zhang. 1993. A freshman engineering design course. Journal of Engineering Education 82 (2): 83–91. Dancy, M.H., and C. Henderson. 2004. Beyond the individual instructor: Systemic constraints in the implementation of research-informed practices. In Proceedings of the Physics Education Research Conference. Sacremento, CA. http://homepages.wmich.edu/~chenders/Publications/ PERC2004Dancy.pdf (last accessed, February 2010). de Solla Price, D.J., and D. Beaver. 1966. Collaboration in an Invisible College. American Psychologist 21 (11): 1011–18. Demetry, C., and J.E. Groccia. 1997. A comparative assessment of students’ experiences in two instructional formats of an introductory materials science course. Journal of Engineering Education 86 (3): 203–10. Dochy, F., M. Segers, P. Van den Bossche, and D. Gijbels. 2003. Effects of problem-based learning: A meta-analysis. Learning and Instruction 13: 533–68. Doymus, K. 2008. Teaching chemical equilibrium with the jigsaw technique. Research in Science Education 38 (2): 249–60. Duffy, H., C. Barry, C, L. Barrington, and M. Heredia. 2009. Servicelearning in engineering science courses: Does it work? In Proceedings of the ASEE Annual Conference and Exposition. Austin, TX. Fairweather, J.S. 1993. Faculty reward structures: Toward institutional and professional homogenization. Research in Higher Education 34 (5): 603–23. Felder, R.M., and R. Brent. 1996. Navigating the bumpy road to student-centered instruction. College Teaching 44 (2): 43–47.

July 2010

Felder, R.M., and R. Brent. 2010. The national effective teaching institute: Assessment of impact and implications for faculty development. Journal of Engineering Education 99 (2): 121–34. Fies, C., and J. Marshall. 2006. Classroom response systems: A review of the literature. Journal of Science Education and Technology 15 (1): 101–09. Foertsch, J.A., S.B. Millar, L.L. Squire, and R.L. Gunter. 1997. Persuading professors: A study of the dissemination of educational reform in research institutions. Madison: University of Wisconsin-Madison, LEAD Center. Fox, M.A., and N. Hackerman. 2003. Evaluating and improving undergraduate teaching in science, technology, engineering, and mathematics. Washington, DC: National Academies Press. Friedman, T.L. 2005. The world is flat. New York: Farrar, Straus and Giroux. Froyd, J.E. 2001. Developing a dissemination plan. In Proceedings of the ASEE/IEEE Frontiers in Education Conference. Reno, NV. Froyd, J.E. 2005. The engineering education coalitions program. In, Educating the engineer of 2020: Adapting engineering education to the new century, ed. National Academy of Engineering. Washington, DC: National Academies Press. Froyd, J.E. 2008. White paper on promising practices in undergraduate STEM education. Commissioned paper presented at NRC Workshop on Evidence on Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education. http://www7.nationalacademies.org/bose/Froyd_Promising_Practices_CommissionedPaper.pdf Froyd, J.E., and M. Ohland. 2005. Integrated engineering curricula. Journal of Engineering Education 94 (1): 147–64. Froyd, J.E., D. Penberthy, and K. Watson. 2000. Good educational experiments are not necessarily good change processes. In Proceedings of the ASEE/IEEE Frontiers in Education Conference. Kansas City, MO. Gabelnick, F., J. MacGregor, R.S. Matthews, and B.L. Smith, eds. 1990. Learning communities: creating connections among students, faculty, and disciplines, Vol. 41. San Francisco, CA: Jossey-Bass. Gabriele, G. 2005. Advancing engineering education in a flattened world. Journal of Engineering Education 94 (3): 285–86. Gardner, H. 2004. Changing minds: The art and science of changing our own and other people’s minds. Boston, MA: Harvard Business School Press. Handelsman, J., D. Ebert-May, R. Beichner, P. Bruns, A. Chang, R. DeHaan, J. Gentile, S. Lauffer, J. Stewart, S. Tilghman, W. Wood. (2004). Scientific teaching. Science 304 (5670): 521–22. Henderson, C., A. Beach, N. Finkelstein, and R.S. Larson. 2008a. Facilitating change in undergraduate STEM: Initial results from an interdisciplinary literature review. Proceedings of the Physics Education Research Conference. Alberta, Canada. Henderson, C., A. Beach, N. Finkelstein, and R.S. Larson. 2008b. Preliminary categorization of literature on promoting change in undergraduate STEM. In Proceedings of the Facilitating Change in Undergraduate STEM Symposium. Augusta, MI. http://www.wmich.edu/science/facilitatingchange/PreliminaryCategorization.pdf (last accessed, February 2010). Henderson, C., N. Finkelstein, and A. Beach. (2010). Beyond dissemination in college science teaching: An introduction to four core change strategies. Journal of College Science Teaching 39 (5). Henderson, C., and M. Dancy. 2005. When one instructor’s interactive classroom activity is another's lecture: Communication difficulties between faculty and education researchers. In Proceedings of the American Association of Physics Teachers Winter Meeting. Albuquerque, NM. Henderson, C., and M. Dancy. 2007. Barriers to the use of researchbased instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics:Physics Education Research 3 (2): 020102.

July 2010

Henderson, C., and M. Dancy. 2009. The impact of physics education research on the teaching of introductory quantitative physics in the United States. Physical Review Special Topics: Physics Education Research 5(2): 020107. Jamieson, L.H., and J.R. Lohmann, eds. 2009. Creating a culture for scholarly and systematic innovation in engineering education: Ensuring U.S. engineering has the right people with the right talent for a global society. Washington, DC: American Society for Engineering Education. Jiusto, S., and D. DiBiasio. 2006. Experiential learning environments: Do they prepare our students to be self-directed, life-long learners? Journal of Engineering Education 95 (3): 195–204. Johnson, D.W., R.T. Johnson, and K.A. Smith. 1998. Cooperative learning returns to college: What evidence is there that it works? Change 30 (4): 26–35. Keeney-Kennicutt, W., A.B. Gunersel, and N. Simpson. 2008. Overcoming student resistance to a teaching innovation. International Journal for the Scholarship of Teaching and Learning 2 (1): 1–26. http://academics.georgiasouthern.edu/ijsotl/v2n1/articles/Keeney-Kennicutt_Gunersel_Simpson/ Article_Keeney-Kennicutt_Gunersel_Simpson.pdf (last accessed, February 2010). Kezar, A. 2001. Summer bridge programs: Supporting all students. ERIC digest. http://www.ericdigests.org/2001-1/summer.html (last accessed, February 2010). King, P., and M. Richter. 2007. Current topics in rehabilitation engineering. In Proceedings of the ASEE Annual Conference and Exposition. Austin, TX. http://soa.asee.org/paper/conference/paper-view. cfm? id3413 (last accessed, February 2010). Knight, D.W., L.E. Carlson, and J.F. Sullivan. 2007. Improving engineering student retention through hands-on, team based, first-year design projects. In Proceedings of the International Conference on Research in Engineering Education. Honolulu, HI. http://www.asee.org/publications/jee/ icree/2007/papers/9.pdf (last accessed, February 2010). Lamancusa, J.S., J.E. Jorgensen, and J.L. Zayas-Castro. 1997. The learning factory—A new approach to integrating design and manufacturing into the engineering curriculum. Journal of Engineering Education 86 (2): 103–12. Lamancusa, J.S., J.E. Jorgensen, J.L. Zayas-Castro, and J. Ratner. 1995. The Learning Factory—A new approach to integrating design and manufacturing into engineering curricula. In Proceedings of the American Society for Engineering Education Annual Conference. Anaheim, CA. Lamancusa, J.S., M. Torres, V. Kumar, and J. Jorgensen. 1996. Learning engineering by product dissection. In Proceedings of the American Society for Engineering Education Annual Conference. Washington, DC. Lattuca, L.R., and J.S. Stark. 1995. Modifying the major—discretionary thoughts from 10 disciplines. Review of Higher Education 18 (3): 315–44. Lemaine, G., R. Macleod, M. Mulkay, and P. Weingart. 1976. Introduction: Problems in the emergence of new disciplines. In Perspectives on the emergence of scientific disciplines, eds. G. Lemaine, R. Macleod, M. Mulkay and P. Weingart. Mouton, France: Parex. Lewis, S.E., and J.E. Lewis. 2005. Departing from lectures: An evaluation of a peer-led guided inquiry alternative. Journal of Chemical Education 82 (1): 135–39. Lovitts, B., and N. Fortenberry. 2006. Documenting the research base underlying educational practices. In Proceedings of the American Society for Engineering Education Annual Conference. Chicago, IL. Lyman, F. 1981. The responsive class discussion. In Mainstreaming digest, ed. A.S. Anderson. College Park, MD: College of Education, University of Maryland.

Journal of Engineering Education 205

Malave, C.O., and K.L. Watson. 1996. Cultural change at Texas A&M: From the engineering science core to the foundation coalition. In Proceedings of the ASEE/IEEE Frontiers in Education Conference. Salt Lake City, UT. Mansilla, V.B., and E.D. Duraisingh. 2007. Targeted assessment of students’ interdisciplinary work: An empirically grounded framework proposed. The Journal of Higher Education 78 (2): 215-37. Marra, R.M., B. Palmer, and T.A. Litzinger. 2000. The effects of a first-year engineering design course on student intellectual development as measured by the Perry Scheme. Journal of Engineering Education 89 (1): 39–45. Matanin, B., T. Waller, J. Kampe, C. Brozina, and B. Watford. 2007. Step in the right direction: Student transition to engineering program. In Proceedings of the American Society for Engineering Education Conference. Honolulu, HI. Matsumoto, E., C. Masters, A. Akyurtlu, D. Hill, M. Ivory, A. Regan, E. Tutumluer, K. Coppock, S. Courter, K. Luker, S. Pfatteicher. 1998. The engineering education scholars program-preparing a new generation of faculty. In Proceedings of the American Society for Engineering Education Annual Conference. Seattle, WA. McKenna, A.F., B. Yalvac, and G.J. Light. 2009. The role of collaborative reflection on shaping engineering faculty teaching approaches. Journal of Engineering Education 98 (1): 17–26. McNair, L.D., M.C. Paretti, and A. Kakar. 2008. Case study of prior knowledge: Expectations and identity constructions in interdisciplinary, cross-cultural virtual collaboration. International Journal of Engineering Education 24 (2): 386–99. Michelson, S.K., R. Jenison, and N. Swanson. 1995. Teaching engineering design through product dissection. In Proceedings of the American Society for Engineering Education Annual Conference. Anaheim, CA. Milem, J.F., J.B. Berger, and E.L. Dey. 2000. Faculty time allocation: A study of change over twenty years. The Journal of Higher Education 71 (4): 454–75. National Academy of Engineering. 2004. The engineer of 2020. Washington, DC: National Academies Press. National Academy of Engineering. 2005. Educating the engineer of 2020: Adapting engineering education to the new century. Washington, DC: National Academies Press. National Science Foundation. 2008. Innovations in engineering education, curriculum, and infrastructure (IEECI) (NSF 08-542). Arlington, VA. Nave, F., S. Frizell, P. Obiomon, S. Cui, and J. Perkins. 2006. Prairie View A&M University: Assessing the impact of the STEM-enrichment program on women of color. In Proceedings of the 2006 WEPAN Conference. Pittsburgh, PA. Oakes, W.C. 2009. Creating effective and efficient learning experiences while addressing the needs of the poor: An overview of service-learning in engineering education. In Proceedings of the ASEE Annual Conference and Exposition. Austin, TX. Ollis, D.F. 2004. Basic elements of multidisciplinary design courses and projects. International Journal of Engineering Education 20 (3): 391–97. Picket-May, M., and J. Avery. 2001. Service learning first year design retention results. In Proceedings of the Frontiers in Education Conference. Boulder, CO. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber963738 (last accessed, July 2009). Pomalaza-Raez, C., and B.H. Groff. 2003. Retention 101: Where robots go…students follow. Journal of Engineering Education 92 (1): 85–09. Prince, M.J. 2004. Does active learning work? A review of the research. Journal of Engineering Education 93 (3): 223–31.

206

Journal of Engineering Education

Project Kaleidoscope. 2002. Recommendations for action in support of undergraduate science, technology, engineering and mathematic Report on Reports. Washington, DC. Razzaq, Z. 2003. An effective teaching strategy for motivation and retention of engineering and technology freshmen. In Proceedings of the ASEE Annual Conference and Exposition. Nashville, TN. http://soa.asee. org/paper/conference/paper-view.cfm?id18012 (last accessed, July 2009). Richardson, J., C. Corleto, J.E. Froyd, P.K. Imbrie, J. Parker, and R. Roedel. 1998. Freshman design projects in the Foundation Coalition. In Proceedings of the Frontiers in Education Conference. Tempe, AZ. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber736801 (last accessed, June 2009). Richter, D.M., and M.C. Paretti. 2009. Identifying barriers to and outcomes of interdisciplinarity in the engineering classroom. European Journal of Engineering Education 34 (1): 29–45. Rogers, E.M. 2003. Diffusion of innovations. New York: The Free Press. Salzman, H., and B.L. Lowell. 2007. Into the eye of the storm: Assessing the evidence on science and engineering education, quality, and workforce demand. The Urban Institute. http://www.urban.org/UploadedPDF/411562_salzman_Science.pdf (last accessed, February 2010). Seymour, E. 2001. Tracking the processes of change in US undergraduate education in science, mathematics, engineering, and technology. Science Education 86 (1): 79–105. Sheehan, K. 2001. E-mail survey response rates: A review. Journal of Computer-Mediated Communication 6(2). Sheppard, S.D. 1992. Mechanical dissection: An experience in how things work. In Proceedings of the Conference on Engineering Education: Curriculum Innovation and Integration. Santa Barbara, CA. Sheppard, S.D., and R. Jenison. 1997a. Examples of freshman design education. International Journal of Engineering Education 13 (4): 248–61. Sheppard, S.D., and R. Jenison. 1997b. Freshman engineering design experiences: An organizational framework. International Journal of Engineering Education 13 (3): 190–97. Shuman, L.J., M. Besterfield-Sacre, and J. McGourty. 2005. The ABET “professional skills”—Can they be taught? Can they be assessed? Journal of Engineering Education 94 (1): 41–55. Silverthorn, D.U., P.M. Thorn, and M.D. Svinicki. 2006. It’s difficult to change the way we teach: Lessons from the integrative themes in physiology curriculum module project. Advances in Physiology Education 30 (4): 204–14. Soyster, A.L. 2008. Guest editorial: The business of engineering education. Journal of Engineering Education 97 (1): 3–4. Springer, L., M.E. Stanne, and S.S. Donovan. 1999. Effects of smallgroup learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Review of Educational Research 69 (1): 21–51. Strobel, J., and A.V. Barneveld. 2009. When is PBL more effective? A meta-synthesis of meta-analyses comparing PBL to conventional classrooms. The Interdisciplinary Journal of Problem-based Learning 3 (1): 44–58. Terenzini, P., L. Rendon, L. Upcraft, S. Millar, K. Allison, P. Gregg, and R. Jalomo. 1996. The transition to college: Diverse students, diverse stories. In ASHE reader on college students: The evolving nature of research, eds. F. Stage, G. Anya, J. Bean, D. Hossler and G. Kuh, 54–79. Needham Heights, MA: Ginn Press. The Boyer Commission on Educating Undergraduates in the Research University 1998. Reinventing undergraduate education: A blueprint for America’s research universities. http://naples.cc.sunysb.edu/Pres/boyer.

July 2010

nsf/673918d46fbf653e852565ec0056ff3e/d955b61ffddd590a852565ec00 5717ae/$FILE/boyer.pdf (last accessed, February 2010). Tinto, V. 1998. Learning communities: Building gateways to student success. The National Teaching and Learning Forum 7 (4). http://www.ntlf. com/html/lib/suppmat/74tinto.htm (last accessed, February 2010). Todd, R.H., S.P. Magleby, C.D. Sorensen, B.R. Swan, and D.K. Anthony. 1995. A survey of capstone engineering courses in North America. Journal of Engineering Education 84 (2): 165–74. U.S. News & World Report. 2009. Best graduate schools. http:// grad-schools.usnews.rankingsandreviews.com/best-graduate-schools (last accessed, August 2009). Wankat, P.C., R.M. Felder, K.A. Smith, and F.S. Oreovicz. 2002. The scholarship of teaching and learning in engineering. In Disciplinary styles in the scholarship of teaching and learning: Exploring common ground, eds. M.T. Huber and S.P. Morrealle, 217–237. Sterling, VA: Stylus Publishing. Wejnert, B. 2002. Integrating models of diffusion of innovations: A conceptual framework. Annual Review of Sociology 28: 297–326. Zia, L.L. 2001. The NSF national science, technology, engineering, and mathematics education digital library (NSDL) program: New projects and a progress report. D-Lib Magazine 7 (11). http://www.dlib.org/ dlib/november01/zia/11zia.html (last accessed, February 2010).

Coalition systematically renewed their undergraduate engineering curricula, institutionalized many of their innovations, and extensively shared their results with the engineering education community. He co-created the Integrated, First-Year Curriculum in Science, Engineering and Mathematics at Rose-Hulman Institute of Technology. The project was recognized with a Hesburgh Certificate of Excellence in 1997. He also served as project director for “Changing Faculty through Learning Communities,” which was intended to create compelling learning experiences for faculty members on gender equity. He has authored or co-authored over 50 papers on curriculum innovation, curriculum integration, assessment of curricular innovations, processes of curricular change, and faculty development. Dr. Froyd is a senior associate editor for the Journal of Engineering Education, has served for ten years as an ABET Program Evaluator for electrical and computer engineering, and has served as a program co-chair for the 2003 and 2004 Frontiers in Education Conferences and the general chair for the 2009 Frontiers in Education Conference. Address: Office of the Dean of Faculties and Associate Provost, TAMU 1126, Texas A&M University, College Station, TX 77843-1126; telephone: (1) 979.845.7574; e-mail: froyd@ tamu.edu.

AUTHORS’ BIOGRAPHIES

Simin Hall is a research assistant professor in the Department of Mechanical Engineering (ME) at Virginia Tech. Dr. Hall completed a postdoc in the Department of Engineering Education at Virginia Tech before joining the ME department. Her Academic experience includes designing and teaching online courses at Winston-Salem State University, research at University of North Carolina in Information Security, and in risk management and medical education at Wake Forest University School of Medicine. Her industry experience is in nuclear power generation with Westinghouse Electric Company in Pittsburgh, PA; Nuclear Technology Division at Babcock and Wilcox, System Design and Equipment Engineering in Lynchburg, VA; and risk management with B&C Associates, a Public Relations Company in Jamestown, NC. She holds degrees from Virginia Tech: BS in ME and minor in Mathematics, MS in Engineering Science and Mechanics. Her Ph.D. is in Education and Statistics from the University of North Carolina. Address: Department of Mechanical Engineering, 117 Randolph, College of Engineering (0238), Virginia Polytechnic Institute and State University, Blacksburg, VA 24061; telephone: (1) 540.231.7270; e-mail: [email protected].

Maura Borrego is an associate professor and director of the graduate program in the Department of Engineering Education at Virginia Tech. She holds U.S. NSF CAREER and Presidential Early Career Award for Scientists and Engineers (PECASE) awards for her engineering education research. Her research interests include research and scholarship communities in engineering education and interdisciplinary graduate education. She teaches graduate level courses in engineering education research methods and assessment. All of Dr. Borrego’s degrees are in Materials Science and Engineering, M.S. and Ph.D. from Stanford University, and B.S. from University of Wisconsin-Madison. Address: Engineering Education (0218), Blacksburg, VA 24061; telephone: (1) 540.231.9536; e-mail: [email protected]. Jeffrey E. Froyd is the director of Faculty Climate and Development in the Office of the Dean of Faculties and associate provost at Texas A&M University. He had served as project director for the Foundation Coalition, one of six Engineering Education Coalitions that were supported by NSF. Six partner institutions in the

July 2010

Journal of Engineering Education 207