A Living-Systems Design Model for Web-based ... - CiteSeerX

1 downloads 0 Views 693KB Size Report
we developed a new design model that is based ..... of technical business practices (TBPs) that prescribe and guide .... focus on the cognitive science foundations of ...... review of the application of design principles or ..... struction (3rd Ed).
AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

A Living-Systems Design Model for Web-based Knowledge Management Systems Jan L. Plass Mark W. Salisbury

Most of the currently available instructional design models were conceptualized to develop instructional solutions to needs and requirements that remain relatively stable over time. Faced with the problem of designing a knowledge management (KM) system that needed to accommodate continuously changing requirements over its fielded lifetime, we developed a new design model that is based on a living-systems approach. In this article, we briefly review currently available instructional systems design models and describe this new model and the mechanisms it contains for accommodating change and growth. We illustrate the application of the phases of the model (analyze initial requirements, design the information architecture, develop the information design, develop the interaction design, implement the Web-based system, and conduct a developmental evaluation of the system) in the development of a KM system with living-system features.

The systematic design of instruction has long been recognized as a key to the successful development of effective and appropriate materials (Dick & Carey, 1990; Gagné & Briggs, 1979; Rothwell & Kazanas, 1997). Early developers of computer-based instruction (CBI) recognized the difficulty in successfully taking a CBI project from conception to delivery. To help manage the complexity of the development process, they adopted methods and techniques created in the computer industry for managing software development projects. Many of these early software development methods were adaptations of the “waterfall approach” (Yourdon & Constantine, 1978), a highly structured method for project management that delineates the design process into clearly defined phases— analysis, design, implementation, testing, and delivery. The metaphorical implications of the waterfall approach are obvious: Each phase has a clearly defined output, and this output provides the input to the next phase. The purpose of the analysis phase is to produce the specification of the new application. It states what the new application should do, but does not describe how it will be accomplished. The design phase results in the specification being translated into a paper-based design of the system. In the implementation phase, this design is finally realized as a computer program—then tested in the testing phase—before it is delivered to the customer in the delivery phase. The waterfall approach did improve project management for CBI developers, but it was a method adopted from software engineering that did not sufficiently focus on the instructional aspects of computer applications. As the CBI field matured, development

ETR&D, Vol. 50, No. 1, 2002, pp. 35–57 ISSN 1042–1629

35

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

36

ETR&D, Vol. 50, No. 1

methods appearing in the literature began to incorporate concepts and terminology taken from the instructional systems design (ISD) approach (Gagné & Briggs, 1979). Since ISD was conceptualized and accepted for instructional projects before the popularity of CBI, it was a natural progression for developers to adopt an ISD approach for CBI projects to incorporate the instructional focus of ISD. Most popular adaptations used ISD front-end methods to establish the purpose of the application and techniques adopted from software development methods to implement the application (Alessi & Trollip, 1991; Fenrich, 1997; Roblyer, 1988; Wager & Gagné, 1988). These ISD-based approaches quickly became popular because CBI applications were resource intensive and CBI project managers were looking for more effective alternatives to the waterfall approach. The typical steps of such an instructional design (ID) approach for the development of CBI projects are (Alessi & Trollip, 1991):

• Determine needs and goals, including learner characteristics and entry knowledge.

• Collect

resources, regarding content, design and delivery of instruction.

• Learn the content, that is, become familiar with the subject matter.

• Generate

ideas, that is, brainstorm and generate creative ideas.

• Design instruction. Perform concept and task analyses, reassess goals, evaluate design.

• Flowchart the lesson. Describe operations the computer will perform, interactions, lesson structure.

• Storyboard displays. Prepare textual and pic-

torial displays, content and form of presentation.

• Program

the lesson. Select programming or authoring language, program instruction.

• Produce supporting materials. Develop student and instructor manuals, a technical manual, and adjunct instruction.

• Evaluate and revise. Assess how well lessons work and how much students learn.

Using approaches based on the ISD process was an improvement over the basic waterfall approach because of the added focus on instruc-

tional aspects in the design process, but it was still perceived by many as a rather linear method of development, with each phase— more or less—completed before the next one was begun (Boyle, 1997). Meanwhile in the software-engineering world, another revolution in development methods was taking place (de Hoog, 1994). Flexible methods of project management based on iterative prototyping—sometimes called evolutionary system development—were becoming increasingly popular for commercial software projects (e.g., Jones & Richey, 2000; Moody, Hudson, & Salisbury, 1988). Collectively, they have become known as rapid application development (RAD) methods (Boar, 1984). Illustrating the popularity of this approach, in 1995, a standard nonproprietary RAD method was introduced by a consortium of nearly 100 companies. This approach was called the dynamic systems development method (DSDM). The manual that resulted from this effort, Dynamic Systems Development Method (1995), defines DSDM as a method that provides a framework for building and maintaining systems that meet tight time constraints through the use of incremental prototyping in a controlled project environment. Because of the success of iterative prototyping methods such as RAD and DSDM for developing general software applications, combining ISD methods with iterative prototyping became the next breakthrough in development methods for CBI (Koper, 1995). Figure 1 shows the iterative prototyping process. Combining the iterative approach with an ISD approach results in an instructional focus of the iterative approach to the design of materials. This approach differs from previous methods by explicitly supporting an interleaved form of project development on different levels. In the early stages of prototyping the application, for example, interleaving is used in order to refine the objectives of the system (Boyle, 1997). During this process, the prototype becomes the object of communication between the designer and the users. Requirements are clarified and misconceptions between the designer and the clients come to light at this stage. For example, Salisbury (1988) used iterative prototyping to define

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

37

LIVING-SYSTEMS DESIGN MODEL

Figure 1

Iterative-prototyping approach to software development

human-computer interface requirements of a natural language interface. Using a prototyping tool, developers implemented a natural language interface that was able to process a subset of the end-user actions for an application. After an evaluation of the acceptance of the natural language interface by end users, developers added more grammar rules and vocabulary words to the interface to enable it to process more end-user actions and to support requests for the same actions with differing sentence structures and vocabulary. This process continued until end users were able to perform most of the required actions for the application through the natural language interface.

DESIGNING SYSTEMS WITH CHANGING NEEDS AND REQUIREMENTS

Combining ISD methods with iterative prototyping has addressed many of the limitations of using a more traditional approach for developing CBI applications. However, we found that this hybrid approach, and those that came before it, were not well suited for the chal-

lenges that we were facing in a project where we developed an integrated and evolving knowledge management (KM) system. While these methods were able to address some changes in needs and requirements during the development of the project, they were not designed for building systems that have continuously changing requirements over their fielded lifetime caused by changing needs of the organization that is using the system. In other words, the approaches discussed so far were able to accommodate certain changes in the system requirements that occur during the project development cycle. A design method for the development of a KM system, however, needs to address two additional levels of change. The first level of change concerns the organization that will use the system. Here, during the fielded lifetime of the KM system, new requirements and needs may arise that result from changes in the organization. These changes may include new characteristics of the target audience for the system due to retirements and new hires. New products and markets result in changing amount, content, type and structure of the information included in the system, and, as a

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

38 consequence, in changing forms and methods of learning and instruction that the system has to be able to facilitate. The second level of change is a consequence of the first one and concerns the KM system itself. Most instructional materials are not designed to be modified by the users through their interaction with the system. In the case of a KM system, however, this functionality constitutes one of the fundamental system requirements: Acting as learners, users retrieve information from the system to solve a particular problem. Once they have solved this problem, however, they act as authors and add their newly constructed solution to the system. The resulting growth of the KM system will depend on the needs of the organization and will change over time. Therefore, mechanisms have to be designed and implemented that can accommodate these changes. These two types of change closely resemble characteristics we can observe in living systems (Maturana & Varela, 1980). Using this analogy, a KM system needs to be able to adapt to its environment, that is, the changing organizational needs. It also needs to be able to grow and learn, that is, to respond to the individual needs of learners. Traditional ID models, however, were not conceptualized to include features that can accommodate this change and growth. In the ID methods described so far, once a need was established and the parameters and requirements for the instructional solution were defined and implemented, no procedures were in place that would allow for their revision and the subsequent modification of the system after it had been deployed. We found that other more recent development methods and models do not explicitly address this problem of building systems with evolving requirements either. Objectoriented design approaches (e.g., Coad & Yourdon, 1991), while solving many previously unresolved issues such as recycling and reusing of project components, do not offer a built-in solution for accommodating evolving requirements during the lifetime of the system. Similarly, methods such as user-design approaches (e.g., Banathy, 1991; Carr, 1997; Reigeluth, 1993; Schuler & Namioka, 1993), where users are empowered to make critical design and develop-

ETR&D, Vol. 50, No. 1

ment decisions, include an initial phase to define parameters and requirements but do not explicitly support building dynamically changing systems. Even the more recently proposed constructivist models of ID (Schwier, 1999; Tennyson, 1997; Willis, 1995), while allowing more learner input in the design of the materials and in the definition of the learning goal and objectives than previous models, still aim for the development of a relatively static final product and do not adequately address the need for a mechanism or procedure that allows for changing requirements and specifications after the development of the system has been completed. In fact, Dick (1996) observed, “when constructivist models are proceduralized, they look very much like traditional design models” (p. 62). Therefore, given the limitations of these methods for building systems with continuously changing requirements, and the specific situation of our development project that called for such changes, we were faced with the need to create a new design model. In this article, we describe this model and the living-systems approach it is based on, and show how it was applied to build an integrated and evolving KM system for a large government organization and its affiliates. First, however, we will provide some information on the project that inspired the development of this new design model.

CASE EXAMPLE—THE PRODUCT REALIZATION PROCESS

Over the last several years, the large government organization for which the KM system was developed has streamlined its operations to make production more efficient in a variety of coordinated engineering, manufacturing, assembly, and management activities. In so doing, eight separate laboratories and plants around the United States have agreed to utilize the Product Realization Process with a common set of technical business practices (TBPs) that prescribe and guide operations. The goal of the project was to get a potential user community of 1 to 2 thousand individuals at these eight sites to understand and apply the TBPs, related documents, and terminology to their projects. The

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

LIVING-SYSTEMS DESIGN MODEL

user community at these sites is an aging population, not unlike the rest of the organization. While these experienced users are highly knowledgeable about how business has been or should be conducted, others are being asked for the first time to subscribe to the common set of business practices. Employees who are experienced with the TBPs, approaching their retirement, have a large amount of tacit knowledge about the TBPs that would be lost to their organization if it were not captured. In addition, newcomers to these eight facilities need to be provided with an orientation while, at the same time, getting a more complete picture of processes, procedures, and practices. Initial efforts to address this problem with introductory, classroom-based training, created with input from subject-matter experts and distributed to representatives at the eight facilities for them to customize and deliver to their own staff had not been sufficient in promoting the use of TBPs. Therefore, a new way of providing access to the TBPs and related documents, to search those documents, to provide decision support that would help users through the maze of TBPs, and to deliver short, focused, just-intime instruction on a variety of related topics was approached—the development of a KM system.

39 through the work of Maturana and Varela (1980), enabling them to make a distinction between living and nonliving systems. Autopoies is Greek for self production, and an autopoietic system is one that has within its own boundaries the mechanisms and processes that enable it to produce and reproduce itself. Applying this concept to human social systems such as those facilitated by a KM system, Luhmann (1986) described such a system as one that is both open and closed to the environment. As such, it is closed to information and knowledge, but open to data. The system constructs its own knowledge through the process of accommodating data from the environment, shaping and changing the very structure and nature of the system in the process (von Krogh, Roos, & Slocum, 1996).

A LIVING-SYSTEMS APPROACH FOR INSTRUCTIONAL DESIGN MODELS

In the case of the KM example, the requirements for the system are changing for a number of reasons. While the target audience currently consists primarily of experienced engineers, a wave of upcoming retirements, which will be followed by new hiring, leads us to expect a change in that population to include an increasing number of new, less experienced and younger employees. At the same time, the information made available through the system is in a process of continuous updates and revisions. TBPs and their documentation are frequently modified according to new technical developments and the changing needs of the organization.

The design and development of a system with changing requirements calls for a design model that can reveal the complex and changing requirements of the system and provide a means to control further development and maintenance costs. One possible approach to the development of such a design model is to view the resulting system as a living and adapting organism. That is, since growing and sharing knowledge is, by definition, an ongoing and self-modifying process, the goal is to design and build a system that is adaptable to its environment—a living system. This view of a system as a living entity falls under research that has been labeled autopoiesis theory. This concept was developed more than 30 years ago in biology

Aside from these externally generated causes for change, the design of the system’s own capabilities and features will result in its change and growth. The developers of early KM systems and expert systems realized that it was difficult to get enough subject-matter experts to contribute their knowledge to the system, which was referred to as knowledge acquisition bottleneck. A second, related challenge, often referred to as the knowledge cliff problem, concerns the breadth of knowledge within a system. While the knowledge contributed to the system by experts may cover a high level of depth, it has often only a very narrow breadth, covering only a narrow area of expertise. Users who attempt to access knowledge outside of this narrow area fall off the knowledge cliff (Salisbury & Plass, 2001). A

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

40

ETR&D, Vol. 50, No. 1

KM system, therefore, needs to include capabilities that allow users to peruse a variety of methods and features of learning and instruction, including means of contributing their own materials to the site: Once the user constructs a solution, it can be added to the system. These features collectively provide a rich learning environment for users and allow them to avoid the knowledge-cliff problem. These capabilities also contribute to the changing and evolving content, which in turn will result in changing and evolving ways of learning. As a practical mechanism for the design of KM systems that can incorporate such change on different levels, we developed a design model with the following phases (see Figure 2):

• Analyze End-User Requirements. Define the initial end-user requirements, instructional need, target audience, goal, and objectives for the system.

the current needs and requirements of the organization; evaluate usability and effectiveness of the features for the described objectives. Below, we describe the purpose of each phase, the input required, steps performed, and output generated, and provide a theoretical background that informs the execution of the steps in the phase. We also discuss how each of these phases was realized in the KM case example by describing the most relevant aspects of the output of each phase. Because the focus of this article is on the description of the design model and not on the detailed description of the KM system we implemented, however, we will not provide an exhaustive description of the outcome of each phase here. More information about the KM system itself can be found in Salisbury and Plass (in press).

• Design

Instructional Information Architecture. Specify the content and functionality of the system as well as its structure, including a living system functionality to accommodate system change and growth; define the navigation system.

• Develop

Instructional Interaction Design. Design the human-computer interaction for each system feature, including instructional strategies, methods of learning and instruction, and navigation.

• Develop

Instructional Information Design. Define the appearance of interface elements; specify the forms of representation of information in the different presentation modes and for different sensory modalities with a focus on the cognitive science foundations of learning from text, sound, animation, maps, graphs, and other multimedia information; develop methods of “wayfinding” for navigation.

• Implement

System Design. Develop a prototype KM system based on the design of the capabilities specified in the information design, interaction design, and information architecture.

• Conduct

Developmental Evaluation. Evaluate the product of each of the phases to determine if it attained the intended objectives for

Analyze End-User Requirements

The purpose of this phase is the analysis of the initial end-users’ needs and requirements and the identification of the target audience, as well as the goals and objectives for the system. The analysis follows mostly the procedures specified in other, more traditional ISD models described earlier. The input for this phase can be taken from interviews with stakeholders and performers in the organization the system is developed for, observation of the daily work performance of potential users, surveys, or document reviews. In many cases, a needs assessment will have to be conducted to identify what the exact requirements are and what approaches to a solution might be feasible. Outcomes of this phase include (a) the decision if a KM system can indeed meet the organization’s needs and, if this is the case, (b) the initial enduser requirements of the goals and objectives regarding the function and performance of the system, (c) the target audience that will use the system and its learner characteristics, and (d) a description of the minimum configuration of the delivery platform for the system, that is, the resources and constraints of the available computer hard- and software, as well as network connectivity for the delivery of the materials.

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

41

LIVING-SYSTEMS DESIGN MODEL

Figure 2

Living-systems approach to the development of knowledge management systems

Note that the goals and objectives do not necessarily determine specific learning outcomes, as they usually do in traditional ISD models. They may instead describe the system features and functionality that will allow learners to construct knowledge according to their needs. Another issue in which this phase differs from other models is in the use of the outcome of the analysis. In many models, the results of the needs analysis are used as input for several other phases of the design process, but are not reassessed or updated after they are initially obtained. In our approach of treating the system we develop as constantly changing, we follow the idea of a needs reassessment outlined by Tessmer, McCann and Ludvigsen (1999) by recognizing that key outcomes of this analysis have to be reviewed frequently and updated in order to provide valid input for the next phases. In the example of the KM system, the assessment of the organization’s needs was performed in a series of meetings with different members of the organization, in which we used structured interviews with some open-ended questions. We identified three different groups as target

audience: (a) members, (b) managers, and (c) customers. Members are the individuals employed in the organization who are directly responsible for correctly applying the TBPs on an everyday basis. Managers are supervisors of the members who apply the TBPs, but they do not use them as part of their own daily work. Managers are, however, responsible for the correct application of the TBPs by the members of the organization that they supervise. Customers take possession of the products that result from the application of the TBPs by the members of the organization. Customers have a vested interest in the correct application of the TBPs since it affects the quality of the product that they receive. After the primary goal of the project was established—developing a system to promote the use of the TBPs throughout the organization— we identified secondary goals, which were (a) the capturing of knowledge from experienced members of the organization and (b) training new employees who were inexperienced. Constraints of importance were other systems (e.g., document management systems) that were currently in use in the organization, as well as

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

42 legacy solutions that had to be integrated with the new system. The objectives we identified for this system included the availability of access to the TBP documents, decision support in applying the TBPs, just-in-time instruction in the use of TBPs, a mechanism to share case examples of the actual use of TBPs in the work environment, and forums for the collaboration among users that helps build a community of users.

Design Instructional Information Architecture

Based on the results of the analysis of the enduser requirements, the purpose of the information architecture phase is to define the content and functionality of the system and its general structure as well as the navigation system (Rosenfeld & Morville, 1998). We use the term instructional information architecture for a conceptual design that emphasizes the instructional function of the system and that focuses on the facilitation of the users’ learning processes. This implies that the design of the functionality and structure of the system is informed by appropriate learning theories and frameworks, and that the navigation supports the instructional function of the system. Based on the learner characteristics, end-user requirements, and goals and objectives determined in the previous phase, the first step of this phase is to specify the functionality that will realize the objectives of the system. In this process of mapping system functionality onto objectives, one or more system features are chosen that can help achieve a particular objective within the resources and constraints of the culture of the organization and the characteristics of the learners. Next, the content is mapped onto the features, that is, the content covered by or contained within each feature is specified. In cases where the model is applied to the design of an environment that relies on content contributed by the users, the type, format, and structure of this content are defined. The content itself can be organized based on the most appropriate organization scheme (e.g., alphabetical, chronological, geographical, topical, or task-specific) and structure (e.g., hypertext,

ETR&D, Vol. 50, No. 1

hierarchical, or hybrid) (Rosenfeld & Morville, 1998; Rothwell & Kazanas, 1997). The output of this phase is an information architecture flow chart that shows the major components of the system and how they are connected, and that provides initial specifications for their functionality. One way in which this phase differs from other ISD approaches is that the functionality of the system also includes features that allow for the accommodation of change and growth. These features include administrative tools, KM capabilities, and living system capabilities (see Table 1). Administrative tools allow for the easy update and modification of user information and of certain periodically changing content that would otherwise require changes to the hypertext markup language (HTML) documents. They also include project management tools for planning and communication between the organization and the developers. KM capabilities allow users not only to retrieve information from the system, but also to capture and store relevant information in the system, making it available to others. Living-systems capabilities

Table 1

Examples of capabilities that allow for accommodation of change in the Knowledge Management system

Capabilities Administrative Tools User-Management Tools Content-Management Tools Project-Management Tools

Knowledge Management Capabilities Knowledge Acquisition Decision Support On-line Instruction Communication Tools Reference Materials Search

Living System Capabilities Overall System Status and Growth Current Demand on System Resources Usage Statistics and Growth Rate for Features Tracking of Usage Patterns (Log File Information)

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

43

LIVING-SYSTEMS DESIGN MODEL

can be described as the digital nervous system of the KM system that provides real-time information, such as overall system status, usage statistics for features, demand on resources, and system growth. This information can be used for automatic, semiautomatic or manual adjustments to the system to accommodate the changing requirements and system growth. In our case example, the design of the information architecture is based on situated learning theory (Lave, 1988; Lave & Wenger, 1990; Wilson, 1993), and cognitive flexibility theory (Spiro & Jehng, 1990), which we integrated in a collaborative cognition model (Plass, Salisbury, & March, 2000) that describes the construction of knowledge on an individual level, team level, and organizational level. Using this model, we identified the categories of knowledge that each of the objectives included—factual, conceptual, procedural, and metacognitive—which are based on a revision of Bloom’s taxonomy (Bloom, 1956) developed by Anderson et al. (1998).1 Factual knowledge concerns terminology, specific details, and elements of the technical business processes. Conceptual knowledge relates to theories, models, principles, and generalizations used in the TBPs. Procedural knowledge includes skills, algorithms, techniques, and other methods that are specific to a product or process within the TBPs. Finally, metacognitive knowledge, an addition to Bloom’s (1956) taxonomy by Anderson et al. (1998), is knowledge about knowledge and involves general strategies for learning, thinking, and problem solving. Metacognitive knowledge also includes knowledge concerning the appropriate contexts and conditions for the use of the strategies them-

1 One of the major differences in the structure of this revised taxonomy by Anderson et al. (1998) to the original taxonomy is that knowledge was moved into a separate dimension that distinguishes between factual, conceptual, procedural, and metacognitive knowledge. The other categories of the original framework are grouped into a “Process Dimension” that describes the cognitive processes of the learner for retention and transfer. Another major change is that the revised taxonomy renames the categories from Bloom’s original “knowledge, comprehension, application, analysis, synthesis, and evaluation” to “remember, understand, apply, analyze, evaluate, and create.” In the revision, “create” is now the highest level of cognition that describes individuals putting elements together to form a novel coherent whole or making an original product.

selves. We defined the functionality of the system by choosing system capabilities that were able to capture these different categories of knowledge, and defined the structure and organization of the system as task oriented. We also included KM features that allow for the accommodation of change and growth (Table 2). A checkmark in the intersection between the row of a capability and the column of an enduser category in Table 2 indicates that this capability is designed for that group of end users. We designed at least one capability for each of the groups in the target audience: members, managers, and customers:

• Knowledge acquisition capability includes two

types of features that were designed to benefit mainly the members of the organization while being useful for the decisionmaking process of managers. One feature allows organizing and storing of case studies,2 that is, cases that illustrate, in depth, the correct application of a TBP, mainly created by subject-matter experts for use as examples during instruction. The best-practices feature enables the members of the organization to capture and share actual real-life case examples that illustrate solutions to problems of the correct application of TBPs in the field. With these two features, the knowledge-acquisition capability facilitates the capturing and sharing of factual and procedural knowledge as it is created by end users to solve unique problems—each solution is represented as a one-of-a-kind case. This knowledge contains the facts and steps that were used in solving their unique problems. Metacognitive knowledge is used in this feature to assist end users in finding a previous solution that will help them with their current problem.

• Decision

support capability consists of two types of features that were designed to benefit mostly the organizational members. The first decision-support tool provides guidance in finding where a particular sub-

2 The term case study is used here for an in-depth example of a concept or procedure, not as a reference to the qualitative method of inquiry.

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

44

ETR&D, Vol. 50, No. 1

Table 2

Functionality of the Knowledge Management system

Functionality Knowledge Acquisition Case Studies Real Life Examples (Best Practices) Decision Support Decision Support by Subject Decision Support by Activity Instruction Instructor Led (Synchronous) Online Tutorials (Asynchronous) Communication Points of Contact (e-mail) Threaded Discussion Frequently Asked Questions User Feedback Reference Materials Technical Business Practices (TBPs) Related Materials (Manuals, Documents) Search Full Text Search Search Index

topic is discussed in the TBPs. The second decision-support tool provides guidance for finding where a specific activity in the Product Realization Process is discussed in the TBPs. Both features facilitate knowledge transfer in the organization: Experienced members provide the expertise for the design of the decision-support capability while members with less experience use the embedded expertise to find and complete the TBPs. Managers with less detailed knowledge of the TBPs may also benefit from use of the expertise in their supervisory role. Knowledge in the decision-support capability is the step-by-step information needed to complete a procedure from start to finish. The decisions of which step to complete under what kinds of conditions is a form of conceptual knowledge. This means that the decision-support capability also embodies a high level of conceptual knowledge concerning the general application of the TBPs. This capability also utilizes metacognitive knowledge to assist end users in finding and applying this knowledge in a just-in-time manner.

• Instruction includes instructor-led classroom

Members √ √ √ √ √ √

End-Users Managers

Customers

√ √







√ √ √ √

√ √



√ √





√ √

√ √

√ √

instruction and just-in-time on-line tutorials. Members of the organization can benefit from all the instruction that is available—instructor-led and on-line. One advantage of instructor-led instruction for members is that it provides an opportunity to ask questions concerning specific details about applying the TBPs. Managers and customers, in general, can meet their needs with on-line instruction that provides just-in-time tutorials for all levels of TBPs, from a general overview to the specifics of applying TBPs in a particular context, such as software application development. On-line instruction includes some factual, but mostly conceptual knowledge. It is high-level knowledge concerning the general principles of a complex system. Hence, the on-line instruction does not focus on the intimate characteristics of the TBPs, but rather on the general application of the policies and procedures.

• Communication consists of three features, (a) a

“points of contact” e-mail feature, allowing users to contact experts within a subject area, (b) a “threaded discussion” list that allows all users to participate in ongoing discussions of

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

LIVING-SYSTEMS DESIGN MODEL

issues regarding TBPs that are of concern to them, and (c) an “FAQ” list, providing means to collect frequently asked questions and responses to them. These communication features were designed to extend the methods available for end users of the KM system to capture and share knowledge by building a community of learners and users.

• On-line reference materials is a capability that

provides access to on-line documentation describing the TBPs for all end users. These materials, containing mostly factual and procedural knowledge, describe the details and step-by-step knowledge needed to complete a procedure from beginning to end. The easy access and cross-reference of related on-line materials mainly will benefit organizational members by helping them to ensure that the application of the TBPs is consistent with the other regulations they must follow.

• Search capability provides all users of the KM

system with quick access to reference materials that relate to specific topics. It includes a full text search where end users can input keywords or phrases to the search engine and receive customized references to all occurrences that match the search criteria. The search results can, depending on the user’s request, be presented in different levels of “granularity,” such as on a paragraph, section or subsection, or document level. In addition, a topical search, based on an index of terms, can be employed by users who are less familiar with the TBP terminology.

Develop Instructional Interaction Design

The purpose of the interaction-design phase is to design the user interaction with the (visual) interface of the system capabilities and features, that is, design the communication between the user and the system functionality. Based on the information architecture flow chart and the initial specifications for the system features provided in the instructional-information-architecture phase, in this phase the details of the functionality are defined. The steps to be performed in this phase include (a) a task analysis, (b) a cognitive task

45 analysis, (c) the design of the steps of the human-computer interaction for each feature and the selection of interface elements that support them, and (d) some general screen layout decisions. They further include (e) the definition of the logic of the feature, that is, the way it processes and manipulates the user input, and (f) the type of system response that will be provided. This includes questions about the input required and how it is given, as well as about the output and its format. For example, should a set of radio buttons or a pull-down menu be used for making a selection? How will the system respond, especially in the case of an error? How can it be assured that meaningful output will be provided? As in the previous phase, we use the term instructional interaction design to put special emphasis on the users’ learning processes and on the cognitive processes involved in the learning task. The goal of instructional interaction design is the design of instructional strategies that are appropriate for the target audience and the instructional content, and that take advantage of the unique characteristics of the delivery medium. This can be achieved, for example, by applying principles derived from cognitive load theory (Sweller, 1994; Sweller, Chandler, Tierney, & Cooper, 1990), which distinguishes between the intrinsic cognitive load imposed by the instructional material itself and the extraneous cognitive load imposed by the instructional activities, as well as principles derived from media attribute theory (Salomon, 1979), which provides a framework to describe the unique features and capabilities of a particular medium. This instructional focus also concerns the design of interactions with the system that reduce cognitive overhead they themselves impose, allowing the users to allocate more of their cognitive resources to the processing of the instructional material. This may require a cognitive task analysis to help identify the cognitive demands of a task performed in the system (Lesgold, 2000; Schraagen, Chipman, & Shute, 2000). The output of the interaction-design phase is detailed specifications of the way the interaction of the user with the system is performed. In many cases, these specifications are completed in the form of written descriptions along with

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

46

ETR&D, Vol. 50, No. 1

Figure 3a

Interactive design of the decision support feature of the sample knowledge management system

Figure 3b

Information design of the decision support feature of the sample knowledge management system

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

LIVING-SYSTEMS DESIGN MODEL

nonfunctional implementations in HTML that show the type and placement of the interface elements. As an alternative, storyboards can be used, in which the interaction design is visualized. In the example of the KM system, the interaction design was where we defined the details of the functionality of the features included in the information architecture. For the decision-support feature, for example, we defined the user interactions and interface elements as shown in Figure 3a. For the topic listed at the top of the screen (Product Realization Process), subtopics are presented for the user to choose from. When the user moves the cursor over a subtopic, detailed information designed to assist the user in the decision-making process is presented in the lower part of the screen. When the user clicks on a subtopic it becomes the new topic for which further subtopics are presented.

Develop Instructional Information Design

The purpose of the information-design phase is to specify the appearance of the interface elements defined in the interaction design and to define the representation mode of the information presented on the screen, for example, as text, picture, video, animation, or sound. Information design also includes the color palette used for the visual interface, color coding, background images, typefaces and their forms (attributes) used for on-screen text, and the appearance of system controls such as buttons, scroll bars, and navigational elements. Appearance goes beyond the “look and feel” of the visual interface, however. Based on the input from all three previous phases, the information design also includes design decisions that will have an impact on the users’ learning processes (Plass, 1998). Such decisions concern the presentation mode and sensory modality for a particular piece of information and the combination of information in different presentation modes. We use the term presentation mode for the format used to represent instruction (e.g., text or pictures), and the term sensory modality to describe the sensory channel used to perceive the information (e.g., auditory, visual, tactile).

47 For example, the information designer might have to decide if an instructional text should be presented visually as text printed on the screen or in auditory format as narration. If instructional animation is to be included, text labels could be added in the animation or separately in a legend. Visual information could be presented as photographic image, line drawing, or video. The combination of visual and verbal information raises a number of questions as to their cognitive processing as well: Some forms of combining visual and verbal information have been shown to result in improved learning, while other forms have negative effects caused by the high cognitive load imposed by the materials (Mayer, 2001; Plass, Chun, Mayer, & Leutner, 1998; Sweller, 1994). We refer to information design that takes into account such considerations of the cognitive impact of the choice of the presentation mode of information as instructional information design. The information-design phase is conducted based on the needs, requirements, and learner characteristics identified in the analysis phase; the specifications of the content and features as well as the structure of the system defined in the information architecture; and the specification of their functioning described in the interaction phase. Information design is informed by the cognitive science principles of learning from text and pictures (Mandl & Levin, 1989), from maps (Schnotz & Kulhavy, 1994) and from multimedia (Mayer, 2001). To define the visual appearance of the navigation system, information designers employ methods of wayfinding, that is, strategies and design elements that assist users in navigating the system and in orienting themselves within it and with relation to the desired location or function (Passini, 1999). The output of this phase is the specification of the appearances of these features, either in form of nonfunctional HTML pages or as storyboards. In the example of the KM system, the information design had to be performed under the organization’s strict constraints of acceptable “browser” plug-ins and “streaming” media. Due to the settings of the existing “firewall,” it was not possible to use any streaming media or plug-in-based components such as Macromedia Flash™ or Director™. The information designer

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

48 had to work within these constraints to design the visual appearance of the Web pages. Figure 3b shows the visual design of the decision-support tool. In comparison to the interaction design in Figure 3a, the information design in Figure 3b shows how color was applied and how the information was organized.

Implement System Design

Based on the input from the previous phases, the purpose of the implementation phase is to produce a prototype of the KM system. Ideally, the development and programming tools used for the implementation of the system are chosen in this phase based on the features that need to be implemented. In reality, however, these tools are often specified as part of the initial requirements of the organization. Often referred to as the production phase, this phase includes the production of graphics and other media elements such as sound and video, the scripting and programming, and the design and implementation of the database backend, a set of tables in a data base management system (DBMS) that comprise the content of the system. This separation of content and code enables the technical implementation of the features and capabilities that allow the system to grow and accommodate change: Users can add their own examples, case studies, and solutions to the existing knowledge base. Authors can use forms to add, delete or change information in the DBMS tables without taking the system offline, and without making modifications to the source code. The system administrator can use an analysis of the number of new records in specific tables to gauge the rate of growth for each feature. These instruments to measure growth in the various components of the system, combined with tools that analyze the user log files, constitute the digital nervous system of the KM system. The output of this phase is a functioning prototype of the KM system that includes all of the features specified in the information architecture. The reason the system is classified as a prototype, however, is that some features may not be fully implemented, the need for user test-

ETR&D, Vol. 50, No. 1

ing and evaluation remains, and most features do not yet contain extensive amounts of captured knowledge. The evaluation of the prototype will be the content of the next phase; the capturing of knowledge into the system will be done by the users of the system once it is deployed. In our example, we used our conceptual system design—the information architecture—as well as the interaction design and information design to develop a prototype KM system. The system was implemented as a Web application using HTML, JavaScript scripting language, and Macromedia ColdFusion Web application architecture, with a backend implemented using Microsoft SQL Server DMBS.

Conduct Developmental Evaluation

The purpose of the evaluation phase is to assess if the features and functionality of the product meet the stated objectives. One level of evaluation is performed on the output of each phase before the next phase can begin. After one design cycle has been completed and all phases have been realized, an evaluation of the resulting product is conducted before the next development cycle can begin. This ongoing evaluation is at the heart of the system’s capability to accommodate changes in user needs and requirements. Tessmer, McCann and Ludvigsen (1999) described how a needs reassessment, conducted after training has been delivered, makes it possible to determine if the training need was met or if the instruction resulted in overtraining or undertraining. This phase of the design model, following the same spirit as the needs reassessment, is concerned with the assessment of whether the needs of the organization and of individual users have been underserved or overserved in order to allow the developers to modify the system accordingly. The nature of the resulting evaluation is therefore both formative and developmental. The ongoing process of monitoring the status of meeting, overserving or underserving user needs gives the evaluation a developmental character since it is seen as an essential part of the design process without which a successful

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

LIVING-SYSTEMS DESIGN MODEL

product could not be achieved and maintained. Developmental evaluation is used here to describe the long-term partnering relationships with the organization for the purpose of collaboratively developing and maintaining the system (Patton, 1994). In order to be able to identify changing needs of the organization in a timely manner, a formative evaluation of each phase is necessary before the next phase can be approached, which is visualized using dotted lines in Figure 2. This evaluation should always include a reassessment of the requirements of the users and the organization. In the first phase of our model, Analyze End-User Requirements, the results of the analysis need to be evaluated. Methods for this type of evaluation include the use of focus groups or an expert review of the goals, objectives, and target audience specifications by stakeholders in the organization. The living system features of the KM system provide another method for identifying changes in these areas. When users are required to complete a one-time log-in procedure that asks about demographic information, such as their level of expertise with the subject matter and in their job, changes in the user demographics can be identified. Beyond that, the analysis of the functionality used by members of a particular group within the target audience provides initial information regarding the kinds of needs that the members of this group have. If combined with surveys and interviews, this method allows for a timely accommodation of changes in the target audience or in the needs of certain groups of users. The purpose of the evaluation of the output of the next phase, Design Instructional Information Architecture, is to validate that the features and capabilities included in the information architecture and their content and structure will indeed be able to meet the requirements specified in the analysis phase. As mentioned above, for this evaluation to be meaningful, a reassessment of these requirements has to be performed first. The evaluation of this high-level conceptual design is not aimed at achieving certainty that any of the features in the information architecture will indeed be effective in the desired way. What is more important in this phase is to verify that the instructional theories

49 and frameworks as well as epistemic and philosophical beliefs that informed the design of the features and their structure are appropriate for the organization and its culture and that the features have the potential to sufficiently support the intended type and format of learning. In addition, the appropriateness of the navigation system is assessed. Therefore, an evaluation of the information architecture might include a review by experts of the mapping process of features onto objectives and content onto features, as well as the navigation, and the approval of the features by the client. The evaluation of the effectiveness of the functionality, structure, and navigation can only be performed once a prototype has been implemented. The Develop Instructional Interaction Design phase and the Develop Instructional Information Design phase can also be evaluated best once a prototype is available. However, after each phase is completed, the interaction design specifications and information design specifications can be evaluated in terms of how the design was informed by the appropriate theoretical foundation. In other words, an expert review can reveal whether the required conditions for the application of a particular design principle were met and if this design principle or theory was applied in an appropriate way. Another focus of this evaluation is Do the chosen instructional strategies and the way they are designed have the potential to support learners’ cognitive processes and can they facilitate the intended type of learning? Again, this is no assurance that the interactions and appearance of the features designed are effective, but it does allow for the elimination of a first level of potential problems. A method that has a lot of promise for this review of the application of design principles or theories in the design is the use of a pattern language that articulates and communicates the design of the entire system in a coherent, formal way (Alexander, Ishikawa, & Silverstein, 1977; Tidwell, 1999). The units of such a language are design rules, or patterns, that capture the solutions to specific issues or problems in the design process in a particular context, and are therefore neither too abstract nor too specific (Tidwell, 1999). Applied to the design process of a com-

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

50 plex system such as a KM system, the specification of these patterns and their use in the design of the various features of the system can assist a team of designers in agreeing on a common and coherent way of communicating with the user and in the evaluation of the application of the appropriate pattern. Once one cycle of the design process has been completed and a prototype of the system is implemented, usability and learning outcome of the resulting system are evaluated. Usability testing includes the assessment of the actual ease of use of the system and of learning with it, which is evaluated using criteria such as the time needed to learn specific functions of the system, user retention of these commands over time, the speed of task performance, the error rate in such performance, and the number of clicks or steps for the task performance (Shneiderman, 1992). It also includes user acceptance testing that assesses the perceived usefulness of the system and of particular features, the perceived ease of navigation, the appropriateness of terminology used, and the consistency and match of the features with user needs and tasks. Methods of inquiry used for usability assessment include surveys and interviews, but also think-aloud protocols, walk-through techniques, field observations, videotaping, and recording log files of user actions. Applied to our model, usability evaluation means the evaluation of specific design decisions made in the information architecture, interaction design, and information design. The evaluation of the learning outcome can be conducted on any of Kirkpatrick’s (1994) four levels of evaluation. On Level I, the reaction of the users and how they perceive the usefulness and value of the system is assessed, which corresponds to a usability assessment specific to learning. While it serves only a limited function in actually assessing learning and in determining the effectiveness of the system, this level is used most often for formative evaluation purposes. Level II is concerned with the learning outcome and instructional effectiveness of the system. If the system has well-defined instructional objectives, this effectiveness is measured by using the criterion-referenced test items that were developed in the design phase of more

ETR&D, Vol. 50, No. 1

traditional ISD models. For systems that do not have such well-defined instructional objectives, alternative methods for the assessment of the learning outcome have to be considered. These could include interviews or observations of the performance of users while they are using the system, and the analysis of journal entries, projects, and portfolios. In many cases, however, the absence of well-defined learning objectives will result in the use of a Level III evaluation, which measures transfer of learning. In other words, the application of the knowledge gained from the system in the users’ daily work performance is measured, for example, using observations, interviews, surveys, or the analysis of work results. The most difficult outcome to measure is the impact of the system on the organization, which is described in Level IV. On this level, the impact of the system as a change agent is assessed, and the resulting impact on the organizational level is described (Preskill &Torres, 1999). In the example of the KM system, we performed evaluations of each of the phases of the design process. We especially focused on the reassessment of the user needs and requirements before we began work in a new phase. This reassessment was performed on a formal level by inviting a group of “power users,” experts in the Product Realization Process and the TPBs, to a focus group meeting in which the results of our needs assessment and a draft of the proposed information architecture were discussed. Later, we conducted the reassessment on a more informal level by involving subject matter experts, potential users, and instructors from the training department into the process of design and assessment. This combination of participatory design and formative evaluation has been referred to as developmental evaluation (Patton, 1994). This approach of using a developmental evaluation to compare the features and functionality of the implemented system with the current end-user requirements by closing the circle from Implement System Design to Analyze End-User Requirements (Figure 2) results in an inherent capability of the model to accommodate change and growth. If the requirements of the end users have changed,

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

LIVING-SYSTEMS DESIGN MODEL

modifications to the system can be made automatically (e.g., scheduled automatic updates to frequently changing information), semiautomatically (e.g., update of information based on administrative tools) or manually (e.g., the development of new features based on new needs and requirements). The ongoing evaluation is based on a multilevel approach to data collection: The living-system capabilities allow us to analyze the status and use of each feature in the system at any time, for particular (unidentified) users and across groups of users. On-line surveys, conducted periodically, provide us with information about specific emerging areas of interest. Field visits are conducted by members of the organization, in which they follow up on trends detected in the surveys and living-system features. Finally, we receive numerous unsolicited responses by users through the e-mail feedback feature in the communication section. Together, the information collected on these different levels provides us with a powerful tool of continuously reassessing user needs.

Evaluation of the Case Example KM System

After the prototype KM system was implemented, we conducted a first evaluation of the system capabilities and features. The purpose of this evaluation was to obtain feedback regarding the perceived usability of the system and its perceived usefulness, that is, to evaluate user acceptance of the system. Similar to the method used in iterative prototyping, we implemented some of the features with limited depth (e.g., on-line tutorials) and others with limited breadth (e.g., decision support features). We were therefore interested in receiving feedback from potential users as to which features should have priority in the implementation of the system. Once the findings of this evaluation are incorporated into the system design and a threshold of the amount of content included in the system has been reached that would make learning experiences with the KM system meaningful, the effectiveness of the features for learning can be evaluated.

51 For the evaluation of perceived usefulness and usability, we developed a questionnaire with items soliciting demographic information and items where users rated the importance of the system’s features for their work and the accuracy of the information in the system, and chose which features of the site they used most and which they used least. We posted a link to the survey on the main page of the KM system and by e-mail, asked all users of the system to participate in the survey. After a three-week period, 49 of the estimated 150 users across the participating facilities had participated in the survey, a response rate of about 33%. We found that, of the participants in the survey, 45% were engineers, 31% managers, 14% support staff, and 10% were classified as other. Of the respondents 31% rated themselves as being highly experienced or experts in TBPs, 51% had medium experience, and 18% low or no experience; 33% had been working for the government organization for more than 25 years, 39% for 16–25 years, 16% for 6–15 years, and only 12% for 5 years or less. In other words, the respondents were mainly seasoned engineers and managers with many years of experiences. On a scale from 1 (lowest) to 5 (highest), the means of the responses regarding the ease of use and accuracy of the information of each feature were consistently above 4.0, with the exception of the on-line training tutorials, which had a mean of M = 3.97 (SD = .71) (see Appendix A). On the same scale, the responses regarding the quality of the overall design of the Web site had a mean of M = 4.35 (SD = 1.01), the ease of navigating the site a mean of M = 4.12 (SD = 1.01), and the likelihood that users would use the site to find needed information a mean of M = 4.37 (SD = 1.07). Aside from this positive feedback regarding the acceptance of the system and its features, the most important result of the evaluation was that we were able to determine a rank order for the importance of the features for the users by asking which of the features they used most, second, third, and last (see Appendix B). As the responses show, the access to the TBP documents, Decision Support by Subject and the Search function, along with the Dictionaries were ranked highest. Tutorials, Decision Support by Activity and Case Studies were among

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

52

ETR&D, Vol. 50, No. 1

the lowest ranks. Considering the high level of experience of the current users, it is not surprising that tutorials or case studies would rank very low, and access to the documents and search tools would rank very high. Of particular interest, however, was the clear preference for one type of decision support, organized by topic, over another, organized by the task the user has to complete, even though both features received very similar responses with regard to their accuracy, ease of use, and general importance (Appendix A). These results show that a frequent reassessment of the user population is necessary to make sure that changes in the required functionality, which can be expected, for example, as a result of new hires who are less experienced in the subject matter, can be detected and accommodated. The results of this first evaluation of the user acceptance of the system provided us with initial feedback on accuracy, ease of use and importance of the features for the respondents of the survey. Interviews with managers and engineers in the organization indicated that the distribution of the respondents to the survey was a fairly accurate reflection of the current user population of the system, and that the capabilities that had received the highest ranks (Document Access, Decision Support by Subject, Search Engine, Dictionaries and Real-life Examples) were indeed a priority for the organization and should be included in the fielded system. Other features, such as Decision Support by Activity and Case Studies, were given a lower priority. Based on these priorities, we were able to allocate more resources to the features ranking very high, and fewer to those that were ranked lowest.

DISCUSSION AND CONCLUSIONS

While all of the existing methods for the development of CBI described earlier have been successfully used to develop applications through the phases of analysis, design, development, implementation, and evaluation, they were not conceptualized for building systems with continuously changing requirements, such as changes in the characteristics of the target

audience; in the amount, content, type, and structure of the information; and in the forms and methods of learning and instruction used in the system. In this article, we therefore describe a new design model for KM systems that is conceptualized to provide explicit mechanisms to design features for KM systems that can accommodate growth and change by applying a living-systems approach. The ability to grow and change and, therefore, to adapt to the changing requirements of its users, is the most important characteristic of the systems for which this approach was developed. In our opinion, this capability has fundamental consequences for the design and development of systems that take advantage of it. While most other systems may reach a point where they are considered to be a finished product, these systems are, by design, never completed. The consequence of such a constant state of change is that the development process is never completed either. The evaluation of such a system is therefore always of a formative nature, and since it involves the ongoing collaboration with the organization that owns the system, it can also be considered a developmental evaluation (Patton, 1994). A comparison of this cycle of feedback and accommodation in the KM system we designed and those in an autopoietic system shows that both have within their own boundaries the mechanisms and processes that enable them to produce and reproduce themselves, which is characteristic of living systems. The design process we employed is cyclic in nature, reflecting the philosophy of accepting change as a factor in the definition of the KM system. While the execution of the first cycle of the design involves the initial analysis and assessment of need, design of the information architecture, interaction design, information design, and system implementation, the next design cycle can take advantage of the livingsystems features implemented in the system. In other words, much of the information that had to be collected in the needs assessment using interviews, document reviews, and observation of performers can now be captured using the digital nervous system of the KM system, which collects information on the use of the system and its features and gauges its growth. The information

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

LIVING-SYSTEMS DESIGN MODEL

obtained from the digital nervous system, combined with other sources of information (surveys, field visits, and e-mail feedback from users) results in a multilevel approach to formative evaluation, in which data collection and evaluation are ongoing. Ethical issues involved in this data collection do not differ from those in the collection of any research data and therefore require measures to keep the information confidential. The living-systems approach of the design model is realized in the form of digital nervous system functionality that monitors and records changes in the system, a developmental evaluation that continuously assesses whether the needs and requirements of the users have changed, using the data from the digital nervous system and from other means of data collection, and a design model that explicitly takes advantage of this information in order to modify the resulting KM system. A number of unresolved issues remain for the KM system that was implemented based on the design model described in this article. A first issue is that more research has to be conducted with regard to the instructional components of the system. For example, the living-systems approach is designed to provide more explicit support for facilitating situated learning (Brown, Collins, & Duguid, 1989) than found in traditional ISD approaches. The KM system described in this article supports learning at the individual and team levels from constructivist (Duffy & Jonassen, 1991) and cognitive flexibility (Spiro & Jehng, 1990) perspectives by allowing learners to construct and share their own meaning in an ill-structured domain through the use of minicases (Plass, Salisbury, & March, 2000). It is also designed to support learning at the organizational level (Nonaka & Takeuchi, 1995). However, research is needed to evaluate the effectiveness of this design method in supporting individual, team, and organizational learning. For our example KM system, an evaluation of the effectiveness of the features for learning would require that the system had been in use for a sufficient period of time so that the threshold of captured information is reached that allows for meaningful learning to take place.

53 Another issue concerns the cultural setting in which the system will be used. The effectiveness of a KM system that is based on the idea of users sharing their knowledge depends on the readiness of the organization to change and to become a learning organization. A KM system can act as a change agent, supporting the process of transforming a hierarchically structured organization into a learning organization. If the organization is not ready for this change, however, even the best KM system is bound to fail. The analyze-end–user-requirements phase should, therefore, include an assessment as to whether the culture of the organization supports the use of a KM system. The reliance of the system on user contribution of knowledge poses another challenge. One issue is the problem of motivating users to share their knowledge and make it available to others. Because of the additional effort involved in contributing information to the KM system, meaningful incentives are required to get learners to share their knowledge with others. Some organizations assess how often the knowledge contributed by a particular employee was accessed by others and determines incentives based on this impact factor. A related issue concerns the quality control of the information that is being contributed. Methods for such a quality control include review by expert panels or rating by peers, which often cannot easily be scaled to larger systems and therefore requires new mechanisms. Several issues also remain to be solved to make the design model described in this paper more easily applicable to other projects. The model was developed for the specific context of the design and development of KM systems. Following the conceptual framework for the comparison of ISD models described by Edmonds, Branch & Mukherjee (1994) our model can be classified as a descriptive and prescriptive model (orientation) with procedural and declarative knowledge structures that require an expert level of experience for the designer. The theoretical origin of the model is a living-systems approach, and it is conceptualized for the design of KM systems for business training and government (instructional context) on an institutional and team level (level of implementation).

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

54

ETR&D, Vol. 50, No. 1

Although its general phases should be applicable to the development of other instructional systems of a similar nature, we have not yet used this model for such a development. To assure a clear and coherent communication of the system’s features and functionality to users, future versions of our model would benefit from adding the use of a pattern language to the design process (Alexander et al., 1977; Tidwell, 1999). As part of the information architecture, interaction design, and information design phases, the patterns of this language would be specified as design rules that capture the solutions to specific issues or problems in the design process of the project. Examples for patterns could describe the implementation of specific instructional strategies, the type of interaction with the decision support feature, or the use of the search tool. By specifying patterns that are neither too specific no too general, it can be assured that they can be applied to similar design decisions in different parts of the program. Hannafin (1992) laments the insulation of ISD

from innovative design methods and models used in other fields and calls for the design of instruction that gives individuals a “greater role in regulating, not merely participating” in the learning process (p. 61). The living-systems approach we described in this article aims to support the development of environments that not only allow individuals to regulate their learning process, but that indeed grow and change in order to accommodate learners’ needs.

Jan L. Plass is with the Educational Communication and Technology Program, New York University. Mark W. Salisbury is with the Organizational Learning and Instructional Technologies Program, University of New Mexico. All trademarks used are properties of their respective owners. Correspondence concerning this article should be addressed to Jan L. Plass at [email protected] or by mail to New York University, ECT Program, 239 Greene St., New York, NY 10003, or to Mark Salisbury at [email protected] or by mail to Educational Office Building, OLIT, University of New Mexico, Albuquerque, NM 87131.

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

LIVING-SYSTEMS DESIGN MODEL

REFERENCES Alessi, S., & Trollip, S. (1991). Computer-based instruction: Methods and development (2nd ed.). Englewood Cliffs, New Jersey: Prentice-Hall Alexander, C., Ishikawa, S., & Silverstein, M. (1977). A pattern language. Towns, buildings, construction. New York: Oxford University Press. Anderson, L.W.; Krathwohl, D.R.; Airasian, P.W.; Cruikshank, K.A.; Mayer, R.E.; Pintrich, P.R.; Raths, J.D.; & Wittrock, M.C. (1998). Taxonomy for learning, teaching and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman. Banathy, B. (1991). Educational systems design: A journey to create the future. Englewood Cliffs, NJ: Educational Technology Publications. Bloom, B.S. (Ed.) (1956). Taxonomy of behavioral objectives: Handbook I: Cognitive domain. New York: David McKay. Boar, B. (1984). Application prototyping. New York: John Wiley & Sons. Boyle, T. (1997). Design for multimedia learning. London: Prentice Hall. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, January-February. Carr, A. (1997). User-design in the creation of human learning systems. Educational Technology Research and Development, 45(3), 5–22. Coad, P., & Yourdon, E. (1991). Object-oriented design. Englewood Cliffs, NJ: Yourdon Press. de Hoog, Robert (1994). Constraint-driven software design: An escape from the waterfall model. Performance Improvement Quarterly, 7(3), 48–63. Dick, W. (1996). The Dick and Carey model: Will it survive the decade? Educational Technology Research and Development, 44(3), 55–63. Dick, W., & Carey, L (1990). The systematic design of instruction (3rd Ed). Glenville, IL: Harper-Collins. Dynamic systems development method, Version 2. 1995. Tesseract Publishing. Duffy, T., & Jonassen, D (1991). Constructivism: New implications for educational technology? Educational Technology, 31(5), 7–12. Edmonds, G.S., Branch, R.C., & Mukherjee, P. (1994). A conceptual framework for comparing instructional design models. Educational Technology Research and Development, 42(4), 55–72. Fenrich, P. (1997) Practical guidelines for creating instructional multimedia applications. Fort Worth, TX: The Dryden Press. Gagné, R.M., & Briggs, L.J. (1979). Principles of instructional design. New York: Holt, Rineholt, & Winston. Hannafin, M. (1992). Emerging technologies, ISD, and learning environments: Critical perspectives. Educational Technology Research and Development, 40(1), 49– 63. Jones, T.S., & Richey, R.C. (2000). Rapid prototyping methodology in action: A developmental study. Educational Technology Research and Development,

55 48(2), 63–80. Kirkpatrick, Donald, (1994). Evaluating training programs. San Francisco, CA: Berrett-Koehler Publishers, Inc. Koper, R. (1995). PROFIL: A method for the development of multimedia. British Journal of Educational Technology, 26(2), 94–108. Lave, J. (1988). Cognition in practice: Mind, mathematics, and culture in everyday life. Cambridge, UK: Cambridge University Press. Lave, J., & Wenger, E. (1990). Situated learning: Legitimate peripheral participation. Cambridge, UK: Cambridge University Press. Lesgold, A. (2000). On the future of cognitive task analysis. In J.M. Schraagen, & S.F. Chipman (Eds.), Cognitive task analysis (pp. 451–465). Mahwah, NJ: Lawrence Erlbaum. Luhmann, N. (1986). The autopoiesis of social systems. In F. Geyer and J. Van de Zouwen (Eds.) Sociocybernetic paradoxes: Observation, control, and evolution of self-steering systems, London: Sage Publications. Mandl, H., & Levin, J.R. (Eds.). (1989). Knowledge acquisition from text and pictures. New York: NorthHolland. Maturana, H., & Varela, F. (1980). Autopoiesis and cognition. London: Reidl. Mayer, R.E. (2001). Multimedia learning. Cambridge, MA: Oxford. Moody, S., Hudson, T., & Salisbury, M. (1988). RAPID: A prototyping environment for battle management information systems. Proceedings of the Third Annual User-System Interface Conference, Austin, TX. Nonaka, I., & Takeuchi, H. (1995). The knowledge-creating company. New York: Oxford University Press. Passini, R. (1999). Information design: An old hag in fashionable clothes? In R. Jacobson (Ed.), Information Design (pp. 83–98). Cambridge, MA: MIT Press. Patton, M.Q. (1994). Developmental evaluation. Special invitational volume: Past, present, future assessments of the field of evaluation. Evaluation Practice, 15(3), 311–319. Plass, J.L. (1998). Design and evaluation of the user interface of foreign language multimedia software: A cognitive approach. Language Learning and Technology, 2(1), 35–45. Plass, J.L., Chun, D.M., Mayer, R.E., & Leutner, D. (1998). Supporting visual and verbal learning preferences in a second language multimedia learning environment. Journal of Educational Psychology, 90, 25–36. Plass, J.L., Salisbury, M.W., &. March, J. (2000). Integrated instruction as a component of a knowledge management system: A case example. Proceedings of WebNet ‘00, Charlottesville, VA: Association for the Advancement of Computers in Education (AACE). Preskill, H., & Torres, R.T. (1999). Evaluative inquiry for learning in organizations. Thousand Oaks, CA: Sage Publications. Reigeluth, C. (1993). Principles of educational systems design. International Journal of Educational Research,

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

56 19(2), 117–131. Roblyer, M. (1988). Fundamental problems and principles of designing effective courseware. In D. Jonassen (Ed.), Instructional designs for micro-computer courseware (pp. 7–33). Hillsdale, New Jersey: Lawrence Erlbaum Associates. Rosenfeld, L., & Morville, P. (1998). Information architecture for the World-Wide-Web. Sebastopol, CA: O’Reilly. Rothwell, W.J., & Kazanas, H.C. (1997). Mastering the instructional design process (2nd ed.). San Francisco, CA: Jossey-Bass. Salisbury, M. (1988). PARGEN: A prototyping tool for natural language interfaces. Proceedings of the Third Annual User-System Interface Conference, Austin, TX. Salisbury, M.W., & Plass, J.L. (in press). A conceptual framework for a knowledge management system. Human Resource Development International. Salisbury, M.W., & Plass, J.L. (2001). Utilizing decision support and case examples for capturing and disseminating knowledge in organizations. Submitted for publication. Salomon, G. (1979). Interaction of media, cognition, and learning. Hillsdale, NJ: Lawrence Erlbaum. Schnotz, W., & Kulhavy, R.W. (Eds.). (1994). Comprehension of graphics. New York: North-Holland. Schraagen, J.M, Chipman, S.F., & Shute, V.J. (2000). State-of-the-art review of cognitive task analysis techniques. In J.M. Schraagen, & S.F. Chipman (Eds.), Cognitive task analysis (pp. 467–487). Mahwah, NJ: Lawrence Erlbaum. Schuler, D., & Namioka, A. (1993). Participatory design: Principles and practices. Hillsdale, NJ: Erlbaum. Schwier, R.A. (1999). Constructivist approaches to instructional design. [On-line], Retrieved from the World Wide Web on 9/6/2000. Available: http://members.home.net/rschwier/presentation s/construct/ Shneiderman, B. (1992). Designing the user interface. Strategies for effective human-computer interaction. Reading, MA: Addison-Wesley. Spiro, R.J., & Jehng, J.C. (1990). Cognitive flexibility and hypertext: Theory and technology for the nonlinear and multidimensional traversal of complex

ETR&D, Vol. 50, No. 1

subject matter. In D. Nix & R. Spiro (Eds.), Cognition, education and multimedia: Exploring ideas in high technology. Hillsdale, NJ: Lawrence Erlbaum. Sweller, J. (1994). Cognitive load theory, learning difficulty, and instructional design. Learning and Instruction, 4, 295–312. Sweller, J., Chandler, P., Tierney, P., & Cooper, M. (1990). Cognitive load as a factor in the structuring of technical material. Journal of Experimental Psychology: General, 119, 176–192. Tennyson, R.D. (1997). A system dynamics approach to instructional systems development. In: R.D. Tennyson, F. Schott, N. Seel, & S. Dijkstra (Eds.), Instructional design: International perspectives. Volume 1: Theory, research and models, (pp. 413–426). Mahwah, NJ: Lawrence Erlbaum. Tessmer, M., McCann, D., & Ludvigsen, M. (1999). Reassessing training programs: A model for identifying training excess and discrepancies. Educational Technology Research and Development, 47(2), 86–99. Tidwell, J. (1999). Common ground: A pattern language for human-computer interface design. [On-line], Retrieved from the World Wide Web on September 18, 1999. Available: http://www.mit.edu/jtidwell/common_ground.html von Krogh, G., Roos, J., & Slocum, K. (1996). An essay on corporate epistemology. In G. von Krogh & J. Roos (Eds.) Managing knowledge: Perspectives on cooperation and competition. London: Sage Publications. Wager, W., & Gagné, R. (1988). Fundamental problems and principles of designing effective courseware. In Jonassen, D. (Ed.), Instructional designs for micro-computer courseware (pp. 7–33). Hillsdale, New Jersey: Lawrence Erlbaum Associates. Willis, J. (1995). A recursive, reflective instructional design model based on constructivist-interpretivist theory. Educational Technology, 30, Nov-Dec, 5–23. Wilson, A.L. (1993). The promise of situated cognition. New Directions for Adult and Continuing Education, 57, 71–79. Yourdon, E., & Constantine, L. (1978). Structured design (2nd Ed). New York: Yourdon Press.

AAH GRAPHICS, INC. / (540) 933-6210 / FAX 933-6523 / 03-04-2002 / 17:27

57

LIVING-SYSTEMS DESIGN MODEL

Appendix A

Means and standard deviations of the users’ rating of the importance, ease of use, and accuracy of the features of the KM system (N = 49).

Feature Decision Support by Subject Decision Support by Activity Reference Materials: TBP architecture visualization Reference Materials: TPB numerical list Case Studies for Online Tutorials Examples of Forms Real life Examples Documents (access to TBPs, QC-1, D&P) Online Tutorials Online Dictionaries—Acronyms & Definitions Search Meeting Announcements Communication-Points of Contact FAQs Descriptions of classroom training Announcements of other training

Importance

Ease of Use

Accuacy

3.87 (0.87) 3.62 (0.89) 3.66 (0.99) 3.95 (1.01) 3.03 (1.01) 3.33 (1.05) 3.63 (0.83) 4.41 (0.72) 3.54 (0.84) 4.02 (0.81) 4.20 (0.71) 3.00 (0.93) 3.23 (0.89) 3.07 (0.71) 2.95 (0.82) 3.05 (0.84)

4.17 (0.57) 4.09 (0.59) 4.16 (0.53) 4.18 (0.58) 4.13 (0.53) 4.07 (0.47) 4.15 (0.42) 4.26 (0.53) 3.97 (0.71) 4.24 (0.48) 4.22 (0.48) 4.22 (0.47) 4.28 (0.60) 4.20 (0.51) 4.13 (0.46) 4.05 (0.50)

4.10 (0.54) 4.10 (0.55) 4.20 (0.61) 4.26 (0.51) 4.00 (0.53) 4.07 (0.27) 3.92 (0.72) 4.27 (0.63) 3.88 (0.44) 4.10 (0.48) 4.25 (0.53) 4.00 (0.86) 4.00 (0.67) 4.09 (0.53) 3.79 (0.72) 3.89 (0.66)

Note. Reponses were given on a 5 point Likert scale, with 1 as the lowest, and 5 as the highest possible response. Standard deviations are printed in parentheses after the mean.

Appendix B

Rank

Score

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

478 273 211 206 186 116 109 91 74 27 25 –22 –34 –68 –97 –128

Scores and Ranks for Evaluation Survey Question: Which features of this site would you use most, second, third and last? (N = 49) Feature TBP and related Documents Decision Support by Subject Search—Topical Index Search—Full Text Online Dictionaries Real life Examples: Form Examples Reference Materials: TPB numerical list Reference Materials: TBP architecture visualization Online Tutorials Communication–Points of Contact Decision Support by Activity Case Studies for Online Tutorials Description of Classroom Training FAQ Training Announcements Communication Mail Room

Note. Ranks was computed based on responses to the question which feature the user expected to use most, second, third and last. Ten points were assigned for features selected as most, 9 for second, 8 for third, and –10 for least.