A Multidimensional Performance Model for Consolidating ... - CiteSeerX

2 downloads 195777 Views 181KB Size Report
The Balanced Scorecard tailored for the ICT field – 1.4. ... QEST: a multidimensional model for Software Performance Measurement – 3.1. .... Current BSC frameworks function as “dashboards”, in that they are made up of sets of individual ... without the benefit of a formal quantitative representation. .... disadvantages to this.
A Multidimensional Performance Model for Consolidating Balanced Scorecards ALAIN ABRAN École de Technologie Supérieure (ETS) Université du Québec à Montréal 1180 Notre-Dame Ouest, Montréal, Québec, Canada H3C 1K3 E-mail: [email protected] Tel: +1 (514) 396.8632 Fax: +1 (514) 396.8684

LUIGI BUGLIONE Software Engineering Management Research Laboratory Université du Québec à Montréal C.P. 8888, Succ. Centre-Ville Montréal, Québec, Canada H3C 3P8 E-mail: [email protected] Tel: (39) 338.95.46.917 Fax: (39) 06 41.536.385

Index 1. Introduction – 1.1 Context – 1.2 Generic performance management and measurement frameworks – 1.2.1 Balanced Scorecard (BSC) – 1.2.2. Intangible Asset Monitor (IAM) – 1.2.3. Skandia Navigator – 1.3. The Balanced Scorecard tailored for the ICT field – 1.4.

An issue: determining the overall value in a BSC – 1.5. Requirements and evaluation criteria– 2. Candidate consolidation techniques – 3. QEST: a multidimensional model for Software Performance Measurement – 3.1. The 3-dimensional basic model – 3.2. The n-dimensional extension – 4. Integration of QEST and BSC – 4.1. First option: BSC(QEST nD) – 4.2. Second option: QEST nD(BSC) – 4.3. Integration of the options: BSC(QEST nD) + QEST nD(BSC) – 5. Conclusions –

References

Abstract A Balanced Scorecard (BSC) presents the quantitative goals selected from multiple perspectives for implementing the organizational strategy and vision. However, in most current BSC frameworks, including those developed for the Information and Communication Technology (ICT) field, each perspective is handled separately. None of these perspectives is integrated automatically into a consolidated view, and so these frameworks do not tackle, either in relative or in absolute terms, the contribution of each goal to the whole BSC. Here, this issue is highlighted, candidate consolidation techniques are reviewed and the preferred technique, the QEST model, is selected; more specifically, three options are presented for incorporating the QEST model into a BSC framework. Key words – Software Performance Measurement; Balanced Scorecard; QEST model; Indicators; Strategy.

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

1

1. Introduction 1.1. Context Evaluating the IT function remains a challenge: well-known financial measures such as “return on investment” (ROI), “internal rate of return” (IRR), “net present value” (NPV) and “payback” time (PB) have been demonstrated to be inadequate, both in explaining IT investment decisions and in assessing them. In assessing IT investments, it is crucial to understand how organizational and strategic goals are achieved, and how IT investments contribute to these goals. Since the financial dimension alone is not sufficient and hides the relations (e.g. cause and effect) among processes, an improvement step has been suggested which would introduce additional dimensions (or perspectives) of analysis. Performance Management is defined by the United States Office of Personnel Government as “the systematic process by which an agency (organization) involves its employees (resources), […], in improving organizational effectiveness in the accomplishment of agency (organization) mission and goals” [OPM – http://www.opm.gov/perform/overview.htm ]. The following five phases are defined for managing organizational effectiveness: • Planning work and setting expectations • Continually monitoring performance • Developing the capacity to perform • Periodically rating performance in a summary fashion • Rewarding good performance

1.2. Generic performance management and measurement frameworks Over the years, several frameworks have been developed to address the management of organizational assets, both tangible and intangible. The three main ones are the Balanced Scorecard (BSC), the Intangible Asset Monitor (IAM) and the Skandia Navigator. 1.2.1. BALANCED SCORECARD (BSC) [KAPL96] In 1993, Robert S. Kaplan of the Harvard School of Business, and consultant David Norton developed the Balanced Scorecard (BSC), an evolution of the concepts in the Tableau de Bord which emerged in France at the turn of the 20th century [EPST97]. The aim of the Tableau had been to translate each company's unitary vision and mission into a set of objectives, through the identification of Key Success Factors and Key Performance Indicators. Kaplan and Norton defined the Balanced Scorecard (BSC) as a multidimensional framework for describing, implementing and managing strategy at all levels of an enterprise by linking, through a logical structure, objectives, initiatives and measures to an organization’s strategy. The resulting scorecard provides the details of an enterprise view of an organization’s overall performance: it complements the financial measures with other key performance indicators around customer perspectives and internal business processes, and around organizational growth, learning and innovation. It must be noted that the BSC is not a static list of measures, but rather a logical framework for implementing and aligning complex programs of change, and, indeed, for managing strategy-focused organizations. In summary, a scorecard is to be used to facilitate the translation of strategy into action. The four basic perspectives of the original BSC are: Financial, Customer, Internal Processes, Learning & Growth (Figure 1). ü Financial: this perspective typically relates to profitability – it is measured, for instance, by the ROI (Return On Investment), ROCE (Return On Capital Employed) and EVA (Economic Value Added). ü Customer: this perspective includes several core or generic measures of successful outcomes from the company strategy, like, for instance, customer satisfaction, customer retention, and market share in targeted segments. ü Internal processes: this perspective focuses on the internal processes that will have the greatest impact on customer satisfaction and on achieving an organization's financial objectives. ü Learning and growth: this perspective identifies the infrastructure the organization has to build and manage to create long-term growth and improvement through people, systems and organizational procedures. The execution of this strategy is then monitored through an internal performance measurement framework with a set of goals, drivers and indicators (both lag and lead types) grouped into each of the four perspectives. The other key objective of a BSC is to tell the story of the organization’s strategy. Three criteria help in determining whether or not this objective has been achieved: • Cause-and-effect relationship: every measure selected should be part of a cause-and-effect relationship (causal relationship chain) which represents the strategy. • Performance drivers: the drivers of performance (lead indicators) tend to be unique since they reflect what is different about the strategy of a company. • Links to financial indicators: the various strategic goals (such as quality, customer satisfaction and innovation) must also be translated into measures which are ultimately linked to financial measures.

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

2

Figure 1 – Balanced Scorecard: original perspectives [KAPL96]

1.2.2. INTANGIBLE ASSET MONITOR (IAM) [SVEI97][SVEI98] Developed by Karl Sveiby, this framework classifies the intangible part of a balance sheet into a so-called family of three thematic areas, each of them composed from four categories (growth, renewal, efficiency, stability/risk): ü Internal structure: consists of a wide range of concepts, models, computers and administrative systems. ü External structure: relationships with customers and suppliers, brand names, trademarks and image. ü Individual competence: people's capacity to act in different situations, including skill, education, experience, values and social skills. The basic tenet of the IAM framework is that people are the organization's only profit generators, since every human action is converted into both tangible and intangible knowledge structures, in other words a "knowledge perspective". 1.2.3. SKANDIA NAVIGATOR [EDVI97][SKAN98] Developed by Edvinsson and Malone, this framework combines the two previous framework structures, applying the BSC presentation format to the family of the three IAM categorizations and using them under the Intellectual Capital label in a supplement to the Skandia Annual Report. Also, the Navigator uses the same four perspectives ("focus") as the BSC1, with a central and pervasive "Human Focus".

1.3. The Balanced Scorecard tailored for the ICT field

A few attempts to adapt the BSC for software intensive organizations (SIOs2) have been reported, such as: • the Balanced IT Scorecard (BITS) proposed by the European Software Institute (ESI) [IBAÑ98] [REO99a] [REO99b]: provides a new version of the four original perspectives (financial, customer, internal process, infrastructure and innovation) and adds a fifth one, the People perspective. • the BSC of Advanced Information Services Inc. [FERG99]: considers the "employee" element as a distinct perspective, thereby expanding the analysis to five perspectives (financial, customer, employee, internal business process, learning and growth). The five distinct perspectives derived from the original scorecard and adapted to the ICT field are: • Financial Perspective: How do our software processes and software process improvement initiatives add value to the organization? • Customer Perspective: How do we know that our customers (internal and external) are delighted with our product? • Process Perspective: Are our software development processes performing at sufficiently high levels to meet customer expectations? • People Perspective: Do our people have the necessary skills to perform their jobs and are they happy doing so? • Infrastructure & Innovation Perspective: Are process improvement, technology and organizational infrastructure issues being addressed with a view to implementing a sustainable improvement program? Table 1 summarizes how both tangible and intangible concepts are being addressed in each of the generic frameworks presented, as well as for those tailored to ICT organizations.

1

The fourth perspective is called "Renewal and Development" instead of "Learning and Growth". SIOs are organizations whose main objective is software development and sales, departments or organizations which develop software as an integral part of their end-products, and organizations which develop software for internal use to achieve better business results or whose software department can be described as an independent organizational unit (European Software Institute definition, 1997). 2

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

3

TANGIBLE ASSETS INTANGIBLE ASSETS

IAM Tangible Net Book Value Internal Structure

BSC Financial Perspective

Internal Process Perspective External Structure Customer Perspective Individual Competence Learning & Growth Perspective

NAVIGATOR Shareholders' Equity

ESI BITS Financial Perspective

AIS BSC Financial

Organizational Capital

Process Perspective

Internal Process

Customer Capital Human Capital

Customer Perspective Customer Infrastructure & Learning & Growth Innovation Perspective People Employee

Table 1 - Comparison of Performance and Measurement Frameworks

1.4. An issue: determining the overall value in a BSC Of these generic frameworks, Kaplan and Norton's BSC framework has the largest market penetration and tackles performance at several levels, from the organizational level to the small business unit, and to the individual level. However, while providing a vast array of individual quantitative indicators, it does not provide for consolidated performance values, either for the individual perspectives or for their consolidation. Although in a BSC the logical (i.e. causal) links are described textually, there is no provision in a BSC for a quantitative consolidation for each perspective to handle separately its various numbers of indicators representing the various elements of each business strategy. Even though it is structured logically to represent the series of causal links between the goals and drivers, a BSC framework does not provide the support necessary to represent quantitatively how much each perspective contributes, either in relative or in absolute terms. In practice, the consolidation has to be carried out intuitively by the users of the BSC; this consolidation is, of course, subject to significant variability of assessment and interpretation depending on the expertise and background of the users. Current BSC frameworks function as “dashboards”, in that they are made up of sets of individual indicators (like the various gauges on an automobile dashboard) which provide the elements necessary for monitoring the deployment of a (business) strategy. Knowing the business links (causal relationships) across the various indicators, the business executives must then, each time, figure out a consolidated assessment of current organizational performance. Since BSC frameworks do not include techniques for consolidating individual perspectives, this has to be done subjectively, without the benefit of a formal quantitative representation. Therefore, consolidated assessments which can be communicated unambiguously and consistently to those with a knowledge of the consolidation rules are not possible. Using a BSC, a manager can typically gain access to detailed information about the goal, driver and indicator (GDI) elements for each BSC perspective and about the follow-up actions to take when results do not meet established itemized thresholds. But, as with any automobile dashboard or control panel, an overall value of that perspective is not defined. It is therefore challenging to answer concerns at aggregate levels above that of the details included in the BSC, whether for internal or external purposes. This issue can be restated in the following way: How can the consolidated management view of performance, and the data collection view, be expressed? How, for instance, can a question on process performance be tackled, a question which, when translated into BSC terms corresponds to: What is the overall value expressed by a set of indicators chosen in your organization for the Internal Process perspective? This paper addresses the consolidation issue and illustrates the way in which a software performance measurement model, namely the QEST model, can be used for a BSC for information and communication technology (ICT) organizations (referred to as “ICT BSC” in this paper).

1.5. Requirements and evaluation criteria To tackle this issue, we have determined that the following mandatory high-level capabilities have to be added to the BSC framework and integrated into a performance measurement system (PMS): • A model providing a multiperspective approach [FINK96]; • A model which expresses these multiple viewpoints in a single, consolidated value; • Dimensional principle: The «dimensional principle» refers to the direct relationship between the number of elements to be represented and the number of dimensions used for their representation. A proper representation of an entity must align the number of elements to be represented and the number of dimensions used to represent them. If this principle is not respected, the danger is a loss of information. A well-known visual example is the way in which the ancient Egyptians painted human subjects. They used a 2D representational style instead of a three dimensional (3D) format. This misalignment – translated from paintings to operational data management – can cause similar misconceptions when data is analyzed for the decision-making process. • A model with a formal and consistent way to express the outcomes. In section 2, candidate consolidation techniques are identified and analyzed against the above evaluation criteria. In section 3, the QEST multidimensional performance measurement model is presented, with an emphasis on its consolidation features; both the 3D original version and its nD extension of the QEST model are presented. In section 4,

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

4

three options are presented for combining the QEST model in a BSC framework, thereby consolidating the ICT BSC final outcomes into a single overall value. The summary is presented in section 5.

2. Candidate consolidation techniques Some PMSs proposed in the literature are reviewed and analyzed against these consolidation requirements. • •



The Performance Pyramid [LYNC95] approach, by Lynch and Cross, uses a pyramid-shaped “map” for understanding and defining the relevant objectives and measures for each level of the business organization (four levels are identified: business, core business processes, department and teamwork, individuals). The Performance Prism [NEEL], developed jointly by Accenture and the Cranfield School of Management (UK), allows two groups of stakeholders (investors and customers) and up to five groups of stakeholders (employees, suppliers, intermediaries, regulators and communities) to be handled in the performance management measurement scheme through a 3D shape (a prism with 5 facets). This approach could be positioned midway between a value chain like in the EFQM Excellence Model [EFQM99] and the BSC. For the purpose stated in this study, the following constraints have been identified: the calculation matrix does not use the geometry to derive the performance results, and it is limited to handling five perspectives. The General Framework for Performance Measurement [ROUS97], developed by Rouse, Putterill and Ryan at the University of Auckland (NZ), uses DEA (Data Envelopment Analysis) models to represent performance graphically with a 3D pyramid having a square base where each of the four faces corresponds to the original four BSC perspectives. Measures are linked to the critical success factors (CSF) and to the underlying processes or cost drivers.

Table 2 presents a high-level comparison of these three performance measurement approaches and the BSC framework, including the following evaluation criteria: the number of perspectives taken into account, whether or not the approaches have a fixed structure or a dynamic one allowing for a variable number of perspectives, the representational format used and the provision of a performance value for each perspective and a consolidated value for all perspectives. Perspectives handled Reference: Balanced Scorecard (D.Kaplan & D.Norton)



BSC

1. 2. 3. 4.

Financial Customer Internal process Learning & growth

Dimensional Principle respected N/A

Performance value for each perspective No

Consolidated performance value No

N/A (Map)

N/A

N/A

N/A

Closed

G3

N

No

Yes

Closed

N/A (Map)

N/A

Yes

Yes

Model Type

Dimensions handled

Closed

N/A (Map)

Closed

Candidate Performance Measurement Systems (PMS) •





1. Business 2. Core business processes 3. Departments and workteams) 4. Individuals 1. Stakeholder satisfaction Performance Prism 2. Strategies (A. Neely & C.Adams) 3. Processes 4. Capabilities 5. Stakeholder contribution General Framework for Can be used with any 4-perspective Performance Measurement set (typically the ones in the Kaplan (P.Rouse, M.Putterill and & Norton’s BSC) S.Ryan) Performance (R.Lynch & K.Cross)

Pyramid

Table 2 - Comparison of Performance Management-Measurement Models

From Table 2, it can be observed that • None of these three PSMs provides for the derivation and presentation of a consolidated performance value; • A geometrical representation is used only to present the distinct results visually. • The details are not consolidated into a single outcome • No geometrical formula is associated with the particular shape used for deriving performance indicators, and the dimensional principle is not respected.

3. QEST: a multidimensional model for Software Performance Measurement In addition to the above PMSs proposed for the generic measurement of performance, a measurement model originally designed for measuring the quality and performance of ICT projects, the QEST model, has been proposed by Buglione and Abran. It is to be noted that this model is generic and not restricted to the quality domain: it could be generalized to the generic measurement of performance. We will see that this QEST model does indeed meet all the requirements and evaluation criteria for a consolidated representation of performance.

3.1. The 3-dimensional basic model The initial QEST (Quality factor + Economic, Social and Technical dimensions) model proposed a geometrical representation of quality performance with the number of dimensions corresponding to the number of viewpoints considered; in addition, this number of dimensions can vary, depending on the organization's choice of the number of 3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

5

dimensions they wish to take into consideration in their performance measurement system. The QEST model is thus not limited to 3 or 4 dimensions, but provides a multidimensional structured shell which can then be populated with the criteria specified for any project, on the basis of management objectives: it is therefore referred to as an open model. Such a topology in a performance model also makes it possible to handle multiple and distinct viewpoints, all of which can exist concurrently in any software project. The various features of the QEST model have been documented: • Theoretical considerations [BUGL99a] [BUGL99b]; • Geometrical and statistical foundations [BUGL99c]; • Implementation of the model [BUGL01c]; • Quality Factor [BUGL99b] and Quality Factor through QFD [BUGL01a]. The basic purpose of the structured shell of this open model is to express performance as the combination of the specific measures (or sets of measures) selected for each of the specified dimensions, these values being derived from both an instrument-based measurement of productivity and a perception-based measurement of quality in the initial QEST quality model. For demonstration purposes, when three perspectives are selected for assessing quality (and performance), a 3dimensional geometrical representation of a regular tetrahedron is selected as the basis for the QEST model, and is illustrated in Figure 2: • the three measurement dimensions (E, S, T) in the spatial representation correspond to the corners of the pyramid’s base, and the convergence of the edges to the P vertex correspond to the maximum performance level – the normalized target; • when the three sides are of equal length, the solid shape that represents this 3-dimensional concept is a pyramid with its triangular base and sides of equal length (tetrahedron). The use of this pyramid-type geometrical representation imposes the following constraint: the sides must be equal. In practice, this constraint is met by normalising the target values to 1 for each of the three different dimensions chosen – and with the representation of corresponding sides with a length value of 1 (regular tetrahedron). This allows the representation of the dimensions through a normalized set of values between 0 and 1 for each axis and on a ratio scale.

Figure 2 - QEST model and its hyperplane sections corresponding to two assessments at different points in time, relative to normalized targets

With this 3-dimensional (3D) representation, it is possible to determine and represent performance using the three classic geometrical concepts: distance, area (the hyperplanes in Figure 2) and volume; furthermore, the ratio between the volume of the lower part of the truncated tetrahedron and the total volume of the tetrahedron itself represents the normalized performance level of a project being assessed against its performance target which is specified as the apex of the tetrahedron, as demonstrated in [BUGL99c]. Thus, the volume within this 3D shape immediately shows in a visual way the assessment of quality at a point in time against the specified target represented, in a normalized way, by the apex of the tetrahedron; similarly, when performance is reassessed throughout a project life cycle, the sloped section of the hyperplane represents how performance is progressing in the three dimensions, and whether or not quality is progressing within each dimension. This is the added visual value that a geometrical representation can provide, together with the support of the corresponding geometrical formula, to adequately consolidate the quantitative values of the concepts measured individually for each edge. By analogy, a management dashboard's power is limited to a series of individual gauges and controls providing a feel for the distinct elements; it does not provide for a combined representation or for the overall status of a given phenomenon. To summarize, the geometrical approach of the QEST model makes it possible to represent the measurement of performance in a simple and visual way for an immediate grasp of the current status of project performance at any time.

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

6

Furthermore, an extension of the QEST model to the whole software life cycle has been developed, namely the LIME (LIfecycle MEasurement) model [BUGL00a]; this QEST/LIME model is the QEST instantiation to the whole software development life cycle, and is applied to each development phase.

3.2. The n-dimensional extension Of course, other profiles could be included in a QEST/LIME model, including quality models such as EFQM or the Malcolm Baldridge models, or others like the Balanced Scorecard or the Skandia Navigator where the number of stakeholder viewpoints can be significantly increased3. While they have been described and illustrated using three dimensions, the QEST/LIME models, being open models, are not restricted to the representation of three dimensions of performance. Their geometrical approach is generic and allows for the handling of any number of dimensions, through the use of appropriate algebra corresponding to the number of dimensions selected, that is, the use of n dimensions. This was demonstrated in [BUGL02]: the mechanism to be used for n dimensions is a simplex, definable as the convex hull of any d+1 affinely independent points in some

ℜ n (n>d), where d represents the number of dimensions taken into account. There are, of course, both advantages and disadvantages to this. For instance, by using matrix calculations, it is possible to extend the underlying concepts of the QEST/LIME models to n dimensions, and we have demonstrated the backward-forward validity of this mechanism for any number of dimensions. The models are thus able to take into account a greater number of stakeholders. However, from the fourth dimension on, it is no longer possible to take advantage of the visualization of related concepts in a direct way. In Table 3, a summary review of the characteristics of both the QEST 3D and QEST nD models are presented using the same criteria as those listed in Table 2: both QEST models respect the dimensional principle and allow for consolidation for each individual perspective, as well as for consolidation of all perspectives considered. Candidate PMS •

QEST 3D (Buglione & Abran)



QEST nD (Buglione & Abran)

Perspectives handled

Model type

Dimensions handled

1. Economic 2. Social 3. Technical QEST nD can be used with the desired set of perspectives

Open

G3

Dimensional principle respected Y

Open

Gn

Y

Performance value for each perspective

Consolidated performance value

Y

Y

Y

Y

Table 3 - Comparison of Performance Management-Measurement models: the case for QEST models

Table 3 highlights the relevance of investigating the use of the QEST models, in particular QEST nD, the n-dimensional extension and generalization of the basic model, to a generic ICT BSC.

4. Integration of QEST and BSC The two options for integrating QEDT and BSC are discussed next. For simplicity's sake, in this text examples are limited to the specific case of n=3. 1. BSC (QEST nD): every ICT BSC perspective applies the basic QEST model as a measurement system using the related nD extension (i.e. QEST 5D for 5 perspectives, and so on); 2. QEST nD (BSC perspective): each ICT BSC perspective represents one of the dimensions of the tetrahedron (taking into account the n-dimensional extension of QEST).

4.1. First option: BSC(QEST nD) Here, the BSC would manage separate, and often disjointed, performance values. This would produce n different quantitative outcomes for each perspective, with only descriptive linkages at the strategy level, and no clear traceability of quantitative contributions to the overall strategy level. The mapping of the elements of the BSC and QEST frameworks is presented in Table 4.

3

Atkinsons, Waterhouse & Wells [ATKI97] distinguish between two groups of stakeholders: Environmental Stakeholders, who belong to the external environment of the organization, which defines the critical elements of its competitive strategy (customers, owners, community); • Process stakeholders, who are employees and suppliers working within the environment defined by the external stakeholders to plan, design, implement and operate the processes that make the company's product and deliver it to its customers. In addition, Sharp, Finkelstein & Galal [SHAR99] propose a 5-step procedure for the identification of stakeholders in the requirement engineering process, where there are four baseline stakeholders: users (mapping to the social perspective), developers (mapping to the technical perspective), decision-makers (mapping to the economic perspective) and legislators (not considered in the ISO/IEC 9126 model). •

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

7

BSC Perspective Goal Driver Indicator Measure Target Initiatives

QEST Perspective Indicator Value collected in the field Weight Upper Threshold Lower Threshold Normalized value SPI-related process area for improvement

Table 4 - Mapping of BSC and QEST structural elements

Some differences can be noted between the two frameworks. First of all, what in QEST is referred to as a perspective would correspond to the logical sum of a goal and one of its related drivers in a BSC perspective, a more granular subdivision than in QEST. Thus, this group of elements (goal + its related drivers) is directly referred to a list of n indicators. In the same way, in the QEST model the indicators are directly referred to a perspective (i.e. economic, social or technical). Another difference is the ability of the QEST model to handle normalized values; this is also, of course, a constraint. Since QEST handles normalized values, some additional BSC framework elements must be gathered: lower thresholds for each measure in order to derive a normalized value for each goal+driver (G+D) topic and its weight, where the sum of the G+D performances for each QEST perspective will be equal to 1; in Figure 3, G1D2 stands for goal no.1-driver no.2 within a certain BSC perspective.

Figure 3 - QEST application to a single BSC perspective

Extending the concept to n possible tuples of GxDy elements, the application of the QEST nD calculation mechanism will produce two (n+1) square matrices, UM (upper measure) and GM (grand measure) from which the p values (the ratio of UM to GM) will be derived. The final outcome will be a normalized measure indicating the performance value for a specific perspective. Repeating this calculation for all the possible perspectives taken into account, the ICT BSC (QEST nD) will return a “first-level value”4 for internal benchmarking analysis. This value can be useful whether or not software-intensive organizations intend to maintain the same GDI (goal-driver-indicator) architecture during (at least) two subsequent time sequences in order to monitor its evolution over time. We illustrate in Figure 4 how, for a given BSC perspective (e.g. Infrastructure & Innovation), the QEST approach can be applied. The consolidated normalized performance value, following QEST application, would be equal to 0.9808.

4

In this paper, the term « first-level value » is used to express the consolidated value from a tree of values, as a result of the QEST calculation algorithm.

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

8

Figure 4 - QEST application to a single BSC perspective: map and values

4.2. Second option: QEST nD(BSC) Here, with the joint application of QEST and BSC, the QEST model represents the “glu framework and provides a unique consolidated value for all the perspectives considered. This second option addresses the following concern: after considering the internal performance value for each perspective, the next logical question an executive asks when assessing the overall business view is: What is the consolidated view of the whole BSC value? That is, is it possible to obtain an overall performance value first, and then carry out a drill-down in-depth analysis (e.g. top-down analysis) from there, perspective by perspective, looking for single indicators and relationships hypothesized across goals selected in the initial design of the organizational strategic BSC map? In this case, the mapping between QEST and BSC is easier, as shown in Table 5. BSC

QEST

Perspective Perspective value from QEST application

Perspective

Measure Target Initiatives

Indicator Value collected in the field Weight Upper Threshold Lower Threshold Normalized value SPI-related process area for improvement

Table 5 - Mapping of BSC perspectives and QEST structural elements

The concept of “perspective” is the same in this case, while the value for each of the perspectives taken into account (Figure 5) is that derived from the calculations shown in Figure 4. This means that each perspective will have only one performance value from the BSC(QEST nD) calculation, proposed in section 4.1. In this case, the QEST model must take into account just the three values as they are.

Figure 5 - QEST application to the whole BSC framework

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

9

Extending the concept to n possible perspectives, the application of the QEST nD calculation mechanism will produce two (n+1) square matrices, UM (upper measure) and GM (grand measure), the p values being their ratio. The final outcome will be a normalized measure representing the consolidated value of the whole BSC. Repeating this calculation at regular time intervals (either projects or business intervals), a “consolidated top-level value” for internal benchmarking analysis is available to management, provided, of course, that the organization maintains the same GDI architecture across successive time intervals.

4.3. Integration of the options: BSC(QEST nD) + QEST nD(BSC) From the two previous sub-sections, it can be observed that these two “options” are not mutually exclusive, but can be seen as complementary to one another. An implementation sequence is shown in Figure 6.

Figure 6 –Sequence for integrating the QEST model into the ICT BSC framework

The first phase consists in the classic construction of the BSC strategic map, which describes the causal links among the organization’s goals. The next phase consists in its “translation” into a set of consolidated performance values, one for each perspective (through the “QEST nD[BSC perspective]” option). The output will constitute five performance values, which will represent the inputs for the overall BSC calculation (Phase 3). Here also, the application of the QEST nD calculation mechanism, through the use of the simplex, will produce a (6 x 6) matrix. Its result will be the final consolidated performance value p for the whole BSC.

5. Conclusions The Balanced Scorecard (BSC) framework provides senior executives with a powerful analytical tool for putting their organizational strategy into action. Causal relationships are stressed in the BSC structure, and this makes it possible to aggressively pursue, through close monitoring, the detailed actionable items supporting the corporate strategy. However, the BSC framework does not formally specify “how” a given BSC perspective (through its GDI elements) contributes to the whole BSC value, nor does it provide for a consolidated performance value for the whole organization, and the global strategic target. The availability of such consolidation rules would provide executives with single values as well as the corresponding information model for interpreting the information provided. These rules would also allow for a top-down decomposition, so that an understanding could be gained of the background forces driving each consolidated view. The availability of consolidated values could, in turn, lead to the establishment of standard sets of consolidated measures and to the institutionalization of internal-external benchmarking practices. Key concepts of the original Kaplan & Norton BSC framework were presented, together with highlights of the tailoring available for the ICT field. Some candidate PMSs for improving a BSC framework were identified and analyzed against a list of relevant criteria, pointing to some of their limitations in this context of intended use. The QEST model was presented next and discussed with respect to the same set of evaluation criteria. While the basic initial QEST model can be used for three perspectives at once, its extension and generalization – called QEST nD – makes it possible to handle any number of perspectives simultaneously, including their multidimensional consolidation. Options for integrating the BSC and the QEST nD model were presented next. The first option consisted in managing the BSC perspectives separately, while the second one consisted in consolidating the perspective performance values to derive the overall BSC value. Both options are valuable and valid for capturing different measurement viewpoints: the partial and overall performance values expressed by an ICT BSC system. It is possible to read data and analyze them with a “layer” logic, from top to bottom, considering each BSC perspective as a separate layer contributing to the final result. The three levels of analysis a manager should perform could therefore be: 1. Look at the overall BSC value (the “BSC (QEST nD)” option); 2. Look at the single perspective values (the “QEST nD(BSC)” option);

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

10

3.

Analyze each single perspective for follow-up action on the basis of the usual BSC “strategic map”, looking at results in the single “layer” as well as the contribution resulting from the causal links across the goals from the different perspectives.

In this way, QEST nD becomes a n-dimensional performances measurement system providing a consolidated “value” to the ICT BSC and to the proxies it takes into account, with a holistic focus on the whole BSC. The research work presented here has put the emphasis on the measurement aspect in Performance Management. Undoubtedly, further investigation is required to tackle more specialized issues, such as the following: Is it possible to measure and “weight” the contribution in terms of performance that goals from one perspective provide to another, with a view to capturing the inner value of each perspective in the value chain, and how much is contributed by the other perspectives? REFERENCES [ATKI97] [BUGL99a] [BUGL99b] [BUGL99c] [BUGL00a] [BUGL00b] [BUGL01a] [BUGL01b]

[BUGL02a] [BUGL02b] [DER95a] [DER95b] [EDVI97] [EFQM99] [EPST97] [FERG99] [FINK96] [IBAÑ98] [KAPL96] [LAND00]

[LYNC95] [NEEL] [OPDH97] [REO00] [REO99a] [REO99b] [ROUS97] [SHAR99]

[SKAN98] [SVEI97] [SVEI98]

ATKINSON A.A., WATERHOUSE J.H. & WELLS R.B., A Stakeholder Approach to Strategic Performance Measurement, Sloan Management Review, Spring 1997, pp. 25-37 BUGLIONE L & ABRAN A.: A Quality Factor for Software, Proceedings of QUALITA99, 3rd International Conference on Quality and Reliability, Paris, France, 1999, March 25-26, pp. 335-344. BUGLIONE L. & ABRAN A.: Multidimensional Software Performance Measurement Models: A Tetrahedron-based Design, Rapport de Recherche No, 99/01, Département d'informatique, UQAM, Université du Québec à Montréal, 14 Mai 1999. BUGLIONE L. & ABRAN A.: Geometrical and Statistical Foundations of a Three-dimensional Model of Performance, in “International Journal of Advances in Engineering Software”, Elsevier Publisher, Vol. 30 No. 12, December 1999, pp. 913-919. BUGLIONE L. & ABRAN A.: LIME: A Three-dimensional Software Performance Measurement Model for Project Management., Proceedings of the 2 nd World Congress on Software Quality (2WCSQ), September 25-29, 2000, Yokohama, Tokyo, Japan BUGLIONE L. & ABRAN A., Balanced Scorecards and GQM: what are the differences?, FESMA/AEMES 2000 Conference, October 1820, 2000, Madrid, Spain BUGLIONE L. & ABRAN A., QF2D: Quality Factor through QFD application, Qualita2001 (4th International Congress on Quality and Reliability), Annecy, France, March 22-23, 2001, ISBN 2-9516453-0-0, pp. 34-39 BUGLIONE L. & ABRAN A., Multidimensionality in Software Performance Measurement: the QEST/LIME models, SSGRR 2001 (2nd International Conference on Advances in Infrastructure for Electronic Business, Science, and Education on the Internet), L'Aquila, Italy, August 6-10, 2001 BUGLIONE, L. & ABRAN, A., QEST nD: n-dimensional Extension and Generalisation of a Software Performance Measurement Model, “Advances in Engineering Software”, Elsevier Science, Vol. 33, No.1, January 2002, pp.1-7 BUGLIONE L. & ABRAN A., Implementation of a Three-Dimensional Software Performance Measurement Model, Technical Report, Université du Québec à Montréal, to be published, 2002 DÉRY, D., ABRAN, A., Adapting the SIMAP Productivity Model to Software Maintenance, presented at Canadian Conference on Electrical and Computer Engineering, Montreal, Canada, 1995. DÉRY D. AND ABRAN A., Adapted SIMAP Model, 0.5, 18 September 1995 ed. Montréal: Université du Québec à Montréal, 1995. EDVINSSON L. & MALONE T., Intellectual Capital: Realising your Company's time value by finding its hidden brainpower, Harper Collins, NY, 1997 EUROPEAN FOUNDATION FOR QUALITY MANAGEMENT, The EFQM Excellence Model – Improved Model, EFQM, 1999 EPSTEIN M.J. & MANZONI J.F., The Balanced Scorecard and Tableau de Bord: A Global Perspective on Translating Strategy into Action, INSEAD Working Paper 97/63/AC/SM, 1997 FERGUSON P., LEMAN G., PERINI P., RENNER S. & SESHAGIRI G., Software Process Improvement Works!, SEI Technical Report, CMU/SEI-TR-99-27, November 1999 FINKELSTEIN, A. & SOMMERVILLE, I. "The Viewpoints FAQ" Software Engineering Journal, 11, 1, (1996), 2-4. IBÁÑEZ M., Balanced IT Scorecard Generic Model Version 1.0, European Software Institute, Technical Report, ESI-1998-TR-009, May 1998 KAPLAN R.S. & NORTON D.P., The Balanced Scorecard: Translating Strategy into Action, Harvard Business School Press, 1996 LAND F.F. Evaluation in a socio-technical context, in “Organizational and Social Perspectives on Information Technology”, (eds.) Richard Baskerville, Jan Stage, Janice I. DeGross, IFIP TC8 WG8.2 International Working Conference on the Social and Organizational Perspective on Research and Practice in Information Technology, (Aalborg, Denmark, June 9-11, 2000) Chapter 8 LYNCH R. & K.CROSS, Measure Up!, 2/e, 1995, Blackwell Publications, ISBN 1557867186 NEELY A. & ADAMS C., Perspectives on Performance: The Performance Prism, Center for Business Performance (CBP), Cranfield University, School of Management, URL: http://www.som.cranfield.ac.uk/som/cbp/prismarticle.pdf OPDHAL A.L., A Comparison of Four Families of Multi-Perspective Problem Analysis Method, IRIS 20 Conference, Hanko, Norway, August 9-12, 1997 REO D., Applying the Balanced Scorecard for process improvement - A case study by the European software industry, Presentation, Measuring The Business Value Of IT With The Balanced Scorecard, February 23 & 24, 2000, Phoenix, AZ REO D., QUINTANO N. & IBÁÑEZ M., ESI Balanced IT Scorecard Process Perspective V 1.1, European Software Institute, Technical Report, ESI-1999-TR-016, February 1999 REO D., QUINTANO N. & BUGLIONE L., ESI Balanced IT Scorecard Infrastructure & Innovation Perspective, European Software Institute, Technical Report, ESI-1999-TR-043, December 1999 ROUSE P., PUTTERILL M. & RYAN D., Towards a General Managerial Framework for Performance Measurement: A comprehensive Highway Maintenance Application, Journal of Productivity Analysis, Vol. 8 No. 2, 1997, pp. 127-149 SHARP, H., FINKELSTEIN, A. & GALAL, G., Stakeholder Identification in the Requirements Engineering Process, in Proceedings of DEXA 99 Conference (10th International Workshop on Database & Expert Systems Applications, September 1-3, 1999, Florence, Italy), IEEE Computer Society Press, 1999, pp. 387-391. SKANDIA GROUP , Skandia UNIC: Universal Networking Intellectual Capital, 1998, http://www.skandia.se/group/fininfo/frame_annual_reports.htm SVEIBY K.E., The New Organisational Wealth: Managing and Measuring Knowledge-Based Assets, Berrett-Koelher, San Francisco, 1997 SVEIBY K.E., The Intangible Asset Monitor, 1998, URL: http://www.sveiby.com.au/IntangAss/CompanyMonitor.html

3rd International Workshop on Software and Performance, Rome, Italy, July 24-27, 2002

11