TOWARDS AN APPROACH FOR EVALUATING

3 downloads 75016 Views 571KB Size Report
cooperation between the customer, management and development team to form ..... a framework for applying XP practices to Cloud Application Development.
IJICIS, Vol.13, No. 3 JULY 2013

International Journal of Intelligent Computing and Information Science

TOWARDS AN APPROACH FOR EVALUATING THE IMPLEMENTATION OF EXTREME PROGRAMMING PRACTICES Nagy Ramadan Darwish Assistant Professor, Department of Computer and Information Sciences, Institute of Statistical Studies and Research, Cairo University, Egypt. [email protected]

Abstract : This paper is concerned with evaluating the degree to which various eXtreme Programming (XP) practices are implemented. It clarifies the essential concepts, core values, and life cycle phases of XP approach. In addition, this paper presents and focuses on the twelve XP practices. Software organizations adopt XP practices in different degrees of implementation. Goal Question Metric (GQM) Method is one of the software measurement methods and it'll be presented and utilized in this paper. This paper presents a proposed approach to evaluate the degree to which various XP practices are implemented. For each XP practice, the researcher adapts GQM method to elaborate its goal, questions, and metrics. The calculation of the elaborated metrics provides an efficient indicator regarding the degree of the implementation of XP practices. The software organizations and XP teams can use the proposed approach as a guide to monitor the work environment of software projects to control and enhance the adoption of XP practices. Keywords: Extreme Programming, XP Practices, Evaluation, Goal Question Metric Method, Metrics 1. Introduction In recent years, agile software development methods have gained much attention in the field of software engineering [37]. Agile software development is an iterative and incremental approach that is performed in a cooperative manner to produce high quality software that meets the changing requirements of the users. Agile software development methods offer a viable solution when the software to be developed has fuzzy or changing requirements [6]. XP approach is one of the most popular agile development methods. XP process is characterized by short development cycles, incremental planning, continuous feedback, and reliance on communication and evolutionary design [37]. It is designed for use with small teams that need to develop software quickly and in an environment of rapidly changing requirements. XP approach was developed at Chrysler by Kent Beck while working on a payroll project as a member of a 15 person team. Beck continued to refine and improve XP approach after the project was completed until it gained worldwide acceptance in 2000 and 2001 [12]. XP approach is based on four values including simplicity, communication, feedback and courage. These values are implemented with twelve core practices: planning game, small releases, metaphor, simple design, testing, refactoring, pair programming, collective code ownership, continuous integration, 40-hour week, on-site customer, and coding standard. Software organizations are progressively adopting the development practices associated with XP approach [13]. XP practices can be used to create a community that supports and sustains a culture that includes the XP values [7]. On the other hand, XP approach can be viewed as life 55

Darwish, Towards An Approach For Evaluating The Implementation of Extreme Programming Practices

cycle phases that include six phases: exploration, planning, iterations to release, production, maintenance, and death. Figure (1) illustrates the XP values, practices, and phases.

Figure (1): The Values, Practices, and Phases of XP.

The values and practices of XP approach help the team to realize several benefits such as:   

   

XP approach is designed for use in an environment of rapidly changing requirements. It helps to reduce the cost of changes by being more flexible to changing requirements. The focus on customer increases the possibility that the software produced will actually meet the needs of the users. XP is heavily reliant on continuous communication between stakeholders and tight feedback loops to clarify and specify feature implementation and to respond to change [21]. Increasing communication in discussing the software before development will result in decreasing the chance of misunderstanding between the team and users. It decreases the chance of rework. The focus on small and incremental release decreases the risks of the project by putting functionality in the hands of the users, helping them to provide timely feedback regarding the new software. Continuous testing and integration enables the team to increase the quality of the new software. Continuous reviewing of the software by two different individuals is more effective than one individual who may be bias and undetectable. Pair programming decreases the possibility of gaining defective software. Refactoring enables the team to increase effectiveness of the software. The code is prone to be edited and enhanced.

Because of the many advantages of XP approach, the progressive usage of XP in software organizations, and the importance of XP practices for the success of applying this approach, there is a need to ensure the proper implementation of XP practices. Therefore, this paper focuses on evaluating the degree to which various XP practices are implemented. For achieving this objective, the researcher uses GQM Method. GQM method is a goal oriented approach for measurement was developed by Victor Basili and et al at Maryland University [4]. This method was designed for multi-purpose evaluation of software. GQM method consists of three steps: determining a Goal, determining a set of 55

IJICIS, Vol.13, No. 3 JULY 2013

questions which have possible answers, and determining a metric or more for each question. The last step is analytic of the set of metrics, which consists of weight coefficient for each set of answers [38]. The researcher proposes an approach for evaluating the implementation of XP practices. The proposed approach depends on defining a goal or a set of goals for each XP practice, decomposing each goal into several questions, defining a metric or a set of metrics that answer the questions, and calculating the value of the metrics. However, this approach will be discussed in section V. 2. Life Cycle Phases of XP Approach The XP software development process focuses on iterative and rapid development. XP approach stresses communication and coordination among the team members at all times; and requires cooperation between the customer, management and development team to form the supportive business culture for the successful implementation of XP [1]. XP approach can be viewed as life cycle phases that include six phases: exploration, planning, iterations to release, productionizing, maintenance, and death [1]. Each phase can be achieved through a set of activities. Figure (2) illustrates the phases of XP life cycle that will be explained in the following paragraphs.

Figure (2): XP Life Cycle [26].

1. In the exploration phase, the customers write out the story cards that they want to be included in the first release. Each story card presents a feature to be added into the release. During this phase, the development team gets familiar with the development environment and the addressed technology [33]. 2. In the planning phase, the customers prioritize the stories and set an agreement of the features of the first small release. The developers estimate the necessary effort and time for each story. Then, the schedule of the first release is defined and approved. 3. In the iterations to release phase, the actual implementation is done. This phase includes several iterations of the development before the first release. The schedule is broken down to a number of iterations that will each take one to four weeks to complete [26]. At the start of the iteration, the customer selects the smallest set of most valuable stories that make sense together and programmers produce the functionality. After the tests are written, the code is developed and 55

Darwish, Towards An Approach For Evaluating The Implementation of Extreme Programming Practices

continuously integrated and tested. At the end of the iteration, all functional tests should be running before the team can continue with the next iteration [17]. The system becomes ready for next phase when all iterations scheduled for a release are achieved. 4. The productionizing phase includes extra testing and checking of the functionality and performance of the system before the system can be released to the customer [26, 32]. During this phase, new changes may still be found and the decision has to be made if they are included in the current release. The postponed ideas and requirements are documented and scheduled for later implementation during the maintenance phase. 5. After the first release is productionized, the system must be kept running in production, while remaining stories are implemented in further iterations. In addition, the maintenance phase includes efforts of customer support. Development stays in this phase until the system satisfies all customers' requirements. 6. Finally, development enters the death phase when the customer has no more stories to be developed and all the necessary documentation of the system is written. Death may occur if the system is not delivering the desired outcomes, or if it becomes non-feasible for further development. 3. Values and Practices of XP Approach XP approach is based on the common understanding of fundamental values and on a disciplined application of best practices [22]. 3.1. XP Values XP approach is based on four main values: simplicity, communication, feedback and courage and expresses the necessity to overcome rigid conventions that have accumulated within the area of software engineering over the last decades [23].  Communication: XP encourages the team members and users to own a shared view on requirements. As a result of continuous communication between the team members as well as with the user, the knowledge about the new system becomes unified. Therefore, there are fewer possibilities of ambiguities and misunderstandings on requirements. Projects developed with XP show that good results can be obtained using sheets of papers to collect user requirements, wall boards to show diagrams and other project-relevant information, and shared workspaces to maximize face-to-face communication [22].  Feedback: Developers must always have a way for obtaining information regarding the development process. Feedback includes several dimensions: the system, customer, and team members. Feedback from the system and the team members aims to provide project leaders with quick indicators of the project’s progress whereas feedback from customer includes the functional and acceptance tests.  Simplicity: Simplicity is one of the values supported explicitly by XP [8]. A simple design always needs less time to finish than a complex one. Therefore, XP encourages developers to start with the simplest solution. Extra functionality can then be added later. Programmers do the simplest thing that could work, and leave the system in the simplest condition. This improves the general speed of development while still retaining an emphasis on working software.  Courage: XP encourages the team members to make decisions that support the implementation of XP practices. The team members need courage to refactor the software code [14]. The team members review the existing system and modify it to facilitate the implementation of future

55

IJICIS, Vol.13, No. 3 JULY 2013

changes. In addition, courage may include removing parts of source code that is obsolete, no matter how much effort was used to create these parts. 3.2. XP Practices (rules) XP approach is a software development discipline in the family of agile methodologies that contributes towards quality improvement using dozen practices [15]. This approach consists of twelve practices, which are planning game, small releases, metaphor, simple design, testing, refactoring, pair programming, collective code ownership, continuous integration, 40-hour week, on-site customer, and coding standard [17]. XP practices cannot be classified as merely being “implemented” or “not implemented.” There are several degrees to which various practices are adopted. For example, a team may choose to program in pairs for complex parts of the code and program individually when writing routine code [27]. Any XP practice doesn't stand well on its own. They require the other practices to keep them in balance [17]. Figure (3) illustrates the mutual relationships among XP practices.

Figure (3): XP Practices Support Each Other [17].

1. Planning Game: The planning game is a collaborative affair involving all team members [8]. At the start of the development, customers, managers, and developers meet to create, estimate, and prioritize requirements for the next release. In fact, story cards are used to capture the requirements. The planning game and the story cards offer the devices to perform planning on the most detailed level for very short periods of time [16]. The customers are aware that the first release will provide the selected functionalities and that many releases could be necessary before all functionalities are implemented [22]. 2. Small Releases: The development is divided into a sequence of small iterations, each implementing new features separately testable by the customer [6]. It is sometimes called short releases. XP accelerates the software delivery by having short releases of 3-4 weeks. At the end of each release, the customer reviews the software product to determine defects and adjust future requirements. Then, a new iteration must be achieved. This process is repeated a little number of times till an initial version of the software is put into production. Small releases help customers to gain confidence in the project progress and to come up with their suggestions on the project based on real experience.

55

Darwish, Towards An Approach For Evaluating The Implementation of Extreme Programming Practices

3. Metaphor: The system metaphor is a story or view that expresses the overall way in which the system will operate [14]. The system metaphor is an effective way of getting all members of the project team to visualize the project. It provides inspiration, vocabulary, and basic architecture of the system. 4. Simple Design: XP requires programmers to use the simplest possible design that will meet the current needs [2]. They should not make any efforts for future Needs. Kent Beck stated that the right design of the software at any given time is the one that runs all the tests, has no duplicated logic, states every intention important to the programmers, and has the fewest possible classes and methods [17]. 5. Testing: Traditionally, testing is a phase of development that is carried out after the main coding effort [23]. But, tests in XP must be created prior to coding. All code must have automated unit tests and acceptance tests, and must pass all tests before it can be released [6]. Programmers write unit tests whereas customers write functional tests. The result is a program that becomes more and more confident over time. 6. Refactoring: Refactoring is the process of improving the code by removing redundancy, eliminating unused functionalities, improving code readability, reducing complexity, improving maintainability, adapting it to patterns, or trying to make it work in an acceptable way[24]. Refactoring throughout the entire project life-cycle saves time of development and increases quality. 7. Pair Programming: It is a programming technique that needs two programmers to work together to achieve a development task while sharing one computer. The task may include analyzing data, creating the data model, programming, etc. The advantages of pair programming are improving productivity, the quality of the solution, and job satisfaction [19]. Moreover, it accelerates task completion and it is particularly useful in complex tasks and training. 8. Collective Code Ownership: This practice means that all developers own and share code. It facilitates ensuring correctness of implementation by encouraging developers to look at and improve each other’s code [11]. It tends to spread knowledge of the system around the team. The code should be subjected to configuration management. 9. Continuous Integration: This practice means that code for each story is integrated into the evolving system as soon as it is ready [2]. All tests are run and they have to be passed for accepting the changes in the code. Thus, XP teams integrate and build the software system multiple times per day. Continuous integration reduces development conflicts and helps team to create a natural end to the development process. 10. 40-Hour Weeks: This practice indicates that the software developers should not work more than 40 hour weeks. The people perform best and most creatively if they are rested, fresh, and healthy. Therefore, requirements should be selected for iteration such that developers do not need to put in overtime. 11. On-Site Customer: A customer works with the development team at all times to answer questions, perform acceptance tests, and ensure that development is progressing as expected. This customer-driven software development led to a deep redefinition of the structure and features of the system [6]. It supports customer-developer communication [16]. 12. Coding Standards: This practice indicates that the developers must agree on a common set of rules enforcing how the system shall be coded. This makes the understanding easier and helps on producing a consistent code. Coding standards are almost unavoidable in XP, due to the continuous integration and collective ownership properties.

56

IJICIS, Vol.13, No. 3 JULY 2013

4. Applying GQM Method to XP Practices: GQM method depends on three main components: goal, question, and metric. Each goal generates a set of quantifiable questions that attempt to define and quantify this goal. The question can only be answered relative to, and as completely as, the available metrics allow. In GQM method, the same question can be used to define multiple goals. In addition, metrics can be used to answer more than one question. A metric is a quantitative measure of the degree to which a system, component, or process possesses a given attribute [10]. Metrics should be objective, timely, simple, accurate, useful, and costeffective. Unfortunately, this approach suits to mature and well-understood problem areas. However, GQM method consists of three steps: 1. Set goals specific to needs in terms of purpose, perspective, and environment. 2. Refine the goals into quantifiable questions that are tractable. 3. Deduce the metrics and data to be collected (and the means for collecting them) to answer the questions. Each XP practice has a goal. For example "Planning Game" aims to create, estimate, and prioritize requirements for the next release. The requirements are captured on “story cards” in a language understandable by all parties. The developers estimate the effort needed for the implementation of customers’ stories and the customers then decide about the scope and timing of releases. Each goal is decomposed into several questions, and each question can be answered by a metric or more. Figure (4) illustrates how to apply the three steps of GQM method to each XP practice.

Figure (4): GQM Method and XP Practice.

For the purpose of this paper, the researcher elaborates a list of goals for XP practices assuming that each XP practice has one goal. Table (1) illustrates the elaborated list of goals of XP practices. 56

Darwish, Towards An Approach For Evaluating The Implementation of Extreme Programming Practices

Table (1): The elaborated List of Goals of XP Practices.

XP Practice Planning Game Small Releases Metaphor Simple Design Testing Refactoring Pair Programming Collective Code Ownership Continuous Integration 40-Hour Week On-Site Customer Coding Standard

Goal Creating, estimating, and prioritizing requirements for the next release. Developing small releases. Obtaining an agreed story about how the system works including: providing inspiration, suggesting a vocabulary, and a basic architecture. Designing only what is needed to support the functionality being implemented and keeping design as simple as possible. Writing and applying all software tests to ensure the confidence in the operation of the program. Changing the code in order to improve it. Two programmers work together on the same computer at solving a development task to improve productivity, improve the quality of the solution, improve job satisfaction, and reduce the time needed for task completion. Enabling all developers to own, share, and edit the code and see the changes made by others. Integrating a new piece of code into the system and ensuring that all tests are run and passed for accepting the changes in the code. The software developers should not work more than 40 hour weeks, and if there is overtime one week, that the next week should not include more overtime. A customer works with the development team at all times to answer questions, perform acceptance tests, and ensure that development is progressing as expected. The developers must agree on a common set of rules enforcing how the system shall be coded.

In addition, the researcher selects "Planning Game" as an example and elaborates a set of questions and metrics for it. Table (2) illustrates the elaborated list of questions and metrics for "Planning Game". Table (2): The Elaborated List of Questions and Metrics for "Planning Game".

XP Practice: Planning Game Goal: Creating, estimating, and prioritizing requirements for the next release.

Question Q1: Are all persons related to this process participated in it? Q2: Is there a writing standard for creating stories?

Q3: Are the customers' stories created and reviewed?

Metric(s) M1: Percentage of participators in writing stories vs. all must be involved in this process. M2: Percentage of participators in estimating stories vs. all must be involved in this process. M3: Percentage of participators in prioritizing stories vs. all must be involved in this process. M1: Percentage of participators who know the existence of this standard vs. all participators. M2: Percentage of participators who understand this standard vs. all participators. M3: Percentage of participators who agree on this standard vs. all participators. M4: Percentage of stories which were prepared according to the writing standard vs. all stories. M1: Percentage of stories which were reviewed vs. all created stories. M2: Percentage of stories which have conflicts vs. all created stories. M3: Percentage of conflicts which were solved vs. all conflicts. M4: Percentage of stories which were merged vs. all created stories. M5: Percentage of stories which were decomposed vs. all created stories.

Q4: Are the customers' stories estimated?

M1: Percentage of stories which were estimated (effort and time) vs. all created stories.

Q5: Are the customers' stories prioritized?

M1: Percentage of stories which were prioritized vs. all estimated stories. M2: Percentage of stories which will be in the next release vs. all prioritized stories.

Q6: Is the process achieved within the planned time?

M1: Percentage of actual time of this process vs. the planned time.

The definition of each metric should include the mathematical or statistical techniques for calculating this metric. In addition, the definition of each metric should include the accepted range of the metric 56

IJICIS, Vol.13, No. 3 JULY 2013

value and the time check points for calculating this metric. The metrics can be calculated weekly as a part of project progress report. The metrics must be known, clarified, and negotiated to all team members. Previous experience from similar projects can be useful in this process. In addition, this process can be achieved with the assistance of external consultants to define and validate the metrics. 5. The Proposed Approach for Measuring the Quality of Adopting XP Practices: The researcher proposes an approach for evaluating the implementation of XP practices. This approach aims to evaluate the degree to which various XP practices are implemented. The researcher adapts GQM Method to achieve this objective. The software organizations and XP teams can use the proposed approach as a guide to monitor the work environment of software projects to control and enhance the implementation of XP practices. Figure (5) illustrates the proposed approach. START Select an XP practice to measure the implementation of it Recall the goal, list of questions, list of metrics of the selected practice Select a question from the list of questions

Select a metric from the list of metrics related to the selected question Collect the data that are necessary to calculate the selected metric Calculate the selected metric

Yes Add a special note in the final report

If the calculated value of the metric is NOT

No If the list of metrics is NOT empty empty?

Yes

No If the list of questions is NOT empty empty?

No Interpret and Analyze the Final Report

END Figure (5): The Proposed Approach.

56

Yes

Darwish, Towards An Approach For Evaluating The Implementation of Extreme Programming Practices

In other words, the researcher re-presents the proposed approach in the form of steps as shown in figure (6) and is explained in the following.

1. 2. 3. 4. 5. 6. 7.

Select an XP practice to measure the implementation of it. Recall the goal, list of questions, list of metrics of the selected practice. Note that there is one goal for each XP practice. Select a question from the list of questions. Select a metric from the list of metrics related to the selected question. Collect the data that are necessary to calculate the selected metric. Calculate the selected metric. If the calculated value of the metric is NOT accepted (according to the acceptable value range of this metric), then add a special note in the final report. 8. If the list of metrics is NOT empty, then go to step 4. 9. If the list of questions is NOT empty, then go to step 3. 10. Interpret and analyze the final report.

Figure (6): The Proposed Approach in the Form of Steps.

1. Select an XP practice from the twelve XP practices to measure the implementation of it. In the previous section, the researcher prepared an elaborated list of goals of all XP practices assuming that each XP have one goal. In addition, the researcher selected "Planning game" and prepared an elaborated list of questions and metrics for it. 2. Recall the goal, list of questions, list of metrics of the selected practice. For "Planning Game", there are six questions and sixteen metrics. The definition of each metric must include:  The data must be collected to calculate each metric.  The mathematical or statistical technique for calculating each metric.  The time check points for calculating each metric.  The acceptable value range of each metric. 3. Select a question from the list of questions related to the goal or the selected practice. The elaborated list of questions for "Planning Game" was listed in table (2). 4. Select a metric from the list of metrics related to the selected question. Some metrics may be Not Applicable (NA) in some specific cases [36]. So, during computing the value of the metrics, the NA metrics are eliminated. 5. Collect the data that are necessary for calculating the selected metric. The data can be collected by direct observations and questionnaires. These data may be affected by time check points that predetermined for calculating the metric. 6. Calculate the selected metric. Calculating metrics is a simple process because it depends on simple or known statistical or mathematical formulas. So, the researcher doesn't focus on calculating performance metrics. 7. If the calculated value of the metric is NOT accepted (according to the acceptable value range of this metric), then add a special note in the final report. In other words, after calculating the value of the metric, the calculated value is compared with the acceptable value range of the metric. If the metric value is out of the accepted value range, a special note must be added to the final report. 8. If the list of metrics is NOT empty, then go to step 4. Because of the question can be answered by a metric or more, the list of metrics may include only one metric or more. Therefore, this list must be checked to know if it is empty or not. If it is not empty, we must go to step 4 to select another metric and repeat the previous steps from 4 to 8.

56

IJICIS, Vol.13, No. 3 JULY 2013

9. If the list of questions is NOT empty, then go to step 3. Because of the goal include many questions, the list of questions must be checked to know if it is empty or not. If it is not empty, we must go to step 3 to select another question and repeat the previous steps from 3 to 9. 10. Interpret and analyze the final report. The final record must include the name of practice, goal, questions, metrics, and the value of each metric. In addition, the final report must include a special note beside the unaccepted value of the metric. The top management and the project team must interpret and analyze the final report to discover the weaknesses and strengths. They may take corrective actions to solve the problems that led to the unaccepted value of the metric. In addition, they may take supportive actions to maintain and increase the strengths of the XP practices. 6. Conclusion This paper aimed to propose an approach to evaluate the degree to which various XP practices are implemented. Therefore, the researcher proposed an approach for this purpose using GQM method. The proposed approach includes: identifying the goal of each XP practice, identifying its questions that attempt to define and quantify this goal, identifying the metrics that can answer each question, calculating the metrics, and comparing the resulted value of each metric with the acceptable value range of this metric. If the metric value is out of the accepted value range, a special note must be added to the final report. The top management and the project team must interpret and analyze the final report to discover the weaknesses and strengths. They may take corrective actions to solve the problems that led to the unaccepted value of the metric. In addition, they may take supportive actions to maintain and increase the strengths of the XP practices. We conclude that the metrics of the implementation of XP practices can play an important role in monitoring, controlling, enhancing, and improving the work environment of software projects. In addition, these metrics can help the team and management to discover and avoid the weaknesses of applying XP practices. 7. Future Work In future, the researcher intends to refine the proposed approach and build a software tool to automate the process of evaluating the implementation of XP practices. In addition, the researcher will focus on the following issues:  Enhancements of evaluating the quality of applying pair programming and refactoring.  Studying the relationships among XP practices.  Using XP approach to achieve higher levels of Capability Maturity Model Integration (CMMI) for IT companies.  Proposing a framework for applying XP practices to large-scale projects.  Using XP approach to improve the quality of web sites.  Proposing a framework for applying XP practices to Cloud Application Development. References 1. A. Qumer and B. Henderson-Sellers, "An Evaluation of the Degree of Agility in Six Agile Methods and its Applicability for Method Engineering", Information and Software Technology Vol. 50 Issue 4, 2008, pages 280–295, 2008. 2. Alan S. Koch, "Agile Software Development - Evaluating the Methods for Your Organization", Artech House INC., 2005.

55

Darwish, Towards An Approach For Evaluating The Implementation of Extreme Programming Practices

3. Dean Liffingwell, "Scaling Software Agility – Best Practices for Large Enterprises", The Agile Software Development Series, Pearson Education Inc., 2007. 4. Fenton, Norman, Robin Whitty and Yoshinori lizuka, “Software Quality Assurance and Measurement”, International Thomson Computer Press, 1995. 5. G. Gordon Schulmeyer, "Handbook of Software Quality Assurance", 4th edition, Artech House Inc., 2008. =5 6. Giulio Concas, Marco Di Francesco, Michele Marchesi, Roberta Quaresima, and Sandro Pinna, "An Agile Development Process and Its Assessment Using Quantitative Object-Oriented Metrics", 9th International Conference, XP 2008, Limerick, Ireland, Proceedings, June 2008. 7. H. Robinson and H. Sharp, "XP Culture: Why the twelve practices both are and are not the most significant thing", presented at 1st International Agile Development Conference (ADC '03), Salt Lake City, UT, pp. 12-21, 2003. 8. H. Sharp and H. Robinson, "Collaboration and co-ordination in mature eXtreme programming teams", International Journal of Human-Computer Studies, Vol. 66, Issue 7, pages 506–518, 2008. 9. Hamid Mcheick, "Improving and Survey of Extreme Programming Agile Methodology", International Journal of Advanced Computing (IJAC), Vol. 3, Issue 3, July 2011. 10. IEEE Standard 610,”Glossary of Software Engineering”, 1990. 11. Ioannis G. Stamelos and Panagiotis Sfetsos, "Agile Software Development Quality Assurance", Information science reference, Idea Group Inc., 2007. 12. Jeffrey A. Livermore, "Factors that Significantly Impact the Implementation of an Agile Software Development Methodology", Journal of Software, Vol. 3, No. 4, APRIL 2008. 13. Jennifer Stapleton, "DSDM: The Method in Practice", Addison-Wesley, 1997. 14. John Hunt, "Agile Software Construction", Springer, 2006. =16 15. K. Usha, N. Poonguzhali, and E. Kavitha, "A Quantitative Approach for Evaluating the Effectiveness of Refactoring in Software Development Process", International Conference on Methods and Models in Computer Science, Delhi, India, Dec. 2009. 16. Karlheinz Kautz and Sabine Zumpe, "Just Enough Structure at the Edge of Chaos: Agile Information System Development in Practice", 9th International Conference, XP 2008, Limerick, Ireland, Proceedings, June 2008. 17. K. Beck, "Extreme Programming Explained: Embrace Change", Addison Wesley, 1999. 18. K. Beck and C. Andres, "Extreme Programming Explained: Embrace Change", AddisonWesley, 2005. 19. L. Williams, R. Kessler, W. Cunningham, and R. Jeffries, "Strengthening the Case for Pair Programming", IEEE Software 17, 19–25, 2000. 20. Laurie Williams, William Krebs, Lucas Layman, Annie I. Antón, Pekka Abrahamsson, "Toward a Framework for Evaluating Extreme Programming", Proceedings of the 8th International Conference on Empirical Assessment in Software Engineering, Edinburgh, Scotland, May 2425, , pp. 11-20, 2004. 21. Lucas Layman, Laurie Williams, Daniela Damian, Hynek Bures, "Essential communication practices for Extrem Programming in a global software development team", Information and Software Technology, vol. 48, issue 9, pp. 781-794, September 2006. 22. M. Angioni, D. Carboni, S. Pinna, R. Sanna, N. Serra, A. Soro, "Integrating XP Project Management in Development Environments", Journal of Systems Architecture Vol. 52, Issue 11, pp. 619–626, 2006. 23. Markus Rittenbruch, Gregor McEwan, Nigel Ward, Tim Mansfield, and Dominik Bartenstein, "Extreme Participation - Moving Extreme Programming Towards Participatory Design",

55

IJICIS, Vol.13, No. 3 JULY 2013

Proceedings of the Participatory Design Conference, Malmo, Sweden, pp.29-41, June, 23-25, 2002. 24. Nagy Ramadan Darwish, “Improving the Quality of Applying eXtreme Programming (XP) Approach", International Journal of Computer Science and Information Security (IJCSIS) – ISSN 1947-5500, Vol. 9 No. 11, November 2011. 25. Noura Abbas, Andrew M. Gravell, and Gary B. Wills, "Historical Roots of Agile Methods: Where Did Agile Thinking Come From?", 9th International Conference, XP 2008, Limerick, Ireland, Proceedings, June 2008. 26. Pekka Abrahamsson, Outi Salo, Jussi Ronkainen, and Juhani Warsta, "Agile Software Development Methods – Review and Analysis", VTT, 2002. 27. Peter Hearty, Norman Fenton, David Marquez, and Martin Neil, "Predicting Project Velocity in XP Using a Learning Dynamic Bayesian Network Model", IEEE Transaction on Software Engineering, Vol. 35, No. 1, JAN/FEB 2009. 28. Paul Goodman, "Software Metrics: Best Practices for Successful IT Management", Rothstein Associates, 2004. 29. R. C. Martin, "Extreme Programming - Development Through Dialog", IEEE Software, pp. 12– 13, 2000. 30. Ridi Ferdiana and Paulus Insap, “Improving Mobility in eXtreme Programming Methods through Computer Support Cooperative Work", International Journal of Computer Science and Information Security (IJCSIS) – ISSN 1947-5500, Vol. 10 No. 2, February 2012. 31. Seiyoung Lee and Hwan-Seung Yong, "Agile Software Development Framework in a Small Project Environment", Journal of Information Processing Systems, Vol. 9, No. 1, 2013. 32. Sukhpal Singh and Inderveer Chana, "Enabling Reusability in Agile Software Development", International Journal of Computer Applications, Vol. 50, No. 13, July 2012. 33. Tobias Hildenbrand, Michael Geisser, Thomas Kude, Denis Bruch, and Thomas Acker, "Agile Methodologies for Distributed Collaborative Development of Enterprise Applications", International Conference on Complex, Intelligent and Software Intensive Systems, 2008. 34. V. Basili, G. Caldiera, and H.D. Rombach, “Goal Question Metric Approach,” Encyclopedia of Software Engineering, pp. 528-532, John Wiley & Sons, Inc., 1994. 35. Venkata V. K. Padmanabhuni and Hari P. Tadiparthi, "Effective Pair Programming Practice- An Experimental Study", Journal of Emerging Trends in Computing and Information Sciences, ISSN 2079-8407, VOL. 3, NO. 4, April 2012. 36. William E Perry, “Quality Assurance for Information Systems: Methods, Tools, and Techniques”, QED technical publishing Group, 1991. 37. Yang Yong and Bosheng Zhou, "Evaluating Extreme Programming Effect through System Dynamics Modeling", International Conference on Computational Intelligence and Software Engineering (CiSE), Wuhan, China, Dec. 2009. 38. Zdena Dobešová and Dagmar Kusendová, “Goal-Question-Metric method for evaluation of cartographic functionality in GIS software”, proceedings GIS Ostrava, 2009.

55