Ramakrishna Reddy [email protected] 9966484777 i r fan @ ema i l.c o m Advanced Software Engineering
Software Process I m p r o v e m e n t “Never Stop Learning”
3 What is SPI? SPI implies that elements of an effective software process can be defined in an effective manner an existing organizational approach to software development can be assessed against those elements, and a meaningful strategy for improvement can be defined. The SPI strategy transforms the existing approach to software development into something that is more focused, more repeatable, and more reliable (in terms of the quality of the product produced and the timeliness of delivery).
Information Systems Development Resources Activities P r o d u c t s e q u i p m e nt Hardware Software D o cu m en t a t i o n Planning Analysis Design Construction Testing Training Implementation Follow-up Enhancements etc...
Software Process Improvement Efforts Carnegie Mellon University’s Software Engineering Institute’s Capability Maturity Model - (SEI’s CMM) International Standards Organization’s 9001 Specification (ISO 9001) Proprietary SPI’s from consulting firms
6 SPI Framework a set of characteristics that must be present if an effective software process is to be achieved a method for assessing whether those characteristics are present a mechanism for summarizing the results of any assessment, and a strategy for assisting a software organization in implementing those process characteristics that have been found to be weak or missing. An SPI framework assesses the “maturity” of an organization’s software process and provides a qualitative indication of a maturity level.
Process Improvement Cycle -7 - Analyse M e asure Change
Elements of a SPI Framework 8
C o n s t it u e n c i e s 9 Quality certifiers Quality(Process) --> Quality(Product) Formalists: process modeling languages Tool advocates Practitioners: little formal process modeling Reformers: organisational change Ideologists: partocular SP for specific organisation
Maturity Models 10 A maturity model is applied within the context of an SPI framework. The intent of the maturity model is to provide an overall indication of the “process maturity” exhibited by a software organization. an indication of the quality of the software process, the degree to which practitioner’s understand and apply the process, the general state of software engineering practice.
Schorsch suggests four levels of immaturity - 11 - Level 0: Negligent– failure to allow processes Level 1: Obstructive– counterproductive processes are imposed Level 2: Contemptuous– disregard for good software engineering Level 3: Undermining– total neglect of own charter Four levels of Immaturity
Is SPI for Everyone? 12 Can a small company initiate SPI activities and do it successfully? Answer: a qualified “yes” It should come as no surprise that small organizations are more informal, apply fewer standard practices, and tend to be self-organizing. SPI will be approved and implemented only after its proponents demonstrate financial leverage.
The SPI Process—I 13 Five activities Assessment and Gap Analysis Assessment examines a wide range of actions and tasks that will lead to a high quality process. Consistency. Are important activities, actions and tasks applied consistently across all software projects and by all software teams? Sophistication. Are management and technical actions performed with a level of sophistication that implies a thorough understanding of best practice? Acceptance. Is the software process and software engineering practice widely accepted by management and technical staff? Commitment. Has management committed the resources required to achieve consistency, sophistication and acceptance? Gap analysis —The difference between local application and best practice represents a “gap” that offers opportunities for improvement.
The SPI Process—II 14 Education and Training Three types of education and training should be conducted: Generic concepts and methods. Directed toward both managers and practitioners, this category stresses both process and practice. The intent is to provide professionals with the intellectual tools they need to apply the software process effectively and to make rational decisions about improvements to the process. Specific technology and tools. Directed primarily toward practitioners, this category stresses technologies and tools that have been adopted for local use. For example, if UML has been chosen for analysis and design modeling, a training curriculum for software engineering using UML would be established. Business communication and quality-related topics. Directed toward all stakeholders, this category focuses on “soft” topics that help enable better communication among stakeholders and foster a greater quality focus.
The SPI Process—III 15 Selection and Justification choose the process model (Chapters 2 and 3) that best fits your organization, its stakeholders, and the software that you build decide on the set of framework activities that will be applied, the major work products that will be produced and the quality assurance checkpoints that will enable your team to assess progress develop a work breakdown for each framework activity (e.g., modeling), defining the task set that would be applied for a typical project Once a choice is made, time and money must be expended to install it within an organization and these resource expenditures should be justified.
The SPI Process—IV 16 I nsta l lat i o n / M i g r a t i on actually software process redesign (SPR) activities. Scacchi [Sca00] states that “SPR is concerned with identification, application, and refinement of new ways to dramatically improve and transform software processes.” three different process models are considered: the existing (“as-is”) process, a transitional (“here-to-there”) process, and the target (“to be”) process.
The SPI Process—V 17 E v a l u at i on assesses the degree to which changes have been instantiated and adopted, the degree to which such changes result in better software quality or other tangible process benefits, and the overall status of the process and the organizational culture as SPI activities proceed From a qualitative point of view, past management and practitioner attitudes about the software process can be compared to attitudes polled after installation of process changes.
Risk Management for SPI 18 manage risk at three key points in the SPI process [Sta97b]: prior to the initiation of the SPI roadmap, during the execution of SPI activities (assessment, education, selection, installation), and during the evaluation activity that follows the instantiation of some process characteristic. In general, the following categories [Sta97b] can be identified for SPI risk factors: budget and cost content and deliverables culture maintenance of SPI deliverables mission and goals organizational management and organizational stability process stakeholders schedule for SPI development SPI development environment and process SPI project management and SPI staff
Critical Success Factors 19 The top five CSFs are [Ste99]: Management commitment and support Staff involvement Process integration and understanding A customized SPI strategy A customized SPI strategy
C MM I : T h e P r e s e q u e l - 2 -
The CMMI model An integrated capability model that includes software and systems engineering capability assessment. The model has two instantiations Staged where the model is expressed in terms of capability levels; Continuous where a capability rating is computed.
SEI Capability Maturity Model Initial Optimizing Managed Defined Repeatable Basic Management Control Process Definition Process Measurement Process Control 45% 30% < 1% 20% 2-3%
CMM - Initial (Level 1) “BASICALLY NO CONTROL” The software process is characterized as ad hoc, occasionally even chaotic Few processes are defined Success depends on individual effort and heroics
CMM - Repeatable (Level 2) Basic project management processes are established to track cost, schedule, and functionality The necessary process discipline is in place to repeat earlier successes on projects with similar applications Success achieved through basic project management; not advanced technologies “BASIC MANAGEMENT CONTROL”
CMM - Defined (Level 3) The software process for both management and engineering activities is documented, standardized, and integrated into a standard software process for the organization All projects use an approved, tailored version of the organization’s standard software process for developing and maintaining software Formality lends itself to improvement “PROCESS DEFINITION”
CMM - Managed (Level 4) Detailed measures of the software process and product quality are collected Both the software process and products are quantitatively understood and controlled A software metrics program is in use “PROCESS MEASUREMENT”
CMM - Optimizing (Level 5) Continuous process improvement is enabled by quantitative (metrics) feedback from the process Continuous process improvement is enabled by piloting innovative ideas and technologies “PROCESS CONTROL”
The continuous CMMI model This is a finer-grain model that considers individual or groups of practices and assesses their use. The maturity assessment is not a single value but is a set of values showing the organisations maturity in each area. The CMMI rates each process area from levels 1 to 5. The advantage of a continuous approach is that organisations can pick and choose process areas to improve according to their local needs.
CMMI model components Process areas 24 process areas that are relevant to process capability and improvement are identified. These are organised into 4 groups. Goals Goals are descriptions of desirable organisational states. Each process area has associated goals. Practices Practices are ways of achieving a goal - however, they are advisory and other approaches to achieve the goal may be used.
CMMI process areas 1
CMMI process areas 2
CMMI goals
CMMI practices Associated goal The requirements are analysed and validated and a definition of the required functionality is developed. Practice Analyse derived requirements to ensure that they are necessary and sufficient Validate requirements to ensure that the resulting product will perform as intended in the userÕs environment using multiple techniques as appropriate. Select the defects and other problems for analysis. Perform causal analysis of selected defects and other problems and propose actions to address them. Establish and maintain an organisational policy for planning and performing the requirements development process. Assign responsibility and authority for performing the process, developing the work products and providing the services of the requirements development process. Root causes of defects and other problems are systematically determined. The process is institutionalised as a defined process.
CMMI assessment Examines the processes used in an organisation and assesses their maturity in each process area. Based on a 6-point scale: Not performed; Performed; Managed; Defined; Quantitatively managed; Optimizing.
A process capability profile Project m onitoring and control Supplier ag ree m ent m anagem ent R i s k m anagem ent C o n f i g u r a t i o n m anagem ent Requirem ents m anagem ent Ve rification Valid ation 1 2 3 4 5
The People CMM 36 “a roadmap for implementing workforce practices that continuously improve the capability of an organization’s workforce.” [Cur02] defines a set of five organizational maturity levels that provide an indication of the relative sophistication of workforce practices and processes
P-CMM Process Areas 37
Other SPI Frameworks 38 SPICE — a international initiative to support the International Standard ISO/IEC 15504 for (Software) Process Assessment [ISO08] Bootstrap —a SPI framework for small and medium sized organizations that conforms to SPICE [Boo06], PSP and TSP —individual and team specific SPI frameworks ([Hum97], [Hum00]) that focus on process in-the-small, a more rigorous approach to software development coupled with measurement TickIT —an auditing method [Tic05] that assesses an organization compliance to ISO Standard 9001:2000
SPI Return on Investment 39 “How do I know that we’ll achieve a reasonable return for the money we’re spending?” ROI = [ S ( benefits ) – S ( costs )] / S ( costs )] X 100% where benefits include the cost savings associated with higher product quality (fewer defects), less rework, reduced effort associated with changes, and the income that accrues from shorter time-to-market. costs include both direct SPI costs (e.g., training, measurement) and indirect costs associated with greater emphasis on quality control and change management activities and more rigorous application of software engineering methods (e.g., the creation of a design model).
SPI Trends 40 future SPI frameworks must become significantly more agile Rather than an organizational focus (that can take years to complete successfully), contemporary SPI efforts should focus on the project level To achieve meaningful results (even at the project level) in a short time frame, complex framework models may give way to simpler models. Rather than dozens of key practices and hundreds of supplementary practices, an agile SPI framework should emphasize only a few pivotal practices
SPI AFTERTHOUGHTS “...according to the SEI model, Apple Computer should not exist.” Tom DeMarco Small organizations may not be able to afford the overhead required by an SEI-type model You can’t skip levels It takes time (2 to 3 years/level) to move from one level to the next Not many organizations are beyond Level 1 New organizations are unlikely to start at Level 3 Levels are important in some contracts
Requirements engineering The process of establishing the services that the customer requires from a system and the constraints under which it operates and is developed. The requirements themselves are the descriptions of the system services and constraints that are generated during the requirements engineering process.
Functional and non-functional requirements Functional requirements Statements of services the system should provide, how the system should react to particular inputs and how the system should behave in particular situations. Non-functional requirements constraints on the services or functions offered by the system such as timing constraints, constraints on the development process, standards, etc. Domain requirements Requirements that come from the application domain of the system and that reflect characteristics of that domain.
R e q u i r e m e nt s Management Validation Inception Elicitation Elaboration Negotiation Specification Requirements Engineering
Requirements Engineering-II Inception —ask a set of questions that establish … basic understanding of the problem the people who want a solution the nature of the solution that is desired, and the effectiveness of preliminary communication and collaboration between the customer and the developer Elicitation —elicit requirements from all stakeholders Elaboration —create an analysis model that identifies data, function and behavioral requirements Negotiation —agree on a deliverable system that is realistic for developers and customers
Requirements Engineering-III Specification —can be any one (or more) of the following: A written document A set of models A formal mathematical A collection of user scenarios (use-cases) A prototype Validation —a review mechanism that looks for errors in content or interpretation areas where clarification may be required missing information inconsistencies (a major problem when large products or systems are engineered) conflicting or unrealistic (unachievable) requirements. Requirements management
I n c e p t ion Identify stakeholders “who else do you think I should talk to?” Recognize multiple points of view Work toward collaboration The first questions Who is behind the request for this work? Who will use the solution? What will be the economic benefit of a successful solution Is there another source for the solution that you need?
Eliciting Requirements meetings are conducted and attended by both software engineers and customers rules for preparation and participation are established an agenda is suggested a "facilitator" (can be a customer, a developer, or an outsider) controls the meeting a "definition mechanism" (can be work sheets, flip charts, or wall stickers or an electronic bulletin board, chat room or virtual forum) is used the goal is to identify the problem : objects, services, constraints, performance, mini specifications, issues list propose elements of the solution negotiate different approaches, and specify a preliminary set of solution requirements
Eliciting Requirements Use QFD t o priorit ize requirem ent s inf orm ally priorit ize requirem ent s f orm al priorit iz at ion? Creat e Use-c as es y e s n o Elic i t requirem ent s writ e scenario def ine act ors com plet e t em plat e draw use-c as e diagram Conduct FA ST m eet ings Make list s of f unct ions, class es Make list s of c onst raint s, et c.
Quality Function Deployment Function deployment determines the “value” (as perceived by the customer) of each function required of the system Normal, Expected and Exciting Requirements Information deployment identifies data objects and events Task deployment examines the behavior of the system Value analysis determines the relative priority of requirements\ Customer Voice Table
Elicitation Work Products a statement of need and feasibility. a bounded statement of scope for the system or product. a list of customers, users, and other stakeholders who participated in requirements elicitation a description of the system’s technical environment. a list of requirements (preferably organized by function) and the domain constraints that apply to each. a set of usage scenarios that provide insight into the use of the system or product under different operating conditions. any prototypes developed to better define requirements .
Building the Analysis Model Elements of the analysis model Scenario-based elements Functional—processing narratives for software functions Use-case—descriptions of the interaction between an “actor” and the system Class-based elements Implied by scenarios Behavioral elements State diagram Flow-oriented elements Data flow diagram
Use-Cases A collection of user scenarios that describe the thread of usage of a system Each scenario is described from the point-of-view of an “actor”—a person or device that interacts with the software in some way Each scenario answers the following questions: Who is the primary actor, the secondary actor (s)? What are the actor’s goals? What preconditions should exist before the story begins? What main tasks or functions are performed by the actor? What extensions might be considered as the story is described? What variations in the actor’s interaction are possible? What system information will the actor acquire, produce, or change? Will the actor have to inform the system about changes in the external environment? What information does the actor desire from the system? Does the actor wish to be informed about unexpected changes?
Use-Case Diagram homeow ner Arms/ disarms syst em Accesses syst em via Int ernet Reconf igures sensors and relat ed syst em f eat ures Responds t o alarm event Encount ers an error condit ion syst em administ rat or sensors
Class Diagram Sensor na m e / i d type location area characteristics identify() enable() disable() reconfigure () From the SafeHome system …
State Diagram Reading Commands System status = “ready” Display msg = “enter cmd” Display status = steady Entry/subsystems ready Do: poll user input panel Do: read user input Do: interpret user input State name State variables State activities
Analysis Patterns Pattern name: A descriptor that captures the essence of the pattern. Intent: Describes what the pattern accomplishes or represents Motivation: A scenario that illustrates how the pattern can be used to address the problem. Forces and context: A description of external issues (forces) that can affect how the pattern is used and also the external issues that will be resolved when the pattern is applied. Solution: A description of how the pattern is applied to solve the problem with an emphasis on structural and behavioral issues. Consequences: Addresses what happens when the pattern is applied and what trade-offs exist during its application. Design: Discusses how the analysis pattern can be achieved through the use of known design patterns. Known uses: Examples of uses within actual systems. Related patterns: On e or more analysis patterns that are related to the named pattern because (1) it is commonly used with the named pattern; (2) it is structurally similar to the named pattern; (3) it is a variation of the named pattern.
Negotiating Requirements Identify the key stakeholders These are the people who will be involved in the negotiation Determine each of the stakeholders “win conditions” Win conditions are not always obvious Negotiate Work toward a set of requirements that lead to “win-win”
Validating Requirements - I Is each requirement consistent with the overall objective for the system/product? Have all requirements been specified at the proper level of abstraction? That is, do some requirements provide a level of technical detail that is inappropriate at this stage? Is the requirement really necessary or does it represent an add-on feature that may not be essential to the objective of the system? Is each requirement bounded and unambiguous? Does each requirement have attribution? That is, is a source (generally, a specific individual) noted for each requirement? Do any requirements conflict with other requirements?
Is each requirement achievable in the technical enviro n nm t e s nt - th I at will house the system or product? Is each requirement testable, once implemented? Does the requirements model properly reflect the information, function and behavior of the system to be built. Has the requirements model been “partitioned” in a way that exposes progressively more detailed information about the system. Have requirements patterns been used to simplify the requirements model. Have all patterns been properly validated? Are all patterns consistent with customer requirements?