Agenda Why to use a test plan How to plan for testing Inputs for test plan Test items Risk Assessment Test strategy Resources Test scheduling Test deliverables Generate test plan Types of test Plan
Why to use a test plan It guides our thinking It serves as a mean of communication with other members of the project team It helps to manage changes
Agenda Why to use a test plan How to plan for testing Inputs for test plan Test items Risk Assessment Test strategy Resources Test scheduling Test deliverables Generate test plan Types of test Plan
How to plan for testing
Test Planning Project Plan Requirements Document Design Document Use-case model Supplemental specifications Client MOM Existing Systems Testing Plan Review System materials
Agenda Why to use a test plan How to plan for testing Inputs for test plan Test items Risk Assessment Test strategy Resources Test scheduling Test deliverables Generate test plan Types of test Plan
To identify what is being tested. To determine the overall test effort. Used as the basis for test coverage. Define test items Why to identify test items Item 1 Item 2 Item 3
Verifiable: they have an observable, measurable outcome. Items to be tested should be: Example: The home page needs to load fast. Home page loading time will take maximum 10 sec once Home page link is clicked.
Output: - Hierarchy of features to be tested , which can be grouped by : Use case Business case Type of test (functional, performance, etc.) Note: - Each use case should derive at least one test item.
Patrons of the library can search library catalog online to locate various resources - books, periodicals, audio and visual materials, or other items under control of the library. Patrons may reserve or renew item, provide feedback, and manage their account. Manage Account Search Catalog Reserve Item Renew Item Provide Feedback Online public access catalog
Why not to include some features in testing: Not to be included in this release of the Software. Low risk, has been used before and is considered stable. OOB component Will be tested by the client Features not to be tested
Agenda Why to use a test plan How to plan for testing Inputs for test plan Test items Risk Assessment Test strategy Resources Test scheduling Test deliverables Generate test plan Types of test Plan
Risk assessment and establishing test priority What is Risk? Risk is a future uncertain event with a probability of occurrence and a potential for loss
why to assess risk : To ensure the most critical , significant, or riskiest requirements for tests are addressed as early as possible To ensure the test efforts are focused on the most appropriate requirements for test To ensure that any dependencies (sequence, data, etc.) are accounted for in the testing
Wrong time estimation Resources are not tracked properly Failure to identify complex functionalities Wrong budget estimation Cost overruns Project scope expansion Failure to resolve the responsibilities No proper subject training No communication in team Continuous changing requirements Difficult project modules integration Running out of fund Market development Changing customer product strategy and priority
Assess Risk Determine operational profile Establish test priority Three steps to assessing risk and establishing the test priorities
a - Identify and describe the risk magnitude indicators that will be used, such as: H - High risk: M - Medium risk: L - Low risk : b - For each item in your test items list , define expected risks, select a risk magnitude indicator, and justify (in a brief statement) the value you selected. Assess Risk
There are three perspectives that can be used for assessing risk : Effect - the consequence of a specified test item fails . Cause - an undesirable outcome caused by the failure of a test item Likelihood - the probability of a test item fails.
Effect To assess risk by Effect, identify a condition, event, or action and try to determine its impact. Ask the question: " What would happen if ___________? " For example: " What would happen if while installing the new software, the system runs out of disk space? "
Description Risk Mitigation Factor Justification Insufficient disk space during install Example H Installing the software provides the user with the first impression of the product. Any undesirable outcomes, such as those listed below would degrade the user's system, the installed software, and communicate a negative impression to the user: software is partially installed (some files, some registry entries), which leaves the installed software in an unstable condition, or the installation halts leaving the system in an unstable state
Cause Assessing risk by Cause is the opposite of by Effect. Begin by stating an undesirable event or condition, and identify the set of events that could have permitted the condition to exist. Ask a question such as: " How could ___________ happen ? ” For example: " How could an order being replicated? "
Example Description Risk Mitigation Factor Justification Replicated orders H Replicated orders increase the company overhead and diminish profits via the costs associated with shipping, handling, and restocking. Possible causes include: Transaction that writes order to the database replicated due to user intervention, user enters order twice - no confirmation of entry Transaction that writes order to the database replicated due to non-user intervention (recovery process from lost Internet connection, restore of database)
Likelihood Assessing risk by Likelihood is to determine the probability that a test item will fail. The probability is usually based on an external factors such as: Failure rate(s) Rate of change Complexity Origination / Originator Example: "Historically we've found many defects in the components used to implement use cases 1, 10, and 12, and our customers requested many changes in use case 14 and 19."
Example Description Risk Mitigation Factor Justification High failure discovery rates / defect densities in use cases 1, 10, 12. Change Requests in use cases 14 and 19. H H Due to the previous high failure discovery rates and defect density use cases 1, 10, and 12 are considered high risk. A high number of changes to these use cases increases the probability of injecting defects into the code.
a - Identify and describe the operational profile magnitude indicators that will be used, such as: H - Quite frequently used: M - Frequently used: L - Infrequently used: b- For each item in you test items list, select an operational profile magnitude indicator and state your justification for the indicator value. Determine Operational Profile
Examples : Ordering items from the on-line catalog Customers inquiring about their order on-line after order is placed Item selection dialog Description Operational Profile Factor Justification Ordering items from the catalog This is the most common use case executed by users. H
a- Identify and describe the test priority magnitude indicators that will be used, such as: H - Must be tested. M - Should be tested, will test only after all H items are tested L - Might be tested, but not until all H and M items have been tested b- For each item in you test items list, select a test priority indicator and a state your justification. Establish Test Priority
Consider the following : the risk magnitude indicator value you identified earlier the operational profile magnitude value you identified earlier contractual obligations (will the target-of-test be acceptable if a use case or component is not delivered?) Strategies for establishing a test priority include: Use the highest assessed factor Identify one assessed factor as being the most significant and use that factor's value as the priority. Use a combination of assessed factors to identify the priority. Using a weighting schema where individual factors are weighed, and their values and priority calculated based upon the weight.
Examples : Ordering items from the on-line catalog Customers inquiring about their order on-line after order is placed Item Selection Dialog Priority when the highest assessed value is used to determine priority: Item Risk Operational Profile Contract Priority Ordering items from catalog H H H Customer Inquiries L L L Item Selection Dialog L H L H L H
Delivery of a third party product. New version of interfacing software Ability to use and understand a new package/tool , etc. Extremely complex functions Modifications to components with a past history of failure Poorly documented modules or change requests misunderstanding of the original requirements. Example for common risks
Agenda Why to use a test plan How to plan for testing Inputs for test plan Test items Risk Assessment Test strategy Resources Test scheduling Test deliverables Generate test plan Types of test Plan
Test Strategy Define Types of Testing which will be used and their objectives Define which testing techniques will be used Define Entrance Criteria Define suspension criteria and resumption requirements Define Exit Criteria Define Testing Stages
Define Types of Testing which will be used and their objectives Integration Testing Acceptance testing System testing Regression Testing Functional Testing Security testing Performance testing
Define Testing Stages Clearly state the stage in which the test will be executed. Application Stages Individual Components are implemented Individual Components are Integrated All System Components are integrated System will be delivered to the Client Functional Testing System testing Integration Testing Regression Testing Performance testing Security testing Acceptance testing Testing types Y Y Y Y Y Y Y Y Y How to decide which testing types will be used
Define which testing techniques will be used For each testing types : For each test item: specify how the test will be implemented who will execute it W hich method(s) will be used to evaluate the results
Define which testing techniques will be used For each testing types : Functional testing For each test item: Registeration Form Specify how the test will be implemented : Who will execute it : There will be a set of test cases , each representing the actions taken by the actor when the test item is executed . A minimum of two test cases will be created for each test item; one test case to reflect the positive condition and one to reflect the negative (unacceptable) condition. QE / Automated Tool (For security/ performance testing : Tool / SME) , (For Acceptancr : QE / Client)
W hich method(s) will be used to evaluate the results Test case execution : Function was executed successfully and as desired Window Existence, or Object Data verification methods:(UI / Data) Windows /data were displayed during test execution. Database reflection testing : Database will be examined before the test and again after the test to verify that the changes executed during the test are accurately reflected in the data.
Define Entrance Criteria Components are developed and unit tested Test environment is ready Testing tools are available Testing Resources are available All bugs are fixed (regression testing)
Define suspension and resumption requirements Example : If the number or type of defects reaches a point where the follow on testing has no value, it makes no sense to continue the test; you are just wasting resources. Testing after a truly fatal error will generate conditions that may be identified as defects but are in fact ghost errors caused by the earlier defects that were ignored.
Why to define Exit Criteria? To identify acceptable product quality To identify when the test effort has been successfully implemented A clear statement of exit criteria should include the following items: W hat is being tested (the specific test item ) How is the measurement being made What criteria is being used to evaluate the measurement Define Exit Criteria
Example All planned test cases have been executed All identified defects have been addressed to an agreed upon resolution All planned test cases have been re-executed and all known defects have been addressed as agreed upon, and no new defects have been discovered
Identify the resources necessary to testing Human resources (skills, knowledge, availability , training) Test environment (Hardware and software requirements) Tools Data
Manage and plan the testing Design the tests and data Implement the tests and data ( test cases creation and test data preparation) Execute testing and evaluate the results Manage and maintain the test systems(Support team) Identify human resources who can do the following
Team Leader - Development Project Manager Development Team Testing team Client User Acceptance test System/Integration testing Unit testing System Design Reviews Test Cases Creation Test Cases Review Screen prototype reviews Regression testing Define RESPONSIBILITIES Who is in charge? Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes
Identify non-human resource needs (Test environment) Two different physical environments are recommended Implementation Environment Execution Environment
The application-under-test. The client O/S. The server O/S. Internet Browser Bugs Tracking system Test case repository tool Database management tool Microsoft office , Outlook Software Needed Minimum software needed: Additional software needed:
Data can be used as: Input (creating or supporting a test condition) Output (to be compared to an expected result). What software tools will be used, by whom , and what information or benefit will be gained by the use of each tool. Tools Data
Agenda Why to use a test plan How to plan for testing Inputs for test plan Test items Risk Assessment Test strategy Resources Test scheduling Test deliverables Generate test plan Types of test Plan
TEST DELIVERABLES Test plan document. Test cases. Test Data Traceability matrix Build status report Release notes Test design specifications. Output of testing tools. (Performance-security - automation - Reports)
Creating Schedule Why to create a test schedule: Creating a schedule includes: Estimate test effort Generate test schedule To identify and communicate test effort, schedule, and milestones
Software testing estimation methods : Percentage of development effort. Experience Based. Work Breakdown Structure. Delphi technique Three-point estimation Estimate testing effort Why to estimate To avoid the exceeding timescales and overshooting budgets
Reading / analyzing and reviewing Requirements. Test design (Test cases creation , test data preparation....) Test implementation ( Recording test cases) Test Execution Re-testing(Issues ) Regression testing Integration testing User acceptance testing Performance / security testing Compatibility testing(Across different browsers , OS ..) Language testing Estimates should include estimates for:
Generate test schedule A test project schedule can be built from the work estimates and resource assignments. It is always best to tie all test dates directly to their related development activity dates.
Generate test plan To organize and communicate to others the test-planning information. Prior to generating the test plan, a review of all the existing project information should be done to ensure the test plan contains the most current and accurate information. The test plan should be distributed to at least the following: all test roles developer representative Project Leader client representative
Approval Who can approve the process as complete and allow the project to proceed to the next level
Agenda Why to use a test plan How to plan for testing Inputs for test plan Test items Risk Assessment Test strategy Resources Test scheduling Test deliverables Generate test plan Types of test Plan
Types of test Plan Project Information Software Information Create details test plan Create Master test plan Master test plan Details test plan
Master test plan Unit test plan Acceptance test plan Integration test plan System test plan