Automation testing

935 views 12 slides Nov 11, 2020
Slide 1
Slide 1 of 12
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12

About This Presentation

Automation testing


Slide Content

Automation Testing TOMY RHYMOND | Cloud Solutions Architect

TEST AUTOMATION TIMELINE 1. REQUIREMENTS Gather business requirement and validate it. 2. TEST CASES Testers write the test cases for all scenarios, positive, negative, edge conditions etc. 3. DEVELOP Developers develop automated tests for all happy path scenarios. Testers develop automated tests for all other scenarios. 4. TEST Developers test the happy path scenarios and Testers test all other scenarios. 5. INTEGRATE Integrate the tests with DevOps CI/CD pipeline and setup reports 6. MONITOR Continuously monitor the pipelines for code coverage and other violations

DEVOPS – TEST AUTOMATION TEST AUTOMATION Process and Tools for Automating Enterprise Testing 2 Developers : Happy Path Scenarios Testers : All Scenarios Pipeline : CD/ On-Trigger Framework : Selenium, Protractor . END 2 END TESTING 1 UNIT TESTING Test all ui and service components Developers : All Scenarios Pipeline : CD Framework : xUnit, Jasmine Code Coverage : 80% 4 Developers : Happy Path Scenarios Testers : All Scenarios Pipeline : CD/ On-Trigger Framework : JMeter, BlazeMeter LOAD TESTING 3 API TESTING Test Backend Service integration with other services. Developers : Happy Path Scenarios Testers : All Scenarios Pipeline : CD/ On-Trigger Framework : Postman, Newman Test UI browser compatibility, ADA Test Application Performance BLAZEMETER protractor selenium xunit

TESTING PROCESS Developers and Testers collaborate to create quality code that can be continuously deployed to your dev and test environments DevTest SPRINT 1 SPRINT 2 SPRINT3 Developer Tester Developer Tester Developer Tester Develop code Write unit/integration/e2e tests for their user stories Develop test cases for the features/use stories Develop code Write unit/integration/e2e tests for their user stories Fix bugs from the previous sprint Run tests from the sprint 1 Write more tests for their test cases Create Bugs for Failed Tests (Collaborate with developers) Develop test cases for next sprint Develop code Write unit/integration/e2e tests for their user stories Fix bugs from the previous sprint Run tests from the sprint 1 Write more tests for their test cases Create Bugs for Failed Tests (Collaborate with developers) Develop test cases for next sprint Review of requirements Test planning / writing test cases Unit testing End 2 End testing API/integration testing  Performance testing Security testing Cross-browser testing / cross-platform testing/ ADA testing Updating test cases Regression testing

CODING GUIDELINES FOR BETTER TESTING Unit Test are fast running tests that are quick to debug. They should cover the bulk of your testing strategy and should cover majority of your code. Write Functional pieced without side effects Think of each code block as a functional piece that shouldn't cause any side effects (like I/O). Any side effects should be isolated and wrapped. Use Dependency Injection Adhere to SOLID principles Class and methods should only have responsibility Write Unit Test first Extract all non-testable code into wrapper classes  I/O operations and external APIs - code at the boundaries of your system. Isolate this code into narrow wrappers. Make sure to exclude as much logic from these wrappers as you can. Try not to use static methods and variables Static methods can be hard to mock out. Avoid them if possible. Static variables leave a global state and should only be used in very special cases API C#/DotNet Core/Azure UI – Angular/TypeScript Add a selector to all elements. eg: <div id=“lastname”></div> Avoid using class name for selector Include an index with all for loops and add to selector eg:  <div *ngFor="let subsection of subsections; index as i"> id="questionaudiencetitle{{ i }}” Make sure you function does one thing. eg. fetchCustomer should only get customer data and should not update data and then get customer data Make sure you typescript file (classes) has one purpose. eg. customerService should only be responsible for retrieving and updating customer data.

UNIT TEST GUIDELINES Unit Test are fast running tests that are quick to debug. They should cover the bulk of your testing strategy and should cover majority of your code. Constructors or properties (if they just return variables). Test them only if they contain validations. Configurations like constants, read-only fields, configs, enumerations, etc. Facades of just wrapping other frameworks or libraries Exception messages POCO classes, models, etc. Complex SQL Queries (more than 3 joins or grouping, etc.). Better to test it with manual or some kind of system test against real DB. ASPNET controller methods Complex multi-threading code (it is better to be tested with integration tests) Private methods and Methods that call another public method What not to Unit Test What to Unit Test Reduce the number of test cases to the necessary minimum and to select the right test cases to cover all possible scenarios. Test All UI Components and Service in UI Code. Test obscure edge cases Write each test to test one thing Use mock objects to clarify dependent contracts Test Repositories, Services and other business logic components in backend API Code. Write your tests using the language used for development. Use TypeScript/JavaScript for UI code and C# for backend code The name of the test should be the first, most useful clue when looking at a failure report. xunit

END 2 END TEST GUIDELINES An application is interconnected and integrated with multiple systems that are outside of the application environment. The entire flow of the application thus gets complicated. End-to-end testing ensures that the application is tested from all layers – front end to backend along with its interfaces and the endpoints. Test from the Perspective of an End User – Think like a user, focus more on features rather than how the functionalities are implemented. Focus on the features of your application whose failure will cause high risk. All requirements from Business should be in these tests. Initial E2E tests should be with consistent mock data, to isolate defects as with the UI only.  If all UI and API tests run, additional test runs should be done in each environment – Dev, Test, Staging, Prod. Expected results are stored in JSON files in a directory for each environment. The only UI aspects not tested are styles and layout – If these tests are needed, Protractor can be configured to do Visual Regression Testing. ADA/Compliance testing is included in the E2E suite of tests. protractor selenium

API TEST GUIDELINES API testing involves testing APIs directly and as part of integration testing to determine if they meet expectations for functionality, reliability, performance, and security. API Tests isolate the REST services without a UI. Data to create the requests and validate the responses are stored in JSON files and categorized by environment, as data will change. All services must have tests with a variety of conditions: Happy Path, empty responses, all error conditions, etc. Critical services should have additional tests to validate all the response scenarios  - difference variables, numbers and values of variables, etc. Postman scripts can be converted to JMeter tests with the use of conversion tools as an option (Loadium). Initial API test should run with mock data to isolate code defects to the API logic and not underlining changes to data or connectivity issues. Like E2E test, data in the environments must not change or this will result in false positives when tests fail.  

LOAD TEST GUIDELINES Load testing is a kind of Performance Testing which determines a system's performance under real-life load conditions. This testing helps determine how the application behaves when multiple users access it simultaneously. Load test should focus on the typical user behaviors for a system: eg – authentication, landing page, app functionality, etc. Loads should be determined by user behavior during peak times.  If the peak load is x number of concurrent users, test should be run starting at 10% peak, then inclemently up to 100%, and then to up to milestone like 125%, 150% an 200%.  A final test to determine at what point does the solution break is recommended. Test durations are determine based upon actual production peak time and increased 500%. Browsers are not involved in a load tests, but the rending of UI files are – HTML, CSS, JS, Bundles, etc. Automated load test should be run on off-peak times, not to stress the system but as a comparison if recent code commits have significantly impacted performance.  This will eliminate major surprises when a full test is run at the end of a development cycle. BLAZEMETER

SECURITY TEST GUIDELINES Testing that ensures software systems and applications are free from any vulnerabilities, threats, risks that may cause a big loss. A password should be in encrypted format Application or System should not allow invalid users Check cookies and session time for application Token expire after certain time For financial sites, the Browser back button should not work. Example Test Scenarios  

DEVOPS – DELIVERY MODEL Continuous Integration Sprint Product Backlog Priorities Sprint Backlog Product (Usable) Acceptance Tests Deployment (production) To Do… To Do… Continuous Delivery Acceptance Tests Deployment (production) Product Backlog Priorities Sprint Backlog To Do… Product (Usable) Sprint Acceptance Tests Deployment (production) Product Backlog Priorities Sprint Backlog Product (Usable) Continuous Deployment Sprint

SUMMARY Reliable and Repeatable Reusable Fast Cost Effective Comprehensive Reduce testing time for bug fixes and maintenance Learning curve Debugging test scripts Test maintenance Test data file maintenance Increase in development time CONS PROS Automation process includes creation of detailed test cases, including predictable “expected results”, which have been derived from Business Functional Specifications and other design documentation. A standalone Test Environment, including a Test Database that is restorable to a known constant, such that the test cases are able to be repeated each time there are modifications made to the application. Automation refers to the use of strategies & tools which augment or reduce the need of manual testing. 11/11/20
Tags