"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

April 08, 2010

Web Service - Performance Testing Fundamentals

[You may also like - Using open source tools for performance testing]
[Next Post in Series - Web Services and Web Testing using VSTS]
[You may also like - 7 Most Stunning Website Crashes/ Failures of 2011 predominantly due to high traffic]
Performance testing is all about ensuring scalability and availability of the system to respond to business needs. Performance testing will ensure application can handle the desired load. Technically we must be answer below questions.
  • Identify response time for requests during Average/Peak Load
  • Numbers of Requests handled in XX Hours
  • Avg. Response Time for each request. %% Requests meeting SLA, Missing SLA during different Loads
  • Monitor System Behavior using counters - Database -> Transactions/Sec, Deadlocks/Sec
“How to performance test website / webservice” - Guidelines Listed Below
Functional Knowledge
Before you performance test your applications know all work flows/ business logic implemented in your application. Identify Data Access Layer/Business Logic layer and understand how it works. When you know the business logic flow in the application identifying the core test scenarios and test mix would be easy deal. Here the key is you need to know the application to evaluate numbers provided by business eg: how many orders per hour / site visits per peak hour.

On an typical business Day
1. 20% is Sales through the Site
2. 30% is Orders Update through the Site
3. 20% of time No Access
4. 30% Enquiries (Search/Enquiries)
During festive Season
1. 60% is Sales through the Site
2. 20% is Orders Update through the Site
3. 5% of time No Access
4. 15% Enquiries (Search/Enquiries)

With this distribution in place you can identify the workflows for each of the transaction (Order Placement/Enquiry/Search). Outcome of this phase is you know the functionality implementation of this application to identify critical workflows in the system.

Identify Performance Test Scenarios (Performance Goals)
Here is where business requirements are converted into performance test scenarios
Ex: 19000 orders a day is expected volume, Peak volume per hour 5000 -> This translates to 1.3 requests/sec for peak volume, Average .21 Req/sec for average load
All business scenarios will be converted into test scenarios and expected result (test pass criteria) will also be mentioned in the test scenario document.

Test Scenario 1 – Average Load
1. 20% is Sales through the Site
2. 30% is Orders Update through the Site
3. 30% Enquiries (Search/Enquiries)
Test Scenario 2 – Peak Load
During festive Season
1. 60% is Sales through the Site
2. 20% is Orders Update through the Site
3. 20% Enquiries (Search/Enquiries)

The outcome of this phase is
• Test scenarios are identified
• Test Pass Criteria identified
• Test Scenarios are reviewed and signed off by stakeholders of the application

Tools/Scripts, Environmental Setup
In a Real-world scenario DEV/Test environment would not have same Hardware configuration and setup compared to Production environment. Production environment would have NLB Servers, High end processors. It is very important to test in production-like environment. Performance test environment would mimic production environment in terms of hardware configuration/setup.

Test data is another crucial factor for performance testing. It’s recommended 90% of tests must pass in a test run to consider the results for analysis. Test Data must have the same volume of production data available in production. Using production data copy is ideal. But it may not be possible because of security or privacy concern. Sample data can be created by repeating a pattern of data. 
  • Identify tools to running the tests; Code test scripts/generate test data.
  • Performance counters need to be identified and setup (ex: Sql Performance counters, ASP.NET Counters, Biztalk counters specific to the App)
Outcome of this phase is environment is ready, test scripts are coded for identified performance test scenarios.

Test Runs – Test Execution
In a performance test run each request-response time is recorded. Time is the crucial factor in performance testing. Test runs are normally done for 30 mins to 2 hour duration window for short test. During these tests, the results are documented to give performance insights. Also, errors and problems are reported and fixed. It’s recommended to conclude to do 3 test runs for each identified scenarios with same test data. Consistency of the behavior of the system, test results obtained across test runs need to be captured.
In the end, a test run for an extended period of time such as 12 to 24 hours should be done to check for excessive resource usage or resource leakage.

Results Collection and Analysis
Logs are collected; performance counter values are collected and analyzed to valuate test pass criteria

Result would look like this
  • Test Mix
  • Req/Sec Results
  • System Health Behavior / Observations
  • SLA Met, SLA Slipped during different Loads
Functional Testing -> Performance Testing -> Test pass criteria passes-> Signoff for Production Release else reiterate the cycle.

A few real-time performance test scenarios listed in link
  • Test Limits of a Download Server for Large Downloads: Probing the maximum number of users without deterioration of performance, identify CPU-IO-Network bottleneck issues,
  • Webserver Burn-In Test: Testing a server with constant load for 8 hours and deterimine the stability of the system, response times to user requests, identify CPU-IO-DB-Network bottleneck issues,
  • Probing a Webserver with 3000 Users: Running a test with up to 3000 users against an IIS6 server and monitoring the performance, response time for requests, identify CPU-IO-DB-Network bottleneck issues

What Does Performance Testing Mean?
Performance Testing Guidance for Web Applications

No comments: