15
ICCIS2012 ICCIS2012 Muhammad Dhiauddin Mohamed Suffian Faculty of IT & Multimedia Communication Open University Malaysia Fairul Rizal Fahrurazi Product Quality & Reliability Engineering MIMOS Berhad Performance Testing: Performance Testing: Analyzing Differences of Analyzing Differences of Response Time between Response Time between Performance Testing Performance Testing Tools Tools

Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

Embed Size (px)

DESCRIPTION

Paper presented during International Conference on Computer and Information Science 2012 (ICCIS2012) as part of World Engineering, Science and Technology Congress 2012 (ESTCON2012)

Citation preview

Page 1: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012ICCIS2012

Muhammad Dhiauddin Mohamed SuffianFaculty of IT & Multimedia Communication

Open University Malaysia

Fairul Rizal FahruraziProduct Quality & Reliability Engineering

MIMOS Berhad

Performance Testing: Performance Testing: Analyzing Differences of Analyzing Differences of Response Time between Response Time between

Performance Testing ToolsPerformance Testing Tools

Page 2: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012

• Background• Related Works• Overview of Performance Testing Tools• Test Environment Setup• Findings and Discussion

– Result of Tool A– Result of Tool B– Result of Tool C– Comparison of Performance Test Results between Tools– Potential Reasons for Response Time Differences

• Conclusion

Presentation Outline

Page 3: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Background• Several issues have been observed related to tools when conducting performance

testing: tools compatibility with the software under test tools installation tools setup tools flexibility in doing test both for client and server side response time generated by the tools

ResearchFocus

Page 4: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Related Works• Most previous work on performance testing tools comparison ignored on different

result reported by each tools VCAA uses pricing and user friendliness as a criteria to decide which tool to use JDS mentions about the ability to emulate a complex business process and

support of unlimited number of concurrent users. Testingrefeclections.com - accuracy of load and response time is something

we need to evaluate against our particular application and not something to compare when determining the tool to use or buy

• There is no work so far to understand why they are different against tools • Shall we have a framework or memorandum of understanding (MOU) about the

uniformity of response time on all performance testing tools? • Each tool claims they are better than the others but none able to justify the

performance testing results against the real world.

This effort is:•to prove if there is a difference response time between different performance testing tools and potential reasons contribute to it•to enlighten the performance testers that no tool in the world able to fully replacing human for performance testing

Page 5: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Overview of Performance Testing Tools

Page 6: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Test Environment Setup

Hardware Specification (Both Machines)CPU/processor : Intel Pentium D 3.4 GHzRAM/memory : 2 GBHDD storage : 80 GBNetwork Card : Integrated 10/100/1000 Ethernet

Server machineOperating system : Windows Server 2003 Enterprise Edition SP1Java JDK : JDK 1.6.0 update 21Web server : Internet Information Services 6HTML page size : 65.8 KB (Page: 7 KB; Image 1: 25.2 KB; Image 2: 33.6 KB)

Client machineOperating system : Windows XP SP2Java JDK : JDK 1.6.0 update 21Tool : Tool A (open source);

Tool B (open source); Tool C (proprietary)

Page 7: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Findings and DiscussionResult of Tool A

Page 8: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Findings and DiscussionResult of Tool B

Page 9: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Findings and DiscussionResult of Tool C

Page 10: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Findings and Discussion

Comparison of Performance Test Results between Tools

Page 11: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Findings and Discussion

Comparison of Performance Test Results between Tools

Page 12: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Findings and Discussion

Comparison of Performance Test Results between Tools

Page 13: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Findings and DiscussionPotential Reasons for Response Time Differences • Some fundamental reasons:

capturing and simulating the load used for the performance test method of calculating metrics gathered by each tool language to develop the tools architecture of the respective tools

• Tool A and Tool C: Capturing and simulating the load plays the biggest role

Several extra items being recorded and simulated when generating user loads. Tool C by default uses Internet Explorer browser when recording and it's observed ASHX files recorded in

the list as an additional items compared to Tool A (ASHX is a web handler file ); Tool A did not JavaScript and CSS files seem have higher response time in Tool C compared to Tool A around 18

percent. It is observed the remaining file types consists of images (GIF, JPG) and HTML do have a small variation between 1 to 4 percent.

Method for calculating metrics gathered by each tool contributes to the variation of the response time Fundamental formula to calculate response time is identical which is based on last byte sent and

last byte received However, Tool C introduces Inter- Request-Delay, where some requests may have delays associated to

them Tool A does not automatically implement a delay and it is up to the user to manually configure it

• Architecture differs greatly: Tool A and C developed by using Java and they require JVM to run so the value setting for Java Heap Size plays a

role to generate the best user load without putting extra burden to the client Tool B architecture relies on web relay daemon facility allowing CORBA-based communication to be transmitted

between machines during executing the performance test

Page 14: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012Conclusion

• Different performance testing tools do give a different response time. • The next critical research is on capturing and simulating the load by each

tools:• Need to continue analyzing each of the HTTP request and response

through tool available in the market for an example Wireshark• To fully understand at the packet level what are being transferred and

received at each tools and why they are being included or excluded• Currently, there is no tool able to tell us if application is fast enough in

term of user experience in a reality• It is crucial for performance testers to understand that there is no tool able

to automate and tell us the full picture of the application's performance going to be in a real worlds

• It is back to human brain to analyze the information given and performance testing tools are just one of the tool can be used to achieve that

Page 15: Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools

ICCIS2012

THANK [email protected]@gmail.com

@MuhdDhiauddin

http://www.linkedin.com/in/dhiauddin