15
Microservices: A Performance Tester’s Dream or Nightmare? Simon Eismann University of Würzburg Cor-Paul Bezemer University of Alberta Weiyi Shang Concordia University Dušan Okanović University of Stuttgart André van Hoorn University of Stuttgart https://research.spec.org/ working-groups/rg-devops-performance.html @simon_eismann @corpaul @swy351 @okanovic_d @andrevanhoorn

Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

  • Upload
    others

  • View
    10

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

Microservices: A Performance Tester’s Dream or Nightmare?

Simon EismannUniversity of Würzburg

Cor-Paul BezemerUniversity of Alberta

Weiyi ShangConcordia University

Dušan OkanovićUniversity of Stuttgart

André van HoornUniversity of Stuttgart

https://research.spec.org/working-groups/rg-devops-performance.html

@simon_eismann @corpaul @swy351 @okanovic_d @andrevanhoorn

Page 2: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

2Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

DevOps Pipeline

What is Performance Regression Testing?

Developer

Github Build Unit test Regression test

commitschanges

triggers

Performance Regression testing

1. DeployApplication

2. Perform Load test

3. Compare toprevious commit

Page 3: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

3Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Requirements for Performance Testing

R1

R2

R3

R4

R5

A stable testing environment which is representative of the production environment

A representative operational profile (including workload characteristics and system state) for the performance test

Access to all components of the system

Easy access to stable performance metrics

Sufficient time

Page 4: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

4Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Microservice traits

T1

T2

T3

T4

T5

Self-containment

Loosely coupled, platform-independent interfaces

Independent development, build, and deployment.

Containers and Container Orchestration

Cloud-native

Page 5: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

5Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Microservices - A Performance Testers Dream?

Benefit 1: Containerization

• Containers package environment

• Simplifies setup of test environment

Benefit 2: Granularity

• Individually testable services

• Dependencies via HTTP calls

• Dependencies easily mocked

Benefit 3: Easy access to metrics

• Orchestration frameworks simplify metric collection

• Application-level metrics common

Benefit 4: Integration with DevOps

• Size reduces performance test duration

• Performance testing within pipeline

Page 6: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

6Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Too good to be true? – Let’s test it!

How stable are the execution environments of microservices?

How stable are the performance testing results?

How well can performance regressions in microservices be detected?

RQ1

RQ2

RQ3

Page 7: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

7Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Case Study

TeaStore Benchmarking Application Scenarios

Deployment Platform

Page 8: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

8Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Research Question 1 – Selected Findings

How stable are the execution environments of microservices across repeated runs of the experiments?

Finding 1: The non-deterministic behaviour of the autoscalerresults in different numbers of provisioned microserviceinstances when scaling the same load

Finding 2: Even when fixing the number of provisionedinstances of a microservices, their deployment across VMsdiffers.

Page 9: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

9Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Research Question 2 – Selected Findings

How stable are the performance testing results across repeated runs of the experiments?

Finding 1: There exist statistically significant differencesbetween the performance testing results from differentscenarios

Finding 2: The total CPU busy time may not be statisticallysignificantly different between scenarios

Page 10: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

10Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Research Question 3 – Selected Findings

How well can performance regressionsin microservices be detected?

Finding 1: Using only a single experiment run results in flakyperformance tests

Finding 2: Using ten experiment runs results in stableperformance tests

Page 11: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

11Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Microservices - A Performance Testers Nightmare?

Stability of the environment

• Autoscaling/container orchestration is not deterministic• Execution environment can not be expected to be stable

Reproducibility of the experiments

• The repeated experiments may not result in the same performance measurements

• Multiple measurements required for regression testing

Detecting small changes

• Variation between measurements can be quite large• Detecting small changes is challenging

Nightmare 1

Nightmare 2

Nightmare 3

Page 12: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

12Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Research Directions

Research Direction 1

Research Direction 2

Research Direction 3

Studying the stability of (new) performance metrics

Variation reduction in executing performance tests

Creating a benchmark environment for microservice-oriented performance engineering research

Page 13: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

13Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Replication Package

Performance measurements

Wrapped in docker container for platform independent execution

Requires only Google Cloud access keys as input

Fully automated performance measurements

Available online at:https://doi.org/10.5281/zenodo.3588515

Data set and analysis

Measurement data of over 75 hours of experiments

Scripts to reproduce any analysis, table or figure from the manuscript

1-click reproduction of the resultsas a CodeOcean Capsule

Available online at:https://doi.org/10.24433/CO.4876239.v1

Page 14: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

14Microservices: A Performance Tester’s Dream or Nightmare?

@simon_eismann

Summary

Page 15: Microservices: A Performance Tester’s Dream or Nightmare? › slides › Eismann_Microservices.pdfMicroservices: A Performance Tester’s Dream or Nightmare? @simon_eismann Research

Microservices: A Performance Tester’s Dream or Nightmare?

Simon EismannUniversity of Würzburg

Cor-Paul BezemerUniversity of Alberta

Weiyi ShangConcordia University

Dušan OkanovićUniversity of Stuttgart

André van HoornUniversity of Stuttgart

https://research.spec.org/working-groups/rg-devops-performance.html

@simon_eismann @corpaul @swy351 @okanovic_d @andrevanhoorn