Quality attributes testing. From Architecture to test acceptance

Preview:

DESCRIPTION

by Oleksandr Reminnyi

Citation preview

QUALITY

ATTR

IBUTE

S

TESTI

NG

FR

OM

AR

CH

I TE

CT

UR

E T

O T

ES

T A

CC

EP

TA

NC

E

OL

EK

SA

ND

R R

EM

I NN

YI

8 years in IT

Solutions Architect

Automation Expert (Consultant, Trainer)

PhD in IT (automation testing)

orem@softserveinc.com

WHO IS IT?

TESTI

NG ARCHIT

ECTURE

WT

F ?

WHAT IS SOFTWARE ARCHITECTURE?

IT STARTS FROM ONE GEEKY GUY…

THEN IT MIGHT FINISH UP AS FOLLOWING…

WHAT IS SOFTWARE ARCHITECTURE?

SO,

WHAT FORMS SOFTWARE ARCHITECTURE?

BUSINESS

USERS

SYSTEM

WHAT CAN BE TESTED HERE?

THIS???

GO SMART

DE

F I NI N

G Q

UA

L I TI E

S

SEI

THEY LIKE NON FUNCTIONAL REQUIREMENTS

THEY CALL THEM – QUALITY ATTRIBUTES

Category Quality attributeDesign Qualities Conceptual Integrity

MaintainabilityReusability

Run-time Qualities Availability

Interoperability

ManageabilityPerformance

ReliabilityScalabilitySecurity

System Qualities SupportabilityTestability

User Qualities Usability

WHAT IS QUALITY ATTRIBUTE?

ABILITY

WHAT QUALITY ATTRIBUTES YOU SEE?

LET’

S DO T

HE

SYSTE

MATIC

APP

ROACH

T AK

I NG

IT

AL L T

OG

ET

HE

R

QUALITY ATTRIBUTE WORKSHOP

Specific (why/what/how)

Measurable

Achievable

Result-focused

Time-bound

S.M.A.R.T.

ARCHITECTURE SCENARIOS – SAMPLE UTILITY TREE

ACCEPTANCE: A SOFTWARE ERROR OCCURS AT HIGH VEHICLE SPEED. REBOOT WITHIN 50 MSEC.

Source

• System

Stimulus

• Software error

Artifact

• System

Env

• High vehicle speed

Response

• Reboot occursResponse Measure

• Time to reboot up to 50 msec.

EXPLAINING ACCEPTANCE: GHERKIN

Given

Environment• High vehicle

speedArtifact• System

As

Source• System

When

Stimulus• Software error

Then

Response• Reboot occursResponse Measure• Time to reboot

up to 50 msec.

IMPLEMENTATION

Gherkin stubs

Best practices

Base classes reusage

Implementation monitoring

….

TRACEABILITY: ACCEPTANCE TREE

1. Implementation directly linked to Quality attribute

2. Quality attribute linked to business driver

3. Priority of the scenario shows might be a start point for its automation

Project specificsDisk level data encryption system

Goal: form acceptance for the system

Support large number of OS

Test in the cloud

Test definitionsNone

TechnologyAmazon + LAMP

ResourcesSenior part time and junior full time developer

REAL USE CASE

HOW IT MIGHT LOOK IN IDE

DIRECT JUMP TO CODE

Factor QAW-based Classic analysis approach

Work with stakeholders It’s better to have everybody in one location, at least for a short period.

You can have several separate sessions with technical representatives only

Documentation That should be part of stakeholders knowledge only

Might be outdated

Domain knowledge Not deep knowledge – participants are the knowledge holders

Required

Rough Estimate 2-3 days 3-5 days

     

GETTING ANALYSIS DATA TOGETHER

Acceptance test suite definition (~30 tests)

Mini-framework implementation

Jenkins integration

Project was DECLINED to be released because of NOT passing the automation tests

RESULT

GOT IT

?

L ET

’ S P

RA

CT

I CE

!

RECAP: PROCESS DEFINITION

•Prioritize•Build timeline•Implement

•Break scenario into steps

•Define steps in QC language

•Generate test stubs

•Define Business Goals

•Gather Quality Attributes

•Generate Scenarios

Architecture

Form Acceptanc

e

Test solution

Trace and update

QUESTIONS?

Recommended