Upload
truonganh
View
222
Download
6
Embed Size (px)
Citation preview
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 1 of 45
30/04/2014
Version 1.0
Due date of deliverable: 30/04/2014
Actual submission date: 30/04/2014
Project co-funded by the European Commission within the Seventh Framework Programme
Dissemination Level
PU Public X
PP Restricted to other programme participants (including the Commission Services)
RE Restricted to a group specified by the consortium (including the Commission Services)
CO Confidential, only for members of the consortium (including the Commission Services)
Coco Cloud Confidential and Compliant Clouds
Establishment of the Development
Environment and Test-bed Requirements
WP6 – Test Bed
D6.1
The Coco Cloud project is supported by funding under the Seventh Framework Programme of
the European Commission, Objective ICT-2013.1.5 – Trustworthy ICT. GA #610853
Responsible partner: ATOS
Editor: Anil Ozdeniz
E-mail address: [email protected]
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 2 of 45
Authors: Anil Ozdeniz, Antonio García-Vázquez, Marina Egea
(ATOS), Laurent Gomez (SAP), Lazouski Aliaksandr,
Paolo Mori (CNR), Mirko Manea, Lorenzo Blasi (HP).
Approved by: Daniele Sgandurra (ICL), Laurent Gomez (SAP)
Revision History
Version Date Name Partner Sections Affected / Comments
0.1 29/01/2014 Antonio
García-
Vázquez
ATOS Initial draft version
0.2 01/04/2014 Laurent
Gomez
SAP Section 2
0.3 04/04/2014 Mirko
Manea
HP Section 5
0.4 07/04/2014 Lazouski
Aliaksandr
CNR Section 3
0.5 07/04/2014 Anil Ozdeniz ATOS Ready for internal review
0.6 17/04/2014 Anil Ozdeniz ATOS Updated according to reviewers comments
0.7 24/04/2014 Laurent
Gomez
SAP Updated section 2
0.8 25/04/2014 Lazouski
Aliaksandr
CNR Updated section 3
0.9 28/04/2014 Anil Ozdeniz ATOS Version ready for release
0.10 29/04/2014 Lorenzo Blasi HP Updated section 3
1.0 30/04/2014 Lorenzo Blasi HP Final editing
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 3 of 45
Executive Summary The main aim of WP6 is to define and set up the test bed for the Coco Cloud project. The test
bed will host the development environment, will allow defining and running integration and
regression tests, packaging the Coco Cloud released components in a deployable format, and will
be available as deployment environment to run the use-cases defined in WP7 to show basic
business case scenarios.
This deliverable describes the work performed in WP6 in the first six months of the project. In
particular, the deliverable starts summarizing the requirements for the test bed that have been
collected from the pilots’ owners and from the developers of software components. Several types
of requirements have been collected, namely development requirements, integration
requirements and software requirements. Starting from these requirements, the deliverable
proposes a configuration of OpenStack to support the Coco Cloud framework, and defines the
characteristics of the physical and virtual environment of the test bed. The development
environment software is also defined in this document, in order to have a proper continuous
integration system for the Coco Cloud framework. In particular, a set of tool selection criteria
has been defined, and the most common revision control systems, build automation systems,
continuous integration systems, bug tracking systems and unit test systems have been evaluated
and compared according to these criteria. The resulting choice is the following: Subversion has
been selected as revision control system, Maven as build automation system, Jenkins as
continuous integration system, Trac as bug tracking system and TestNG as unit test system.
Finally, this deliverable defines the quality assurance and security procedures. The CISQ
recommendations for automated Quality Characteristics measurement has been adopted, and a
set of static code analysers, each of which covers a subset of the CISQ Quality Characteristics,
have been evaluated and compared according to tool selection criteria defined in section 2. The
selected tools for the Coco Cloud test bed are: Android Lint, Findbugs, CheckStyle and JaCoCo,
because their combined usage will help the project addressing most of the CISQ
recommendations.
Future activities in WP6 will be focused on setting-up and managing the environments described
in this deliverable.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 4 of 45
Table of Contents
Executive Summary ........................................................................................................................ 3
1. Introduction ............................................................................................................................. 7
1.1. Structure of the document ................................................................................................ 7
1.2. Definitions and abbreviations........................................................................................... 7
2. Test bed Requirements .......................................................................................................... 10
2.1. Healthcare scenario ........................................................................................................ 10
2.1.1. Development requirements ..................................................................................... 10
2.1.2. Integration requirements ......................................................................................... 11
2.1.3. Software requirements ............................................................................................ 11
2.2. Mobile scenario .............................................................................................................. 11
2.2.1. Development requirements ..................................................................................... 12
2.2.2. Integration requirements ......................................................................................... 12
2.2.3. Software requirements ............................................................................................ 12
2.3. Public administration scenario ....................................................................................... 12
2.3.1. Development requirements ..................................................................................... 12
2.3.2. Integration requirements ......................................................................................... 13
2.3.3. Software requirements ............................................................................................ 13
2.4. Integrity Verification and Attestation module requirements.......................................... 13
2.4.1. Development requirements ..................................................................................... 13
2.4.2. Integration requirements ......................................................................................... 14
2.4.3. Software requirements ............................................................................................ 14
2.5. Other requirements ......................................................................................................... 14
2.5.1. Development requirements ..................................................................................... 14
2.5.2. Integration requirements ......................................................................................... 14
2.5.3. Software requirements ............................................................................................ 15
2.6. Tools Selection Criteria .................................................................................................. 15
3. Integration and Testing Environment ................................................................................... 17
3.1. Physical resources .......................................................................................................... 18
3.2. OpenStack components .................................................................................................. 18
3.3. Component identification ............................................................................................... 20
3.4. Virtual Environment ....................................................................................................... 20
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 5 of 45
4. Development Environment Software Description ................................................................ 23
4.1. Revision Control Systems .............................................................................................. 24
4.1.1. Concurrent Versions Systems (CVS)...................................................................... 24
4.1.2. Apache Subversion (SVN) ...................................................................................... 25
4.1.3. GIT .......................................................................................................................... 26
4.2. Build Automation System .............................................................................................. 27
4.2.1. Apache Ant ............................................................................................................. 27
4.2.2. Apache Maven ........................................................................................................ 27
4.2.3. Gradle ...................................................................................................................... 28
4.3. Continuous Integration Software ................................................................................... 28
4.3.1. Hudson .................................................................................................................... 29
4.3.2. Jenkins..................................................................................................................... 29
4.3.3. Cruise Control ......................................................................................................... 29
4.4. Bug / Issue Tracking System .......................................................................................... 30
4.4.1. MantisBT ................................................................................................................ 30
4.4.2. Bugzilla ................................................................................................................... 31
4.4.3. Trac ......................................................................................................................... 31
4.5. Unit test system .............................................................................................................. 32
4.5.1. Junit ......................................................................................................................... 32
4.5.2. TestNG .................................................................................................................... 33
4.5.3. Spock....................................................................................................................... 33
4.6. Additional software components .................................................................................... 34
4.7. Final Selection of Development Environment Tools ..................................................... 34
5. Quality Assurance and Security Procedures ......................................................................... 36
5.1. Approach ........................................................................................................................ 36
5.2. Software Code Guidelines .............................................................................................. 36
5.3. Evaluated static code analysers ...................................................................................... 37
5.3.1. Android Lint............................................................................................................ 37
5.3.2. CheckStyle .............................................................................................................. 38
5.3.3. FindBugs ................................................................................................................. 38
5.3.4. PMD ........................................................................................................................ 39
5.3.5. OWASP Dependency Check .................................................................................. 40
5.4. Evaluated code coverage tools ....................................................................................... 40
5.4.1. Cobertura................................................................................................................. 41
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 6 of 45
5.4.2. JaCoCo .................................................................................................................... 41
5.5. Final selection of Quality and Security Assurance Tools .............................................. 42
6. Next steps .............................................................................................................................. 43
7. References ............................................................................................................................. 44
List of Figures
Figure 1 - Separation between Development and Testing Environments ..................................... 17 Figure 2 - OpenStack functionalities and their relationships ........................................................ 19
Figure 3 - Proposed OpenStack deployment architecture ............................................................. 21
List of Tables
Table 1: Healthcare scenario – Development requirements ......................................................... 11 Table 2: Healthcare scenario – Integration requirements ............................................................. 11
Table 3: Healthcare scenario – Software requirements ................................................................ 11 Table 4: Mobile scenario – Development requirements ............................................................... 12
Table 5: Mobile scenario – Software requirements ...................................................................... 12 Table 6: Public admin scenario – Development requirements ..................................................... 13 Table 7: Public admin scenario – Integration requirements ......................................................... 13
Table 8: Integrity Verification and Attestation module - Development requirements ................. 14
Table 9: Others - Development requirements ............................................................................... 14
Table 10: Others - Integration requirements ................................................................................. 15 Table 11: Tools selection criteria .................................................................................................. 15
Table 12: Tools selection criteria addressing CISQ Quality Characteristics ................................ 16 Table 13: Selection Criteria of Development Environment Tools ............................................... 35 Table 14: Selection Criteria of Quality and Security Assurance Tools ........................................ 42
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 7 of 45
1. Introduction The main objective of WP6 Test bed is to put in place tools, processes and activities for
delivering a testable system implementing the Coco Cloud integrated framework architecture. On
this respect, the current document aims to define the environment that will be used for
developing the Coco Cloud framework architecture. This definition includes the specification of
a continuous integration system that we will setup in the next months. This environment will
help to rationalize the software development and integration by putting in place several groups of
tools that will facilitate the collaborative development. This common development environment
will include a unit test system for running automated regression tests, a versioning system, a bug
tracking system, an issue tracking tool among others.
Additionally, this document also provides the definition of the quality management processes
and common development practices that will be enforced during the whole development life
cycle. This process includes the preparation of a quality plan, selection of the programming
languages, providing of coding guidelines and programming tools.
Summarizing, this document describes the architecture of cloud developing and integration
environments supported by Coco Cloud WP6 to allow the development and deployment of pilots
and components by other Coco Cloud work packages.
1.1. Structure of the document
The document is structured as follows:
After this introductory chapter, Chapter 2 summarizes the test bed requirements collected from
each pilot owner.
Chapter 3 provides a description of the physical and virtual elements that will compose both the
development and the integration environments.
Chapter 4 contains an assessment about the possible tools to be included in the common
development environment.
Chapter 5 describes the strategy for evaluating and measuring the overall software quality both
at and near specific project milestones.
Chapter 6 concludes with the lessons learned from this activity and the next steps to follow in
task T6.1.
1.2. Definitions and abbreviations
Term Meaning
AgID / AGID Agenzia per l’Italia Digitale (Agency for Digital Italy)
API Application Programming Interface
CA Certificate Authority
CAD Codice dell’Amministrazione Digitale (Digital Administration Code)
CI Continuous Integration
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 8 of 45
CISQ Consortium for IT Software Quality
CPD Copy-Paste-Detector
CPE Common Platform Enumeration
CVE Common Vulnerability and Exposure
DDOS Distributed Denial-Of -Service
DDR3 Double Data Rate type 3
DSA Data Sharing Agreement
DSL Domain Specific Language
EU European Union
FTP File Transfer Protocol
FTPS File Transfer Protocol Secure (based on TLS/SSL)
GPL General Public License (GNU)
GUI Graphical User Interface
HDD Hard Disk Device
IT Information Technology
ICT Information and Communications Technology
IDE Integrated Development Environment
JVM Java Virtual Machine
LGPL Lesser General Public License
LTS Long Term Support
PA Public Administration
PACS Picture Archiving and Communication System
PPL PrimeLife Policy Language
RAM Random Access Memory
SDK Software Development Kit
SSH Secure Shell
SSL Secure Sockets Layer
SVN SubVersioN (versioning system)
TLS Transport Layer Security
VM Virtual Machine
WSDL Web Services Description Language
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 9 of 45
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 10 of 45
2. Test bed Requirements In this section, we list test bed requirements collected from each Coco Cloud pilot owner, and
from software component developers. The pilot owners are Quiron with the healthcare use case,
AGID for the public administration use case, and SAP for the mobile use case. Those use cases
are detailed in deliverable D7.1. In addition, we’ve collected test bed requirements from software
component developers such as ICL and HP.
The following subsections contain tables describing the requirements collected for the test bed,
ordered by priority1. Each requirement can be identified through its unique ID. We distinguish
three types of test bed requirements: development (identified as WP6-Dev-*), integration
(identified as WP6-Int-*) and software related (identified as WP6-Soft-*). Integration and
software requirements will guide the setup of the integration and testing environment, whereas
development requirements will help in the setup of the development environment.
Another type of requirement has been collected from pilot owners and component developers:
criteria for the selection of development tools. Tool selection criteria are listed in section 2.6 and
are identified as WP6-Cri-*.
2.1. Healthcare scenario
In the scope of the healthcare scenario, Quiron proposes to develop the following components:
The PACS connection/configuration
Gateway connection from the hospital (private cloud) outwards
The patient and doctor portals
The requirements related to those components are detailed as follows.
2.1.1. Development requirements
ID Description Comment Priority
WP6-Dev-1 Code repository
Binary code ONLY or an hybrid
model committing part of the
source code (we need to decide our
business strategy)
1
WP6-Dev-2 Programming language
Java 1.7, with DCM4CHE toolkit
for PACS
connection/configuration
implementation
1
WP6-Dev-3 Use of Maven as build tool Both the pilot and the API for the
gateway and PACS connection
will be built upon Maven.
1
WP6-Dev-4 Programming language
convention
Oracle/Sun’s code conventions for
the Java Programming Language
as a guideline, but not strictly
2
1 MoSCoW scale: 1 highest –4 lowest
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 11 of 45
WP6-Dev-5 Unit testing
Unit test required for the pilot
itself.
The gateway connection is merely
a component for the entire pilot.
However, provide a unit test to
check the connection
functionalities can be helpful.
2
Table 1: Healthcare scenario – Development requirements
2.1.2. Integration requirements
ID Description Comment Priority
WP6-Int-1 Integration with DSA
enforcement component
The public cloud will be hosted
on the test bed. The private cloud
part (e.g., images and reports) will
remain on the hospital side.
1
Table 2: Healthcare scenario – Integration requirements
2.1.3. Software requirements
ID Description Comment Priority
WP6-Soft-1 Swift/Cinder on OpenStack. Storage 1
WP6-Soft-2 Nova and Glance on
OpenStack
Also, the orchestration may be
required for management and
technical administration purposes.
As the e-health pilot does not
require high performance in
calculation processing, the virtual
machines may not be necessary
3
Table 3: Healthcare scenario – Software requirements
2.2. Mobile scenario
In the scope of the healthcare scenario, SAP proposes to develop a Cloud based PPL engine. This
component will be in charge of the enforcement of access and usage control policy defined in the
scope of the mobile scenario. In this scenario, data subject defines a set of privacy preferences on
data stored on a cloud based service. The latter enforces the security policy defined by the data
subject, and enable data sharing with mobile users.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 12 of 45
The requirements related to those components are detailed as follows.
2.2.1. Development requirements
ID Description Comment Priority
WP6-Dev-6 Code repository
Binary only 1
WP6-Dev-7 Programming language Java 1.7 1
WP6-Dev-9 Maven as build tool 2
WP6-Dev-10 Programming convention Oracle/Sun’s code conventions for
the Java Programming Language
as a guideline, but not strictly.
2
WP6-Dev-8 Unit testing No preference on unit testing
technology
3
Table 4: Mobile scenario – Development requirements
2.2.2. Integration requirements
No integration requirement has been identified for the mobile use case.
2.2.3. Software requirements
ID Description Comment Priority
WP6-Soft-3 Swift for repository 1
WP6-Soft-4 Keystone for Identity
Management
1
WP6-Soft-5 Ubuntu Server or JCloud as
application server
1
Table 5: Mobile scenario – Software requirements
2.3. Public administration scenario
AGID proposes a prototype that simulates data sharing between Public Administrations (PAs)
mainly through web-based or machine-to-machine interactions.
2.3.1. Development requirements
ID Description Comment Priority
WP6-Dev-11 Code repository No constraints about the use of 3
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 13 of 45
source code (but keep in mind
the above issue about
developing).
WP6-Dev-12 Programming language: Java Latest version 1
WP6-Dev-13 Unit testing 3
WP6-Dev-14 Programming convention 3
Table 6: Public admin scenario – Development requirements
2.3.2. Integration requirements
ID Description Comment Priority
WP6-Int-2 Integration with the DSA
authoring tool.
2
Table 7: Public admin scenario – Integration requirements
2.3.3. Software requirements
No software requirement has been identified.
2.4. Integrity Verification and Attestation module requirements
ICL propose the implementation of a module for integrity verification and attestation, both for
Cloud and Mobile environments. It is meant to run stand-alone, like an independent plugin.
2.4.1. Development requirements
ID Description Comment Priority
WP6-Dev-15 Code repository source code under GPL license 1
WP6-Dev-16 Programming language C 1
WP6-Dev-17 Programming language Java 1.7 1
WP6-Dev-18 Programming language Java for Android 1
WP6-Dev-19 Deployment
Not necessarily. simple
MAKEFILE could be sufficient
2
WP6-Dev-20 Programming convention
Mostly Oracle/Sun’s Code
Conventions for the Java
Programming Language is an
option, but with some differences
(e.g., newline before/after curly
braces, etc).
2
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 14 of 45
WP6-Dev-21 Unit testing 3
Table 8: Integrity Verification and Attestation module - Development requirements
2.4.2. Integration requirements
No integration requirement has been identified.
2.4.3. Software requirements
No software requirement has been identified.
2.5. Other requirements
HP proposes to contribute to the whole Coco Cloud architecture by developing a Graphical User
Interface to create and manage Data Sharing Agreements. The component will be a web browser
GUI extensible with plugins to generate low level enforceable security policies. Other proposed
contributions to the architecture are mechanisms for usage control enforcement on both server
(OpenStack) and client (browsers) side. These components will enforce security policies defined
through the GUI and will enable data sharing with desktop users.
2.5.1. Development requirements
ID Description Comment Priority
WP6-Dev-22 Code repository Yes, binaries for the DSA UI and
possibly code for the other
components
1
WP6-Dev-23 Programming language Java 1.7 1
WP6-Dev-24 Unit testing with JUnit 1
WP6-Dev-25 Deployment tool Maven 1
WP6-Dev-26 Programming convention CISQ guidelines will be followed 1
Table 9: Others - Development requirements
2.5.2. Integration requirements
ID Description Comment Priority
WP6-Int-3 Direct integration of DSA
UI with its plugins
1
WP6-Int-4 Usage control enforcement
components (PEP) may
need integration with
Attribute servers (PIP) and
2
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 15 of 45
policy engines (PDP), and
also with other security
components (e.g. PKI
infrastructure)
Table 10: Others - Integration requirements
2.5.3. Software requirements
No software requirement has been identified.
2.6. Tools Selection Criteria
The development environment described in section 4 implements a continuous delivery pipeline
where code artifacts are constantly/regularly generated from the sources. Working in this context
is a mandatory requirement for any tool used to assist the code quality process. They should also
address as many as possible of the CISQ recommendations (see section 5.1 and [29]) for the
quality characteristics regarding reliability, efficiency, security and maintainability. They should
also support the chosen reference language (Java) and could provide valuable inputs for specific
technologies (Mobile with Android and Web frameworks).
The following table describes common criteria collected from both pilot owners and component
developers for the tools’ selection, ordered by priority2:
ID Description Priority
WP6-Cri-1 Build automation environments support. (Code repository, build, CI
and unit test tools)
1
WP6-Cri-2 Free or open source. 1
WP6-Cri-3 Java support. 1
WP6-Cri-4 Mobile platforms (Android) support.
Note: Mobile Business Case described in WP7 will use a mobile
device.
2
WP6-Cri-5 IDE integration (e.g. Eclipse).
Note: this helps findings bugs earlier in the development lifecycle.
3
WP6-Cri-6 Specific web technologies/frameworks support (e.g. jsp, servlet,
struts, etc.)
3
Table 11: Tools selection criteria
The following table provides criteria to take into account for the selection process to address the
CISQ Quality Characteristics (most relevant characteristic(s) shown):
2 MoSCoW scale: 1 highest –4 lowest
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 16 of 45
ID Description Priority Reliability Efficiency Security Maintainability
WP6-Cri-7 Analyse common
security (web)
vulnerabilities (e.g.
Cross-Site Scripting,
SQL Injection, etc.).
1 X
WP6-Cri-8 Coding standards (e.g.
Java coding guidelines)
2 X
WP6-Cri-9 Check for bad
programming practices
(e.g. unhandled
exceptions, code copy
and paste, etc.).
1 X X X
WP6-Cri-10 Work even without
source code.
This allows evaluating
third-parties Java byte
code or closed source
Java components.
2 X X
WP6-Cri-11 Check testing coverage.
This could work
harmonically with the
unit testing framework
envisioned for the
development
environment.
3 X
Table 12: Tools selection criteria addressing CISQ Quality Characteristics
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 17 of 45
3. Integration and Testing Environment
Two different groups of physical/virtual machines are being foreseen
- The first group will host the Development environment and software tools in charge of
continuous integration management system.
- The second one will implement the Integration and Testing environment where developed
components and pilot applications are going to be deployed and evaluated.
Separation between the two environments will offer more stability for Integration and Testing
and the needed flexibility for Development.
The following Figure 1 shows the separation between Development and Testing environments.
The Testing environment is shown here with the preliminary deployment configuration of five
Virtual Machines proposed in section 3.4 to host the Swift (object storage) OpenStack
component. Other possible and more complex deployment configurations may be studied in the
future.
Figure 1 - Separation between Development and Testing Environments
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 18 of 45
3.1. Physical resources
The development environment is going to be hosted in one HP ProLiant DL385 G7 Server
already available in the CNR data center. The server is designed to support a virtualization
environment and has the following hardware specification:
- two 16-core AMD Opteron 6274 server processors (2.20GHz per core) with 16 Mb cache
memory of level 3 per each processor
- 32 Gb of DDR3 registered random access memory
- 4 Tb HDD storage
- two HP NC382i dual port multifunction Gigabit server adapters (network controllers)
- graphics: integrated ATI ES 1000, 16-bit color and maximum resolution of 1600 x 1200
The integration and testing environment is going to be hosted in another HP ProLiant DL385p
G8 Server. The server has been ordered and should arrive by the end of April 2014. The server
has the following hardware specification:
- two 16-core AMD Opteron 6376 server processors (2.30GHz per core) with 16 Mb cache
memory of level 3 per each processor
- 32 Gb of DDR3 registered random access memory
- 8 Tb HDD storage
Both servers are placed in the data centre of the National Council of Research of Italy (CNR) in
Pisa with the following environment specification:
- temperature range: +10-+28 °C
- humidity range: 10% - 90%
3.2. OpenStack components
OpenStack [1] is the IaaS cloud framework to be used as a basis for Coco Cloud Test bed.
OpenStack is an open source project backed by an independent Foundation and supported by
several Corporate sponsors such as Hewlett Packard, Rackspace, Ericsson, Intel, etc. and a global
community with more than 7,000 members representing 850 unique organizations across 88
countries. The source code is available under the Apache 2.0 license.
OpenStack is composed of the following modules mapping the fundamental IaaS services:
Nova provides computation services (Virtual Machines)
Neutron (formerly Quantum) provides networking services (Virtual Networks).
Cinder provides block storage services (Virtual Disks)
Swift implements object storage services (files)3.
Some other modules are available implementing other important functionalities:
Horizon provides a web front-end for managing and controlling purposes
3 This service allows users to store ‘objects’ (i.e. files) in the cloud. It can be used, for example, for sharing photos,
music, documents, etc.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 19 of 45
Glance implements a catalogue for storing virtual machine images
Keystone implements authentication and authorization
Heat uses other components for orchestrating the creation/deletion of IaaS object
aggregations
Ceilometer monitors the usage of resources for metering and accounting purposes.
Communication between OpenStack modules relies on a message broker middleware (e.g.
RabbitMQ) and status information is stored in a centralized database (e.g. MySQL).
Figure 2 below shows the relationships between OpenStack components.
Figure 2 - OpenStack functionalities and their relationships
Most OpenStack modules are composed of sub-modules implemented as programs usually
running on physical machines (servers or nodes). For example Cinder, the component providing
block storage services, is composed of cinder-api subcomponent, cinder-scheduler
subcomponent and cinder-volume subcomponent.
Every OpenStack subcomponent can in principle be run on a dedicated node, but usually several
subcomponents, even of different modules, are co-located on a single node. The way the
subcomponents are installed on the nodes is described in the deployment architecture.
This deliverable identifies the OpenStack modules that will be installed in the Coco Cloud test
bed and proposes an initial deployment strategy; the exact deployment architecture and further
configuration details will be fixed after some experimentation with real usage and load.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 20 of 45
3.3. Component identification
To select which OpenStack modules should be hosted in the Integration and Testing environment
the software requirements expressed in section 2 by pilot owners and component developers
have been considered.
The main requirement common to all pilots is to have storage space (storage as a service)
available in the test bed cloud, since Coco Cloud is centred on the concept of data sharing and
will provide protection of data resources with improved privacy for users. This means installing
the Swift [2] OpenStack module, but also Keystone [3] for basic authentication and
authorization. The other OpenStack storage module, Cinder, is typically used to provide storage
volumes for virtual machines but, since Nova is not needed, it will not be considered in the first
configuration.
A further module which can be installed is Horizon [4], offering easier web-based access to the
resources exported by the other components. Also Ceilometer [5] can be installed if specific
metering needs arise, but .
The deployment of the OpenStack components indicated above, in a typical configuration, will
be done on several nodes. Since only one physical server is available, each of the needed nodes
will be hosted on a different virtual machine; therefore a computing virtualization environment is
also needed. The already available physical server runs VMware vSphere 5.0 supervisor software
to allow creation of virtual machines. To have a uniform environment allowing easier
management, also the server dedicated to the integration and testing environment can use the
same virtualization environment, therefore the installation of Nova may not be needed.
3.4. Virtual Environment
On the first physical server a virtual machine has been already created and is accessible via SSH
port 22. This virtual machine is dedicated for the development environment, it runs Ubuntu
12.04 LTS desktop version and has the following specification (however, it is possible to
customize the virtual machine and create new ones should the project require them):
- two 4-core processors (2.20GHz per core)
- 8 Gb of RAM
- 250 Gb of hard disk
The second physical server is available to create virtual machines for the integration and testing
environment.
To deploy Swift, Horizon and Keystone OpenStack modules the following subcomponents are
needed. Only Keystone doesn’t have subcomponents.
Common components:
Message broker (e.g. RabbitMQ)
Database (e.g. MySQL)
For Horizon:
Horizon dashboard
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 21 of 45
Horizon database
For Swift:
Object storage API
Swift-proxy
Swift account/container databases
Storage nodes (with their connected disk partitions)
The proposed OpenStack deployment architecture for the Coco Cloud test bed is shown in Figure
3. The needed components will be deployed on five virtual machines: one VM acting as
controller node with all the common components and the dashboard GUI, and four VMs hosting
as many Swift storage nodes.
Figure 3 - Proposed OpenStack deployment architecture
Swift replicates data across several (by default three) storage nodes for high availability of data.
It allows defining different Zones of storage nodes to separate entities which are not supposed to
fail at the same time (e.g. different disks, different physical machines, even different sites). Each
replica is stored in separate zones, if possible. The proposed configuration of the Coco Cloud test
bed will define two zones, each with two storage nodes. Alternatively the four storage nodes
could be defined as four different zones.
The message broker middleware hosted on the controller node is used by OpenStack modules to
communicate with each other, and the databases allow each module to store status information.
Each virtual machine in the proposed configuration will be connected to two networks: the
Management network, which will be used for the communication between the OpenStack
elements, and the External and API network, which exposes all OpenStack API endpoints and is
used for the communication between VMs running in the cloud and any external entity (e.g. end
users on Internet).
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 22 of 45
Each of the five virtual machines will run Ubuntu 12.04 and will have at least the following
resources:
- a single-core processor (2.30GHz)
- 2 Gb of RAM
- 30 Gb of hard disk
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 23 of 45
4. Development Environment Software Description This chapter describes the tools that will be used in management of Coco-Cloud’s continuous
integration system. Continuous Integration (CI) is the practice, in software engineering, that
requires developers to integrate all working copies into a shared repository several times a day.
Each of the integration is verified by an automated build and test to detect integration errors as
quickly as possible [15]. According to [15], a list of key practices that should be followed in a
continuous integration system:
- Maintain a code repository: It is extremely recommended to have a revision control system for
storing the source code of the project. All artifacts required to build the system should be placed
in this common repository. The most common and recommended practice in this context is that
the system should be “buildable” from a fresh checkout without requiring additional external
dependencies. Consequently, as highlighted above, all artifacts should be stored in the same
repository.
- Automate the build: It is also recommended to automate as much as possible, preferably with a
single command, the building of the system. Automation of the build should include automating
the integration, which often includes deployment into a production-like environment. In many
cases, the build script not only compiles binaries, but also generates documentation, website
pages, statistics and distribution media.
- Make the build self-testing: Once the system has been built, all tests should be executed to
verify that the system behaves as expected. This set of tests, should include unitary functional
tests but also integration tests, performance tests, security tests, etc.
- Everyone commits to the baseline regularly: By committing regularly, the number of conflicts
detected by each developer will be reduced and since they have been detected at an early
stage, they should be easily to resolve. Committing all changes at least once a day is generally
considered part of the definition of Continuous Integration.
- Every commit should be built: After every main commit, the system should be completely built
to verify that the components still integrate correctly after the new changes. Usually, this check
is done in an automated way, but, it can be done also manually.
- Keep the build fast: Building procedure should be as fast as possible in order to detect conflict
and integration problems as quickly as possible.
- Test in a clone of the production environment: This testing environment should be a scalable
version of the production environment trying to reduce the cost of setting up and maintaining
this testing environment but also maintaining the technology stack composition and nuances.
The differences between testing and production environments can lead to failures, not detected
in testing environment, when deploying to production.
- Make it easy to get the latest deliverables: Making documentation easily available to all
stakeholders during the whole development life cycle can reduce the amount of rework
necessary when rebuilding a feature that does not meet requirements Everyone can see the
results of the latest build.
- Everyone can see what’s happening: To find out whether the build breaks and, if so, who made
the relevant change.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 24 of 45
- Automate deployment: It is recommended to write a script to automate the deployment of the
application to a testing a live testing environment server that everyone can look at. Additionally
a further refinement would be to automate the deployment in the production environment, of
course including also automated tests to prevent defects or regressions.
In order to manage Coco Cloud’s continuous integration there should be a revision control
system for versioning the project, a build automation system for compiling and building the
project code, a CI Software for automating CI process, a Bug Tracking system for tracking issues
and finally a unit test system for testing the build code.
4.1. Revision Control Systems
A Revision Control System is the central element of a CI process. It helps projects from spinning
out of control by letting developers, each tackle a project from a different angle, focusing only on
the components he/she is developing, without getting in each other’s way and without doing
damage that cannot be undone. In addition to storage of source code, these solutions also offer
advanced capabilities that help developers to track the changes easily:
- Merge and tracking versions system.
- Feature based workflow
- Full or partial change upload
- File locking (exclusive check-out)
Possible solutions in this area are described and analysed in the following chapters.
4.1.1. Concurrent Versions Systems (CVS)
CVS [18] is an old (80s) and popular version control system. The maintenance of CVS is still
active but new features are not added. It is released under GNU License [WP6-Cri-2].
CVS is a centralized version control system where the version history is stored on a single
central server and the client machines have a copy of all the files that the developers are working
on [6]. In order to perform CVS operations like check-ins and updates a network connect must
be available between client and server but developers can edit files on their local copy. CVS can
handle branching projects so the developed software can diverge into different products with
unique features and will be reconciled at a later time. However, the usage of this branching
functionality usually introduces additional complexity to the integration and building process if it
is not managed carefully.
The CVS server runs on Unix-like systems and client software runs on multiple operating
systems. There is also a project named CVSNT which was created to run CVS on Windows
servers.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Has been in use for many years and is considered mature technology
Excellent cross-platform support. Ported to Windows, Linux, Mac
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 25 of 45
Good support from huge CVS community.
CVS is one of the older source-management systems, serious bugs have been fixed.
There is no integrity-checking for the source-code repository.
Check-outs and commits are not atomic.
Branch operations are expensive [7].
CVS does not support versioning of renamed and moved files.
Security risks from symbolic links to files
No atomic operation support, leading to source corruption
4.1.2. Apache Subversion (SVN)
SVN [19] was created as an alternative to CVS that would fix some bugs in the CVS system
while maintaining high compatibility with it. Like CVS, SVN is free and open source with the
difference of being distributed under the Apache license [WP6-Cri-2] as opposed to GNU. It is
one of the most popular versioning control system.
SVN is also centralized version control system like CVS. It has a central repository which stores
all files and users must communicate over the network with the central repository to obtain
history about files. The disadvantage of this centralized approach is that when the server is
down, no clients are able to access the code. Because of this centralized approach, SVN has
slower comparative speed and the lack of distributed revision control. Distributed revision
control uses a peer-to-peer model rather than using a centralized server to store code updates.
While a peer-to-peer model would work better for world-wide, open source projects, it may not
be ideal in other situations like developments on a distributed but controlled (in terms of size)
community.
To prevent corruption in the database from being corrupted, SVN employs a concept called
atomic operations. Either all of the changes made to the source are applied or none are applied,
meaning that no partial changes will break the original source.
Additionally, SVN also addresses CVS’s problems with branching operations. SVN is designed
to allow for it, lending itself better to large, forked projects with many directions. The SVN
repository consists of 3 key points: Trunk, Branches and Tags [9]:
- Trunk: It is the central area where stable code and files will stored. Users will get updates from
trunk.
- Branches: When working on a new feature users can get an exact copy of the trunk and after
they complete they can merge changes to truck.
- Tags: Tags are similar to Branches; users can get an exact copy of the trunk. But tags are used in
deployment when the code is ready for release. It can be also used for reverting the code back
to a stable state.
The SVN servers run on all modern flavours of UNIX, Win32, BeOS, OS/2, MacOS X [10] and
client software runs on multiple operating systems.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 26 of 45
Concerning the strengths (↑) and weaknesses (X) of this solution:
Single Repository provides easy management of data.
Wider range of user interface tools and plugins for IDE’s.
Automatic detection of binary files.
Partial Checkout
Shorter and Predictable Revision Numbers
Cheaper branch operations when compared to CVS
Supports atomic checkouts and commits.
Contains more bugs when compared to CVS.
Still contains bugs relating to renaming files and directories.
Slower speed when compared to GIT.
Insufficient repository management commands.
Restricted access control.
4.1.3. GIT
Git [20] is another popular open source, GNU Licensed [WP6-Cri-2], version control system that
takes a radical approach that differs greatly from CVS and SVN. The original concepts for Git
were to make a faster, distributed revision control system that would openly defy conventions
and practices used in CVS. Git has multiple repositories, it has one central repository but each
developer has their own repository. Network traffic is not needed to commit code into source
control, developers can quickly do versioning operation on their local repository and when the
code is ready they can push the code to the master repository. It is primarily developed for Linux
and has the highest speeds on there. It will also run on other Unix-like systems, and native ports
of Git are available for Windows as msysgit.
Git is a distributed version control system which best suits with large teams, but, as there is no
centralized server, Git does not lend itself to single developer projects or small teams as the code
may not necessarily be available when using a non-repository computer. Workarounds exist for
this problem, and some see Git’s improved speed as a decent trade-off for the hassle.
Git also comes equipped with a wide variety of tools to help users navigate the history system.
Each instance of the source contains the entire history tree, which can be useful when developing
without an internet connection.
Git supports most major operating systems including BSD, Solaris, Linux, OS X, and Microsoft
Windows.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Better access control; you can decide when to merge what from whom.
Extremely fast when compared other version control systems.
Repository and working directory sizes are extremely small when compared to SVN.
Full history tree available offline.
Cheap branch operations.
Not optimal for single developers.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 27 of 45
Long and unpredictable revision numbers
Limited Windows support compared to Linux
Fewer range of user interface tools when compared to SVN
Partial Checkout is not available
Learning curve for those used to SVN.
4.2. Build Automation System
Build Automation Systems are in charge of scripting or automating the process of compiling
source code into binary code.
In next subsections, an assessment of the most well-known tools in this area is provided.
4.2.1. Apache Ant
Ant [16] is an open source Apache Licensed [WP6-Cri-2] software tool for automating software
build processes. Ant provides build-in tasks to compile, assemble, test and run Java applications.
It is implemented in Java and used mostly for Java projects; however, Ant can also be used
effectively to build non Java applications, such as C or C++ applications.
One of the strengths of Ant is its backwards compatibility. Ant can be used on various systems,
ranging from machines that have not been updated for ages, to brand-new computer systems.
However, there is a disadvantage of maintaining this sort of support, is that the new Java features
unfortunately cannot be used in the Ant core area. But, knowing that one's program can pretty
much run everywhere is important.
Ant was the first of the so-called “modern build tools”, and is used for automating build process.
More generally, Ant can be used to pilot any type of process which can be described in terms of
targets and tasks [16].
Ant was criticized for having some drawbacks. Build scripts are written in XML, which is
hierarchical by nature and not suitable for procedural programming, something often needed in
build scripts. Ant is often claimed to be overly verbose, which means it can eventually make
build scripts for large projects unmanageably big.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Fast builds.
Wide community support.
Backwards compatibility.
No formal conventions, users have to configure Ant exactly where to find the source and where
to put the output.
Long setup time.
4.2.2. Apache Maven
Maven [21] is an open source Apache Licensed [WP6-Cri-2] software tool for automating
software build processes. It is not only a build automation tool, also can be used to manage a
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 28 of 45
software project's build, reporting and documentation. Maven natively supports building Java
applications. It can also build other language projects with plugins.
In the same way Ant does it, Maven still uses XML for writing a build file (pom.xml), but the
structure is very different because it became declarative. Now developers do not have to write
down all the commands that lead to some goals, but instead he describes the structure of the
project (if it differentiates from the conventions) and Maven itself provides the available targets
that a programmer can invoke. Plus, Maven is capable of downloading dependencies through a
network.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Wide community support
Quick setup time due to the conventions.
Wide plugin support
Long build time
4.2.3. Gradle
Gradle [22] is a new open source Apache Licensed [WP6-Cri-2] software tool for automating
software build processes. Gradle is the youngest build tool of the three and its developers tried to
combine the power and flexibility of Ant with the dependency management and conventions of
Maven into a more effective way to build [17].
Main difference of Gradle from Ant and Maven is using Domain Specific Language (DSL)
instead of XML. DSL is based on the JVM language Groovy, which was invented so that
developers could ditch the verbosity of XML and write more simple and clear statements. This
has sparked a debate among users as to whether the standard, easily-understandable (but long-
winded) style of XML is better or worse than DSL.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Change detection will only run tasks that have changed since last run.
Fewer configurations when compared to Ant and Maven.
Newest technology, less community support more possible bugs.
Performance problems.
4.3. Continuous Integration Software
Continuous integration software helps speeding up and managing CI process. Also they can be
used to run artifacts defined and deploy software projects into both development and integration
environments.
The selected tool should be integrated with both source code repository and build automation
system.
Some other features that should also be present are:
- Unit test reporting
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 29 of 45
- Multiple environments simultaneous management
- Monitoring executions of externally-run jobs
- Real-time notifications (e-Mail)
In next subsections, descriptions of the most well-known tools in this area are provided.
4.3.1. Hudson
Hudson [23] is an open source MIT licensed [WP6-Cri-2] application that monitors executions
of repeated jobs. It can be used for building and testing software projects continuously or
monitoring executions of external jobs like cron jobs.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Hudson is more stable, has fewer bugs when compared to Jenkins.
Less community support when compared to Jenkins.
Hudson’s development is less active.
Fewer plugins when compared to Jenkins.
4.3.2. Jenkins
Jenkins [24] is another open source MIT licensed [WP6-Cri-2] continuous integration software.
In 2011 Hudson team moved to develop Jenkins [11], both Jenkins and Hudson are supporting
common features. However, Jenkins is widely supported by community and a wide variety of
plugins developed for Jenkins.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Jenkins has a large number of plugins when compared to Hudson.
Community supports Jenkins [11].
Jenkins has more frequent releases when compared to Hudson.
Jenkins is less stable, has more bugs when compared to Hudson.
4.3.3. Cruise Control
Cruise Control [25] is both a continuous integration tool and an extensible framework for
creating a custom continuous build process. It is an open source application and licensed under
BSD-style [WP6-Cri-2].
Main difference of Cruise Control from Jenkins and Hudson is Cruise Control still using XML
for configuring Jobs. Cruise Control is an old application when compared to other applications
and has a better documentation.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Cruise Control has a wide documentation.
Rich GUI reporting.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 30 of 45
Subversion support is only for change detection.
Do not provide output analysis of Junit, FindBugs.
4.4. Bug / Issue Tracking System
In order to be able to track bugs and issues from either software components to be created in
Coco Cloud project, software tools and the physical environments different bugs and issues
sources must be supported; for example by defining multiple instances or sub-projects under the
root Coco Cloud project.
The main benefit of a bug tracking system is to provide a clear centralized overview of
development requests (including bugs and improvements, the boundary is often fuzzy), and their
state. The prioritized list of pending items (often called backlog) provides valuable input when
defining the product road map, or maybe just "the next release". Additionally, a bug tracking
system may be used to generate reports on the productivity of programmers at fixing bugs.
However, this may sometimes yield inaccurate results because different bugs may have different
levels of severity and complexity. The severity of a bug may not be directly related to the
complexity of fixing the bug. There may be different opinions among the managers and
architects. This systems must also provide user an easy to use system to categorize defects (based
on severity and/or priority levels) as well as tools to manage the bugs/issues life cycle (from
detection stages to solve ones).
Some other features that should also be present are:
- Unit test reporting
- Real time notifications (e-Mail) to both issues creator and developers.
- Test tools integration
Possible implementations are discussed in the next subsections.
4.4.1. MantisBT
MantisBT [12] is an open source GNU GPL licensed [WP6-Cri-2] web-based bug/issue tracking
system. It supports Linux, Windows and Mac OS X on the server side. Clint application is
supported by most common web browsers like Chrome, Firefox, Safari, Opera and IE 7+ [12].
Mantis is written in PHP, and works on various databases including MySQL, MS SQL and
PostgreSQL. Mantis provides features such as: source code integration, time tracking, issue
relationship graph, custom fields and workflow and anonymous access.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Lite-weight when compares to Bugzilla.
Easy installation and usage.
Source Control Integration (GIT, SVN and CVS).
Support on mobile devices.
Less graphic reporting function.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 31 of 45
No duplicate Bug detection.
4.4.2. Bugzilla
Bugzilla [26] is an open source Mozilla Public Licensed [WP6-Cri-2] web-based defect tracking
system. It allows users to keep track of bugs and issues in projects effectively.
Bugzilla is written in Perl, and works on various databases including MySQL and Oracle. It
includes features such as: time tracking, private attachment and commenting, flexible reporting
and charting (including ability to scheduled reports and receive it via email) and the possibility
of adding custom fields and workflows.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Advanced Search Capabilities.
Customized Email Notifications.
Reports and Charts.
Duplicate Bug detection.
Customizable Interface.
Painful installation.
Third party support for source control integration.
4.4.3. Trac
Trac [27] is an open source BSD Licensed [WP6-Cri-2], project management and bug tracking
system. Apart from issue tracking, it also provides wiki, and integration to subversion. Trac is
written in Python and its web interface is very simplistic and easy to use. This also provides
project management features including roadmap and milestone tracking.
Trac allows wiki markup in issue descriptions and commit messages, creating links and seamless
references between bugs, tasks, changesets, files and wiki pages. A timeline shows all current
and past project events in order, making the acquisition of an overview of the project and
tracking progress very easy. The roadmap shows the road ahead, listing the upcoming
milestones.
It provides an interface to Subversion and Git (or other version control systems), an integrated
Wiki and convenient reporting facilities.
Concerning the strengths (↑) and weaknesses (X) of this solution:
Excellent integration with Subversion.
User friendly interface.
Less features when compared to Bugzilla and Mantis.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 32 of 45
4.5. Unit test system
To allow the automation of unit and individual regression tests some additional artifacts will be
defined. Strickly speaking, unit testing are tests at the "unit" level. In other words, the testing of
individual methods or blocks of code without consideration of the surrounding infrastructure. In
procedural programming, a unit could be an entire module, but it is more commonly an
individual function or procedure. In object-oriented programming, a unit is often an entire
interface, such as a class, but could be an individual method. In computer programming, unit
testing is a method by which individual units of source code, sets of one or more computer
program modules together with associated control data, usage procedures, and operating
procedures are tested to determine if they are fit for use. Unit tests are typically run without the
presence of physical resources that involve I/O such databases, socket connections or files. This
is to ensure they run as quick as possible since quick feedback is important.
Unit tests are short code fragments, called test cases, created by programmers or occasionally by
white box testers during the development process. Additionally developers should create the
required testing methods on design for automatic test to be able. Ideally, each test case is
independent from the others. Substitutes such as method stubs, mock objects, fakes, and test
harnesses can be used to assist testing a module in isolation. Unit tests are typically written and
run by software developers to ensure that code meets its design and behaves as intended.
Most common implementations in this area are discussed in the next subsections.
4.5.1. Junit
JUnit [13] is an open source Eclipse Public Licensed [WP6-Cri-2] Regression Testing
framework used by developers to implement unit testing in Java and accelerate programming
speed and increase the quality of code. It is an instance of the xUnit architecture for unit testing
frameworks [13].
JUnit test framework provides following important features such as fixtures (a fixed state of a set
of objects used as a baseline for running tests), test suites (a set of test cases that must be
executed together), test runners (used for executing the test cases) and JUnit classess (important
classes which is used in writing and testing JUnits).
JUnit framework uses annotations to identify methods that specify a test. Typically these test
methods are contained in a class which is only used for testing. It is typically called a Test class.
JUnit assumes that all test methods can be executed in an arbitrary order. Therefore tests should
not depend on other tests. To write a test with JUnit you annotate a method with the
@org.junit.Test annotation and use a method provided by JUnit to check the expected result of
the code execution versus the actual result.
JUnit has a wide range of ports for different languages and is the established de-facto solution in
this area. JUnit Framework can be easily integrated with either of the followings: Eclipse,
NetBeans, Ant and Maven.
Concerning the strengths (↑) and weaknesses (X) of this solution:
More popular and better tool support when compared to TestNG.
Use annotations and marker interfaces.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 33 of 45
No support for group tests.
No support for dependency check.
4.5.2. TestNG
TestNG [14] is an open source Apache Licensed [WP6-Cri-2] unit testing framework for the
Java programming language. While JUnit is the de-facto solution, TestNG attempts to offer
additional features needed for Enterprise applications. TestNG is inspired from JUnit and NUnit
but have new functions which make TestNG more powerful and easier to use. TestNG introduces
new functionalities to unit testing such as:
Support for Java annotations
XML configuration file for test configuration
No required class extension or interface implementation
Support for dependent methods and groups
Support for parallel testing
Parameters for test methods
Arbitrary number of invocations plus success rate
TestNG is supported by a variety of tools and plugins like Eclipse, IDEA, Maven, etc [14].
Concerning the strengths (↑) and weaknesses (X) of this solution:
Flexible test configuration.
Support for data-driven testing.
Support for parameters.
Supported by a variety of tools and plug-ins.
Support group testing.
Does not have ports for other languages.
4.5.3. Spock
Spock [28] is an open source Apache Licensed [WP6-Cri-2] testing and specification framework.
Spock was conceived for Groovy applications; however it can be used for testing Java code as
well.
Spock is compatible with most IDEs, build tools, and continuous integration servers. There are
multiple ways to use Spock. Some of them are:
As part of a Gradle build. (which is Groovy’s build tool of choice)
In a Maven project (which can have both Java and Groovy code)
In an Eclipse project.
On the web with no installation at all.
There is a Groovy plugin on the Eclipse MarketPlace that adds support (syntax highlighting,
code completion) for Groovy scripts, so once you have it installed you are good to go. However,
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 34 of 45
if it is expected to be executed on a big Enterprise application, the Maven way is recommended
since it’s also important to run Spock-powered unit tests as part of the build process (i.e.
Jenkins).
Concerning the strengths (↑) and weaknesses (X) of this solution:
Testing and mocking abilities in single package.
Spock is a newly developed tool.
Fewer community supports.
More possible bugs when compared to Junit and TestNG.
4.6. Additional software components
Some of the previous tools described above require additional software previously instated such
as:
- MySQL: Data base to support data repositories.
- Apache: Open source Web Server
4.7. Final Selection of Development Environment Tools
In order to give a decision on final selection of tools we used the selection criteria defined in
section 2.6.
We reviewed CVS, SVN and GIT revision control systems. Each has its unique versioning
architecture. When compared to others, SVN has many features that make versioning operations
easy. It also has a variety of clients for most common operating systems.
Gradle, Maven, Ant are the tools that we reviewed for build automation. Gradle is the newest
tool of all and has some nice features. Maven is not only a build tool, but also a management tool
for software projects. Build automation with Maven is simpler than with Ant; both require XML
configuration files.
It is important to choose tools which support operating together. We reviewed three bug tracking
tools which have similar great features, but Trac is the one which has the better integration with
Subversion.
Jenkins, Hudson and Cruise Control are the CI tools that we reviewed. Cruise Control is the
oldest and most stable one but Jenkins and Hudson have greater features. Jenkins has a wide
community support and a variety of plugins when compared to Hudson.
We investigated Junit, TestNG and Spock unit test systems. TestNG is the one supporting more
features.
A summary of the evaluation of development tools with respect to applicable selection criteria is
shown in the table below.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 35 of 45
Table 13: Selection Criteria of Development Environment Tools
After these considerations the selected tools are the following:
Revision Control System: Subversion;
Build Automation System: Maven;
Continuous Integration: Jenkins;
Bug Tracking System: Trac;
Unit Test System: TestNG.
The selected tools have support to work together and their combined usage helps in the
continuous integration process.
Criteria P
riori
ty
CV
S
SV
N
GIT
An
t
Maven
Gra
dd
le
Jen
kin
s
Hu
dso
n
Cru
ise
Con
trol
Man
tis
Bu
gzi
lla
Tra
c
Ju
nit
Tes
tNG
Sp
ock
WP6-Cri-1 1 X X X X X X X X X X
WP6-Cri-2 1 X X X X X X X X X X X X X X X
WP6-Cri-3 1 X X X X X X X X X
WP6-Cri-4 2 X X X X X X X X X X
WP6-Cri-5 3 X X X X X X X X X X X X X X X
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 36 of 45
5. Quality Assurance and Security Procedures
5.1. Approach
The Coco Cloud development environment outlined in the previous sections needs a strategy for
evaluating and measuring the overall software quality both at and near specific project
milestones (e.g. the deliverables), as well as continuously in a day to day fashion in order to
monitor the progresses or deficiencies and promptly react in case specific quality thresholds are
not met.
This chapter describes the steps taken in order to deliver a reliable, efficient, secure and
maintainable implementation of the Coco Cloud reference solution, mainly according to the
recommendations of the Consortium for IT Software Quality – CISQ [29], built upon ISO/IEC
25010 (and older ISO/IEC 9126). We try to address as many recommendations as possible, by
using automated processes during the continuous delivery pipeline of the build automation
system, with particular focus on security aspects. In fact security plays a major role in Coco
Cloud and we must define mechanisms and actions to ensure that the built software meets a high
level of assurance.
For these reasons, both architects and developers shall follow architectural practices and good
coding techniques, by referring to well-known guidelines, as detailed in section 5.2; at the same
time, during the build phase, the quality of software is measured by inspecting the source code
by using automated static code analysis tools. This latter practice delivers a continuous code
inspection, that aims to find software issues as soon as possible (e.g. at every build or even at
every code commit).
By using this approach we foresee to obtain the following results:
Allow the developers to immediately react on reported issues in order to implement the
required/suggested fixes;
Measure the improvement of the project code quality during time, by evaluating evolving KPI
trends during the whole development lifecycle (e.g. reduction of security issues);
Avoid critical bugs to be found later in the SSDLC (Secure Software Development Life Cycle),
when developments are (mainly) terminated and resolution/rework costs are some order of
magnitude greater than when the bug was first introduced [30].
5.2. Software Code Guidelines
The CISQ Specifications for Automated Quality Characteristic Measures [31] provide helpful
hints on several quality rules that shall be evaluated during programming. These practices cover
sound principles at the unit, technological and system level [32] for the following Quality
Characteristics:
Reliability and resiliency: these factors are affected usually by failures in proper error-handling
mechanisms. In addition, data corruption can result from failures in the data flow logic from one
software layer to another, where due to bad practices the proper interfaces, layers or
frameworks could be bypassed for direct and easier access.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 37 of 45
Efficiency: performance issues usually appear at system level, where the end-user suffers
damage to productivity and the overall system consumes more IT resources than needed. Also
poor object-oriented programming practices can severely impact the solution.
Security: the prevention of security vulnerabilities is a prime concern for Coco Cloud. The CISQ
leverages the Common Weakness Enumeration (CWE) repository for a list of issues to watch for.
Other well-known sources that shall be followed are the OWASP Developer Guide [33], along
with their Top Ten lists of vulnerabilities (OWASP Top Ten 2013 [34], OWASP Top Ten Mobile
2014 [35] and OWASP Cloud Top Ten[36]) and the 2011 CWE/SANS Top 25 Most Dangerous
Software Errors [37].
Maintainability, Adaptability, Changeability: source code tidiness is a primary concern when it
comes to maintain over time a piece of software. However, failures to observe design and
architectural rules lead to all sort of issues and contribute to create “spaghetti code” or the likes
[38].
In the following sections, a number of tools are analysed and selected in order to (partially)
automate the code quality inspection activity.
5.3. Evaluated static code analysers
Several static code analysers have been evaluated and described in this section. They are called
static as they work when the code is not running. Most tools are able to (partially) address more
than one CISQ Quality Characteristic, sometimes they overlap, while some other times they
complement each other. They are oriented towards fighting against bugs, security issues, design
flaws and code untidiness.
All the described tools are free or open source [WP6-Cri-2], support Java [WP6-Cri-3] (even if at
different version levels) and integrate with a build automation tool [WP6-Cri-1].
5.3.1. Android Lint
Android Lint is a static code analyzer tool from Google [39] with an Apache-based license
[WP6-Cri-2] which scans Android [WP6-Cri-4] Java projects [WP6-Cri-3]. It reports on:
Correctness [WP6-Cri-9] (potential bugs, e.g. check for duplicate layout ids, look for hardcoded
references to sdcard, check of ProGuard config issues, etc.);
Security [WP6-Cri-7] (e.g. usage of world readable/writeable files, private keys packaged in the
app, bad usage of SecureRandom, etc.);
Performance (e.g. over usage of battery due to wakelocks used to keep screen always on, checks
for “overdraw” issues, look for unused resources, etc.);
Usability and accessibility (e.g. issues with typography, icons, labels, etc.);
Translation (e.g. i18n issues like hardcoded texts, not using UTF8, etc.).
This tool is bundled with Android SDK, so the build automation system needs access to that
SDK (it supports Maven and Jenkins [WP6-Cri-1]). Being part of the Android SDK, it also
provides Eclipse and IntelliJ IDE integration [WP6-Cri-5].
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 38 of 45
5.3.2. CheckStyle
CheckStyle [40] is an open source LGPL-licensed [WP6-Cri-2] tool primarily aimed at enforcing
a Java coding standard for programmers. In particular, it can check for Oracle/Sun’s Code
Conventions for the Java Programming Language [WP-Qua-11], covering filenames, file
organization, indentation, comments, declarations, statements, white space, naming conventions
and programming practices [41]. It is a syntactical checking tool.
Checkstyle integrates with many IDEs like Eclipse, IntelliJ, NetBeans and BlueJ [WP6-Cri-5].
However, there is no off-the-shelf specific mobile (Android) support for checking the Google
suggested guidelines.
It supports Java up to version 1.7 (as of CheckStyle v5.7) [WP6-Cri-3], but does not currently
support new features of Java 1.8 (e.g. lambda expression syntax). It supports build automation
tools (Maven, Ant, Jenkins/Hudson) [WP6-Cri-1].
5.3.3. FindBugs
FindBugs [42] is an open source LGPL-licensed [WP6-Cri-2] static code analyser that uses both
syntactical and simple data flow (within one method) techniques [as described in Nick Rutar,
Christian B. Almazan, and Jeffrey S. Foster. 2004. A Comparison of Bug Finding Tools for Java.
In Proceedings of the 15th International Symposium on Software Reliability Engineering (ISSRE
'04).[43]. It works on Java bytecode and so does not require source code [WP6-Cri-10]. For this
reason it is suitable for checking third-party libraries and proprietary modules. However, in the
context of a continuous build system, where code might not always cleanly compiles, the
availability of the bytecode could not be guaranteed, hence FindBugs might not be able to run at
all times.
It is based on the concepts of “bug patterns”, which are code instances that are likely to be errors
[WP6-Cri-9]. The tool categorizes bug patterns in the following groups:
Bad practice: violations of recommended and essential coding practices (e.g. hash code and
equals problems, cloneable idiom, dropped exceptions, serializable problems and misuse of
finalize);
Correctness: apparent code mistakes (e.g. invocation of hashCode on an array, equals()
comparing different types, null pointer dereference, etc.);
Malicious code vulnerability: code that can be exploited (e.g. Finalizer method should be
protected and not public, field should be final but it is not, etc.);
Multithread correctness: issue in multithreading applications (e.g. standard methods inherently
unsafe for multithreaded use, locks not released, etc.);
Performance: slow down the code (e.g. strings concatenated with +, explicit garbage collection
invocation, etc.);
Security: bugs that can result in security vulnerabilities (e.g. hardcoded passwords, path
traversal, possible SQL Injections, etc.);
Dodgy code: code that is confusing, anomalous, or written in a way that leads itself to errors
(dead local stores, switch fall through, unconfirmed casts and redundant null check of value
known to be null).
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 39 of 45
In addition, FindBugs can be extended with plug-ins and an interesting one is FindSecurityBugs
[44]. This plug-in has a variety of checks for security vulnerabilities in the following areas:
Security Command Injection, XPath Injection, Xml eXternal Entity (XXE), Weak cryptography,
Tainted inputs, Predictable random, Specific library weakness, XSS in JSP page, SQL/HQL
injection, ReDOS, Path traversal [WP6-Cri-7];
It also supports several frameworks [WP6-Cri-6]: Spring MVC, Apache Tapestry, Struts,
JaxRS/WS, J2EE classic Web api, Apache Wicket.
It supports Java up to version 1.7 (as of FindBugs v2.0.3) [WP6-Cri-3], but does not currently
fully support Java 1.8 (due to changes in the classfile format).
FindBugs has native plug-in support for Eclipse, but it integrates also with other Eclipse-derived
IDEs [WP6-Cri-5].
Usually this tool works in conjunction with PMD and they complement each other, even though
some overlaps do occur. FindBugs integrates with build automation tools like Ant, Maven and
Jenkins/Hudson [WP6-Cri-1] and leverage the continuous integration dashboards of them (like
PMD does).
5.3.4. PMD
PMD [45] is an open source BSD-like licensed [WP6-Cri-2] source code analyser, focused on
finding bad programming practices, by using a syntactical approach. It needs source code to
perform its tasks. It finds common programming flaws [WP6-Cri-9] like:
Possible bugs: empty try/catch/finally/switch statements;
Dead code: unused local variables, parameters and private methods;
Suboptimal code: wasteful String/StringBuffer usage;
Overcomplicated expressions: unnecessary if statements, for loops that could be while loops;
Duplicate code: copied/pasted code means copied/pasted bugs.
PMD is based on the evaluation of a set of rules against the source code files. The PMD best
practices recommend starting with an incremental approach, by using the following rulesets:
Unusedcode: looks for unused or ineffective code (e.g. variable set but never used, private
method defined but never used, etc.);
Basic: collection of good practices (e.g. confused increments in loops, always true or false if
conditions, hard coded IP addresses, etc.);
Design: potential sub-optimal code implementations (e.g. define utility classes, default in switch
statements, deeply nested if, etc.);
Controversial (optional): these rules could not be agreed and some of them could be suppressed
(e.g. constructor with empty body, usage of native code, etc.).
PMD has an (small) Android specific set of rules as well [WP6-Cri-4] and a specific, but
minimal, set for security (Security Code Guidelines) [WP6-Cri-7].
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 40 of 45
It supports Java 1.8 (starting from v5.1.0) [WP6-Cri-3], JSP [WP6-Cri-6], JavaScript, XML,
XSL. Additionally it includes CPD, the Copy-Paste-Detector. CPD finds duplicated code in Java,
C, C++, C#, PHP, Ruby, Fortran, JavaScript.
PMD is integrated with several IDEs: JDeveloper, Eclipse, JEdit, JBuilder, BlueJ, CodeGuide,
NetBeans/Sun Java Studio Enterprise/Creator, IntelliJ IDEA, TextPad, Gel, JCreator, and Emacs
[WP6-Cri-5].
It integrates with build automation tools such as Maven, Ant and Jenkins/Hudson [WP6-Cri-1].
In Jenkins a dashboard is available to report graphically the evolving trends of the discovered
bugs. Jenkins provides also specific views (through the Jenkins’ DRY plug-in) for the Copy-
Paste-Detector (CPD).
5.3.5. OWASP Dependency Check
OWASP Dependency Check [46] is a GPL-licensed [WP6-Cri-2] utility that identifies project
dependencies (e.g. third party open source libraries) and checks if there are any known, publicly
disclosed, vulnerabilities [WP6-Cri-7]. This is in line with OWASP Top Ten “A9 - Using
Components with Known Vulnerabilities” and since Java applications are usually built upon tons
of open source libraries that rarely get updated, it becomes a crucial security investigation. It
does not require library source code to be available [WP6-Cri-10].
This tool works by attempting to understand which dependencies are used in the application (by
getting “evidences”) and then use these evidences to identify the Common Platform Enumeration
[47] for each dependency. Once a CPE is identified, it gathers the associated Common
Vulnerability and Exposure (CVE) from the US NIST’s National Vulnerability Database [48].
The OWASP Dependency Check integrates with build automation tools, like Maven, Ant and
Jenkins/Hudson [WP6-Cri-1], supports Java [WP6-Cri-3], but currently does not integrate with
any IDE (as of v1.1.3).
Since it works on dependencies, it can be successfully used on mobile applications as well
[WP6-Cri-4], even if it has no specific Android support.
5.4. Evaluated code coverage tools
These kinds of tools attempt to understand the degree the developed system is automatically
tested by the unit testing cases suite. The bigger the coverage, the better the assurance of having
functionally exercised the application. Coverage tools works by running the code. The most
important code coverage criteria are:
Line, method and class coverage: respectively a line (statement) has been executed, a method
has been invoked (e.g. at least a statement inside the method) and a class has been executed
(e.g. at least a method in the class has been executed);
Branch coverage: understand whether each branch of a conditional control structure (e.g. if or
switch) has been executed.
Code coverage addresses the CISQ Quality Characteristic of reliability and requirement [WP6-
Cri-11].
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 41 of 45
5.4.1. Cobertura
Cobertura [49] is an open source GPL-licensed [WP6-Cri-2] tool for checking the code coverage
of project’s unit tests. It covers both line level and branch level. It works by off-line
instrumenting the unit test Java bytecode, so it does not depend on the unit testing framework,
but it modifies the bytecode before executing.
Cobertura supports Java [WP6-Cri-3], but currently does not support Java 1.8 (as of v2.0.3).
Furthermore, it integrates with Eclipse IDE [50] [WP6-Cri-5].
Cobertura integrates with build automation tools like Maven, Ant and Jenkins/Hudson [WP6-
Cri-1].
5.4.2. JaCoCo
JaCoCo [51] is an open source EPL-licensed [WP6-Cri-2] library for checking the code
coverage. Being a library it works as part of other tools, like Maven, Ant and Jenkins/Hudson
[WP6-Cri-1]. JaCoCo covers both line level and branch level, but differently from Cobertura, it
works by on-the-fly instrumenting the unit test Java bytecode via the java agent JMX
technology: this makes JaCoCo faster than Cobertura.
JaCoCo supports Java 1.8 [WP6-Cri-3] as of v0.7.0 and has native integration with the Eclipse
IDE [52] [WP6-Cri-5].
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 42 of 45
5.5. Final selection of Quality and Security Assurance Tools
The table below shows each tool and its ability to meet the selection criteria we defined:
Criteria Priority Android
Lint
CheckStyle FindBugs PMD OWASP
DepChK
Cobertura JaCoCo
WP6-Cri-1 1 X X X X X X X
WP6-Cri-2 1 X (Apache) X (LGPL) X
(LGPL)
X (BSD) X (GPL) X (GPL) X (EPL)
WP6-Cri-3 1 X X X X+ (J8) X X X+ (J8)
WP6-Cri-4 2 X X- (few) X
WP6-Cri-5 3 X X X X X X
WP6-Cri-6 3 X X X
WP6-Cri-7 1 X X+
(many)
X- (few) X
WP6-Cri-8 2 X
WP6-Cri-9 1 X X X
WP6-Cri-10 3 X X
WP6-Cri-11 3 X- (offline) X+ (on-
the-fly)
Table 14: Selection Criteria of Quality and Security Assurance Tools
It is clear that each tool has its unique characteristics. While Android Lint is ideal for Android
development (this is a priority 2 requirement), FindBugs and PMD have the most overlaps.
FindBugs has a larger library of security checks (thanks to FindSecurityBugs plug-in) and can
analyze Java bytecode, while PMD supports Java 1.8; however, we choose FindBugs, because it
covers better the priority 1 requirements. CheckStyle helps maintaining a tidy code, which
targets some maintainability requirements and is the only tool which addresses that priority 2
requirement (WP6-Cri-8). For the test coverage, JaCoCo guarantees support for Java 1.8 and
instruments the tests on-the-fly, while having also a better documentation than Cobertura.
These considerations draw the conclusions that the selected tools are: Android Lint, Findbugs,
CheckStyle and JaCoCo. The combined usage of these tools allows satisfying our stated
selection criteria and helps addressing (at least the most important) requirements of reliability,
efficiency, security and maintainability stated by the CISQ.
Even if other reviewed tools could have a role to improve software quality, we will start with the
selected pool and we will check during the project whether these tools fully satisfy our stated
needs or if we need to look for improvements.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 43 of 45
6. Next steps This deliverable describes WP6 test bed aiming at putting in place the tools, processes and
activities for delivering a testable system implementing the Coco Cloud framework architecture.
This document included the specification of the physical and virtual resources that will compose
the Development and the Integration / Testing environment. Software tools and Integration
environment components have been selected. The selection has been mainly based on the
requirements collected from each pilot owner: Quiron for the healthcare use case, AGID for the
public administration use case, and SAP for the mobile use case. The common development
environment will include a unit test system for running automated regression tests, a versioning
system, a bug tracking system, an issue tracking tool among others.
Additionally, this document also provided the definition of the quality management processes
and common development practices that will be enforced during the whole development life
cycle. This process includes the preparation of a quality plan, selection of the programming
languages, providing of coding guidelines and programming tools.
The next activities in WP6 for this first year will be focused on setting-up and managing the
environments described in this deliverable.
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 44 of 45
7. References
[1], OpenStack main site: http://www.openstack.org/
[2], Swift, http://docs.openstack.org/developer/swift
[3], Keystone, http://docs.openstack.org/developer/keystone/
[4], Horizon, http://docs.openstack.org/developer/horizon/
[5], Ceilometer, http://docs.openstack.org/developer/ceilometer/
[6], http://www.nongnu.org/cvs/#TOCintroduction
[7], Collins-Sussman, Ben; Greg Ward (September 2004). "Subversion Users: Re: Performance
(Subversion vs. CVS)". subversion-users. Retrieved 03.04.2014
[8], https://git.wiki.kernel.org/index.php/GitSvnComparsion, Retrieved 03.04.2014
[9], http://agile.dzone.com/articles/version-control-git-vs-svn, Retrieved 03.04.2014
[10], http://subversion.apache.org/faq.html#portability, Retrieved 03.04.2014
[11], https://wiki.jenkins-ci.org/pages/viewpage.action?pageId=53608972, Retrieved 03.04.2014
[12], http://www.mantisbt.org/, Retrieved 03.04.2014
[13], http://junit.org, Retrieved 03.04.2014
[14], http://testng.org, Retrieved 03.04.2014
[15], Martin Fowler. Continuous Integration. (2006) Article published online at:
http://martinfowler.com/articles/continuousIntegration.html#PracticesOfContinuousIntegration,
Retrieved 03.04.2014
[16], http://ant.apache.org, Retrieved 03.04.2014
[17], http://www.gradle.org, Retrieved 03.04.2014
[18], http://www.nongnu.org/cvs/, Retrieved 03.04.2014
[19], http://subversion.apache.org/, Retrieved 03.04.2014
[20], http://git-scm.com/, Retrieved 03.04.2014
[21], http://maven.apache.org , Retrieved 03.04.2014
[22], http://gradle.org , Retrieved 03.04.2014
[23], http://hudson-ci.org/] , Retrieved 03.04.2014
[24], http://jenkins-ci.org/ , Retrieved 03.04.2014
[25], http://cruisecontrol.sourceforge.net/ , Retrieved 03.04.2014
[26], http://www.bugzilla.org , Retrieved 03.04.2014
[27], http://trac.edgewall.org , Retrieved 03.04.2014
[28], http://code.google.com/p/spock , Retrieved 03.04.2014
FP7-ICT-2013-10 D6.1
Coco Cloud – GA#610853
Page 45 of 45
[29], CISQ, http://it-cisq.org/wp-content/uploads/2012/09/CISQ-Specification-for-Automated-
Quality-Characteristic-Measures.pdf
[30], B. Boehm and V. Basili, “Software Defect Reduction Top 10 List,” IEEE Computer, vol.
34(1): 135-137, January 2001, http://www.cs.umd.edu/~basili/publications/journals/J81.pdf
[31], CISQ, http://it-cisq.org/wp-content/uploads/2012/09/CISQ-Specification-for-Automated-
Quality-Characteristic-Measures.pdf
[32], OMG, http://www.omg.org/CISQ_compliant_IT_Systemsv.4-3.pdf
[33], OWASP, https://www.owasp.org/index.php/Category:OWASP_Guide_Project
[34], OWASP, https://www.owasp.org/index.php/Category:OWASP_Top_Ten_Project
[35], OWASP, https://www.owasp.org/index.php/Projects/OWASP_Mobile_Security_Project_-
_Top_Ten_Mobile_Risks
[36], https://www.owasp.org/index.php/Category:OWASP_Cloud_%E2%80%90_10_Project
[37], MITRE, http://cwe.mitre.org/top25/
[38], http://en.wikipedia.org/wiki/Spaghetti_code
[39], http://developer.android.com/tools/help/lint.html
[40], http://checkstyle.sourceforge.net/
[41], http://www.oracle.com/technetwork/java/codeconv-138413.html
[42], http://findbugs.sourceforge.net/
[43], IEEE Computer Society, Washington, DC, USA, 245-256,
http://www.cs.umd.edu/~jfoster/papers/issre04.pdf
[44], http://h3xstream.github.io/find-sec-bugs/
[45], http://pmd.sourceforge.net/
[46], https://www.owasp.org/index.php/OWASP_Dependency_Check
[47], CPE, http://nvd.nist.gov/cpe.cfm
[48], http://nvd.nist.gov
[49], http://cobertura.github.io/cobertura/
[50], http://ecobertura.johoop.de/
[51], http://www.eclemma.org/jacoco/
[52], http://www.eclemma.org/