Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
© LDRA 2017 LDRA tool suite v9.7.0 August 2017
SimplifyingFunctional Safety
Certificationwith the
ARM Keil µVision®5 IDEand the LDRA tool suite®
1
2
Goals
▪Starting with some simple requirements written in Word & Excel, show how a project can be developed using the Keil µVision®5 IDE and targeting a low cost STM32 Discovery board (ARM Cortex-M0)
▪Verify the traceability between all the requirements and the code
▪Verify every high level and low level requirement by running tests on the target
▪Verify that the code is MISRA C:2012/AMD1 compliant and also clear, maintainable and testable
▪Verify that 100% structural coverage has been attained
3
Pre-requisites
▪ LDRA tool suite V9.7.0 or above
▪Keil µVision 5 IDE (Tested with V5.24a)
▪STM32F051RB-Discovery Board (ARM Cortex-M0) with on board ST-Link
Preparation
4
5
Keil µVision5
▪ It is assumed that Keil µVision5 is installed
▪A valid license is not required for the Keil µVision5 since the generated code for the simple project that we will be testing, does not exceed the 32K limit
6
Device Drivers
▪Check that the appropriate device drivers (32bits or 64bits) are installed
7
STM32F051-Discovery
▪Using the Pack Installer, verify that the STM32F051 Discovery board is loaded
8
LDRA tool suite
▪ It is assumed that the LDRA tool suite is installed and licensed
9
Environment Variables
▪ If you have both Keil µVision5 and ARM DS-5 installed, then there may be a conflict with the following Environment Variables:
▪ This can be resolved by running a batch file with the following commands:
setx ARM_PRODUCT_PATH ""
setx ARM_TOOL_VARIANT ""
setx ARMLMD_LICENSE_FILE ""
10
Install the TLP (Target License Package)
▪ The TLP allows the LDRA tool suite to work with Keil µVision5
▪Run the Keil µVision5 ARM TLP installer
▪Select the following options
11
Installation Path
▪Select the path of where Keil µVision5 is installed
12
Project Extension
▪Accept the default project extension
13
Build Method
▪Select COMPILE
14
Device
▪Select STM32F051R8
If you don’t have a target, then select ARMCM0, which uses the ARM Cortex-M0 simulator.
15
CMSIS
▪Enter path of the CMSIS to be used
16
Keil_UV5_STM32F051_Safe_Utilities
▪Once installed, locate the following workspace
If you selected ARMCM0, then open the following project:
17
Project
▪Double-click on the .uvprojx file to load the project into Keil µVision5
18
LDRA Target
▪Ensure that the target “LDRA” is selected
19
ldra.ini
▪ The execution will be controlled via an initialization file called ldra.ini, this will get auto-generated via a batch file
20
Rebuild
▪Rebuild the executable
21
Settings
▪Open the Debug Settings
22
FLASH
▪ If the FLASH Programming Algorithm is not set, then add it
This is not required when the target is ARMCM0, which uses the ARM Cortex-M0 simulator.
23
Run the Executable
▪Connect the STM32F051-Discovery board to the PC via the USB port
▪Download and execute
24
ST-Link
▪ If there is a problem connecting to the ST-Link, then first check the following:▪ The board is connected!
▪ The drivers have been installed
▪ If a problem still remains, then install the ST-LINK Utility and upgrade the ST-Link firmware
25
exit
▪ Finally stop the execution and close the Keil µVision5 IDE
Requirements
26
27
Requirements
▪Open the Requirements folder
28
System Level Requirements
▪ This is a very simple document that contains just one requirement that starts with SYS_
29
High Level Requirements
▪ This document contains High Level Requirements that all start with HLR_ and which have a link to a System Level Requirement
30
Styles
▪Note how in this document, styles have been used to help identify requirements ex:
31
Low Level Requirements
▪ The Low Level Requirements are in an excel document and contain links to High Level Requirements
32
Restore_Initial_State.bat
▪Run the Restore_Initial_State.bat file, this will create a new TBmanager project and in case the tutorial has already been started or completed, will delete any existing results
33
TBmanager
▪Open the newly created TBmanager project, by double-clicking on the .tbp file
34
UniView
▪By default the UniView should be shown
35
Groups
▪ The following groups have been added:▪ SLR: System Level Requirements
▪ HLR: High Level Requirements
▪ LLR: Low Level Requirements
▪ HLT: High Level Tests
▪ LLT: Low Level Tests
36
Import System Level Requirements
▪We want to import the System Level Requirements from the Word document into the Group SLR
▪ First switch to the documents view, where we can see that the Requirements documents have already been added
37
Import from Word Document…
▪Select the Safe_Utilities_SLR.docx file, right-click and select “Import from Word Document…”
38
Regular Expressions
▪Note how regular expressions have previously been created, to extract the System Level Requirements
39
Preview
▪Click on Refresh Preview to see that the regular expressions have correctly identified the requirement: number, name and body
▪Note also that the requirements will be imported into the group SLR
40
Import
▪Click on OK to first view the requirements, then on OK again to import them
41
Imported Requirement
▪We should now be able to see the imported requirement in the Project Tree
42
Import High Level Requirements
▪Next we want to import the High Level Requirements from the Word document into the Group HLR
▪Select the Safe_Utilities_HLR.docx file and “Import from Word Document…”
43
Rule 1: Requirement_ID Style
▪ This time since styles have been used, it is much easier to identify the requirements. Different rules have been created for each style ex:
44
Rules 2 & 3
▪Similarly rules have been created for the styles Requirement_Text and Traceability_Data
45
Preview
▪As before, refresh the preview to check the rules
▪Check also that the requirements will be imported into the group HLR
46
Import
▪Click on OK to first view the requirements, then on OK again to import them
47
Imported Requirements
▪We should now be able to see the imported requirements in the Project Tree, nested below SYS_100, since they all have a link to that requirement
48
Import Low Level Requirements
▪Next we want to import the Low Level Requirements from the Excel document into the Group LLR
▪Select the Safe_Utilities_LLR.xlsx file, right-click and select “Configure .xlsx Format…”
49
Columns
▪Each column is identified as a specific attribute ex: requirement: number, name, body, …
50
Group LLR
▪ The first row is ignored and the requirements will be imported into the group LLR
51
Get Requirements from File
▪Click OK to close this menu, then to import the requirements, select “Get Requirements from File”
52
Imported Requirements
▪We should now be able to see the imported requirements in the Project Tree
53
Relationships View
▪Switch to the relationships view
54
SLR Requirements
▪Right-click in the 1st column and add all the requirements from the group: SLR
55
Traceability
▪Now we can observe the upstream and downstream traceability by clicking on any requirement
56
UniView
▪Switch to the UniView view
57
Traceability Matrix
▪Right-Click on the HLR group and select “Traceability Matrix Report to Requirement”, then select Group: LLR
58
High level Requirement <Not Covered>
▪We can see clearly that one High Level Requirement is not covered!
Traceability to Code
59
60
Source Code
▪We now need to look at the traceability between the source code and the Low Level Requirements
▪ First we need to analyse the source code
▪ In the Project Tree, right-click on Source Code and select “Add Compiler Project…”
61
Keil_UV5_STM32F051_Safe_Utilities
▪ Locate the Keil_UV5_STM32F051_Safe_Utilities project and open it
62
Set
▪ This will now create a “Set”, accept the following
63
Analysis
▪We now need to analyse this set of source files
▪ If we click on the menu “Analyse Procedures”, it would run just the minimum analysis in order to identify all the functions. However, in our case, we want to do a deep analysis in order to also check the compliance to MISRA C:2012/AMD1, so we need to selected the option “Run LDRA Testbed Static Analysis”
64
Analysis
▪Click on the System Set, then hold the control key and right-click to select the following menu
65
Map Source View
▪When the analysis completes, select the Map Source View and expand each file
66
Map Source Code
▪Map each function (except main) to the appropriate Low Level Requirement by dragging and dropping
67
Relationships View
▪Switch to the Relationships view and note that every function should now trace back to a requirement
Tests
68
69
Tests
▪Before testing against the requirements, we also want to perform the following tests:▪ Check that the code is compliant to MISRA C:2012/AMD1
▪ Check that the code is clear, maintainable and testable
▪ Then after testing against the requirements, we will want to perform the following test:▪ Check that we have 100% Structural Coverage
70
TCI Grid View
▪Switch to the TCI Grid View
71
Map TCIs to Set
▪Map each TCI (Test Case Identifier) to the System Set by dragging and dropping ex:
72
Map Source View
▪Switch to the Map Source View and the TCIs should be shown
73
Code Review
▪Right-click on the TCI_CodeReview and select “Verify with LDRA tool suite…”
74
Code is Not Compliant
▪ The Green dots indicate that there are no Mandatory or Advisory violations, but there are some Required violations
75
Callgraph
▪Drag the System Set onto the Output Callgraph placeholder
76
Callgraph – Programming Standards
▪Double-click to open the callgraph
77
Violations
▪Clicking on a function, highlights the coding standard violations
78
Code Review Report
▪Alternatively double-click on the Code Review Report
79
View Results with LDRA tool suite
▪Or view the results with the LDRA tool suite
80
Quality Review
▪Next verify the quality of the code
▪Right-click on the TCI_QualityReview and select “Verify with LDRA tool suite…”
81
Code is Clear, Maintainable and Testable
▪ The Green dots indicate that the quality of the code is good, it is clear, maintainable and testable. All the measured metrics are within the specified thresholds
82
Callgraph
▪Drag the System Set onto the Output Callgraph placeholder, then double-click to open the Callgraph
83
Maintainability View
▪Select the Maintainability View
84
Sort Metrics
▪Clicking on the column title, sorts the metrics by value, making it easy to locate the most complex function
▪ The flowgraph for each function can be viewed ex:
85
Flowgraph
High Level Tests
86
87
High Level Tests
▪Next we want to verify the High Level Requirements
▪ The Safe_Utilities_HLT.xslx file contains just a single High Level Test
88
High Level Tests
▪ For this test, we are simply going to execute the main which exercises every function and afterwards check the structural coverage
89
Import High Level Tests
▪Switch to the Documents View, select the Safe_Utilities_HLT.xslx file and “Get Test Cases from File”
90
Set External Task
▪Set the External Task to be the following
91
Build Instrumented
▪As the code executes, we need to measure the structural coverage. In order to do this, we first need to instrument the source code and then to perform a build
▪ This can be done by executing the batch file Build_Instrumented.bat
▪Click on the following menu to do this
92
Verify with External Task
▪Now we can right-click on the HLT and verify with External Task
93
Execution History
▪At the end of the execution, the compressed execution history is uploaded to the host
▪ This can now be processed by running the batch file Get_Coverage.bat
▪Click on the following menu to do this
94
Code Coverage
▪Now switch to the Map Source View and verify the code coverage
95
Callgraph
▪Drag the System Set onto the Output Callgraph placeholder
▪Double-click to view the pass/fail coverage Callgraph
96
Callgraph
▪As expected, every function has been invoked, but of course we don’t have 100% coverage
97
Flowgraph
▪View the Coverage Pass/Fail Flowgraph for the function safe_uncompress
98
Defensive Programming
▪Since the code checks that the parameters aStringand anArray are not NULL, we don’t have 100% structural coverage
▪ In order to test this “defensive programming”, we will need to perform unit testing
Low Level Tests
99
100
Low Level Tests
▪Next we want to verify the Low Level Requirements
▪ The Safe_Utilities_LLT.xslx file contains the Low Level Tests
101
Import Low Level Tests
▪Switch to the Documents View, select the Safe_Utilities_LLT.xslx file and “Get Test Cases from File”
102
Associated Test Case File (.tcf)
▪With each Low Level Test, there is an associated Test Case File which contains a sequence of test cases
▪ TCF files can be regressed using TBrun
▪Select the following Low Level Test and “Verify Test Interactively in TBrun...”
103
TBrun
▪ For each Test Case, we can see the list of inputs and expected outputs
104
Run
▪ The Test Cases can now be compiled, linked and executed on the target
▪ The Test Cases all pass with 100% coverage
105
Test Passes
▪Exiting TBrun will update the status of the Low Level Tests in TBmanager
▪Now verify the next Low Level Test
106
Failed Test Case
▪ This time the Test Case fails
107
Failed Test Case
▪ This is exactly why we unit test, the function safe_uncompress does not work correctly and must be modified
108
Regression Report
▪Viewing the Regression Report shows why the test case failed
109
Fail
▪Since the Test failed, it has a red dot
110
TCI Grid
▪Switch to the TCI Grid
111
Filter
▪Press “Control + L” to create the following filter
112
2nd Filter
▪Press “+” to create a second filter as shown
113
Regress All Unverified Tests
▪Press “Control + A” to select all the unverified Test Cases, then regress them
114
One Unverified Test Case Identifier
▪Once the filter is refreshed, there should now be just the single unverified Test Case Identifier
115
Code Coverage
▪Now that all the unit tests have been run, check that we have 100% structural coverage
116
Objectives
▪Next import the objectives from a standard
117
ISO 26262
▪Any standard can be imported, but for the moment, select ISO 26262 and click OK
118
Standards
▪ The existing standards can be customised or additional standards added ex:
119
Objectives View
▪Switch to the Objectives View
120
Placeholders
▪ For each objective, it is possible to add placeholders for all the various artifacts / assets that need to be created or produced in order to satisfy it
121
Placeholders
▪Expand the objective “Part 6: Section 5: Table 1: 1a”
▪Note the placeholders for Artifacts that need to be created/produced in order to satisfy this objective
122
Realising Objectives
▪As the Artifacts/Assets get produced, they can be used to satisfy the placeholders
▪Right-click on each placeholder and associate the appropriate document ex:
123
Satisfied Placeholders
124
Fulfilled Objective
▪Once the associated documents, have been reviewed, the status of the objective can be changed to Fulfilled
125
Objectives Report
▪Generate an Objective Summary Report and navigate from it to view the documents
126
Reports
▪ Finally, reports such as the Project Coverage Detailed Report can be generated
127
Command Line
▪Everything that was performed manually in this tutorial can be automated from a batch file
▪Close TBmanager and try running the following batch file
128
Running From Jenkins
▪ The same batch file can be invoked from a continuous integration server such as Jenkins ex:
129
Jenkins
▪All the reports are published into the Jenkins workspace and can be easily viewed:
Summary
130
131
Summary
▪ In this simple example, we have seen:▪ How the traceability from requirements to code can be
verified
▪ How the code in a Keil µVision5 project can be analysed and checked for MISRA C:2012/AMD1 compliance as well as ensuring that it is clear, maintainable and testable
▪ How the High Level Requirements and Low Level Requirements can be verified by executing tests on the target
▪ How the Structural Coverage can be measured
▪ How everything can be automated from a batch file
132
For More Information
▪ For more information, please view the following tutorials