Effectiveness Measurement in ESA Space Projects - ISVV Process Overview

undefined
 
ISVV Effectiveness Measurement
in ESA Space Projects
Pedro A. Barrios, Maria Hernek, Marek Prochazka
European Space Agency
NASA IV&V Workshop 
11-13 September 2012
 
Objective / Outline
Objective
Present the results of an 
ESA study to assess the effectiveness of the
ISVV process 
carried out in the scope of ESA missions
Assessment of past ISVV projects, with the following final objectives:
Identify 
what is useful 
in ISVV process (i.e. what brings results)
Identify 
what needs to be improved 
(i.e. added/removed/clarified/...)
Make unified 
metrics collection 
an integrated part of the process
Outline
ESA ISVV process: a quick overview
ISVV metrics definition
ISVV metrics collection & analysis
Conclusions and future work
 
Independent Software Verification &
Validation (ISVV) by ESA
1.
ISVV is 
required for Mission and Safety Critical software
,
(ECSS-E-40/ECSS-Q-80)
2.
ISVV tasks are 
additional and complementary to the
nominal 
SW supplier’s verification and validations tasks
3.
ISVV tasks cover 
verification and validation of software
requirements, design, code and tests 
(typically starting at
SW-SRR and finishing before the SW-QR)
4.
ISVV supplier is required to be an 
organization independent
of the software supplier as well as the prime/system integrator
(
full technical, managerial, and financial independence
)
5.
Most
 ESA project
s
 implement
 the
 ISVV process as an
industrial contract 
placed by the Prime contractor
 
ESA ISVV Process overview
6 activities/STAGES
: Management (
MAN
),
Verification (
IVE
) and Validation (
IVA
)
Activities are composed of 
TASKS
, and these are
further split into 
SUBTASKS
1.
Management (
MAN.PM
 and 
MAN.VV
)
 is
concerned with issues such as ISVV objectives and
scope, planning, roles, responsibilities, budget,
communication, competence, confidentiality,
schedule and ISVV level definition (to limit the
scope of ISVV)
2.
Technical Specification Analysis 
(
IVE.TA
) is
verification of the software requirements
3.
Design Analysis
 (
IVE.DA
) is verification of the SW
Architectural Design and the Software Detailed
Design
4.
Code Analysis 
(
IVE.CA
)
 
is verification of the SW
source code
5.
Validation
 (
IVA
) is testing of the SW to
demonstrate that the implementation meets the
technical specification
 
ESA ISVV Process overview
Example of a Task/Subtask description
Activity
: Technical Specification
Analysis
Task
: SW Requirements Verification
Subtasks
: T1.S1, T1.S2 … T1.S11
Start/End Events
Inputs/Outputs
Methods
 are identified for each subtask
Some numbers
:
IVE.TA 
 1 task 
 11 subtasks
IVE.DA 
 3 tasks 
 15/12/5 subtasks
IVE.CA 
 3 tasks 
 
10/5/3 subtasks
IVA 
 3 tasks 
 3/3/3 subtasks
 
ESA ISVV Process overview
IVE: Technical Specification Analysis
Subtasks
: To verify
Software Requirements external consistency 
with
the system requirements
Interface Requirements external consistency 
with
the system requirements
software requirements 
correctness
consistent documentation 
of the software
requirements
software requirements 
completeness
dependability and safety
 requirements 
readability
 of the software requirements
timing and sizing budgets 
of the software
requirements
Identify 
test areas and test cases for Independent
Validation
that software requirements are 
testable
software requirements 
conformance
 with 
applicable
standards
TA.T1: 
Software Requirements
Verification
 
ESA ISVV Process overview
IVE: Design Analysis
Subtasks: 
To verify
SW architectural design external 
consistency 
with 
Technical
Specification
SW architectural design external 
consistency
 with 
Interface
Control Documents
interfaces consistency
 between different 
SW components
architectural design 
correctness
architectural design 
completeness
dependability & safety 
of the design
readability
 of the architectural design
timing and sizing 
budgets of the software
Identify 
test areas and test cases 
for independent Validation
architectural design 
conformance with applicable standards
if 
models
 are produced by the SW suppliers:
Verify test performed on high level model
Verify development and verification and testing methods and
environment
then construct model test cases & model test procedures
then execution of model test procedures
DA.T1
: Architectural Design
Verification
 
ESA ISVV Process overview
IVE: Design Analysis
Subtasks
: To verify
detailed design external 
consistency
 with 
Technical Specification
detailed design external 
consistency
 with 
Interface Control
Documents
detailed design external 
consistency
 with 
Architectural Design
interfaces consistency 
between different 
SW components
detailed design 
correctness
detailed design 
completeness
dependability & safety 
of design
readability
 of detailed design
timing and sizing budgets 
of software
accuracy
 of the model (in case models are produced by the SW
suppliers)
Identify 
test areas 
and 
test cases 
for independent Validation
Verify detailed design 
conformance
 with 
applicable standards
Subtasks
: To verify
timing and sizing budgets 
of software
that
 dependability & safety 
aspects on product are
specified in the SUM
readability
 of User Manual
completeness
 of User Manual
correctness
 of User Manual
DA.T2: 
Detailed Design Verification
DA.T3: 
Software User Manual Verification
 
ESA ISVV Process overview
IVE: Code Analysis
CA.T1: 
Source Code Verification
Subtasks
: To verify
source code external 
consistency
 with 
Technical Specification
source code external 
consistency
 with 
Interface Control Documents
source code external 
consistency
 with 
Architectural Design and Detailed Design
interfaces 
consistency
 between different 
SW units
source code 
correctness
 with respect to technical specification, architectural design and detailed design
source code 
readability, maintainability and conformance 
with the applicable standards
dependability & safety 
of source code
Source code 
accuracy
Identify 
test areas and test cases 
for independent Validation
timing and sizing 
budgets of the software
 
ESA ISVV Process overview
IVE: Code Analysis
CA.T2: 
Integration Test Specification and Test Data Verification
Subtasks
: To verify
consistency
 with 
Technical Specification
consistency
 with Software 
Architectural Design
integration 
test procedures correctness 
and 
completeness
If models are produced by the SW suppliers, then evaluate model verification and validation test results
integration 
test reports
CA.T3: 
Unit Test Procedure and Test Data Verification
Subtasks: 
To verify
consistency
 with Software 
Detailed Design
unit test 
procedures correctness 
and 
completeness
unit test 
reports
 
ISVV effectiveness metrics
Key goal of activity is to 
estimate effectiveness 
of the 
ISVV process
carried out in scope of ESA projects
Major objective is to provide measurements and conclusions to support
identification and prioritization of ISVV activities based on their
‘efficiency’
Improve ISVV process 
is an additional objective
ISVV effectiveness to be calculated based on number of
findings and their acceptance and impact
Based on number of findings, the following metrics are computed: findings
per ISVV stage / task / subtask; finding per severity; findings per type and
effective findings.
 
Measurement Process
3 steps activity
: ISVV metrication definition / ISVV metrics collection /
ISVV metrics assessment
Industrial context
:
Measurement needs 
and processes started by ESA
Provision of metrics 
performed through different small contracts granted
by ESA to different ESA ISVV suppliers
Data analysis
, collection and metrics analysis and calculation performed by
an ESA contractor to this activity
 
Measurement Process
Data gathering
, with following 
contents: 
SW product metrics (size in kLOC, number of requirements, criticality)
ISVV project metrics (ISVV level, ISVV scope and stages, documentation quality
at reviews)
Findings (task, subtask, which document, type, severity, use of tools,
acceptance, impact measured in number of changes)
Note: excel tool was used
 
Measurement Process
15 products from 5 projects
4 different ISVV suppliers
The 
IVE effectiveness metrics 
are
assessed:
o
per product
o
per SW products of similar size
o
In total, i.e. in all projects and SW
products considered 
Analysis is performed
:
o
Per all stages
o
Per ISVV project stage
o
Per ISVV task /subtask
Findings per stage/task/sub-task, per severity,
per type, Effective Findings & Tools usage
Note: Only one product classified as small
 
ISVV metrics collection & analysis  (1/10)
Total Findings
Total number of 
IVE findings 
for 
15 products
 within this analysis is 
2492
No clear relationship between findings & 
product size
red=big products ; blue=medium; green=small
Mean & Standard deviation
 
ISVV metrics collection & analysis  (2/10)
Findings per stage
TA
: Technical Specification Analysis
CA
: Code Analysis
DA
: Design Analysis
Although there is some variability per product,
number of findings are 
roughly 1/3 for
three stages
The majority of findings are at the TA stage for
big type products, and CA stage for small
 
ISVV metrics collection & analysis  (3/10)
Findings per task  
(
TA
: Technical Specification Analysis)
 
Majority of findings of TA stage are at TA.T2 task
(
Software Requirements Verification
) for all
products, all projects, all product sizes with only
one exception.
As the size of products decreases, more findings
are discovered at TA.T2 task
 
ISVV metrics collection & analysis  (4/10)
Findings per task   
(
DA
: Design Analysis)
Total share of findings for DA stage
Majority of findings of the DA stage are either at
DA.T2 (
Architectural Design Verification
) task or
at DA.T4 task (
Detailed Design Verification
)
depending of the different products. 
Findings per product for DA tasks
Findings per size for DA tasks
 
ISVV metrics collection & analysis  (5/10)
Findings per task   
(
CA
: Code Analysis)
Total share of findings for CA stage
Share of findings per product for CA tasks
Findings per size for CA tasks
Majority of findings of CA stage are at
the CA.T2 task (
Source Code
Verification
) in totals, then it varies
product by product. CA.T3 (
IT tests
Verification
) also represents a big
share
 
ISVV metrics collection & analysis  (6/10)
Findings per sub-task (e.g. TA subtasks)
Exact numbers are available for all the subtasks
There are 
subtasks producing a reduced number of findings. 
Three
possible cases: 
subtask not performed within the ISVV project, subtask not
producing findings or data not available for the subtask
T2
: Software Requirements Verification
T1
: Requirements Traceability Verification
 
ISVV metrics collection & analysis  (7/10)
Findings per severity
Most of findings are minor. 
Major findings account for 36%.
Proportions
 found across the three 
stages
 (TA, DA, CA) are 
similar
 to these
numbers
 
ISVV metrics collection & analysis  (8/10)
Findings per type
Most of findings are of type 
correctness
, followed
by findings of type 
completeness
 
ISVV metrics collection & analysis  (9/10)
Tools usage
Majority of findings were discovered manually 
(97% of the total
findings) and only very few of them using tools (either to automatically
discover the finding or the so-called ‘semi-automated’, using tools to
further evaluate to discover any finding)
 
ISVV metrics collection & analysis  (10/10)
Effective findings 
(ISVV findings that implied a change, improvement,
correction to the software product)
Majority of findings are effective 
no matter
product & size, except for small product for which
majority of findings are not effective
Majority of 
findings per stage 
are effective
(72% TA & DA stages;  61% at CA stage)
Majority of 
findings are effective for all
severities 
(70% for major, 69% for minor)
 
Conclusions  (1/2)
Total number of findings
-
Measurements based on number of findings
-
Focus on IVE metrics
-
No correlation found between number of findings & product size 
Total number of findings per ISVV stage / task /subtask
-
Stage
: Roughly even distribution (39% TA, 28% DA, 33% CA)
-
Task
/
Subtasks:
 Identified the tasks producing most of the findings for TA,
DA, CA  (e.g. ‘Correctness/Completeness’ subtasks are producing many
findings; ‘consistency’ subtasks produce some)
Type of findings
: The majority of findings are of type correctness (36%) &
completeness (28%)
Effective findings: 
The majority of findings (69%) are effective (i.e. implying
changes/corrections to the software product)
 
Conclusions  (2/2)
Severity
: Most findings are minor at all stages, with 58% minor, 36% major
and remaining 6% for other severity (comment, very low)
Tools: 
The majority of the findings were discovered manually (97%) and only
very few of them using tools
. 
Tools were used only for 3% of findings
(especially at the CA stage)
Example
: if we started today an ISVV contract on a project, we could expect, on
average: 166 findings, from which 115 would be effective findings; out of those,
41 would be major findings, and those would be spread on the different stages as:
IVE: 16 TA, 11 DA, 14 CA
 
Future work
Collect metrics for the upcoming ISVV projects
.
Analyze tasks/sub-tasks not producing many findings 
(they might need better explanations within the
ISVV guide, review the methods and tools proposed to be used when performing them, …)
Analyze Independent Validation
:
o
Define useful metrics for IVA and asses IVA effectiveness
o
Extend the scope of IVA, to cover Qualification & Acceptance of the SW and the Operational
scenarios (i.e. having the operational view to create SW validation campaigns)
Modeling
:
o
Some model related sub-tasks have not been ‘profiled’
o
Understand how models produced during SW development could be used during ISVV activities
(e.g. Model-Based Testing techniques to produce validation campaigns)
ISVV effectiveness Metrics:
o
Some other way how to measure effectiveness?
o
Cost figures?
 
 
Thanks for your attention !!!!
For more information, please contact:
Pedro A. Barrios, 
European Space Agency
Pedro.Barrios@esa.int
 
 
Back-up Slides
 
Findings per sub-task  (1/3)
IVE: 
Technical Specification Analysis
TA.T1: 
Software Requirements Verification
(===)
 IVE.TA.T1.S1: Verify 
Software Requirements external consistency 
with the system requirements
(===)
 IVE.TA.T1.S2: Verify 
Interface Requirements external consistency 
with the system requirements
(+++)  
IVE.TA.T1.S3: Verify software requirements 
correctness
(===)
 IVE.TA.T1.S4: Verify the 
consistent documentation 
of the software requirements
(+++)
 IVE.TA.T1.S5: Verify software requirements 
completeness
(+++)
 IVE.TA.T1.S6: Verify the 
dependability and safety
 requirements 
(+++)
 IVE.TA.T1.S7: Verify the 
readability
 of the software requirements
(---)
 IVE.TA.T1.S8: Verify the 
timing and sizing budgets 
of the software requirements
(---)
 IVE.TA.T1.S9: Identify 
test areas and test cases for Independent Validation
(---)
 IVE.TA.T1.S10: Verify that the software requirements are 
testable
(---)
 IVE.TA.T1.S11: Verify software requirements 
conformance
 with 
applicable standards
High level view of number of findings per sub-task
Legend:
(+++)
: Subtask producing a considerable number of findings
(===)
: Subtask producing some findings
(---)
: Subtask producing a reduced number of findings
(xxx)
: Metrics not available for that subtask
 
Findings per sub-task  (2/3)
IVE: 
Design Analysis
DA.T1: 
Architectural Design Verification
(===)
 IVE.DA.T1.S1: Verify the SW architectural design 
external consistency 
with the Technical Specification
(---)
 IVE.DA.T1.S2: Verify the SW architectural design 
external consistency 
with the Interface Control Documents 
(===)
 IVE.DA.T1.S3: Verify 
interfaces
 
consistency
 between different 
SW components
(===)
 IVE.DA.T1.S4: Verify architectural design 
correctness
(===)
 IVE.DA.T1.S5: Verify architectural design 
completeness
(===)
 IVE.DA.T1.S6: Verify the 
dependability & safety 
of the design
(+++)
 IVE.DA.T1.S7: Verify the 
readability
 of the architectural design
(===)
  IVE.DA.T1.S8: Verify the 
timing and sizing budgets 
of the software
(---)
 IVE.DA.T1.S9: Identify 
test areas and test cases 
for independent Validation
(---)
 IVE.DA.T1.S10: Verify architectural design 
conformance with applicable standards
(xxx) 
IVE.DA.T1.S11: Verify the test performed on the high level model
(xxx) 
IVE.DA.T1.S12: Verify the development and verification and testing methods and environment
(xxx) 
IVE.DA.T1.S13: then construct model test cases
(xxx) 
IVE.DA.T1.S14: then construct model test procedures
(xxx) 
IVE.DA.T1.S15: then execution of model test procedures
DA.T2: 
Detailed Design Verification
(---)
 IVE.DA.T2.S1: Verify the detailed design external 
consistency
 with the Technical Specification
(---)
 IVE.DA.T2.S2: Verify the detailed design external 
consistency
 with the Interface Control Documents
(---)
 IVE.DA.T2.S3: Verify the detailed design external 
consistency
 with the Architectural Design
(+++)
 IVE.DA.T2.S4: Verify 
interfaces consistency 
between different 
SW components
(===)
 IVE.DA.T2.S5: Verify detailed design 
correctness
(===)
 IVE.DA.T2.S6: Verify detailed design 
completeness
(+++)
 IVE.DA.T2.S7: Verify the 
dependability & safety
 of the design
(---)
 IVE.DA.T2.S8: Verify the 
readability
 of the detailed design
(===)
 IVE.DA.T2.S9: Verify the 
timing and sizing budgets 
of the software
(xxx)
 IVE.DA.T2.S10: Verify the 
accuracy
 of the model (in case models are produced by the SW suppliers)
(---)
 IVE.DA.T2.S11: Identify 
test areas and test cases 
for independent Validation
(---)
 IVE.DA.T2.S12: Verify detailed design 
conformance with applicable standards
 
Findings per sub-task  (3/3)
DA.T3: 
Software User Manual Verification
(---)
 IVE.DA.T3.S1: Verify the 
timing and sizing budgets 
of the software
(---)
 IVE.DA.T3.S2: Verify the 
dependability & safety 
aspects on the product are specified in the SUM
(---)
 IVE.DA.T3.S3; Verify the 
readability
 of the User Manual
(---)
 IVE.DA.T3.S4; Verify the 
completeness
 of the User Manual
(---)
 IVE.DA.T3.S5: Verify the 
correctness
 of the User Manual
IVE: 
Code Analysis
CA.T1
: Source Code Verification
(---)
 IVE.CA.T1.S1: Verify source code external 
consistency
 with Technical Specification
(---)
 IVE.CA.T1.S2: Verify source code external 
consistency
 with Interface Control Documents
(---)
 IVE.CA.T1.S3: Verify source code external 
consistency
 with Architectural Design and Detailed Design
(---)
 IVE.CA.T1.S4: Verify interfaces 
consistency
 between different 
SW units
(+++)
 IVE.CA.T1.S5: Verify source code 
correctness
 with respect to technical specification, architectural design & detailed design
(+++)
 IVE.CA.T1.S6: Verify the source code 
readability, maintainability and conformance 
with the applicable standards
(+++)
 IVE.CA.T1.S7: Verify the 
dependability & safety 
of the source code
(---)
 IVE.CA.T1.S8: Verify the 
accuracy
 of the source code
(---)
 IVE.CA.T1.S9: Identify 
test areas and test cases 
for independent Validation
(===)
 IVE.CA.T1.S10: Verify the 
timing and sizing budgets 
of the software
CA.T2: 
Integration Test Specification and Test Data Verification
(===)
 IVE.CA.T2.S1: Verify 
consistency
 with Technical Specification
(---)
 IVE.CA.T2.S2: Verify 
consistency
 with Software Architectural Design
(+++)
 IVE.CA.T2.S3: Verify integration 
test procedures correctness and completeness
(xxx)
 IVE.CA.T2.S4: If models are produced by the SW suppliers, then evaluate model verification and validation test results
(xxx)
 IVE.CA.T2.S5: Verify integration 
test reports
CA.T3: 
Unit Test Procedure and Test Data Verification
(---)
 IVE.CA.T3.S1: Verify 
consistency
 with Software Detailed Design
(===)
 IVE.CA.T3.S2: Verify unit test 
procedures correctness and completeness
(xxx)
 IVE.CA.T3.S3: Verify unit test 
reports
 
IVA: Independent Validation
 
IVA.T1: 
Identification of Test Cases
IVA.T1.S1: Evaluate Task Input Inspection
IVA.T1.S2: Perform Analysis
IVA.T1.S3: Writing Independent Validation Test Plan
IVA.T2: 
Construction of Test Procedures
IVA.T2.S1: Achieve knowledge about the SVF
IVA.T2.S2: Implement Test Cases into Test Procedures
IVA.T2.S3: Updating the Independent Validation Test Plan
IVA.T3: 
Execution of Test Procedures
IVA.T3.S1: Execute the Test Procedures
IVA.T3.S2: Investigation of failed tests
IVA.T3.S3: Produce Test Report
Slide Note
Embed
Share

This document presents the results of an ESA study assessing the effectiveness of the Independent Software Verification & Validation (ISVV) process within ESA missions. It outlines the objectives, past project assessments, identification of useful aspects, areas for improvement, and the integration of unified metrics collection. The ISVV process overview covers management, verification, and validation activities, including tasks and subtasks related to technical specification analysis, design analysis, and code analysis.

  • ESA
  • Space Projects
  • ISVV
  • Effectiveness Measurement
  • Verification & Validation

Uploaded on Sep 10, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. ISVV Effectiveness Measurement in ESA Space Projects Pedro A. Barrios, Maria Hernek, Marek Prochazka European Space Agency NASA IV&V Workshop 11-13 September 2012 ESA UNCLASSIFIED For Official Use

  2. Objective / Outline Objective Present the results of an ESA study to assess the effectiveness of the ISVV process carried out in the scope of ESA missions Assessment of past ISVV projects, with the following final objectives: Identify what is useful in ISVV process (i.e. what brings results) Identify what needs to be improved (i.e. added/removed/clarified/...) Make unified metrics collection an integrated part of the process Outline ESA ISVV process: a quick overview ISVV metrics definition ISVV metrics collection & analysis Conclusions and future work ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 2 ESA UNCLASSIFIED For Official Use

  3. Independent Software Verification & Validation (ISVV) by ESA 1. ISVV is required for Mission and Safety Critical software, (ECSS-E-40/ECSS-Q-80) 2. ISVV tasks are additional and complementary to the nominal SW supplier s verification and validations tasks 3. ISVV tasks cover verification and validation of software requirements, design, code and tests (typically starting at SW-SRR and finishing before the SW-QR) 4. ISVV supplier is required to be an organization independent of the software supplier as well as the prime/system integrator (full technical, managerial, and financial independence) 5. Most ESA projects implement the ISVV process as an industrial contract placed by the Prime contractor ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 3 ESA UNCLASSIFIED For Official Use

  4. ESA ISVV Process overview MAN. Management 6 activities/STAGES: Management (MAN), Verification (IVE) and Validation (IVA) MAN.PM.ISVV Process Management Activities are composed of TASKS, and these are further split into SUBTASKS MAN.VV.ISVV level definition 1. Management (MAN.PM and MAN.VV) is concerned with issues such as ISVV objectives and scope, planning, roles, responsibilities, budget, communication, competence, confidentiality, schedule and ISVV level definition (to limit the scope of ISVV) IVE. Independent Verification IVE.TA.Technical Specification Analysis 2. Technical Specification Analysis (IVE.TA) is verification of the software requirements IVE.DA.Design Analysis 3. Design Analysis (IVE.DA) is verification of the SW Architectural Design and the Software Detailed Design IVE.CA.Code Analysis 4. Code Analysis (IVE.CA)is verification of the SW source code IVA. Independent Validation 5. Validation (IVA) is testing of the SW to demonstrate that the implementation meets the technical specification IVA.Validation ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 4 ESA UNCLASSIFIED For Official Use

  5. ESA ISVV Process overview Example of a Task/Subtask description Activity: Technical Specification Analysis Task: SW Requirements Verification Subtasks: T1.S1, T1.S2 T1.S11 Start/End Events Inputs/Outputs Methods are identified for each subtask Some numbers: IVE.TA 1 task 11 subtasks IVE.DA 3 tasks 15/12/5 subtasks IVE.CA 3 tasks 10/5/3 subtasks IVA 3 tasks 3/3/3 subtasks ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 5 ESA UNCLASSIFIED For Official Use

  6. ESA ISVV Process overview IVE: Technical Specification Analysis TA.T1: Software Requirements Verification System Requirements allocated to Software (SRR) System SW- Requirements (SRR) SW-HW Interface Requirements Requirements Subtasks: To verify Software Requirements external consistency with the system requirements Interface Requirements external consistency with the system requirements software requirements correctness consistent documentation of the software requirements software requirements completeness dependability and safety requirements readability of the software requirements timing and sizing budgets of the software requirements Identify test areas and test cases for Independent Validation that software requirements are testable software requirements conformance with applicable standards allocated to SW Requirements Specification (PDR) Interfaces Control Document (PDR) ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 6 ESA UNCLASSIFIED For Official Use

  7. ESA ISVV Process overview IVE: Design Analysis DA.T1: Architectural Design Verification Subtasks: To verify SW architectural design external consistency with Technical Specification SW architectural design external consistency with Interface Control Documents interfaces consistency between different SW components architectural design correctness architectural design completeness dependability & safety of the design readability of the architectural design timing and sizing budgets of the software Identify test areas and test cases for independent Validation architectural design conformance with applicable standards Technical Specification (PDR) Interfaces Control Doc (PDR) SW Architectural Design (PDR) if models are produced by the SW suppliers: Verify test performed on high level model Verify development and verification and testing methods and environment then construct model test cases & model test procedures then execution of model test procedures ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 7 ESA UNCLASSIFIED For Official Use

  8. ESA ISVV Process overview IVE: Design Analysis DA.T2: Detailed Design Verification Subtasks: To verify detailed design external consistency with Technical Specification detailed design external consistency with Interface Control Documents detailed design external consistency with Architectural Design interfaces consistency between different SW components detailed design correctness detailed design completeness dependability & safety of design readability of detailed design timing and sizing budgets of software accuracy of the model (in case models are produced by the SW suppliers) Identify test areas and test cases for independent Validation Verify detailed design conformance with applicable standards DA.T3: Software User Manual Verification Subtasks: To verify timing and sizing budgets of software that dependability & safety aspects on product are specified in the SUM readability of User Manual completeness of User Manual correctness of User Manual ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 8 ESA UNCLASSIFIED For Official Use

  9. ESA ISVV Process overview IVE: Code Analysis CA.T1: Source Code Verification Subtasks: To verify source code external consistency with Technical Specification source code external consistency with Interface Control Documents source code external consistency with Architectural Design and Detailed Design interfaces consistency between different SW units source code correctness with respect to technical specification, architectural design and detailed design source code readability, maintainability and conformance with the applicable standards dependability & safety of source code Source code accuracy Identify test areas and test cases for independent Validation timing and sizing budgets of the software ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 9 ESA UNCLASSIFIED For Official Use

  10. ESA ISVV Process overview IVE: Code Analysis CA.T2: Integration Test Specification and Test Data Verification Subtasks: To verify consistency with Technical Specification consistency with Software Architectural Design integration test procedures correctness and completeness If models are produced by the SW suppliers, then evaluate model verification and validation test results integration test reports CA.T3: Unit Test Procedure and Test Data Verification Subtasks: To verify consistency with Software Detailed Design unit test procedures correctness and completeness unit test reports ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 10 ESA UNCLASSIFIED For Official Use

  11. ISVV effectiveness metrics Key goal of activity is to estimate effectiveness of the ISVV process carried out in scope of ESA projects Major objective is to provide measurements and conclusions to support identification and prioritization of ISVV activities based on their efficiency Improve ISVV process is an additional objective ISVV effectiveness to be calculated based on number of findings and their acceptance and impact Based on number of findings, the following metrics are computed: findings per ISVV stage / task / subtask; finding per severity; findings per type and effective findings. ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 11 ESA UNCLASSIFIED For Official Use

  12. Measurement Process 3 steps activity: ISVV metrication definition / ISVV metrics collection / ISVV metrics assessment Industrial context: Measurement needs and processes started by ESA Provision of metrics performed through different small contracts granted by ESA to different ESA ISVV suppliers Data analysis, collection and metrics analysis and calculation performed by an ESA contractor to this activity ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 12 ESA UNCLASSIFIED For Official Use

  13. Measurement Process Data gathering, with following contents: SW product metrics (size in kLOC, number of requirements, criticality) ISVV project metrics (ISVV level, ISVV scope and stages, documentation quality at reviews) Findings (task, subtask, which document, type, severity, use of tools, acceptance, impact measured in number of changes) Note: excel tool was used ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 13 ESA UNCLASSIFIED For Official Use

  14. Measurement Process 15 products from 5 projects 4 different ISVV suppliers Project GAIA Product SIZE Big TYPE ASW GAIA intermediate CryoSat CryoSat CDMU CryoSat AOCS LISA PF BSW LISA PF ASW LISA PF DHSW Galileo MSF Medium Medium Small Big Medium Big ASW ASW BSW ASW ASW ASW LISA PF The IVE effectiveness metrics are assessed: per product per SW products of similar size In total, i.e. in all projects and SW products considered Analysis is performed: Per all stages Per ISVV project stage Per ISVV task /subtask Findings per stage/task/sub-task, per severity, per type, Effective Findings & Tools usage Galileo MSF Galileo MGF Galileo MGF Big ASW o Galileo IPF Galileo IPF AF Medium ASW o Galileo IPF RTMC Galileo NSGU Big Big ASW ASW o Galileo NSGU Galileo PxSU Galileo PxSU BSW Medium BSW Galileo PxSU ASW Medium ASW o ATV ATV FAS ATV MSU Big Medium ASW ASW o o Note: Only one product classified as small ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 14 ESA UNCLASSIFIED For Official Use

  15. ISVV metrics collection & analysis (1/10) Total Findings Total number of IVE findings for 15 products within this analysis is 2492 No clear relationship between findings & product size Mean & Standard deviation red=big products ; blue=medium; green=small ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 15 ESA UNCLASSIFIED For Official Use

  16. ISVV metrics collection & analysis (2/10) Findings per stage per Product Findings per stage Share of total findings per stage IVE.CA 824 33% IVE.TA 958 39% Findings per size per stage IVE.DA 710 28% 100% 90% 369 27 80% Although there is some variability per product, number of findings are roughly 1/3 for three stages 562 70% IVE.TA IVE.DA IVE.CA 27 413 60% 50% 40% 349 30% 46 596 20% The majority of findings are at the TA stage for big type products, and CA stage for small 10% 103 0% Big Medium Small TA: Technical Specification Analysis CA: Code Analysis DA: Design Analysis ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 16 ESA UNCLASSIFIED For Official Use

  17. ISVV metrics collection & analysis (3/10) Findings per task (TA: Technical Specification Analysis) IVE.TA.T1 (Requirements traceab verif) 191 21% Findings per product for TA tasks IVE.TA.T2 (Requirements verif) 702 79% Share of total findings for TA tasks Findings per size for TA tasks Majority of findings of TA stage are at TA.T2 task (Software Requirements Verification) for all products, all projects, all product sizes with only one exception. As the size of products decreases, more findings are discovered at TA.T2 task 100% 90% 80% IVE.TA.T2 IVE.TA.T1 70% 408 60% 267 27 50% 40% 30% 20% 154 10% 37 ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 17 0 0% Big Medium Small ESA UNCLASSIFIED For Official Use

  18. ISVV metrics collection & analysis (4/10) Findings per task (DA: Design Analysis) IVE.DA.T5 (SUM verif) 1 0,2% IVE.DA.T1 (AD traces verif) 33 5,4% IVE.DA.T4 (DD verif) 257 42,1% IVE.DA.T2 (AD verification) 288 47,1% IVE.DA.T3 (DD traces verif) 32 5,2% Total share of findings for DA stage Findings per product for DA tasks 1 0 0 100% IVE.DA.T5 96 142 Majority of findings of the DA stage are either at DA.T2 (Architectural Design Verification) task or at DA.T4 task (Detailed Design Verification) depending of the different products. 80% IVE.DA.T4 19 8 60% 22 IVE.DA.T3 40% 113 IVE.DA.T2 170 2 20% IVE.DA.T1 5 22 1 10 Big 0% Small Medium ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 18 Findings per size for DA tasks ESA UNCLASSIFIED For Official Use

  19. ISVV metrics collection & analysis (5/10) Findings per task (CA: Code Analysis) 0 13 100% 27 18 85 80% IVE.CA.T1 (traces of code) 12 2% IVE.CA.T4 IVE.CA.T3 IVE.CA.T2 IVE.CA.T1 82 60% IVE.CA.T4 (UT test verification) 40 7% 21 40% 183 95 20% 7 4 1 0% IVE.CA.T3 (IT tests verification) 185 35% Small Medium Big IVE.CA.T2 (Verif of code) 299 56% Findings per size for CA tasks Total share of findings for CA stage Majority of findings of CA stage are at the CA.T2 task (Source Code Verification) in totals, then it varies product by product. CA.T3 (IT tests Verification) also represents a big share Share of findings per product for CA tasks ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 19 ESA UNCLASSIFIED For Official Use

  20. ISVV metrics collection & analysis (6/10) IVE.TA.T2.S8 10 1,4% Findings per sub-task (e.g. TA subtasks) IVE.TA.T2.S5 92 13,1% IVE.TA.T1.S1 74 39% IVE.TA.T1.S4 99 51% IVE.TA.T2.S4 105 15,0% IVE.TA.T2.S1 338 48,1% IVE.TA.T2.S3 100 14,2% IVE.TA.T1.S2 17 9% IVE.TA.T1.S3 1 1% IVE.TA.T2.S2 56 8,0% IVE.TA.T2.S10 1 0,1% T1: Requirements Traceability Verification T2: Software Requirements Verification Exact numbers are available for all the subtasks There are subtasks producing a reduced number of findings. Three possible cases: subtask not performed within the ISVV project, subtask not producing findings or data not available for the subtask ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 20 ESA UNCLASSIFIED For Official Use

  21. ISVV metrics collection & analysis (7/10) Findings per severity Comment, very low 141 6% Major 908 36% Minor 1443 58% Most of findings are minor. Major findings account for 36%. Proportions found across the three stages (TA, DA, CA) are similar to these numbers ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 21 ESA UNCLASSIFIED For Official Use

  22. ISVV metrics collection & analysis (8/10) Findings per type 10 7 100% 86 14 178 3 2 8 90% Typo Readability & Maintainability N/A No Problem Technical feasibility Internal Consistency External Consistency Completeness Correctness 7 2 3 5 173 125 80% 67 70% 191 42 60% 350 50% 307 40% 30% 27 490 Most of findings are of type correctness, followed by findings of type completeness 20% 383 10% 12 0% Big Small Medoim ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 22 ESA UNCLASSIFIED For Official Use

  23. ISVV metrics collection & analysis (9/10) Tools usage Automated 1% Semi- automated 2% Manual 97% Majority of findings were discovered manually (97% of the total findings) and only very few of them using tools (either to automatically discover the finding or the so-called semi-automated , using tools to further evaluate to discover any finding) ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 23 ESA UNCLASSIFIED For Official Use

  24. ISVV metrics collection & analysis (10/10) Effective findings (ISVV findings that implied a change, improvement, correction to the software product) Not effective 31% Accepted findings per product Effective 69% Majority of findings are effective no matter product & size, except for small product for which majority of findings are not effective Accepted findings per size Majority of findings per stage are effective (72% TA & DA stages; 61% at CA stage) 100% 287 418 80% Majority of findings are effective for all severities (70% for major, 69% for minor) 73 60% Not effective effective 880 40% 807 20% 27 ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 24 0% Big Small Medium ESA UNCLASSIFIED For Official Use

  25. Conclusions (1/2) Total number of findings - Measurements based on number of findings - Focus on IVE metrics - No correlation found between number of findings & product size Total number of findings per ISVV stage / task /subtask - Stage: Roughly even distribution (39% TA, 28% DA, 33% CA) - Task/Subtasks: Identified the tasks producing most of the findings for TA, DA, CA (e.g. Correctness/Completeness subtasks are producing many findings; consistency subtasks produce some) Type of findings: The majority of findings are of type correctness (36%) & completeness (28%) Effective findings: The majority of findings (69%) are effective (i.e. implying changes/corrections to the software product) ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 25 ESA UNCLASSIFIED For Official Use

  26. Conclusions (2/2) Severity: Most findings are minor at all stages, with 58% minor, 36% major and remaining 6% for other severity (comment, very low) Tools: The majority of the findings were discovered manually (97%) and only very few of them using tools. Tools were used only for 3% of findings (especially at the CA stage) Example: if we started today an ISVV contract on a project, we could expect, on average: 166 findings, from which 115 would be effective findings; out of those, 41 would be major findings, and those would be spread on the different stages as: IVE: 16 TA, 11 DA, 14 CA ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 26 ESA UNCLASSIFIED For Official Use

  27. Future work Collect metrics for the upcoming ISVV projects. Analyze tasks/sub-tasks not producing many findings (they might need better explanations within the ISVV guide, review the methods and tools proposed to be used when performing them, ) Analyze Independent Validation: o Define useful metrics for IVA and asses IVA effectiveness o Extend the scope of IVA, to cover Qualification & Acceptance of the SW and the Operational scenarios (i.e. having the operational view to create SW validation campaigns) Modeling: o Some model related sub-tasks have not been profiled o Understand how models produced during SW development could be used during ISVV activities (e.g. Model-Based Testing techniques to produce validation campaigns) ISVV effectiveness Metrics: o Some other way how to measure effectiveness? o Cost figures? ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 27 ESA UNCLASSIFIED For Official Use

  28. Thanks for your attention !!!! For more information, please contact: Pedro A. Barrios, European Space Agency Pedro.Barrios@esa.int ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 28 ESA UNCLASSIFIED For Official Use

  29. Back-up Slides ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 29 ESA UNCLASSIFIED For Official Use

  30. Findings per sub-task (1/3) High level view of number of findings per sub-task Legend: (+++): Subtask producing a considerable number of findings (===): Subtask producing some findings (---): Subtask producing a reduced number of findings (xxx): Metrics not available for that subtask IVE: Technical Specification Analysis TA.T1: Software Requirements Verification (===) IVE.TA.T1.S1: Verify Software Requirements external consistency with the system requirements (===) IVE.TA.T1.S2: Verify Interface Requirements external consistency with the system requirements (+++) IVE.TA.T1.S3: Verify software requirements correctness (===) IVE.TA.T1.S4: Verify the consistent documentation of the software requirements (+++) IVE.TA.T1.S5: Verify software requirements completeness (+++) IVE.TA.T1.S6: Verify the dependability and safety requirements (+++) IVE.TA.T1.S7: Verify the readability of the software requirements (---) IVE.TA.T1.S8: Verify the timing and sizing budgets of the software requirements (---) IVE.TA.T1.S9: Identify test areas and test cases for Independent Validation (---) IVE.TA.T1.S10: Verify that the software requirements are testable (---) IVE.TA.T1.S11: Verify software requirements conformance with applicable standards ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 30 ESA UNCLASSIFIED For Official Use

  31. Findings per sub-task (2/3) IVE: Design Analysis DA.T1: Architectural Design Verification (===) IVE.DA.T1.S1: Verify the SW architectural design external consistency with the Technical Specification (---) IVE.DA.T1.S2: Verify the SW architectural design external consistency with the Interface Control Documents (===) IVE.DA.T1.S3: Verify interfacesconsistency between different SW components (===) IVE.DA.T1.S4: Verify architectural design correctness (===) IVE.DA.T1.S5: Verify architectural design completeness (===) IVE.DA.T1.S6: Verify the dependability & safety of the design (+++) IVE.DA.T1.S7: Verify the readability of the architectural design (===) IVE.DA.T1.S8: Verify the timing and sizing budgets of the software (---) IVE.DA.T1.S9: Identify test areas and test cases for independent Validation (---) IVE.DA.T1.S10: Verify architectural design conformance with applicable standards (xxx) IVE.DA.T1.S11: Verify the test performed on the high level model (xxx) IVE.DA.T1.S12: Verify the development and verification and testing methods and environment (xxx) IVE.DA.T1.S13: then construct model test cases (xxx) IVE.DA.T1.S14: then construct model test procedures (xxx) IVE.DA.T1.S15: then execution of model test procedures DA.T2: Detailed Design Verification (---) IVE.DA.T2.S1: Verify the detailed design external consistency with the Technical Specification (---) IVE.DA.T2.S2: Verify the detailed design external consistency with the Interface Control Documents (---) IVE.DA.T2.S3: Verify the detailed design external consistency with the Architectural Design (+++) IVE.DA.T2.S4: Verify interfaces consistency between different SW components (===) IVE.DA.T2.S5: Verify detailed design correctness (===) IVE.DA.T2.S6: Verify detailed design completeness (+++) IVE.DA.T2.S7: Verify the dependability & safety of the design (---) IVE.DA.T2.S8: Verify the readability of the detailed design (===) IVE.DA.T2.S9: Verify the timing and sizing budgets of the software (xxx) IVE.DA.T2.S10: Verify the accuracy of the model (in case models are produced by the SW suppliers) (---) IVE.DA.T2.S11: Identify test areas and test cases for independent Validation (---) IVE.DA.T2.S12: Verify detailed design conformance with applicable standards ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 31 ESA UNCLASSIFIED For Official Use

  32. Findings per sub-task (3/3) DA.T3: Software User Manual Verification (---) IVE.DA.T3.S1: Verify the timing and sizing budgets of the software (---) IVE.DA.T3.S2: Verify the dependability & safety aspects on the product are specified in the SUM (---) IVE.DA.T3.S3; Verify the readability of the User Manual (---) IVE.DA.T3.S4; Verify the completeness of the User Manual (---) IVE.DA.T3.S5: Verify the correctness of the User Manual IVE: Code Analysis CA.T1: Source Code Verification (---) IVE.CA.T1.S1: Verify source code external consistency with Technical Specification (---) IVE.CA.T1.S2: Verify source code external consistency with Interface Control Documents (---) IVE.CA.T1.S3: Verify source code external consistency with Architectural Design and Detailed Design (---) IVE.CA.T1.S4: Verify interfaces consistency between different SW units (+++) IVE.CA.T1.S5: Verify source code correctness with respect to technical specification, architectural design & detailed design (+++) IVE.CA.T1.S6: Verify the source code readability, maintainability and conformance with the applicable standards (+++) IVE.CA.T1.S7: Verify the dependability & safety of the source code (---) IVE.CA.T1.S8: Verify the accuracy of the source code (---) IVE.CA.T1.S9: Identify test areas and test cases for independent Validation (===) IVE.CA.T1.S10: Verify the timing and sizing budgets of the software CA.T2: Integration Test Specification and Test Data Verification (===) IVE.CA.T2.S1: Verify consistency with Technical Specification (---) IVE.CA.T2.S2: Verify consistency with Software Architectural Design (+++) IVE.CA.T2.S3: Verify integration test procedures correctness and completeness (xxx) IVE.CA.T2.S4: If models are produced by the SW suppliers, then evaluate model verification and validation test results (xxx) IVE.CA.T2.S5: Verify integration test reports CA.T3: Unit Test Procedure and Test Data Verification (---) IVE.CA.T3.S1: Verify consistency with Software Detailed Design (===) IVE.CA.T3.S2: Verify unit test procedures correctness and completeness (xxx) IVE.CA.T3.S3: Verify unit test reports ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 32 ESA UNCLASSIFIED For Official Use

  33. IVA: Independent Validation IVA.T1: Identification of Test Cases IVA.T1.S1: Evaluate Task Input Inspection IVA.T1.S2: Perform Analysis IVA.T1.S3: Writing Independent Validation Test Plan IVA.T2: Construction of Test Procedures IVA.T2.S1: Achieve knowledge about the SVF IVA.T2.S2: Implement Test Cases into Test Procedures IVA.T2.S3: Updating the Independent Validation Test Plan IVA.T3: Execution of Test Procedures IVA.T3.S1: Execute the Test Procedures IVA.T3.S2: Investigation of failed tests IVA.T3.S3: Produce Test Report ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 33 ESA UNCLASSIFIED For Official Use

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#