Software
Coding and Testing
by:
Dr. Bharat V. Chawda
Computer Engineering Department,
BBIT, VVNagar, Gujarat, India
1
Overview
 Introduction
 Code Review
 Software Documentation
 Testing
 Test Documentation
(As per GTU Curriculum – Diploma in Computer/IT Engineering)
Based on Books:
1. Fundamentals of Software Engineering – by Rajib Mall
2. Software Engineering: A Practitioner’s Approach – by Roger Pressman
2
Introduction: Coding
 When?
 After:
 Design phase is complete, and
 Design docs are successfully reviewed
 Objective
 Design of System  Code in high-level lang
 Unit test this code
 Coding Standards
 Coding Guidelines
3
Code Review
 When?
 After: Module successfully compiles
 All the syntax errors have been eliminated
 Code review v/s Testing
 CR: Cost-effective strategy for error elimination
 CR: Direct detects errors
 T: Detects failures only: diff i/p, circumstances
 Testing: Requires efforts: Debugging – locate
errors; Error Correction – fix the bug
 CR: Two Types
 Code Walkthrough, Code Inspection
4
Code Walkthrough
 Informal code analysis technique
 When to review?
 After: Module is Coded, Compiled, and Syntax Errors
are eliminated
 How?
 Few members of dev team are assigned this task
 Each member selects some test cases
 Simulate execution of code by hand
 (Trace execution through different Statements and
Instructions of the code)
 Note down findings; Discuss with coder in WT meeting
5
Code Walkthrough (cont)
 Objective
 Discover the algorithmic and logical errors in
the code
 Guidelines
 Team size: Not too big, not too small: 3-7
member
 Focus on discovery of errors, not on how to fix
them
 Managers should not attend WT meeting – To
avoid feeling: engineers are being evaluated
6
Code Inspection
 Code is examined for the presence of some
common/classical programming
errors
 Use of uninitialized variables
 Incompatible assignments
 Non terminating loops; Jumps into loops;
Improper modification of loop variables
 Mismatch in arguments in procedure (fun) calls
 Array indices out of bounds
 Improper storage allocation and de-allocation
 Use of incorrect logical operators; Precedence
 Comparison of equality of floating point values 7
Code Inspection (cont)
 Objective:
 Check for the common types of errors
 Check whether coding standard have been
adhered to
 SW companies can maintain list of
commonly committed error  check list for
code inspection
8
Software Documentation
 SW Product
 Executable files + Source Code + Documents
 Documents: Users’ manual, SRS doc, Design
doc, Test doc, Installation manual, etc
 Why required?
 Enhances understandability of SW product;
Reduces effort & time required 4 maintenance
 Help users to und & effectively use the system
 Help in effectively tackling manpower turnover
 Help manager to effectively track progress
9
SW: Internal Documentation
 Code comprehension features: provided in
the source code itself
 Comments embedded in the source code
 Use of meaningful variable names
 Module and function headers
 Code indentation
 Code structuring (modules + functions)
 Use of constant identifiers
 Use of enumerated types
 Use of user-defined data types
10
SW: External Documentation
 Contains various types of supportive docs
 Users’ manual
 SRS doc
 Design doc
 Test doc
 Installation manual…
 Features: Good external documentation
 Consistency
 Understandability
11
Gunning’s Fog Index
 Metric to measure the readability of a document
 Fog(D) = [0.4 * words/sentences] +
[% of words having >=3 syllables]
 Example: “The Gunning’s fog index is based on
the premise that use of short sentences and
simple words makes a document easy to
understand”
 Fog(D) = [0.4 * 23 / 1] + [4 / 23 * 100]
= 26
 Indicates the no. of years of formal education
required to comfortably understand document
12
Testing: Introduction
 Testing:
 Aim: Identify all defects in a program
 Error / Defect / Bug / Fault:
 Mistake committed by development team
during any of the development phases.
 Failure:
 Manifestation of an error
 Symptom of an error
 Test case: Triplet [I, S, O]: I/P, State, O/P
 Test suite: Set of all test cases…
13
Testing: Levels/Stages
 Unit Testing
 Integration Testing
 System Testing
14
Unit Testing
 When?
 After: Module has been coded and reviewed
 How?
 Design test cases
 Develop Environment
 Do testing
 Environment
 Driver + Module + Stub
(Stub: Dummy procedures with simplified behavior)
(Driver: Non-local data str + Code to call fun of module)
15
Driver
Stub
Module under Test
Global
Data
Black Box Testing
16
int find_max(int x, int y)
{
int max;
if (x>y)
max = x;
else
max = y;
return max;
}
x
y
max
find_max
Black Box Testing
 Concept
 Based on functional specification of SW
 Based on functional behavior: Inputs/Outputs
 Also known as: Functional Testing
 No knowledge of design & code is required
 Two main approaches
 Equivalence Class Partitioning
 Boundary Value Analysis
17
Black Box Testing: Example
 SW: Computes square root of integer
values in the range of 0 and 5000.
 Test Cases: Equivalence Class Partitioning
 {-5, 500, 6000}
 Test Cases: Boundary Value Analysis
 {-1, 0, 5000, 5001}
18
White Box Testing
19
int find_max(int x, int y)
{
int max;
if (x>y)
max = x;
else
max = y;
return max;
}
x
y
max
find_max
White Box Testing
 Concept
 Based on analysis of code
 Based on structure of the implementation
 Also known as: Structural Testing
 Knowledge of design & code is required
 Two Types
 Fault based: Targets: detect certain types of F
 Coverage based: Targets: execute (cover)
certain elements of a program
20
White Box T: Coverage based
 Strategies
 Statement Coverage
 Each statement should be executed at least once
 Branch Coverage
 Each branch : traversed at least once
 Condition Coverage
 Each condition : True at least once and false at least
once
 Path Coverage
 Each linearly independent path : executed at least
once
21
White Box T: Example
int test (int x, int y)
{ int z;
z = -1;
if (x>0 && y>0)
z = x;
return z;
}
22
Statement Coverage:
{(x=1,y=1)}
Branch Coverage:
{(1,1), (0,0)}
Condition Coverage:
{(0,0), (0,1), (1,0), (1,1)}
White Box T: Path Coverage
 Concept
 All linearly independent paths in the program
are executed at least once
 CFG: Control Flow Graph
 Directed graph – consisting of a set of Nodes
(N) and Edges (E) where
 Nodes (N): corresponds to a unique program
statement
 Edges (E): Transfer of control From one node
to another node
23
White Box T: Path Coverage
 Example:
int gcd (int x, int y)
{
while (x!=y)
{
if (x>y)
x=x-y;
else
y=y-x;
}
return x;
}
24
White Box T: Path Coverage
 Example:
int gcd (int x, int y)
{
1. while (x!=y)
{
2. if (x>y)
3. x=x-y;
else
4. y=y-x;
5. }
6. return x;
}
25
1
2
3 4
5
6
 CFG:
Cyclomatic Complexity Metric
 V(G) = E – N + 2
 V(G) = Total number of Non-overlapping
Bounded Areas + 1
 V(G) = Total number of Non-overlapping
Areas
 V(G) = Decision Points + 1
 V(G) = Predicate Nodes + 1
26
Cyclomatic Complexity of previous example of GCD: 3
Test Documentation
 When: Towards end of testing
 Represents: Test summary report
 Specifies:
 Total number of tests: applied to a sub-system
 How many tests were successful
 How many tests were unsuccessful; and at
which extent (degree): totally or partially
27
Thank-U…!!!
28

SE2023 0401 Software Coding and Testing.pptx

  • 1.
    Software Coding and Testing by: Dr.Bharat V. Chawda Computer Engineering Department, BBIT, VVNagar, Gujarat, India 1
  • 2.
    Overview  Introduction  CodeReview  Software Documentation  Testing  Test Documentation (As per GTU Curriculum – Diploma in Computer/IT Engineering) Based on Books: 1. Fundamentals of Software Engineering – by Rajib Mall 2. Software Engineering: A Practitioner’s Approach – by Roger Pressman 2
  • 3.
    Introduction: Coding  When? After:  Design phase is complete, and  Design docs are successfully reviewed  Objective  Design of System  Code in high-level lang  Unit test this code  Coding Standards  Coding Guidelines 3
  • 4.
    Code Review  When? After: Module successfully compiles  All the syntax errors have been eliminated  Code review v/s Testing  CR: Cost-effective strategy for error elimination  CR: Direct detects errors  T: Detects failures only: diff i/p, circumstances  Testing: Requires efforts: Debugging – locate errors; Error Correction – fix the bug  CR: Two Types  Code Walkthrough, Code Inspection 4
  • 5.
    Code Walkthrough  Informalcode analysis technique  When to review?  After: Module is Coded, Compiled, and Syntax Errors are eliminated  How?  Few members of dev team are assigned this task  Each member selects some test cases  Simulate execution of code by hand  (Trace execution through different Statements and Instructions of the code)  Note down findings; Discuss with coder in WT meeting 5
  • 6.
    Code Walkthrough (cont) Objective  Discover the algorithmic and logical errors in the code  Guidelines  Team size: Not too big, not too small: 3-7 member  Focus on discovery of errors, not on how to fix them  Managers should not attend WT meeting – To avoid feeling: engineers are being evaluated 6
  • 7.
    Code Inspection  Codeis examined for the presence of some common/classical programming errors  Use of uninitialized variables  Incompatible assignments  Non terminating loops; Jumps into loops; Improper modification of loop variables  Mismatch in arguments in procedure (fun) calls  Array indices out of bounds  Improper storage allocation and de-allocation  Use of incorrect logical operators; Precedence  Comparison of equality of floating point values 7
  • 8.
    Code Inspection (cont) Objective:  Check for the common types of errors  Check whether coding standard have been adhered to  SW companies can maintain list of commonly committed error  check list for code inspection 8
  • 9.
    Software Documentation  SWProduct  Executable files + Source Code + Documents  Documents: Users’ manual, SRS doc, Design doc, Test doc, Installation manual, etc  Why required?  Enhances understandability of SW product; Reduces effort & time required 4 maintenance  Help users to und & effectively use the system  Help in effectively tackling manpower turnover  Help manager to effectively track progress 9
  • 10.
    SW: Internal Documentation Code comprehension features: provided in the source code itself  Comments embedded in the source code  Use of meaningful variable names  Module and function headers  Code indentation  Code structuring (modules + functions)  Use of constant identifiers  Use of enumerated types  Use of user-defined data types 10
  • 11.
    SW: External Documentation Contains various types of supportive docs  Users’ manual  SRS doc  Design doc  Test doc  Installation manual…  Features: Good external documentation  Consistency  Understandability 11
  • 12.
    Gunning’s Fog Index Metric to measure the readability of a document  Fog(D) = [0.4 * words/sentences] + [% of words having >=3 syllables]  Example: “The Gunning’s fog index is based on the premise that use of short sentences and simple words makes a document easy to understand”  Fog(D) = [0.4 * 23 / 1] + [4 / 23 * 100] = 26  Indicates the no. of years of formal education required to comfortably understand document 12
  • 13.
    Testing: Introduction  Testing: Aim: Identify all defects in a program  Error / Defect / Bug / Fault:  Mistake committed by development team during any of the development phases.  Failure:  Manifestation of an error  Symptom of an error  Test case: Triplet [I, S, O]: I/P, State, O/P  Test suite: Set of all test cases… 13
  • 14.
    Testing: Levels/Stages  UnitTesting  Integration Testing  System Testing 14
  • 15.
    Unit Testing  When? After: Module has been coded and reviewed  How?  Design test cases  Develop Environment  Do testing  Environment  Driver + Module + Stub (Stub: Dummy procedures with simplified behavior) (Driver: Non-local data str + Code to call fun of module) 15 Driver Stub Module under Test Global Data
  • 16.
    Black Box Testing 16 intfind_max(int x, int y) { int max; if (x>y) max = x; else max = y; return max; } x y max find_max
  • 17.
    Black Box Testing Concept  Based on functional specification of SW  Based on functional behavior: Inputs/Outputs  Also known as: Functional Testing  No knowledge of design & code is required  Two main approaches  Equivalence Class Partitioning  Boundary Value Analysis 17
  • 18.
    Black Box Testing:Example  SW: Computes square root of integer values in the range of 0 and 5000.  Test Cases: Equivalence Class Partitioning  {-5, 500, 6000}  Test Cases: Boundary Value Analysis  {-1, 0, 5000, 5001} 18
  • 19.
    White Box Testing 19 intfind_max(int x, int y) { int max; if (x>y) max = x; else max = y; return max; } x y max find_max
  • 20.
    White Box Testing Concept  Based on analysis of code  Based on structure of the implementation  Also known as: Structural Testing  Knowledge of design & code is required  Two Types  Fault based: Targets: detect certain types of F  Coverage based: Targets: execute (cover) certain elements of a program 20
  • 21.
    White Box T:Coverage based  Strategies  Statement Coverage  Each statement should be executed at least once  Branch Coverage  Each branch : traversed at least once  Condition Coverage  Each condition : True at least once and false at least once  Path Coverage  Each linearly independent path : executed at least once 21
  • 22.
    White Box T:Example int test (int x, int y) { int z; z = -1; if (x>0 && y>0) z = x; return z; } 22 Statement Coverage: {(x=1,y=1)} Branch Coverage: {(1,1), (0,0)} Condition Coverage: {(0,0), (0,1), (1,0), (1,1)}
  • 23.
    White Box T:Path Coverage  Concept  All linearly independent paths in the program are executed at least once  CFG: Control Flow Graph  Directed graph – consisting of a set of Nodes (N) and Edges (E) where  Nodes (N): corresponds to a unique program statement  Edges (E): Transfer of control From one node to another node 23
  • 24.
    White Box T:Path Coverage  Example: int gcd (int x, int y) { while (x!=y) { if (x>y) x=x-y; else y=y-x; } return x; } 24
  • 25.
    White Box T:Path Coverage  Example: int gcd (int x, int y) { 1. while (x!=y) { 2. if (x>y) 3. x=x-y; else 4. y=y-x; 5. } 6. return x; } 25 1 2 3 4 5 6  CFG:
  • 26.
    Cyclomatic Complexity Metric V(G) = E – N + 2  V(G) = Total number of Non-overlapping Bounded Areas + 1  V(G) = Total number of Non-overlapping Areas  V(G) = Decision Points + 1  V(G) = Predicate Nodes + 1 26 Cyclomatic Complexity of previous example of GCD: 3
  • 27.
    Test Documentation  When:Towards end of testing  Represents: Test summary report  Specifies:  Total number of tests: applied to a sub-system  How many tests were successful  How many tests were unsuccessful; and at which extent (degree): totally or partially 27
  • 28.