The document outlines various dynamic testing techniques in software testing, including black box and white box testing methods. It stresses the necessity of systematic testing techniques for effective fault detection and measuring test efforts, introducing concepts such as equivalence partitioning and boundary value analysis. Additionally, the document discusses the relevance of non-systematic techniques like error guessing and how they complement more structured approaches.
Contents
What is atesting technique?
Black and White box testing
Black box test techniques
White box test techniques
Error Guessing
Dynamic Testing Techniques
1 2 3
4 5 6
4.
Why dynamic testtechniques?
Exhaustive testing (use of all possible inputs and conditions)
is impractical
must use a subset of all possible test cases
must have high probability of detecting faults
Need thought processes that help us select test cases more
intelligently
test case design techniques are such thought processes
5.
What is atesting technique?
a procedure for selecting or designing tests
based on a structural or functional model of the software
successful at finding faults
'best' practice
a way of deriving good test cases
a way of objectively measuring a test effort
Testing should be rigorous, thorough and systematicTesting should be rigorous, thorough and systematic
6.
Using techniques makestesting much more effectiveUsing techniques makes testing much more effective
Advantages of techniques
Different people: similar probability find faults
gain some independence of thought
Effective testing: find more faults
focus attention on specific types of fault
know you're testing the right thing
Efficient testing: find faults with less effort
avoid duplication
systematic techniques are measurable
7.
Measurement
Objective assessmentof thoroughness of testing (with
respect to use of each technique)
useful for comparison of one test effort to another
E.g.
Project A
60% Equivalence
partitions
50% Boundaries
75% Branches
Project B
40% Equivalence
partitions
45% Boundaries
60% Branches
8.
Contents
What is atesting technique?
Black and White box testing
Black box test techniques
White box test techniques
Error Guessing
Dynamic Testing Techniques
1 2 3
4 5 6
9.
Three types ofsystematic technique
Static (non-execution)
• examination of documentation,
source code listings, etc.
Functional (Black Box)
• based on behaviour /
functionality of software
Structural (White Box)
• based on structure
of software
10.
Some test techniques
StaticStaticDynamicDynamic
StructuralStructural
BehaviouralBehavioural
FunctionalFunctionalNon-functionalNon-functional
ReviewsReviews
WalkthroughsWalkthroughs
Desk-checkingDesk-checking
Data
Flow
Data
Flow
Symbolic
Execution
Symbolic
Execution
Definition
-Use
Definition
-Use
StatementStatement
Branch/DecisionBranch/Decision
Branch ConditionBranch Condition
Branch Condition
Combination
Branch Condition
Combination
LCSAJLCSAJ
ArcsArcs
Equivalence
Partitioning
Equivalence
Partitioning
Boundary
Value Analysis
Boundary
Value Analysis
Cause-Effect GraphingCause-Effect Graphing
RandomRandom
UsabilityUsability
PerformancePerformance
Static AnalysisStatic Analysis
InspectionInspection
Control
Flow
Control
Flow
etc.etc.
etc.etc.
etc.etc.
etc.etc.
etc.etc.
State TransitionState Transition
11.
Black box versuswhite box?
IntegrationIntegration
ComponentComponent
AcceptanceAcceptance
SystemSystem
Black box appropriate
at all levels but
dominates higher
levels of testing
White box used
predominately
at lower levels
to compliment
black box
12.
Contents
What is atesting technique?
Black and White box testing
Black box test techniques
White box test techniques
Error Guessing
Dynamic Testing Techniques
1 2 3
4 5 6
13.
Black Box testdesign and
measurement techniques
Techniques defined in BS 7925-2
Equivalence partitioning
Boundary value analysis
State transition testing
Cause-effect graphing
Syntax testing
Random testing
Also defines how to specify other techniques
= Yes
= No
Also a measurement
technique?
14.
Equivalence partitioning (EP)
divide (partition) the inputs, outputs, etc. into areas which are the same
(equivalent)
assumption: if one value works, all will work
one from each partition better than all from one
1 100 1010
valid invalidinvalid
15.
Boundary value analysis(BVA)
faults tend to lurk near boundaries
good place to look for faults
test values on both sides of boundaries
1 100 1010
valid invalidinvalid
16.
Example: Loan application
CustomerName
Account number
Loan amount requested
Term of loan
Monthly repayment
Term:
Repayment:
Interest rate:
Total paid back:
6 digits, 1st
non-zero
£500 to £9000
1 to 30 years
Minimum £10
2-64 chars.
17.
Customer name
Number ofcharacters:
2 64 65
invalid valid invalid
1
Conditions Valid
Partitions
Invalid
Partitions
Valid
Boundaries
Invalid
Boundaries
Customer
name
2 to 64 chars
valid chars
< 2 chars
> 64 chars
invalid chars
2 chars
64 chars
1 chars
65 chars
0 chars
Valid characters:
Any
other
A-Z
a-z-’
space
18.
Account number
5 67
invalid
valid
invalid
number of digits:
first character:
invalid: zero
valid: non-zero
Conditions Valid
Partitions
Invalid
Partitions
Valid
Boundaries
Invalid
Boundaries
Account
number
6 digits
1st
non-zero
< 6 digits
> 6 digits
1st
digit = 0
non-digit
100000
999999
5 digits
7 digits
0 digits
Design test cases
Test
Case
DescriptionExpected Outcome New Tags
Covered
1
2
Name: John Smith
Acc no: 123456
Loan: 2500
Term: 3 years
Name: AB
Acc no: 100000
Loan: 500
Term: 1 year
Term: 3 years
Repayment: 79.86
Interest rate: 10%
Total paid: 2874.96
Term: 1 year
Repayment: 44.80
Interest rate: 7.5%
Total paid: 537.60
V1, V2,
V3, V4,
V5 .....
B1, B3,
B5, .....
22.
Why do bothEP and BVA?
If you do boundaries only, you have covered all the partitions
as well
technically correct and may be OK if everything works correctly!
if the test fails, is the whole partition wrong, or is a boundary in
the wrong place - have to test mid-partition anyway
testing only extremes may not give confidence for typical use
scenarios (especially for users)
boundaries may be harder (more costly) to set up
23.
Test objectives?
For athorough approach: VP, IP, VB, IB
Under time pressure, depends on your test objective
- minimal user-confidence: VP only?minimal user-confidence: VP only?
- maximum fault finding: VB first (plus IB?)maximum fault finding: VB first (plus IB?)
Condition Valid
Partition
Tag Invalid
Partition
Tag Valid
Boundary
Tag Invalid
Boundary
Tag
24.
Decision tables
explorecombinations of inputs, situations or events,
it is very easy to overlook specific combinations of input
start by expressing the input conditions of interest so that
they are either TRUE or FALSE
– record found
– file exists
– code valid
– policy expired
– account in credit
– due date > current date
25.
Example: student access
Auniversity computer system allows students an allocation
of disc space depending on their projects.
If they have used all their allotted space, they are only
allowed restricted access, i.e. to delete files, not to create
them. This is assuming they have logged on with a valid
username and password.
What are the input and output conditions?What are the input and output conditions?
26.
List the inputand output
conditions
• list the ‘output
conditions’ under the
input conditions
Input Conditions
Valid username
Valid password
Account in credit
Output Conditions
Login accepted
Restricted access
• list the ‘input
conditions’ in the first
column of the table
27.
Determine input combinations
add columns to the table for each unique combination of
input conditions.
each entry in the table may be either ‘T’ for true, ‘F’ for
false.
Input Conditions
Valid username T T T T F F F F
Valid password T T F F T T F F
Account in credit T F T F T F T F
28.
Rationalise input combinations
some combinations may be impossible or not of interest
some combinations may be ‘equivalent’
use a hyphen to denote “don’t care”
Input Conditions
Valid username F T T T
Valid password - F T T
Account in credit - - F T
29.
Complete the table
determine the expected output conditions for each combination of input
conditions
Input Conditions
Valid username F T T T
Valid password - F T T
Account in credit - - F T
Output Conditions
Login accepted F F T T
Restricted access - - T F
30.
Determine test casegroups
each column is at least one test case
Input Conditions
Valid username F T T T
Valid password - F T T
Account in credit - - F T
Output Conditions
Login accepted F F T T
Restricted access - - T F
Tags A B C D
31.
Design test cases
usually one test case for each column but can be none or several
Test Description Expected Outcome Tag
1 Username BrbU Invalid username A
2 Username
usernametoolong
Invalid username A
3 Username BobU
Password abcd
Invalid password B
4 Valid user, no disc
space
Restricted access C
5 Valid user with disc
space
Unrestricted access D
32.
Rationalising outputs
ifoutputs or effects are mutually exclusive, I.e. T occurs in
only one place in each column, we can combine them
for example:
X T F F
Y F T F
Z F F T Output X Y Z
is equivalent to:
33.
Rationalising dangers
rationalisingis based on assumptions
assumptions may be wrong!
assumptions should be stated
assumptions may change over time
be aware of the dangers
filling in the full table may find errors which will be missed if
you rationalise
it is possible to rationalise too far
34.
Extending decision tables
Entries can be more than just ‘true’ or ‘false’
completing table needs to be done carefully
rationalising becomes more important
E.g.
Code = 1, 2, or 3 1 1 1 1 2 2 2 2 3 3 3 3
Exp.date < now T T F F T T F F T T F F
Class A product T F T F T F T F T F T F
35.
Decision Tables inrelation to EP
and BVA
Input value Output value
FALSE
TRUE
36.
Contents
What is atesting technique?
Black and White box testing
Black box test techniques
White box test techniques
Error Guessing
Dynamic Testing Techniques
1 2 3
4 5 6
37.
White Box testdesign and
measurement techniques
Techniques defined in BS 7925-2
Statement testing
Branch / Decision testing
Data flow testing
Branch condition testing
Branch condition combination testing
Modified condition decision testing
LCSAJ testing
Also defines how to specify other techniques
= Yes
= No
Also a measurement
technique?
38.
Stronger structural
techniques (different
structuralelements)
More tests
Using structural coverage
Increasing coverage
Coverage OK?
What's
covered
?
Results OK?
Enough
tests?
Spe
c
Spe
c SoftwareSoftware
Tests
More testsMore testsMore testsMore testsMore testsMore testsMore tests
39.
The test coveragetrap
Structure exercised,
insufficient function
Structure exercised,
insufficient function
Function exercised,
insufficient structure
Function exercised,
insufficient structure
better testingbetter testing
% Statement % Decision % Condition
Combination
Structural testedness
Functional
testedness
100% coverage does
not mean 100% tested!
100% coverage does
not mean 100% tested!
Coverage is not
Thoroughness
Coverage is not
Thoroughness
40.
Statement coverage
percentageof executable statements exercised by a test
suite
number of statements exercised
total number of statements
example:
program has 100 statements
tests exercise 87 statements
statement coverage = 87%
=
Typical ad hoc testing achieves 60 - 75%Typical ad hoc testing achieves 60 - 75%
Statement coverage
is normally measured
by a software tool.
Statement coverage
is normally measured
by a software tool.
?
41.
Example of statementcoverage
Test
case
Input Expected
output
1 7 7
As all 5 statements are ‘covered’ by
this test case, we have achieved
100% statement coverage
read(a)
IF a > 6 THEN
b = a
ENDIF
print b
1
2
3
4
5
Statement
numbers
42.
Decision coverage
(Branch coverage)
percentage of decision outcomes
exercised by a test suite
number of decisions outcomes exercised
total number of decision outcomes
example:
program has 120 decision outcomes
tests exercise 60 decision outcomes
decision coverage = 50%
Typical ad hoc testing achieves 40 - 60%Typical ad hoc testing achieves 40 - 60%
=
Decision coverage
is normally measured
by a software tool.
Decision coverage
is normally measured
by a software tool.
True
False
?
Paths through codewith loops
?
1 2 3 4 5 6 7 8 ….
for as many times as it
is possible to go round
the loop (this can be
unlimited, i.e. infinite)
45.
End
Select
trans...
Yes
Example 1
Wait forcard to be inserted
IF card is a valid card THEN
display “Enter PIN number”
IF PIN is valid THEN
select transaction
ELSE (otherwise)
display “PIN invalid”
ELSE (otherwise)
reject card
End
Display
“Enter..
Yes
Valid
PIN?
No
Reject
card
Display
“PIN in..
No
Valid
card?
Wait
46.
Read A
IF A> 0 THEN
IF A = 21 THEN
Print “Key”
ENDIF
ENDIF
IF A > 0 THEN
ENDIF
Print “Key”
IF A = 21 THEN
ENDIF
Example 2
Cyclomatic complexity: _____
Minimum tests to achieve:
Statement coverage: ______
Branch coverage: _____
3
1
3
Read A
Print
Yes
A=21
Yes
No
End
A>0
No
Read
47.
Example 3
Cyclomaticcomplexity: _____
Minimum tests to achieve:
Statement coverage: ______
Branch coverage: _____
Read A
Read B
IF A > 0 THEN
IF B = 0 THEN
Print “No values”
ELSE
Print B
IF A > 21 THEN
Print A
ENDIF
ENDIF
ENDIF
4
2
4
Print
Yes
Print
A>21
No
No
B=0
Print
Yes
Yes
Read
A>0
End
No
48.
Example 4
Cyclomaticcomplexity: _____
Minimum tests to achieve:
Statement coverage: ______
Branch coverage: _____
Read A
Read B
IF A < 0 THEN
Print “A negative”
ELSE
Print “A positive”
ENDIF
IF B < 0 THEN
Print “B negative”
ELSE
Print “B positive”
ENDIF
3
2
2
Read
End
Print
No
Print
Yes
B<0
Print
No
Print
Yes
A<0
Note: there
are 4 paths
49.
End
No
Example 5
Cyclomaticcomplexity: _____
Minimum tests to achieve:
Statement coverage: ______
Branch coverage: _____
Read A
Read B
IF A < 0 THEN
Print “A negative”
ENDIF
IF B < 0 THEN
Print “B negative”
ENDIF
3
1
2
Read A<0 Print
Yes
No
B<0 Print
Yes
50.
Example 6
Cyclomaticcomplexity: _____
Minimum tests to achieve:
Statement coverage: ______
Branch coverage: _____
Read A
IF A < 0 THEN
Print “A negative”
ENDIF
IF A > 0 THEN
Print “A positive”
ENDIF
3
2
2
Read Print
Yes
No
End
No
A>0 Print
Yes
A<0
51.
Contents
What is atesting technique?
Black and White box testing
Black box test techniques
White box test techniques
Error Guessing
Dynamic Testing Techniques
1 2 3
4 5 6
52.
Non-systematic test techniques
Trial and error / Ad hoc
Error guessing / Experience-driven
User Testing
Unscripted Testing
A testing approach that is only
rigorous, thorough and systematic
is incomplete
A testing approach that is only
rigorous, thorough and systematic
is incomplete
53.
Error-Guessing
always worthincluding
after systematic techniques have been used
can find some faults that systematic techniques can miss
a ‘mopping up’ approach
supplements systematic techniques
Not a good approach to start testing withNot a good approach to start testing with
54.
Error Guessing: derivingtest
cases
Consider:
past failures
intuition
experience
brain storming
“What is the craziest thing we can do?”
55.
Summary: Key Points
Testtechniques are ‘best practice’: help to find faults
Black Box techniques are based on behaviour
White Box techniques are based on structure
Error Guessing supplements systematic techniques
Dynamic Testing Techniques
1 2 3
4 5 6