Evolution of Test Automation and Modern Testing Practices

undefined
Test Automation
and Modern Testing Practices
Jeff Offutt
Jeff Offutt
George Mason University
George Mason University
cs.gmu.edu/~offutt
“Science can amuse and fascinate us, but
it is engineering that changes the world.”
-- Isaac Asimov
Software is a skin that surrounds
our civilization
2
Quote due to Dr. Mark Harman
Why do we test?
Test process maturity levels
3
There’s no difference between 
There’s no difference between 
testing
testing
and 
and 
debugging
debugging
The purpose of testing is to show
The purpose of testing is to show
correctness
correctness
The purpose of testing is to show that
The purpose of testing is to show that
the software 
the software 
doesn’t work
doesn’t work
The purpose of testing is to 
The purpose of testing is to 
reduce the
reduce the
risk
risk
 of using the software
 of using the software
Testing is a 
Testing is a 
mental discipline 
mental discipline 
that helps
that helps
us develop high quality software
us develop high quality software
Level 
Level 
0
0
Level 
Level 
1
1
Level 
Level 
2
2
Level 
Level 
3
3
Level 
Level 
4
4
Put simply …
4
A software tester
tries to spot software
failures 
before
 they
are inflicted on users
Why automate testing?
The usual reasons for automation
The usual reasons for automation
reduce cost and increase value
reduce cost and increase value
(increase RoI)
(increase RoI)
5
Reduce repetitive
Reduce repetitive
 work
 work
Increase predictability
Increase predictability
Better software reduces support costs
Better software reduces support costs
Increase execution speed
Increase execution speed
Pre-history: Manual testing
Before automation, testers entered inputs by hand
Inputs often arbitrary
Entirely system testing
Slow
Repeatability problems
Testers made errors
o
Mis-entered values
o
Missed erroneous results
6
Common
among CS
students
Early (partial) automation
Tests designed by hand
Tests were sequences of actions and inputs
Sometimes executed all or partially by software
Results checked by human
Entirely system testing
Somewhat repeatable
Fewer errors
Slow & expensive
7
Test automation in the 1990s
System testing 
test automation tools
Often project- or company-specific
Test values usually designed by hand
Result checking usually by hand
Capture-replay
 tools for GUIs
Captured inputs and result screens from manual tests
Replayed tests when software changed
Early 
unit-level
 test frameworks
Not accessible to non-programmers
Most were research demonstration tools
Automatic checking of outputs via assertions
JUnit integrated the best ideas and simplified
8
Then this happened
9
6
0
Requirements
Prog / unit testing
Design
Integration testing
Fault origin (%)
Fault detection (%)
Unit cost (X)
Software Engineering Institute; Carnegie Mellon University; Handbook CMU/SEI-96-HB-002
System testing
Post-deployment
Evidence of RoI for unit testing
10
6
0
Requirements
Prog / unit testing
Design
Integration testing
Software Engineering Institute; Carnegie Mellon University; Handbook CMU/SEI-96-HB-002
 
Assume $1000 unit cost per fault, 100 faults
System testing
Post-deployment
Fault origin (%)
Fault detection (%)
Unit cost (X)
Unit testing and automation
Unit tests are easier to automate
Unit tests are easier to automate
Thus … more automation
Thus … more automation
11
Faster to produce
Faster to produce
Cheaper
Cheaper
More predictable
More predictable
Less repetitive
Less repetitive
 work
 work
More modularity
More modularity
Less re-design
Less re-design
Re-verification
Re-verification
supports
supports
 evolution
 evolution
Better software
Better software
reduces support costs
reduces support costs
Postfix values
Setup (prefix) values
Object obj = Min.min(list);
Example automated unit test
12
@Test
}
Test case values
public void testMutuallyIncomparable () {
Expected output
(expected = ClassCastException.class)
list.add(1);
abstract
concrete
List list = new ArrayList();
list.add("cat");
list.add("dog");
Categories of TA activities
13
Test automation
how
what
human scripts
C/R (GUIs)
script languages
frameworks (JUnit)
continuous integration (DevOps)
Execution
Generation
Management
test values
new
test
change
test
delete
test
human-based
criteria-based
14
Elements of 
test automation
(generation)
(1)
model-based
 testing
(tests from abstract models)
Model-based testing
(abstract test design)
software
artifact
model /
structure
test
requirements
input
values
test
cases
test
scripts
test
results
pass /
fail
Concrete level
Concrete level
(implementation)
(implementation)
Abstract level
Abstract level
(model)
(model)
15
Raising our level of abstraction
Raising our level of abstraction
allows for m
allows for m
ore creative work
ore creative work
abstract
test
values
Software models
Models represent software at an 
abstract level
formal
 models have formal semantics and can often be executed
informal
 models omit details and thus cannot be executed
Uses
 of models
design & specification
documentation
maintenance and evolution
testing
Types
 of models
1.
graphs
2.
logic expressions
3.
sets
4.
formal syntax (source code, BNF, XML, etc)
16
Software to models
17
Abstract graph
CFGs
statecharts
use cases
FSMs, PNs
4
2
1
3
decisions
statecharts
decisions
use case
decisions
FSMs, PNs,
SMV
Logic expression
(A & B ) | (C & D)
Model-based testing process
18
Model
Criterion
Test
requirements
Abstract
tests
Extra
information
Concrete
tests
Test execution
Test
reports
19
Elements of test automation
(generation)
(2) mapping problem
(abstract test to concrete test)
Mapping 
example—vending machine
Abstract test: [ 1, 3, 4, 1, 2, 4 ]
Refined abstract test:
Concrete test (mapping):
1.
2.
3.
4.
5.
Testers build abstract tests by hand
But each transition 
maps
 to specific concrete actions
Automate by 
assembling
 concrete test components
20
1
2
3
4
Coin
GetChoc
AddChoc
Coin
GetChoc
Coin
AddChoc
AddChoc
AddChoc
or
 Coin
AddChoc, Coin, GetChoc, Coin, AddChoc
c
 = “m&ms”; addChocolate(
c
);
v
 = “$1”; addCoin(
v
);
chooseChocolate(
c
); dispense();
v
 = “$1”; addCoin(
v
);
c
 = “m&ms”; addChocolate(
c
);
Assembling test components
21
Test components
C1
C2
C3
C4
C5
C6
C7
Test 1
Test 3
Test 2
C1
C3
C4
C1
C5
C6
C2
C7
C4
C5
Each abstract
Each abstract
test component
test component
is mapped to real
is mapped to real
code in concrete
code in concrete
tests
tests
many
many
times
times
22
Elements of test automation
(generation)
(3) test oracles
(expected result from a test)
A test oracle knows correct
behavior of the software
23
An automated
test must include
expected output
for that test
RIPR model
R
eachability
I
nfection
P
ropagation
R
evealability
Test
Fault
Error
program
state
Test
oracles
Observed final
output 
state
Reaches
Infects
Propagates
Reveals
Incorrect
final output
24
Outputs
final program state
What makes a good test oracle?
Do test oracles need to observe the entire output space?
Arbitrarily large
Do test oracles need to observe intermediate states?
Do test oracles always return the same answer?
How do we automate test oracles when we do not know
the correct output?
How do we model test oracles?
25
Lots of partial answers to these questions
Researchers are still hard at work
Here are a few answers …
Good test oracles are 
consistent
Sometimes tests behave differently on different runs
Google says 16% of their tests are 
flaky
What makes a test flaky?
26
Concurrency
Asynchronous behavior
Random inputs
Resource leaks
Test order dependency
Library class assumptions
Relying on external systems
Good test oracles 
look
Smoke
 (crash) tests miss about two-thirds of the failures
Why waste a good test?
Tests are expensive to design, to implement, and to run
27
@Test (expected = NullPointerException.class)
 public void addOneValue()
 {
     list.addFront(“cat”);
     Object obj = list.getFirst();
}
???
Should check: [ cat ]
Good test oracles 
can see
A 
blind test 
blind test 
does 
not
 check the portion of the output that
is incorrect
28
@Test
 public void testTwoValues()
 {
     list.addFront(“dog”);
     list.addFront(“cat”);
     Object obj = list.getFirst();
     assertTrue(“Two values”, obj.equals(“cat”));
 }
Passes:
[ cat ]
[ cat, cat ]
[ cat, null ]
Should check: [ cat, dog ]
Up to 40% of industry automated tests are blind
Blind tests in the RIPR model
R
eachability
I
nfection
P
ropagation
R
evealability
Test
Fault
Error
program
state
Test
oracles
Observed final
output 
state
Reaches
Infects
Propagates
Does NOT
reveal
Incorrect
final output
29
Outputs
final program state
Categories of TA activities
30
Test automation
how
what
human scripts
C/R (GUIs)
script languages
frameworks (JUnit)
continuous integration (DevOps)
Execution
Generation
Management
test values
new
test
change
test
delete
test
human-based
criteria-based
used in industry
used in industry
active research
active research
both
both
31
Four challenges
 and
problems
in test automation
TA challenges and problems (1)
Most software developers learned very little about testing
in their education
Testing courses are rare at universities
A two-week overview in a general software engineering course is
not enough
Google sends new developers to a six-week “testing boot
camp”
32
Education and scalability
Why do students major in
Computer Science
 to become
Software Engineers
?
TA challenges and problems (2)
Test automation tools (execution) are very 
slow
Most 
inputs
 are through the screen with funny gestures
and auto-fills
Challenging to automate
Every mobile device has its own 
ecosystem
How to model these ecological communities?
How to test entanglements with other apps?
What does  a 
test oracle
 look like in a mobile app?
33
Testing mobile apps
Many details are very different
TA challenges and problems (3)
TOs are still usually created 
by hand
Old, current, and new approaches:
Formal specifications
—lots of research, but limited use
Aggregating
 unit-level TOs into system-level TOs
Impact analysis
 to identify which parts of output state to check
Screen capturing
 approaches
Machine learning
 to “guess” expected results?
Leverage 
TDD tests
 to create expected results for similar tests?
34
Test oracles
We need to automate
 generation
TA challenges and problems (4)
Examples:
Games, loan eligibility, scientific modeling,  non-deterministic, AI
& ML, …
How to 
build
 automated tests?
Inputs
 are often very complicated
Test oracles
—JUnit assertions are not enough
35
Non-determinant
 software
We cannot know the result
 
a priori
Instead of “correct behavior,”
we need
acceptable behavior
and
and
ethical
ethical
Takeaways
36
(1)
Test automation has
 had enormous impact
JUnit
test suite
management
agile &
TDD
continuous
integration
capture
replay
test
scripts
test
oracle
model-
based
testing
DevOps
Takeaways
37
(2)
Good models empower test automation
(A & B ) | (C & D)
Takeaways
38
(3)
We have barely started!
Every advance reveals
two more tasks
we can automate
Jeff Offutt
Jeff Offutt
cs.gmu.edu/~offutt
“Computer science amuses and fascinates us,
but automation changes the world.”
(paraphrasing Isaac Asimov)
Slide Note
Embed
Share

Explore the evolution of test automation and modern testing practices, from the manual testing era to the reasons for automating testing. Learn about the challenges, achievements, and takeaways in the field of test automation. Discover the importance of software testing in ensuring high-quality products and reducing risks for users.


Uploaded on Apr 03, 2024 | 11 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Test Automation and Modern Testing Practices Jeff Offutt George Mason University cs.gmu.edu/~offutt Science can amuse and fascinate us, but it is engineering that changes the world. -- Isaac Asimov

  2. Software is a skin that surrounds our civilization Quote due to Dr. Mark Harman why test? test automation TA achievements major challenges & open problems takeaways of 38 2

  3. Why do we test? Test process maturity levels Testing is a mental discipline that helps us develop high quality software Level 4 The purpose of testing is to reduce the risk of using the software Level 3 The purpose of testing is to show that the software doesn t work Level 2 The purpose of testing is to show correctness Level 1 There s no difference between testing and debugging Level 0 why test? test automation TA achievements major challenges & open problems takeaways of 38 3

  4. Put simply A software tester tries to spot software failures before they are inflicted on users why test? test automation TA achievements major challenges & open problems takeaways of 38 4

  5. Why automate testing? The usual reasons for automation reduce cost and increase value (increase RoI) Reduce repetitive work Increase predictability Increase execution speed Better software reduces support costs why test? test automation TA achievements major challenges & open problems takeaways of 38 5

  6. Pre-history: Manual testing Before automation, testers entered inputs by hand Inputs often arbitrary Entirely system testing Slow Repeatability problems Testers made errors o Mis-entered values o Missed erroneous results Common among CS students why test? test automation TA achievements major challenges & open problems takeaways of 38 6

  7. Early (partial) automation Tests designed by hand Tests were sequences of actions and inputs Sometimes executed all or partially by software Results checked by human Entirely system testing Somewhat repeatable Fewer errors Slow & expensive why test? test automation TA achievements major challenges & open problems takeaways of 38 7

  8. Test automation in the 1990s System testing test automation tools Often project- or company-specific Test values usually designed by hand Result checking usually by hand Capture-replay tools for GUIs Captured inputs and result screens from manual tests Replayed tests when software changed Early unit-level test frameworks Not accessible to non-programmers Most were research demonstration tools Automatic checking of outputs via assertions JUnit integrated the best ideas and simplified why test? test automation TA achievements major challenges & open problems takeaways of 38 8

  9. Then this happened 60 50 40 Fault origin (%) 30 Fault detection (%) 20 10 Unit cost (X) 0 Software Engineering Institute; Carnegie Mellon University; Handbook CMU/SEI-96-HB-002 why test? test automation TA achievements major challenges & open problems takeaways of 38 9

  10. Evidence of RoI for unit testing 60 Assume $1000 unit cost per fault, 100 faults 50 40 Fault origin (%) 30 Fault detection (%) 20 10 Unit cost (X) 0 Software Engineering Institute; Carnegie Mellon University; Handbook CMU/SEI-96-HB-002 why test? test automation TA achievements major challenges & open problems takeaways of 38 10

  11. Unit testing and automation Unit tests are easier to automate Thus more automation More modularity Less re-design Faster to produce Re-verification supports evolution Cheaper Less repetitive work Better software reduces support costs More predictable why test? test automation TA achievements major challenges & open problems takeaways of 38 11

  12. Example automated unit test @Test Expected output (expected = ClassCastException.class) public void testMutuallyIncomparable () { List list = new ArrayList(); Setup (prefix) values list.add("cat"); list.add("dog"); Test case values list.add(1); Postfix values Object obj = Min.min(list); } abstract concrete why test? test automation TA achievements major challenges & open problems takeaways of 38 12

  13. Categories of TA activities Test automation Execution Generation Management how what new test change test delete test test oracle prefix postfix values test values human-based human scripts prefix & postfix criteria-based C/R (GUIs) assembled pieces script languages frameworks (JUnit) test oracle continuous integration (DevOps) why test? test automation TA achievements major challenges & open problems takeaways of 38 13

  14. Elements of test automation (generation) (1) model-based testing (tests from abstract models) why test? test automation TA achievements major challenges & open problems takeaways of 38 14

  15. Model-based testing (abstract test design) model / structure test requirements abstract test values Abstract level (model) Concrete level (implementation) software artifact input values pass / fail test results test scripts test cases why test? test automation TA achievements major challenges & open problems takeaways of 38 15

  16. Software models Models represent software at an abstract level formal models have formal semantics and can often be executed informal models omit details and thus cannot be executed Uses of models design & specification documentation maintenance and evolution testing Types of models 1. graphs 2. logic expressions 3. sets 4. formal syntax (source code, BNF, XML, etc) why test? test automation TA achievements major challenges & open problems takeaways of 38 16

  17. Software to models code CFGs statecharts designs Abstract graph requirements use cases 1 FSMs, PNs formal models 3 2 4 code decisions statecharts decisions designs Logic expression (A & B ) | (C & D) use case decisions requirements formal models FSMs, PNs, SMV why test? test automation TA achievements major challenges & open problems takeaways of 38 17

  18. Model-based testing process Model Criterion Test requirements Abstract tests Extra information Concrete tests Test execution Test reports why test? test automation TA achievements major challenges & open problems takeaways of 38 18

  19. Elements of test automation (generation) (2) mapping problem (abstract test to concrete test) why test? test automation TA achievements major challenges & open problems takeaways of 38 19

  20. Mapping examplevending machine Abstract test: [ 1, 3, 4, 1, 2, 4 ] Refined abstract test: Concrete test (mapping): 1. 2. 3. 4. 5. Testers build abstract tests by hand But each transition maps to specific concrete actions Automate by assembling concrete test components Coin Coin 1 2 AddChoc, Coin, GetChoc, Coin, AddChoc GetChoc AddChoc AddChoc c= m&ms ; addChocolate(c); v= $1 ; addCoin(v); chooseChocolate(c); dispense(); v= $1 ; addCoin(v); c= m&ms ; addChocolate(c); Coin 3 4 GetChoc AddChoc or Coin AddChoc why test? test automation TA achievements major challenges & open problems takeaways of 38 20

  21. Assembling test components Test components Test 1 Test 2 C1 C1 C1 C2 C2 C3 C3 Test 3 C4 C4 C4 C5 C5 C5 C6 C6 C7 C7 why test? test automation TA achievements major challenges & open problems takeaways of 38 21

  22. Elements of test automation (generation) (3) test oracles (expected result from a test) why test? test automation TA achievements major challenges & open problems takeaways of 38 22

  23. A test oracle knows correct behavior of the software An automated test must include expected output for that test why test? test automation TA achievements major challenges & open problems takeaways of 38 23

  24. RIPR model Outputs Test final program state Reaches Reachability Observed final output state Infection Fault Incorrect final output Propagation Infects Propagates Reveals Revealability Error program state Test oracles why test? test automation TA achievements major challenges & open problems takeaways of 38 24

  25. What makes a good test oracle? Do test oracles need to observe the entire output space? Arbitrarily large Do test oracles need to observe intermediate states? Do test oracles always return the same answer? How do we automate test oracles when we do not know the correct output? How do we model test oracles? Lots of partial answers to these questions Researchers are still hard at work Here are a few answers why test? test automation TA achievements major challenges & open problems takeaways of 38 25

  26. Good test oracles are consistent Sometimes tests behave differently on different runs test A run 1 test A run 2 test A run 3 Google says 16% of their tests are flaky What makes a test flaky? Concurrency Asynchronous behavior Random inputs Resource leaks Test order dependency Library class assumptions Relying on external systems why test? test automation TA achievements major challenges & open problems takeaways of 38 26

  27. Good test oracles look Smoke (crash) tests miss about two-thirds of the failures Why waste a good test? Tests are expensive to design, to implement, and to run @Test (expected = NullPointerException.class) public void addOneValue() { list.addFront( cat ); Object obj = list.getFirst(); } ??? Should check: [ cat ] why test? test automation TA achievements major challenges & open problems takeaways of 38 27

  28. Good test oracles can see A blind test does not check the portion of the output that is incorrect @Test public void testTwoValues() { list.addFront( dog ); list.addFront( cat ); Object obj = list.getFirst(); assertTrue( Two values , obj.equals( cat )); } Passes: [ cat ] [ cat, cat ] [ cat, null ] Should check: [ cat, dog ] Up to 40% of industry automated tests are blind why test? test automation TA achievements major challenges & open problems takeaways of 38 28

  29. Blind tests in the RIPR model Outputs Test final program state Incorrect final output Reaches Reachability Infection Fault Observed final output state Propagation Infects Propagates Does NOT reveal Revealability Error program state Test oracles why test? test automation TA achievements major challenges & open problems takeaways of 38 29

  30. Categories of TA activities Test automation Execution Generation Management how what new test change test delete test test oracle values prefix postfix test values human-based human scripts prefix & postfix criteria-based C/R (GUIs) assembled pieces script languages used in industry frameworks (JUnit) test oracle active research continuous integration (DevOps) both why test? test automation TA achievements major challenges & open problems takeaways of 38 30

  31. Four challenges and problems in test automation why test? test automation TA achievements major challenges & open problems takeaways of 38 31

  32. TA challenges and problems (1) Education and scalability Most software developers learned very little about testing in their education Testing courses are rare at universities A two-week overview in a general software engineering course is not enough Google sends new developers to a six-week testing boot camp Why do students major in Computer Science to become Software Engineers? why test? test automation TA achievements major challenges & open problems takeaways of 38 32

  33. TA challenges and problems (2) Testing mobile apps Many details are very different Test automation tools (execution) are very slow Most inputs are through the screen with funny gestures and auto-fills Challenging to automate Every mobile device has its own ecosystem How to model these ecological communities? How to test entanglements with other apps? What does a test oracle look like in a mobile app? why test? test automation TA achievements major challenges & open problems takeaways of 38 33

  34. TA challenges and problems (3) Test oracles We need to automate generation TOs are still usually created by hand Old, current, and new approaches: Formal specifications lots of research, but limited use Aggregating unit-level TOs into system-level TOs Impact analysis to identify which parts of output state to check Screen capturing approaches Machine learning to guess expected results? Leverage TDD tests to create expected results for similar tests? why test? test automation TA achievements major challenges & open problems takeaways of 38 34

  35. TA challenges and problems (4) Non-determinant software We cannot know the result a priori Examples: Games, loan eligibility, scientific modeling, non-deterministic, AI & ML, How to build automated tests? Inputs are often very complicated Test oracles JUnit assertions are not enough Instead of correct behavior, we need acceptable behavior why test? test automation TA achievements major challenges & open problems takeaways of 38 35

  36. Takeaways (1) Test automation has had enormous impact test scripts test oracle continuous integration JUnit agile & TDD capture replay test suite management model- based testing DevOps why test? test automation TA achievements major challenges & open problems takeaways of 38 36

  37. Takeaways (2) Good models empower test automation test model / structure requirements Execution Generation Management Abstract level how what new test change test delete test Concrete level software artifact values test oracle prefix postfix input values test values human-based human scripts pass / fail test results test scripts test cases prefix & postfix criteria-based C/R (GUIs) script languages assembled pieces frameworks (JUnit) test oracle continuous integration Outputs Test final program state Reaches Observed final output state Incorrect final state 1 3 Fault 2 4 Infects Error program state (A & B ) | (C & D) Propagates Reveals Test oracles why test? test automation TA achievements major challenges & open problems takeaways of 38 37

  38. Takeaways (3) We have barely started! Every advance reveals two more tasks we can automate Computer science amuses and fascinates us, but automation changes the world. (paraphrasing Isaac Asimov) Jeff Offutt cs.gmu.edu/~offutt why test? test automation TA achievements major challenges & open problems takeaways of 38 38

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#