De presentatie wordt gedownload. Even geduld aub

De presentatie wordt gedownload. Even geduld aub

Testing in practice Bart Knaack Logica

Verwante presentaties


Presentatie over: "Testing in practice Bart Knaack Logica"— Transcript van de presentatie:

1 Testing in practice Bart Knaack Logica Bart.Knaack@logica.com
Gastcollege Testing in practice Bart Knaack Logica

2 Who am I? Bart Knaack, Senior Test Advisor, Logica, The Netherlands
15 years experience in IT, of which 12 in testing. Developer, Development Lead, Tester, Testautomator, Testcoordinator, Testmanager, Testadvisor. Trainer in Testmanagement

3 Paper clippings

4 Testpyramide Testgrip RRBT Test Frame

5 The goal of testing is to find errors
Footer appears here

6 Testing is trying to build convidence in system, by trying to diminish that confidence (and hopefully not succeeding) D. Graham Footer appears here

7 Why testing? Prevent defects during operation of the system.
Verify intended functionality Validation vs Verification €€€ Definition Design Development Production

8 Test levels in the V-model
User needs, Requirements, Business processes Acceptance Test Black box White Test specification Test execution System Specification System Integration Test System Test D y n a m i c T e s t i n g S t a t i c T e s t i n g TC This slide shows the positioning of white box and black box testing techniques in the V-model of software testing. The testing techniques vary from mainly “white box” techniques during CT to “black box” only during AT. Definitions according to the ISTQB Glossary: white-box testing: Testing based on an analysis of the internal structure of the component or system. black-box testing: Testing, either functional or non-functional, without reference to the internal structure of the component or system. Technical Design & Code Component Integration Test Component Test terms according to:

9 Also called ‘white-box’ or ‘structural’ testing
Glass-box testing Also called ‘white-box’ or ‘structural’ testing Testers have access to the system design They can Examine the design documents View the code Observe at run time the steps taken by algorithms and their internal data Individual programmers often informally employ glass-box testing to verify their own code

10 Testers provide the system with inputs and observe the outputs
Black-box testing Testers provide the system with inputs and observe the outputs They can see none of: The source code The internal data Any of the design documentation describing the system’s internals

11 Boundary Value Analysis Equivalence partitioning
Test Techniques Boundary Value Analysis Equivalence partitioning Branch Coverage testing

12 Equivalence Partitioning
Requirements divide data in ranges of equivalent behavior. All people between 25 and 65 have to pay premium for their retirement. All people older than 65 get a pension. All people earning less then €5000 do not have to pay premium. These classes need to be tested

13 Equivalence Partitioning
Requirements divide data in ranges of equivalent behavior. All people between 25 and 65 have to pay premium for their retirement. All people older than 65 get a pension. All people earning less then €5000 do not have to pay premium. These classes need to be tested

14 Boundary Value Analysis
Requirements contain values All people between 25 and 65 have to pay premium for their retirement. All people older than 65 get a pension. All people earning less then €5000 do not have to pay premium. These values form the basis for setting up testcases

15 Boundary Value Analysis
Requirements contain values All people between 25 and 65 have to pay premium for their retirement. All people older than 65 get a pension. All people earning less then €5000 do not have to pay premium. These values form the basis for setting up testcases

16 You are done testing when all possible tests have been executed
Footer appears here

17 No Risk, No Test Footer appears here

18 RRBT: risks versus requirements
Matching risks with requirements Risk, no requirement: Add requirement (earlier error detection) Delete risk (No useless testing) Requirement, no risico: Adapt Risico lijst (better coverage) Delete Requirements (no useless development, no “frails”) Requirements Product Risks Matching requirements with risks

19 Combine productrisks en requirements
Analyse requirements Analyse risico’s Prioritze requirements Assess Impact risks Must have Should have Could have Won’t have Must test Should test Could test Won’t test Waarom combineren we de productrisico’s en de requirements? Traditioneel Risk Based Testing pakte de lijst met requirements en hing aan elk requirement een risico met een prioriteit. Dit zegt alleen niets over de juistheid of compleetheid van de requirements. Misschien zijn er alleen functionele requirements. Wij willen nu een stap verder gaan. Eerst stellen we samen met de verschillende stakeholders een overzicht op met de productrisico’s. Deze geven we ook samen met hen een prioriteit. Hiervoor kiezen we bewust voor het MoSCoW-model (gebaseerd op de MoSCoW-rules uit de systeemontwikkeling). Dit voorkomt te grote subjectiviteit (relatieve prioriteit) of schijnzekerheid (formule). Vervolgens koppelen we de requirements aan deze productrisico’s. Zo zien we al snel of requirements missen, elkaar tegenspreken of onduidelijk zijn. Zeker voor productrisico’s met een hoge prioriteit kan dit een probleem vormen. Een duidelijk voorbeeld is performance. Hiervoor ontbreken vaak requirements of er staat iets als “binnen redelijke tijd”. Problemen met requirements melden we aan de opdrachtgever. Deze zal beslissen wat er verder gaat gebeuren. Als we de informatie compleet hebben (zo compleet als afgesproken), stellen de testers de testgevallen op, te beginnen met de productrisico’s met de hoogste prioriteit. Define testcases

20 Kwaliteitsattributen ISO9126
Functionaliteit Functionaliteit Betrouwbaarheid Betrouwbaarheid Bruikbaarheid Bruikbaarheid Efficiëntie Onderhoudbaarheid Onderhoudbaarheid Portabiliteit Portabiliteit Geschiktheid Geschiktheid Volwassenheid Volwassenheid Begrijpelijkheid Begrijpelijkheid Tijdsbeslag Tijdsbeslag Analyseerbaarheid Analyseerbaarheid Aanpasbaarheid Aanpasbaarheid Nauwkeurigheid Nauwkeurigheid Fout tolerantie Fout tolerantie Leerbaarheid Leerbaarheid Middelenbeslag Middelenbeslag Wijzigbaarheid Wijzigbaarheid Installerbaarheid Installerbaarheid Connectiviteit Connectiviteit Herstelbaarheid Herstelbaarheid Opereerbaarheid Opereerbaarheid Stabiliteit Stabiliteit Inschikkelijkheid Inschikkelijkheid Veiligheid Veiligheid Aantrekkelijkheid Aantrekkelijkheid Testbaarheid Testbaarheid Uitwisselbaarheid Uitwisselbaarheid Functionaliteits Functionaliteits Betrouwbaarheids Betrouwbaarheids Bruikbaarheids Bruikbaarheids Efficiency Efficiëntie Onderhoudbaarheids Onderhoudbaarheids Portabiliteits Portabiliteits naleving naleving naleving naleving naleving naleving naleving naleving naleving naleving naleving naleving

21 Functional testing Usability testing Performance testing
Types of testing Functional testing Usability testing Performance testing Stress testing Penetration testing Footer appears here

22 Heuristic evaluation (Nielson)
Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design . Help users recognize, diagnose, and recover from errors Help and documentation Footer appears here

23 Most of the time spend in testing is wasted effort.
Footer appears here

24 Quality is always under pressure
Slide : 24 Title : Testing is often put under pressure Objective : Visualise how testing is put under pressure. Contents : Often the enddate of the project are fixed before the start of a project: * Year 2000; * Euro; * This project is input for another; Business choose. When the initiator of the product gets 60% of the market, Who ever is first is real important Because specification and development don’t make the enddate, and the project enddate is not moving, testing is under pressure. This may cost quality of testing and what is tested. Testing is seen thru the eyes of a enduser not important (not vissible), so testing is economised The difference between fixed and planned enddate makes the difference for testmanagement. pro-active: What ever may be, time and money are fixed (fixed date). reactive: You have a certain amount of money and time (planned date). In reality the last option is difficult to manage. Most important things first (so priority is very important). Tips : Ask the trainees if this is a know problem so discussion is started. Ask if the trainees will allow this, and how they should prevent it. Relate this to the V-model. Starting early may result in less presure. Animation : Press the mouse button once to see the line going to the right and pressure is growing. Time : 5 minutes Planning and Specification Development Testing

25 Requirements Analysis Risk Analysis Testcase Preparation
Tasks of a tester Requirements Analysis Risk Analysis Testcase Preparation Stakeholder Involvement Test Execution Bug Reporting Bugfix meetings Retesting Reporting


Download ppt "Testing in practice Bart Knaack Logica"

Verwante presentaties


Ads door Google