All About Software Testing - A Primer

Anybody can test!

Saturday, March 14, 2015

Bug or Defect Life Cycle

What is a defect Life Cycle?

Defect life cycle is the cycle it goes through, when the bug or defect is found - till the end when the defect is closed. Its cycle includes:

1.   New
2.   Assigned
3.   Open
4.   Fixed
5.   Pending
6.   Retest
7.   Verified
8.   Reopen
9.   Closed
10. Duplicate
11. Rejected
12. Deferred
13. Not A Bug

Bug Life Cycle

Monday, June 2, 2014

GUI Testing and Checklists

GUI stands for Graphical User Interface where user interacts with the application using mouse, keyboard or other interfaces.

GUI testing includes verifying whether your application meets the standards of graphical user interface or GUI standard specifications.

GUI Testing can be done in 4 ways:

1. Manually
User inputs data and verifies whether application interacts as expected.

2. Record and Playback
User records set of actions using record and playback tools like selenium or Microsoft Test Professional or win-runner and plays back the recorded actions usually tweaking the recorded scripts to include verification for elements present using css or xpath location of a particular element like image. In this approach tester needs to monitor each step of the recorded playback.

3. Coded UI Testing and
In Coded UI Testing tester or developer adds codes to check validation messages, navigations, UI Elements positions, Sizes, Fonts etc., and generate reports on whether the coded tests passed or failed.

4. Model/Graphical representational state transition based testing
In this approach a graphical representation of systems behavior model is prepared. Based on the representation, application is tested whether its behavior matches to the flow.

GUI Check Lists include:
1. Verify whether meaningful tool tips are available.
2. Verify whether proper field labels are present.
3. Verify Field value font, size, prompt message properties, alignment.
4. Verify Navigation.
5. Verify sorting of drop down values and default values.
6. Verify type ahead of drop down values.
7. Verify validation messages and appropriate icon used in message pop up dialog box.
8. Verify event coverage like click action, navigation, save and close.
9. Verify character limits and lengths.
10. Verify validations for special characters, blank values, alpha numerical values, only alphabets, only digits and so on.
11. Verify mandatory fields.
12. Verify error message, warning message, information message and validation message formats.
13. Verify hover effects, click effects, drag effects.
14. Verify hour glass when application is fetching some result.
15. Verify UI elements, alignment of elements for different standard screen resolutions.
16. Check readability, usability, user friendliness etc,.
17 Verify whether images are skewed, pixilated and that they have good resolution.
18. Verify scroll bars wherever necessary.
19. Verify if buttons, text fields and other UI elements are meant to be round cornered.

Saturday, December 7, 2013

Test Scenario

Test scenarios are hypothetical stories which the tester uses to predict the user behavior, problems and system complexities. Scenarios can have multiple steps whereas Test cases have Single step. Hence Test scenarios are different from test cases.

The flow of documentation is as follows:

  • Use Case can be written using Functional Specification.
  • Test Scenario document can be written using Use Case or Functional specification or wire frame.
  • Test Cases document can be written using Test Scenario or Use Cases document.


|   Introduction to Software Testing   |   Roles and Responsibilities of a Software Tester   |   What is a Test Case   |   Software Testing types and Methods   |   STLC Process   |   Hierarchy Chart   |   Differences between Desktop application testing, Client Server application testing, and Web Based application testing   |   Most Common Interview Questions   |   Resume Preparation Tips   |   SDLC Models   |   Blog Index   |   Software Testing FAQs   |

Thursday, January 31, 2013

Severity and Priority Differences

Differences between severity and priority
Differences between Severity and Priority

Whenever tester(s) report bugs to developers they assign "Severity" and "Priority" to a bug, so that developers can understand the seriousness of the bug and they can schedule bug fixes.

Severity: of a bug decides the risk factor/impact of a bug on the application tested. Severity of a bug may be classified in to different types.

 Severity Types:

1) Blocker: bug that prevents further testing or developing of software.
2) Critical: A bug that causes AUT to hang, or causes you to lose data.
3) Major: A major functionality is not working or is broken.
4) Normal: Bug that can be taken care of and it should be fixed.
5) Minor: A bug which causes loss of function, and involves an easy patch work around.
6) Trivial: A cosmetic problem, such as a typo error or UI Issue.
7) Enhancement: A Request for new functionality or suggestions.

Priority: helps in deciding the seriousness of a bug. It is the main factor which helps in scheduling bug fixes. A bug which has high severity may have low priority.

 Priority Types:

1) Blocker: the product cannot be developed further unless this bug is fixed.
2) Immediate: This bug Should be fixed ASAP or else it may ruin the
3) High: Fix it so that it should not cause shipping to be delayed.
4) Normal: Fix it immediately after all the other higher priority bugs are done.
5) Low: Fix it so that the product is perfect and finished properly.


Monday, January 28, 2013

Desktop, Client Server, and Web Applications

What is the difference between Desktop application testing, Client Server application testing, and Web Based application testing?

In Desktop application testing application runs on personal computers or work stations and main focus is on a particular OS / environment. In this test approach application is altogether tested on sections like functionality, GUI, Load, Performance and back-end.

In client server application asymmetrical components have to be tested. Application itself is loaded on a single server machine whereas the application ".exe" is installed on every client machine. In Client Server application based testing, testers usually have to test on classifications like, GUI on both the sides, functionality, client - server integration, back-end, Load and performance. This environment is generally used in intranet networks.

Web based application is quite different and complicated to test as testers won't be having control over the application. Application is loaded on a web server whose location may or may not be known and no exe is installed on the client machine. Tests are carried out on different web browsers and OS platforms. In brief, web application is mainly tested on categories like browser compatibility and operating system compatibility, error handling, functionality, static pages, links, risk factors, back-end testing and load testing.


Friday, December 28, 2012

Agile SDLC Model

  • Speeds up or bypasses one or more life cycle phases.
  • Usually less formal and reduced scope.
  • Used for time-critical applications.
  • Used in organizations that employ disciplined methods.

Some Agile Methods:

  • Adaptive Software Development (ASD)
  • Feature Driven Development (FDD)
  • Crystal Clear
  • Dynamic Software Development Method (DSDM)
  • Rapid Application Development (RAD)
  • Scrum
  • Extreme Programming (XP)
  • Rational Unified Process (RUP)

Agile process SDLC Model
Agile SDLC


Spiral Model

Adds risk analysis, and 4gl RAD prototyping to the waterfall model.
Each cycle involves the same sequence of steps as the waterfall process model.

Spiral Model
Typical Activities:
  • Create a design
  • Review design
  • Develop code
  • Inspect code
  • Test product

Spiral Model Strengths:
  • Provides early indication of insurmountable risks, without much cost.
  • Users see the system early because of rapid prototyping tools.
  • Critical high-risk functions are developed first.
  • The design does not have to be perfect.
  • Users can be closely tied to all life-cycle steps.
  • Early and frequent feedback from users.
  • Cumulative costs assessed frequently.

Spiral Model Drawbacks:
  • Time spent for evaluating risks too large for small or low-risk projects.
  • Time spent planning, resetting objectives, doing risk analysis and prototyping may  be excessive.
  • The model is complex.
  • Risk assessment expertise is required.
  • Spiral may continue indefinitely.
  • Developers must be reassigned during non-development phase activities.
  • May be hard to define objective, verifiable milestones that indicate readiness to proceed through the next iteration.

When to use Spiral Model:
  • When creation of a prototype is appropriate.
  • When costs and risk evaluation is important.
  • For medium to high-risk projects.
  • Long-term project commitment unwise because of potential changes to economic priorities.
  • Users are unsure of their needs.
  • Requirements are complex.
  • New product line.
  • Significant changes are expected.

Incremental Model

Incremental Model
Incremental Model

  • Constructs a partial implementation of a total system.
  • Then slowly adds increased functionality.
  • The incremental model prioritizes requirements of the system and then implements them in groups.
  • Each subsequent release of the system adds function to the previous release, until all designed functionality has been implemented.

Incremental Model Strengths:
Incremental Model - SDLC
Incremental Model

  • Develop high-risk or major functions first.
  • Each release delivers an operational product.
  • Customer can respond to each build.
  • Uses  “divide and conquer” breakdown of tasks.
  • Lowers initial delivery cost.
  • Initial product delivery is faster.
  • Customers get important functionality early.
  • Risk of changing requirements is reduced.

Incremental Model Draw Backs:

  • Requires good planning and design.
  • Requires early definition of a complete and fully functional system to allow for the definition of increments.
  • Well-defined module interfaces are required (some will be developed long before others).
  • Total cost of the complete system is not lower.

When to use Incremental Model:

  • Risk, funding, schedule, program complexity, or need for early realization of benefits.
  • Most of the requirements are known up-front but are expected to evolve over time.
  • A need to get basic functionality to the market early.
  • On projects which have lengthy development schedules.
  • On a project with new technology.

Thursday, December 27, 2012


  • Requirements planning phase - a workshop utilizing structured discussion of business problems.
  • User description phase – automated tools capture information from users.
  • Construction phase – productivity tools, such as code generators, screen generators, etc. inside a time-box. (“Do until done”)
  • Cut over phase  -- installation of the system, user acceptance testing and user training.

RAD - Strengths:

  • Reduced cycle time and improved productivity with fewer people means lower costs.
  • Time-box approach mitigates cost and schedule risk.
  • Customer involved throughout the complete cycle minimizes risk of not achieving customer satisfaction and business needs.
  • Focus moves from documentation to code.
  • Uses modeling concepts to capture information about business, data, and processes.

RAD - Drawbacks:

  • Accelerated development process must give quick responses to the user.
  • Risk of never achieving closure .
  • Hard to use with legacy systems.
  • Requires a system that can be modularized.
  • Developers and customers must be committed to rapid-fire activities in an abbreviated time frame.

When To Use RAD:

  • Reasonably well-known requirements.
  • User involved throughout the life cycle.
  • Project can be time-boxed.
  • Functionality delivered in increments.
  • High performance not required.
  • Low technical risks.
  • System can be modularized.



It is a variant of Waterfall Model that emphasizes the verification and validation of the product. Testing of the product is planned in parallel with a corresponding development phase.

V-Model / V-Shaped Model
V-Model / V-Shaped Model

Project and Requirements Planning:– allocate resources.
Product Requirements and Specification Analysis:– complete specification of the software systems.
Architecture or High-Level Design:– defines how software functions fulfill the design.
Detailed Design – develop algorithms for each architectural component Production, operation and maintenance:– provide for enhancement and corrections.
System and acceptance testing:– check the entire software system in its environment.
Integration Testing:– check that modules  interconnect correctly.
Unit testing:– check that each module acts as expected.
Coding:– transform algorithms into software.

V-Model Strengths:

  • Emphasizes planning for verification and validation of the product in early stages of product development phase.
  • Each deliverable are easily testable.
  • Project management can track progress by milestones.
  • Easy to use.

V-Model / V-Shaped Model.
V-Model / V-Shaped Model.

V-Model Drawbacks:

  • Does not easily handle concurrent events.
  • Does not handle iterations or phases.
  • Does not easily handle dynamic changes in requirements.
  • Does not contain risk analysis activities.

When to use V-Model:

  • Excellent choice for systems requiring high reliability.
  • All requirements are known up-front.
  • When it can be modified to handle changing requirements beyond analysis phase .
  • Solution and technology are known.


Topic: SDLC Model.

Next: |   RAD - Rapid Application Development Model   |   Incremental Model   |   Waterfall Model   |   Spiral Model   |   Agile SDLC Model   |

| Introduction to Software Testing | Roles and Responsibilities of a Software Tester | What is a Test Case | Software Testing types and Methods | STLC Process | Hierarchy Chart | Most Common Interview Questions | Resume Preparation Tips | SDLC Models | Blog Index | Software Testing FAQs |

Google Search