Common Mistakes in Logged Bugs by Tester regarding a mobile phone app


While writing the defect report, it is also important that proper terms and terminologies must be used so that both the developer and the reporter have same understanding of the defect.
Image result for mistakes
Enlisted are a few common mistakes (along with their solutions) that are made while writing defect report.
  • Never write ‘click’ in the defect report of a mobile application. Instead write ‘Tap/Double Tap’ as these are the correct terms
          Image result for Tap on screen
  • A mobile application has ‘screens’ and not ‘pages’. Never use the term ‘page’ in the defect report of a mobile application as only web applications have ‘pages’.
         
  • The ‘keyboard’ that only contains numeric values is called ‘Keypad’. If this ‘Keypad’ is used for dialing numbers, it will be referred to as ‘Dial pad’. So always use the specific terms while reporting the defect.
          Image result for keyboard vs keypad
  • There are two screen views. i) Landscape, ii) Portrait. Used these terms instead of writing ‘horizontal screen’ and ‘vertical screen’ for Landscape and Portrait respectively.
          Image result for Landscape view mobile
  • ‘Drag n Drop’ is a term that is used for web applications. For mobile application, ‘slide to rearrange’ must be used.
          Image result for Drag and drop android
  • ‘Slide’ or ‘Swap/Swipe’ must be used when moving down the application screen instead of ‘scroll’.
  • Particularly in iPhone, there is a three line button on top which is used to show/hide side menu. This button is called ‘hamburger’ button. While writing the defect report, it is mostly referred to as ‘menu icon’ or ‘side menu button’.
         Image result for hamburger button Iphone 4
  • The use of the following terms will also make it easy for the dev to understand the defects properly.
  • Tap: Opens or launches whatever you tap.
  • Double Tap: Zooms in or out in stages.
  • Pan: Moves through screens or menus at a controlled rate.
  • Flick: Scrolls rapidly through menus or pages or moves sideways in hubs.
  • Pinch: Zooms gradually out of a website, map or picture.
  • Stretch: Zooms gradually in a website, map or picture.
  • Rotate: Move a picture or other item(s) on the screen in a circular direction (clockwise or counter-clockwise)

Coded UI HTML Report Generator Utility API


I recently created a coded UI HTML reporter for creating HTML report of test execution

you can download it from the link it generated Report like the one in image

How to Use : 
Store all your result of every test case in a data table and at the end render that table into a HTML report 
Create a method to call functions of generate report API 
public static void generateHTMLReport()
        {
            string nameofTC=null, process=null, status=null, curDay=null;
            string ex = null;
            DateTime thisDay = DateTime.Today;
            String curDay2 = thisDay.ToString();
            reportname = “coaded UI Script execution report” + curDay2;
            GenerateReports.Report.sReportName = “C:\\EOR data\\Results\\ReportGeneration.html”;
            GenerateReports.Report.CreateReportHTML();
            foreach (DataRow row in resultLOG.Rows)
            {
                nameofTC = row[“1”].ToString();
                process = row[“2”].ToString();
                status = row[“3”].ToString();
                curDay = row[“4”].ToString();
                ex = row[“5”].ToString();
                GenerateReports.Report.AppendReportHTML(nameofTC, process, status, “This test case was executed at :  ” + curDay + ” if script  failed exception log of coded UI is  *****************” + ex);
                    
            }
            GenerateReports.Report.CloseReportHTML();
}}

GUI Testing Checklist


GUI Testing Checklist
 Purpose of this GUI Testing Checklist is to help you understand how your applicationcan be tested according to the known and understood standards for GUI. This checklist can give some guidance to the development and QE, both the teams. Development teamcan make sure that during the development they follow guidelines related to the compliance, aesthetics, navigation etc. but onus of testing GUI is on the QE team and as a tester it is your responsibility to validate your product against GUI standards followed by your organization. This GUI test checklist can ensure that all the GUI components are thoroughly tested. In the first part of this checklist, we will cover Windows compliance standard and some test ideas for field specific tests.
Windows Compliance Standards
These compliance standards are followed by almost all the windows based application. Any variance from these standards can result into inconvenience to the user. This compliance must be followed for every application. These compliances can be categorized according to following criteria 
      i.        Compliance for each application
                     a.        Application should be started by double clicking on the icon.
                     b.        Loading message should have information about application name, version number, icon etc.
                     c.        Main window of application should have same caption as the icon in the program manager.
                     d.        Closing of the application should result in “Are you sure?” message.
                     e.        Behaviour for starting application more than once must be specified.
                      f.        Try to start application while it is loading
                     g.        On every application, if application is busy it should show hour glass or some other mechanism to notify user that it is processing.
                     h.        Normally F1 button is used for help. If your product has help integrated, it should come by pressing F1 button.
                      i.        Minimize and restoring functionality should work properly
     ii.        Compliance for each window in the application
                     a.        Window caption for every application should have application name and window name. Specially, error messages.
                     b.        Title of the window and information should make sense to the user.
                     c.        If screen has control menu, use the entire control menu like move, close, resize etc.
                     d.        Text present should be checked for spelling and grammar.
                     e.        If tab navigation is present, TAB should move focus in forward direction and SHIFT+TAB in backward direction.
                      f.        Tab order should be left to right and top to bottom within a group box.
                     g.        If focus is present on any control, it should be presented by dotting lines around it.
                     h.        User should not be able to select greyed or disabled control. Try this using tab as well as mouse.
                      i.        Text should be left justified
                      j.        In general, all the operations should have corresponding key board shortcut key for this.
                     k.        All tab buttons should have distinct letter for it.
    iii.        Text boxes
                     a.        Move mouse to textbox and it should be changed to insert bar for editable text field and should remain unchanged for non-editable text field.
                     b.        Test overflowing textbox by inserting as many characters as you can in the text field. Also test width of the text fieldby entering all capital W.
                     c.        Enter invalid characters, special characters and make sure that there is no abnormality.
                     d.        User should be able to select text using Shift + arrow keys. Selection should be possible using mouse and double click should select entire text in the text box.
    iv.        Radio Buttons
                     a.        Only one should be selected from the given option.
                     b.        User should be able to select any button using mouse or key board
                     c.        Arrow key should set/unset the radio buttons.
     v.        Check boxes
                     a.        User should be able to select any combination of checkboxes
                     b.        Clicking mouse on the box should set/unset the checkbox.
                     c.        Spacebar should also do the same
    vi.        Push Buttons
                     a.        All buttons except OK/Cancel should have a letter access to them. This is indicated by a letter underlined in the button text.  The button should be activated by pressing ALT
                     b.        Clicking each button with mouse should activate it and trigger required action.
                     c.        Similarly, after giving focus SPACE or RETURN button should also do the same.
                     d.        If there is any Cancel button on the screen, pressing Esc should activate it.
   vii.        Drop down list boxes
                     a.        Pressing the arrow should give list of options available to the user. List can be scrollable but user should not be able to type in.
                     b.        Pressing Ctrl-F4 should open the list box.
                     c.        Pressing a letter should bring the first item in the list starting with the same letter.
                     d.        Items should be in alphabetical order in any list.
                     e.        Selected item should be displayed on the list.
                      f.        There should be only one blank space in the dropdown list.
  viii.        Combo Box
                     a.        Similar to the list mentioned above, but user should be able to enter text in it.
    ix.        List Boxes
                     a.        Should allow single select, either by mouse or arrow keys.
                     b.        Pressing any letter should take you to the first element starting with that letter
                     c.        If there are view/open button, double clicking on icon should be mapped to these behaviour.
                     d.        Make sure that all the data can be seen using scroll bar.

Web Application Testing Checklist


Web Application Testing Checklist
1. FUNCTIONALITY
1.1 LINKS

1.1.1 Check that the link takes you to the page it said it would.
1.1.2 Ensure to have no orphan pages (a page that has no links to it)
1.1.3 Check all of your links to other websites
1.1.4 Are all referenced web sites or email addresses hyperlinked?

1.1.5 If we have removed some of the pages from our own site, set up a custom 404 page that redirects your visitors to your home page (or a search page) when the user try to access a page that no longer exists.
1.1.6 Check all mailto links and whether it reaches properly

1.2 FORMS

1.2.1 Acceptance of invalid input
1.2.2 Optional versus mandatory fields
1.2.3 Input longer than field allows
1.2.4 Radio buttons
1.2.5 Default values on page load/reload(Also terms and conditions should be disabled)
1.2.6 Is Command Button can be used for Hyper Links and Continue Links ?
1.2.6 Is all the data inside combo/list box are arranged in chronological order?
1.2.7 Are all of the parts of a table or form present? Correctly laid out? Can you confirm that selected texts are in the “right place?
1.2.8 Does a scroll bar appear if required?

1.3 DATA VERIFICATION AND VALIDATION

1.3.1 Is the Privacy Policy clearly defined and available for user access?
1.3.2 At no point of time the system should behave awkwardly when an invalid data is fed
1.3.3 Check to see what happens if a user deletes cookies while in site
1.3.4 Check to see what happens if a user deletes cookies after visiting a site

2. APPLICATION SPECIFIC FUNCTIONAL REQUIREMENTS
2.1 DATA INTEGRATION

2.1.1 Check the maximum field lengths to ensure that there are no truncated characters?
2.1.2 If numeric fields accept negative values can these be stored correctly on the database and does it make sense for the field to accept negative numbers?
2.1.3 If a particular set of data is saved to the database check that each value gets saved fully to the database. (i.e.) Beware of truncation (of strings) and rounding of numeric values.

2.2 DATE FIELD CHECKS

2.2.1 Assure that leap years are validated correctly & do not cause errors/miscalculations.
2.2.2 Assure that Feb. 28, 29, 30 are validated correctly & do not cause errors/ miscalculations.
2.2.3 Is copyright for all the sites includes Yahoo co-branded sites are updated

2.3 NUMERIC FIELDS

2.3.1 Assure that lowest and highest values are handled correctly.
2.3.2 Assure that numeric fields with a blank in position 1 are processed or reported as an error.
2.3.3 Assure that fields with a blank in the last position are processed or reported as an error an error.
2.3.4 Assure that both + and – values are correctly processed.
2.3.5 Assure that division by zero does not occur.
2.3.6 Include value zero in all calculations.
2.3.7 Assure that upper and lower values in ranges are handled correctly. (Using BVA)+

2.4 ALPHANUMERIC FIELD CHECKS

2.4.1 Use blank and non-blank data.
2.4.2 Include lowest and highest values.
2.4.3 Include invalid characters & symbols.
2.4.4 Include valid characters.
2.4.5 Include data items with first position blank.
2.4.6 Include data items with last position blank.

3. INTERFACE AND ERROR HANDLING
3.1 SERVER INTERFACE

3.1.1 Verify that communication is done correctly, web server-application server, application server-database server and vice versa.
3.1.2 Compatibility of server software, hardware, network connections

3.2 EXTERNAL INTERFACE

3.2.1 Have all supported browsers been tested?
3.2.2 Have all error conditions related to external interfaces been tested when external application is unavailable or server inaccessible?

3.3 INTERNAL INTERFACE

3.3.1 If the site uses plug-ins, can the site still be used without them?
3.3.2 Can all linked documents be supported/opened on all platforms (i.e. can Microsoft Word be opened on Solaris)?
3.3.3 Are failures handled if there are errors in download?
3.3.4 Can users use copy/paste functionality? Does it allow in password/CVV/credit card no field?
3.3.5 Are you able to submit unencrypted form data?

3.4 INTERNAL INTERFACE

3.4.1 If the system does crash, are the re-start and recovery mechanisms efficient and reliable?
3.4.2 If we leave the site in the middle of a task does it cancel?
3.4.3 If we lose our Internet connection does the transaction cancel?
3.4.4 Does our solution handle browser crashes?
3.4.5 Does our solution handle network failures between Web site and application servers?
3.4.6 Have you implemented intelligent error handling (from disabling cookies, etc.)?

4. COMPATIBILITY
4.1 BROWSERS

4.1.1 Is the HTML version being used compatible with appropriate browser versions?
4.1.2 Do images display correctly with browsers under test?
4.1.3 Verify the fonts are usable on any of the browsers
4.1.4 Is Java Code/Scripts usable by the browsers under test?
4.1.5 Have you tested Animated GIFs across browsers?

4.2 VIDEO SETTINGS

4.2.1 Screen resolution (check that text and graphic alignment still work, font are readable etc.) like 1024 by 768, 600×800, 640 x 480 pixels etc
4.2.2 Colour depth (256, 16-bit, 32-bit)


4.3 CONNECTION SPEED

4.3.1 Does the site load quickly enough in the viewer’s browser within 8 Seconds?

4.4 PRINTERS

4.4.1 Text and image alignment
4.4.2 Colours of text, foreground and background
4.4.3 Scalability to fit paper size
4.4.4 Tables and borders
4.4.5 Do pages print legibly without cutting off text?

User Interface Testing Checklist
1. USER INTERFACE
1.1 COLORS

1.1.1 Are hyperlink colors standard?
1.1.2 Are the field backgrounds the correct color?
1.1.3 Are the field prompts the correct color?
1.1.4 Are the screen and field colors adjusted correctly for non-editable mode?

1.1.5 Does the site use (approximately) standard link colors?
1.1.6 Are all the buttons are in standard format and size?
1.1.7 Is the general screen background the correct color?
1.1.8 Is the page background (color) distraction free?

1.2 CONTENT

1.2.1 All fonts to be the same
1.2.2 Are all the screen prompts specified in the correct screen font?
1.2.3 Does content remain if you need to go back to a previous page, or if you move forward to another new page?
1.2.4 Is all text properly aligned?
1.2.5 Is the text in all fields specified in the correct screen font?
1.2.6 Is all the heading are left aligned
1.2.7 Does the first letter of the second word appears in lowercase? Eg:

1.3 IMAGES

1.3.1 Are all graphics properly aligned?
1.3.2 Are graphics being used the most efficient use of file size?
1.3.3 Are graphics optimized for quick downloads?
1.3.4 Assure that command buttons are all of similar size and shape, and same font & font size.
1.3.5 Banner style & size & display exact same as existing windows
1.3.6 Does text wrap properly around pictures/graphics?
1.3.7 Is it visually consistent even without graphics?

1.4 INSTRUCTIONS

1.4.1 Is all the error message text spelt correctly on this screen?
1.4.2 Is all the micro-help text(i.e tool tip) spelt correctly on this screen?
1.4.3 Microhelp text(i.e tool tip) for every enabled field & button
1.4.4 Progress messages on load of tabbed(active screens) screens

1.5 NAVIGATION

1.5.1 Are all disabled fields avoided in the TAB sequence?
1.5.2 Are all read-only fields avoided in the TAB sequence?
1.5.3 Can all screens accessible via buttons on this screen be accessed correctly?
1.5.4 Does a scrollbar appear if required?
1.5.5 Does the Tab Order specified on the screen go in sequence from Top Left to bottom right? This is the default unless otherwise specified.
1.5.6 Is there a link to home on every single page?
1.5.7 On open of tab focus will be on first editable field
1.5.8 When an error message occurs does the focus return to the field in error when the user cancels it?

1.6 USABILITY

1.6.1 Are all the field prompts spelt correctly?
1.6.2 Are fonts too large or too small to read?
1.6.3 Are names in command button & option box names are not abbreviations.
1.6.4 Assure that option boxes, option buttons, and command buttons are logically grouped together in clearly demarcated areas “Group Box”
1.6.5 Can the typical user run the system without frustration?
1.6.6 Do pages print legibly without cutting off text?
1.6.7 Does the site convey a clear sense of its intended audience?
1.6.8 Does the site have a consistent, clearly recognizable “look-&-feel”?
1.6.9 Does User cab Login Member Area with both UserName/Email ID ?
1.6.9 Does the site look good on 640 x 480, 600×800 etc.?
1.6.10 Does the system provide or facilitate customer service? i.e. responsive, helpful, accurate?
1.6.11 Is all terminology understandable for all of the site’s intended users?

White Box Testing Techniques


White-Box Testing
White-box testing is testing that takes into
account the internal mechanism of a
system or component (IEEE, 1990).
Motivation behind White-box Testing
• Unit testing, which is testing of individual hardware or software
units or groups of related units (IEEE, 1990).
• Integration testing, which is testing in which software components,
hardware components, or both are combined and tested to evaluate
the interaction between them (IEEE, 1990).
• Regression testing, which is selective retesting of a system or
component to verify that modifications have not caused unintended
effects and that the system or component still complies with its
specified requirements (IEEE, 1990).
Testing by Stubs & Drivers
A driver is a software module used to invoke a module
under test and, often, provide test inputs, control and
monitor execution, and report test results (IEEE, 1990)
A stub is a computer program statement substituting for
the body of a software module that is or will be defined
elsewhere (IEEE, 1990) or a dummy component or
object used to simulate the behavior of a real component
(Beizer, 1990) until that component has been developed.
Deriving Test Cases
2.1 Basis Path Testing
Basis path testing (McCabe, 1976) is a means
for ensuring that all independent paths through a
code module have been tested.
Basis Path Testing
• Requirement (Monopoly Game)
If a player wants to purchase a house, her or she must land on an
un-owned property cell and have enough money to buy the property.
If so, the player’s money is decremented by the purchase price,
obtain the house.
a) 1-2-7-8 (property owned, pay rent)
b) 1-2-7-9 (property owned, no money
for rent)
c) 1-2-3-4-5-6 (buy house)
d) 1-2-3 (don’t want to buy)
e) 1-2-3-4 (want to buy, don’t have
enough money)
Deriving Test Cases
2.2 Equivalence Partitioning
A technique that partitions the space of possible
program inputs/outputs into a finite set of
equivalence classes.
Guidelines for Partitioning
Ranges: if a requirement specifies a range then
three equivalence classes are required, one valid and
two invalid.
Numbers: if a requirement specifies a number then
three equivalence classes are required, one valid and
two invalid.
Sets: if a requirement specifies a set then two
equivalence classes are required, one valid and one
invalid.
Deriving Test Cases
2.3 Boundary Value Analysis
Boundary value analysis extends equivalence
partitioning by focusing attention on equivalence
class boundaries.
Equivalence Partitioning/Boundary Value
Analysis
Example:
A person might want to buy a house, but may or may not have
enough money. Considering EP/BVA, we would want to ensure our
test cases include the following:
1. house costs $100, have $200 (equivalence class “have enough
money”)
2. house costs $100, have $50 (equivalence class, “don’t have
enough money”)
3. house costs $100, have $100 (boundary value)
4. house costs $100, have $99 (boundary value)
5. house costs $100, have $101 (boundary value)
3 Control-flow/Coverage Testing
Coverage is a measure of the
completeness of the set of test cases.
3.1 Method Coverage
3.2 Statement Coverage
3.3 Decision/Branch Coverage
3.4 Condition Coverage
Sample Code for Coverage Analysis
3.1 Method Coverage
Method coverage is a measure of the
percentage of methods that have been
called by your test cases.
Test Case 1:
The method call
foo(0, 0, 0, 0, 0.)
3.2 Statement Coverage
Statement coverage is a measure of the
percentage of program statements that are
run when your tests are executed.
3.2 Statement Coverage
Test Case 2:
The method call
foo(1, 1, 1, 1, 1.)
Expected return value of 1.
100% statement
coverage achieved
3.3 Decision/Branch Coverage
Decision or branch coverage is a measure
of how many of the Boolean expressions
(Decision points) of the program have
been evaluated as both true and false in
the testing.
3.3 Decision/Branch Coverage
Line # Predicate True False
3 (a == 0) Test Case 1
foo(0, 0, 0, 0, 0)
return 0
Test Case 2
foo(1, 1, 1, 1, 1)
return 1
7 ((a==b) OR ((c == d) AND bug(a) )) Test Case 2
foo(1, 1, 1, 1, 1)
return 1
Two decision points – one on line 3 and the other on line 7.
Line 3: if (a == 0) {
Line 7: if ((a==b) OR ((c == d) AND bug(a) )) {
75% Branch Coverage
3.3 Decision/Branch Coverage
Test Case 3:
To bring us to 100% branch coverage:
foo(1, 2, 1, 2, 1).
Line # Predicate True False
3 (a == 0) Test Case 1
foo(0, 0, 0, 0, 0)
return 0
Test Case 2
foo(1, 1, 1, 1, 1)
return 1
7 ((a==b) OR ((c == d) AND bug(a) )) Test Case 2
foo(1, 1, 1, 1, 1)
return 1
Test Case 3
foo(1, 2, 1, 2, 1)
100% Branch Coverage
3.4 Condition Coverage
Condition coverage reports the true or
false outcome of each Boolean subexpression of a compound predicate.
Sample Code for Coverage Analysis
3.4 Condition Coverage
Predicate True False
(a==b) Test Case 2
foo(1, 1, x, x, 1)
Return value 0
Test Case 3
foo(1, 2, 1, 2, 1)
Division by zero!
(c==d) Test Case 3
foo(1, 2, 1, 2, 1)
Division by zero!
Bug(a)
50% Conditional Coverage
Test Case 4:
To address test (c==d) as true:
foo(1, 2, 1, 1, 1)
Test Case 5:
foo(3, 2, 1, 1, 1),
3.4 Condition Coverage
Predicate True False
(a==b) Test Case 2
foo(1, 1, x, x, 1)
Return value 0
Test Case 3
foo(1, 2, 1, 2, 1)
Division by zero!
(c==d) Test Case 4
foo(1, 2, 1, 1, 1)
return value 1
Test Case 3
foo(1, 2, 1, 2, 1)
Division by zero!
Bug(a) Test Case 4
foo(1, 2, 1, 1, 1)
return value 1
Test Case 5
foo(3, 2, 1, 1, 1)
Division by zero!
100% Conditional Coverage
Code Coverage Tool – Java Screenshots
Cost of defect throughout the Life Cycle

White Box Testing Techniques


White-Box Testing
White-box testing is testing that takes into
account the internal mechanism of a
system or component (IEEE, 1990).
Motivation behind White-box Testing
• Unit testing, which is testing of individual hardware or software
units or groups of related units (IEEE, 1990).
• Integration testing, which is testing in which software components,
hardware components, or both are combined and tested to evaluate
the interaction between them (IEEE, 1990).
• Regression testing, which is selective retesting of a system or
component to verify that modifications have not caused unintended
effects and that the system or component still complies with its
specified requirements (IEEE, 1990).
Testing by Stubs & Drivers
A driver is a software module used to invoke a module
under test and, often, provide test inputs, control and
monitor execution, and report test results (IEEE, 1990)
A stub is a computer program statement substituting for
the body of a software module that is or will be defined
elsewhere (IEEE, 1990) or a dummy component or
object used to simulate the behavior of a real component
(Beizer, 1990) until that component has been developed.
Deriving Test Cases
2.1 Basis Path Testing
Basis path testing (McCabe, 1976) is a means
for ensuring that all independent paths through a
code module have been tested.
Basis Path Testing
• Requirement (Monopoly Game)
If a player wants to purchase a house, her or she must land on an
un-owned property cell and have enough money to buy the property.
If so, the player’s money is decremented by the purchase price,
obtain the house.
a) 1-2-7-8 (property owned, pay rent)
b) 1-2-7-9 (property owned, no money
for rent)
c) 1-2-3-4-5-6 (buy house)
d) 1-2-3 (don’t want to buy)
e) 1-2-3-4 (want to buy, don’t have
enough money)
Deriving Test Cases
2.2 Equivalence Partitioning
A technique that partitions the space of possible
program inputs/outputs into a finite set of
equivalence classes.
Guidelines for Partitioning
Ranges: if a requirement specifies a range then
three equivalence classes are required, one valid and
two invalid.
Numbers: if a requirement specifies a number then
three equivalence classes are required, one valid and
two invalid.
Sets: if a requirement specifies a set then two
equivalence classes are required, one valid and one
invalid.
Deriving Test Cases
2.3 Boundary Value Analysis
Boundary value analysis extends equivalence
partitioning by focusing attention on equivalence
class boundaries.
Equivalence Partitioning/Boundary Value
Analysis
Example:
A person might want to buy a house, but may or may not have
enough money. Considering EP/BVA, we would want to ensure our
test cases include the following:
1. house costs $100, have $200 (equivalence class “have enough
money”)
2. house costs $100, have $50 (equivalence class, “don’t have
enough money”)
3. house costs $100, have $100 (boundary value)
4. house costs $100, have $99 (boundary value)
5. house costs $100, have $101 (boundary value)
3 Control-flow/Coverage Testing
Coverage is a measure of the
completeness of the set of test cases.
3.1 Method Coverage
3.2 Statement Coverage
3.3 Decision/Branch Coverage
3.4 Condition Coverage
Sample Code for Coverage Analysis
3.1 Method Coverage
Method coverage is a measure of the
percentage of methods that have been
called by your test cases.
Test Case 1:
The method call
foo(0, 0, 0, 0, 0.)
3.2 Statement Coverage
Statement coverage is a measure of the
percentage of program statements that are
run when your tests are executed.
3.2 Statement Coverage
Test Case 2:
The method call
foo(1, 1, 1, 1, 1.)
Expected return value of 1.
100% statement
coverage achieved
3.3 Decision/Branch Coverage
Decision or branch coverage is a measure
of how many of the Boolean expressions
(Decision points) of the program have
been evaluated as both true and false in
the testing.
3.3 Decision/Branch Coverage
Line # Predicate True False
3 (a == 0) Test Case 1
foo(0, 0, 0, 0, 0)
return 0
Test Case 2
foo(1, 1, 1, 1, 1)
return 1
7 ((a==b) OR ((c == d) AND bug(a) )) Test Case 2
foo(1, 1, 1, 1, 1)
return 1
Two decision points – one on line 3 and the other on line 7.
Line 3: if (a == 0) {
Line 7: if ((a==b) OR ((c == d) AND bug(a) )) {
75% Branch Coverage
3.3 Decision/Branch Coverage
Test Case 3:
To bring us to 100% branch coverage:
foo(1, 2, 1, 2, 1).
Line # Predicate True False
3 (a == 0) Test Case 1
foo(0, 0, 0, 0, 0)
return 0
Test Case 2
foo(1, 1, 1, 1, 1)
return 1
7 ((a==b) OR ((c == d) AND bug(a) )) Test Case 2
foo(1, 1, 1, 1, 1)
return 1
Test Case 3
foo(1, 2, 1, 2, 1)
100% Branch Coverage
3.4 Condition Coverage
Condition coverage reports the true or
false outcome of each Boolean subexpression of a compound predicate.
Sample Code for Coverage Analysis
3.4 Condition Coverage
Predicate True False
(a==b) Test Case 2
foo(1, 1, x, x, 1)
Return value 0
Test Case 3
foo(1, 2, 1, 2, 1)
Division by zero!
(c==d) Test Case 3
foo(1, 2, 1, 2, 1)
Division by zero!
Bug(a)
50% Conditional Coverage
Test Case 4:
To address test (c==d) as true:
foo(1, 2, 1, 1, 1)
Test Case 5:
foo(3, 2, 1, 1, 1),
3.4 Condition Coverage
Predicate True False
(a==b) Test Case 2
foo(1, 1, x, x, 1)
Return value 0
Test Case 3
foo(1, 2, 1, 2, 1)
Division by zero!
(c==d) Test Case 4
foo(1, 2, 1, 1, 1)
return value 1
Test Case 3
foo(1, 2, 1, 2, 1)
Division by zero!
Bug(a) Test Case 4
foo(1, 2, 1, 1, 1)
return value 1
Test Case 5
foo(3, 2, 1, 1, 1)
Division by zero!
100% Conditional Coverage
Code Coverage Tool – Java Screenshots
Cost of defect throughout the Life Cycle

GUI Testing Checklist


GUI Testing Checklist
1. Windows Compliance Standards
2. Testers Screen Validation Checklist
3. Validation Testing – Standard Actions
Windows Compliance Testing
1.1. Application
• Start Application by Double Clicking on its
ICON.
• Application main window Caption
• Closing the application should result in an “Are
you Sure” message box
• Attempt to start application Twice.
• Try to start the application twice as it is
loading.
• Proper message while application is busy.
• All screens should have a Help button, F1
Windows Compliance Testing
1.2. For each Window in the Application
• Window caption for every application
should have name of application & window
name
• If the screen has an
Control menu, then
use all un-grayed options
Windows Compliance Testing
• Use TAB to move focus around the Window. Use
SHIFT+TAB to move focus backwards.
• If a field is disabled (grayed) then it should not get focus.
• Never updateable fields should be displayed with black
text on a grey background with a black label.
• All text should be left-justified, followed by a colon tight
to it.
• List boxes are always white background with black text
whether they are disabled or not. All others are grey.
• All tab buttons should have a distinct letter.
Windows Compliance Testing
1.3. Text Boxes
• Move the Mouse Cursor over all Enterable Text
Boxes. Cursor should change from arrow to
Insert Bar.
• Enter text into Box
• Try to overflow the text by typing too many
characters.
• Enter invalid characters – Letters in amount
fields, try strange characters like + , – * etc. in All
fields.
• Text Selection in the field.
Windows Compliance Testing
1.4. Option (Radio Buttons)
• Left and Right arrows should move ‘ON’
Selection. So should Up and Down.
• Select with mouse by clicking
1.5. Check Boxes
• Clicking with the mouse on the box
• SPACE should do the same.
Windows Compliance Testing
1.6. Command Buttons
• Command Button leads to another Screen,
• Click each button once with the mouse – This
should activate
• Tab to each button – Press SPACE – This should
activate
• Closing any action related to some running
process must prompt with Yes/No options.
• The button should be activated by pressing
CTRL+Letter.
Windows Compliance Testing
• Drop Down List Boxes
• Pressing the Arrow should give list of options.
• Pressing a letter should bring you to the first
item in the list, that start with that letter.
• Spacing should be compatible with the existing
windows spacing (word etc.)
• Drop down with the item selected should list with
the selected item on the top.
Windows Compliance Testing
Combo Boxes
• Should allow text to be entered. Clicking Arrow should
allow user to choose from list.
List Boxes
• Should allow a single selection to be chosen by
mouse.
• Pressing a letter should take you to the first item in the
list starting with that letter.
• If there is a ‘View’ or ‘Open’ button beside the list box
then double clicking on a line should work as
command button.
Tester’s Screen Validation Checklist
AESTHETIC CONDITIONS:
1. Is the general screen background the correct color?
2. Are the field prompts the correct color?
3. Are the field backgrounds the correct color?
4. Are all the screen prompts specified in the correct
screen font?
5. Are all the field prompts aligned perfectly on the
screen?
6. Are all the field edit boxes, group-boxes aligned
perfectly on the screen?
7. Should the screen be resizable, minimizeable?
8. Are all the field prompts spelt correctly?
Tester’s Screen Validation Checklist
VALIDATION CONDITIONS:
1. Does a failure of validation on every field cause a
sensible user error message?
2. Have any fields got multiple validation rules and if so
are all rules being applied?
3. If user enters an invalid value, invalid entry identified
and highlighted correctly with an error message.?
4. For all numeric fields check whether negative numbers
be entered.
5. For all character/alphanumeric fields check the field to
ensure character limit specified, exactly to one specified
in database.
6. Do all mandatory fields require user input?
Tester’s Screen Validation Checklist
NAVIGATION CONDITIONS:
1. Can the screen be accessed correctly
from the menu/ToolBar ?
2. Can a number of instances of this screen
be opened at the same time and is this
correct?
Tester’s Screen Validation Checklist
USABILITY CONDITIONS:
1. Do the Shortcut keys work correctly?
2. Is all date entry required in the correct format?
3. Is the cursor positioned in the first input field or
control when the screen is opened?
4. When an error message occurs does the focus
return to the field in error, when the user
cancels it?
Tester’s Screen Validation Checklist
DATA INTEGRITY CONDITIONS:
1. Is the data saved when the window is closed?
2. If numeric fields accept negative values can
these be stored correctly.
MODES (EDITABLE READ-ONLY) CONDITIONS:
1. Are all fields and controls disabled in read-only
mode?
2. Are the screen and field colors adjusted
correctly for read-only mode?
Tester’s Screen Validation Checklist
GENERAL CONDITIONS:
1. Assure the existence of the “Help” menu.
2. In drop down list boxes, ensure that the names are not
abbreviations / cut short
3. Ensure that duplicate hot keys do not exist on each screen.
4. Ensure the proper usage of the escape key.
5. Assure working of command buttons over particular screen.
6. Assure that all field labels/names are not technical labels, but
rather are names meaningful to system users.
7. Assure command buttons are all of similar size and shape, and
same font & font size
8. Assure that the color red is not used to highlight active objects
(many individuals are red-green color blind).
Tester’s Screen Validation Checklist
Specific Field Tests
1. Date Field Checks
Assure that month code 00 and 13 are validated
correctly & do not.
2. Numeric Fields
Assure that both + and – values, lower and higher
values are handled correctly.
3. Alpha Field Checks
Use blank and non-blank data
Include invalid characters & symbols
Validation Testing – Standard Actions
On every Screen
• Add Add
• View View
• Change Change
• Delete Delete
• Continue Cancel
• Fill each field – Valid data
• Fill each field – Invalid data
• Different Check Box combinations
• Scroll Lists
• Help
• Fill Lists and Scroll
• Tab
• Tab Order
• Shift Tab
• Shortcut keys – Alt + F
Validation Testing – Standard Actions
• SHORTCUT KEYS / HOT KEYS
Validation Testing – Standard Actions
• CONTROL SHORT KEYS
• Recommended CTRL+ Letter Shortcuts
• Suggested CTRL+ Letter Shortcuts