Requirements Traceability Matrix (RTM)


1. Requirements Traceability Matrix (RTM)
A document showing the relationship/mapping between Test Requirements and Test Cases.

Elements of RTM:
a. Requirements ID
b. Requirements Description
c. Test Case ID
d. Status [Open, Closed, Defer (Later), On hold]

2. Verification and Validation
Verification is the process confirming that something software meets its specification. Validation is the process confirming that it meets the user's requirements.

Difference between Verification and Validation:
Suppose, you are going to buy a pair of shoes having number 9 for you. You have chosen a pair and seen the tag with 9 written on it. This is verification, because your requirement was to buy a pair of shoes with 9 number.

But when you tried to wear it and found that shoe is not fitted into your feet. After inquery, you have found that company has tagged it 9 number by mistake. Actually it was 7 number shoe. This process is called Validation.

Example of Verification: Creating Traceability Matrix
Example of Validation: Executing Test Cases

3. Static and Dynamic Testing

Static black-box testing: Testing the specification is static black-box testing.

Two Types of Static black-box testing:

1. High-level review techniques
a. Research Existing Standards and Guidelines
b. Review and Test Similar Software

2. Low-level techniques
a. Specification Attributes Checklist (e.g. Spec must be complete, accurate, precise, consistent etc)
b. Specification Terminology Checklist (e.g. focus on the terms in Spec like "If…Then…(but missing Else)." or "Etc., And So Forth, And So On" etc)

Dynamic Black-Box Testing: Testing software without knowledge of code is dynamic black-box testing.

Static White-Box Testing: Static white-box testing is the process of carefully reviewing the software design, architecture, or code for bugs without executing it.

Three Types of Static White-Box Testing:

a. Peer Reviews: Peer Reviews are the least formal method. Peer reviews are often held with just the programmer who wrote the code and one or two other programmers or testers acting as reviewers.

b. Walkthroughs: In a Walkthrough, the programmer who wrote the code formally presents it to a small group of five or so other programmers and testers. The presenter reads through the code line by line, or function by function, explaining what the code does and why. The reviewers listen and question anything that looks suspicious.

c. Inspections: Inspections are the most formal type of reviews and more formalized than a 'walkthrough', typically with 3-8 people including a moderator, reader, and a recorder to take notes. The other participants are called inspectors.

Walkthrough:
1. It's a type of Semi Formal Review.
2. 2 to 7 People are attaining it.
3. Author is Presenter.
4. Lead by Author only.
5. Reviewers are not aware of the subject/topic.

Inspection:
1. It's totally a Formal Review.
2. 2 to 10 or more People attaining it.
3. Author is not presenter. Some one else is giving presentation.
4. Lead by Moderator.
5. Reviewers are aware & well prepared for the subject/topic.
6. Recorder is noting down everything. Like defects, changes, improvements etc.

Dynamic White-Box Testing: is a method of testing software that tests internal structures or workings of an application.

Difference between Dynamic White-Box Testing and Debugging:
The goal of dynamic white-box testing is to find bugs. The goal of debugging is to fix them.

Desktop Application Testing


Level 1 - User Interface Testing (GUI Testing):
a. Content wording used in the web pages should be correct.
 

b. Wrap-around should occur properly.
 

c. Instructions used in the application should be correct (i.e. if you follow each instruction does the expected result occur?)
 

d. Image spacing – To verify that images are displaying properly with text.

Level 2 - Functional Testing
a. Check for broken links (Broken link  refers to a hyperlink which does not work).

b. Warning messages: User input should get verified at system level according to business rules and error/warning messages should be flash to user for incorrect inputs.

c. Resolution change effect on the application: Ensure that application's functionality and design is compatible with the different resolutions.

d. Print: Following points must be verified

- Test the print functionality of the application when no printer connected - application should behave correctly if printer is not available.

- Test the print functionality of the application when printer is connected

- Ensure that application queues prints in printer if papers are not available in printer.

- To ensure that lengthy description of an event  is not truncating on print layout in selected event printing.

e.  Theme change: Ensure the successful launch of application after theme change.

f. Installation Testing (Upgrade/Downgrade): Verify application is included in Programs and Features list after installation. Also Verify application is removed from Programs and Features list after un installation. Keep in mind that older version of application should not be install on latest version.

g. Testing with multi user accounts: Open Control Panel, User Accounts, and add 2 user accounts (Standard and admin) to the system. With the application running, press Start, the Switch User to the user account just created.
Verify application launches and runs correctly on the newly created user account. Switch back and forth between user accounts and use the application in both. Watch for any performance decreases and check functionality.

h. Sleep: While the application is running, put the system to sleep (S3). Wake the system up after two minutes.
a) Verify the application is still running.
b) Verify there is no distortion or error.

i. Cache:
- Delete the application's cache, launch the application and verify that application should work properly.
- Delete the application's cache while application is running and verify that application should work properly.

Level 3 - Compatibility Testing
a. Test on different Operating systems: Some functionality in your web application may not be compatible with all operating systems. All new technologies used in web development like graphics designs, interface calls like different API’s may not be available in all Operating Systems.
Test your web application on different operating systems like Windows (XP, Vista, Win7 etc), Unix, MAC, Linux, Solaris with different OS flavors.

Level 4 - Performance testing
a. Long period of continuous use: Is site able to run for long period, without downtime.


b. Memory: Note down the average memory usage in Comments column.



c. Generate "Power Efficiency Diagnostics Report" by running the command powercfg /energy