Docstoc

GENERIC TEST PLAN

Document Sample
GENERIC TEST PLAN Powered By Docstoc
					Application: Application X                                   -1-                                           Print Date: 05/29/12
Version:     1.00.00                                                                                       Tester:    John Doe




                                      GENERIC TEST CHECKLIST
                                                          Version 99-11-30




                                                Document Bookmarks:
                          Application:                    Application X
                          Version:                        1.00.00

                          Tester:                         John Doe
                          Company:                        Your Company
                          Group:                          Your Group

                          Printed:                        5/29/2012 7:59:00 PM



Checklist Items are based predominantly upon several texts (esp. Cem Kaner’s book - Bug List Appendix) + Experience + Other

I have found that when I miss bugs, it is convenient and effective to build a new check into the checklist to prevent a recurrence of the
missed bug while testing future builds or apps. (Every test failure recognized is a lesson learned for which the solution is built into the
master checklist—a mistake is not necessarily a bad thing when you can build upon it.)




Template Filename:     C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf              Create Date:     11/30/1999 4:38:00 PM
Template By:           Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)              FileSize:        113,428 Bytes
Application: Application X                                -2-                                    Print Date: 05/29/12
Version:     1.00.00                                                                             Tester:    John Doe



                                            Revision History
     Tester           Date                                        Comments




                People Responsible for App
 Test Lead:
 Tester:
 Developer:
 Developer Lead:
 Program
 Manager:


           Other People Associated with App
 Support Analyst:




                                              Brief Application Description




                                           Schedule
#                            Milestone Name                                 Sched’d     Actual
                                                                             Date        Date




Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf     Create Date:    11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)     FileSize:       113,428 Bytes
Application: Application X                                -3-                                 Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe



Checklist Instructions
Document File Notes:
   RTF File: The document is saved as an RTF file to avoid the compatibility confusion (is the doc saved as Win
    Word 6.0, 95, or 97…and what version of Word do you have?).
   Read Only: The document has the Read Only attribute set. This is the ancient method of converting a document
    into a template. When user goes to save the Read Only file, the Save As dialog is popped up every time. (Did
    this because saved file as an .rtf.)
   Question Format: All checklist line items are written so that answers of yes are ‘a bad thing’, and answers of no
    are ‘a good thing.’

Preparation
Front End Tools
Tracking Tools:
   Bug Tracking: Testing must have a bug tracking tool. Use RAID, Access app, etc.
    Visit http://www.angelfire.com/wa/pbsystems for a free bug tracking tool.
   Test Case Manager: Used to track status of test cases, avoid duplication of efforts (re-use test cases with slight
    modification), etc. Use Access .mdb, Word .doc, etc.
    Visit http://www.angelfire.com/wa/pbsystems for a free test case management tool.
   Configuration Management: Use Visual Source Safe (or sim. Prod.) to save test cases, test plans, etc. Also use
    it to look for code changes, churn rate, version control, etc.

Diagnostic Tools:
   Virus Scan: Use F-Prot or similar virus scanner to check master disk sets. Scan test machines periodically.
   NT TaskManager: Tracks resource usage, threads, performance, etc.
   ExamDiff: Compare file differences. Registry Dumps, Dir c:\*.* /s, or ini file snapshot changes over time—take
    snapshots both before and after broken state occurs…then compare. (or use fc.exe f/NT too).
    Visit http://www.nisnevich.com for the best freeware diff’ing tool available.
   WPS: Debug tool that lists all modules loaded into memory at a given point. (Applet comes with Windows NT.)
    Save list and WinDiff multiple lists over time, or across O.S.’s.
   PerfMon: NT Performance Monitor. Check out processor, RAM, disk access, network, server performance. (In
    NT4.0, type ‘diskperf’ at command prompt to activate disk stats after reboot.
   SysInfo: MSOffice app that lists all info about currently loaded modules in memory. (Comes with MSOffice.)
   Shotgun: Compares differences in registry between different snapshots over time. Does much more.
   Wmem: Checks system resources (user & GDI) and memory used by loaded apps
   Dr.Watson: debug tool that lists machine configuration, task list, module list, stack dump.
   Performance Profiling: Used to determine where bottlenecks occur in code, etc. Built-into VB4.0 Enterprise.
   Complexity Analysis: Cyclomatic complexity metric, Mccabe’s metric, etc. to determine complexity of source
    code.
   Coverage Analysis: Ensure that testing hit all lines code, all functions, etc.

Automation Tools:
   Dialog Check: Runs standard battery of tests against dialog boxes.
   Smart Monkey: Excellent tools for quickly automating (they analyze app and randomly push buttons).
   Simulators: While testing modules, can use drivers or other simulators to provide inputs to functions / apps being
    tested. VB Code = good tool.
   Visual Test: Automation where appropriate (repeatable).
   Macro Recorder: Used to reproduce intermittent bugs easily without effort of Visual Test.
   Test Data Generator: Used to populate tables with test data.




Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                    -4-                              Print Date: 05/29/12
Version:     1.00.00                                                                           Tester:    John Doe


Backend Tools
   DB Native GUI Tools: Dig them up. Enterprise Manager, Security Manager, NT User Manager for Domains, etc.
   DB Command Line Tools: Dig them up. ISQL, etc.
   VB / Access Apps: Create these to test things out.
   Batch Scripts: Run long processes overnight. Business Rule Enforcement, etc.
   Schema Comparison Tool: compare schemas between servers.
   DBA: Consult local DBA for other tools.
   Event Viewer: Look at NT Event Viewer for history of silent errors.
   Data Elements List: List of databases, tables, stored procedures, etc. with descriptions.



Project Risk / Contingency List
Many of the items here originated at http://www.azor.com/codeplan.html
Product:
   Reliability Required: Too much? Too little?
   Database Size: Too big? Too small?
   Complexity: Too complex / time consuming? Overly simplified?
   …
   …

Computer:
   Execution Time Constraint: Impossible objective given hardware platform? Too loose, users angry?
   Main Storage Constraint: Too much, unfeasible?
   Platform Volatility: OS too new? Too old?
   Development Code Turnaround Time: Builds take too long?
   …
   …

Personnel:
   Tester Capability: Any weak links?
   Automation Experience: If automation is necessary, any weak links here? Inexperienced? Spend too much time
    automating uselessly?
   Application Experience: New to app? Misunderstand or unaware of basic business rules / functionality / user
    scenarios?
   …
   …

Project:
   Latest Techniques: Still using ancient techniques / tools that are inefficient? Using techniques / tools that are
    brand new, kinks not worked out, inefficient?
   Schedule Tolerance: Is float time and resource loading too inflexible?
   …
   …




Template Filename:     C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:           Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                          -5-                                                Print Date: 05/29/12
Version:     1.00.00                                                                                                   Tester:    John Doe




*Generic* Component Testing
Component Testing is concerned with testing the various modules that comprise the application. The Generic line
items in this section can be applied across many different applications with little modification.

Basic Functionality (Maintenance Testing Only)
Skip this entire subsection unless you are doing Maintenance Testing. Build Turnaround Times(install, learn, and test the app) are 2-4 days for
typical Maintenance Patches. These items do not apply to 95% of testers, and should be ignored.

Initial Research
    Existing Documentation: Dig up old manuals, test plans, test cases, materials in files, materials in Visual Source
     Safe, etc.
    Screen Shots: Where appropriate (maintenance testing, existing products upgrades, etc.), take screen shots of
     all forms, dialogs, etc. Compact view app functionality.
    Fact Gathering: After reviewing existing documentation, go on fact gathering mission to fill in holes. Speak with
     developers and support analysts. Learn Server names, Server owners, passwords and logins, build version
     numbers, etc. that are pertinent to your testing.
    Learn App - Click Test: Click through all buttons, tabs, etc. to ensure they work properly. Work recursively from
     upper left corner through lower right of each form and dialog in app. (Objective is to learn the new app.)
    Smoke Test: Does the new build pass the standard smoke test? (Use VT4.0, Smart Monkey, Macro Recorder?,
     or Manual Tests.)


Front End:
Front End testing is concerned with testing through the application interface. This is standard black box testing.
Please note that the items in this section are listed in order of priority. Items at the top are higher priority than items at the bottom. The prioritization
was based upon Marv Parson’s observations given in a video lecture on the history of testing. His excellent observation is that first and foremost, a
tester must be sure that there are no Data Related Bugs. Second, the Setup must work. Third, there should be no ‘visible’ system crashes. Fourth,
everything else ‘should’ work (UI, Localization, etc.).

Data Testing
Accuracy / Integrity:
    Calculations - Reports: Calculation errors in reports? Wrong Data loaded? < Link to Backend Testing Section. >
    Calculations – App: Bad underlying functions? Overflow or underflow? Outdated constants? Impossible
     parenthesis? Wrong order of operators?
    Calculations – Backend: Can tester generate queries that show calculated values from query differ from actual
     app calc’d values (in table)? (If yes, double check test query math.)
    Div by 0: Can test force this error condition?
    Truncate: Truncate instead of rounding precision of numbers? Incorrect formula / approximation for conversion?
    Compatibility: Work with existing data structures? (Print reports or other summaries with old and new systems.
     Compare.)
    Test Data: Generate sufficient amount to thoroughly test the app.

Database Connectivity:
    Save: Does it fail? Is all data saved?
    Retrieval: Does it fail? Is all data retrieved?




Template Filename:       C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf                       Create Date:       11/30/1999 4:38:00 PM
Template By:             Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)                       FileSize:          113,428 Bytes
Application: Application X                                -6-                                 Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe




Installation Testing
Install Scenarios:
   Clean Machine: Does setup fail? Does app fail when run? (First wipe clean machine and reinstall. Write down
    setup specs.)
   New Bld Over Old Bld: Does setup fail? Does app fail when run? When run old app, does it recognize need to
    upgrade install (Reinstall)?
   New Bld Over New Bld: Does setup fail? Does app fail?
   Uninstall: Does Start > Settings > Control Panel > Add/Remove Programs > Uninstall App fail? Does a fresh
    Install following an Uninstall fail?
   Reinstall: Does Reinstall fail?
   Install Path: Do long filenames fail? Do spaces in path fail? Do other drives fail? Does not default path fail?

Version Control Tests:
   Build Number: Did wrong build number get placed on app? Wrong build number at file properties for .exe?
    Wrong build number at back end database build?
   Stomp Test (Matrix): If install other apps over this one, do newer/older DLLs/OCXs conflict?
   Install Component Variance: Any varying app components across varying install scenarios (WinDiff directories of
    multiple computers & check dates, versions, etc.)?
   Missing Required Components: Any required files missing? (Make list of all components [DLL’s, VBX’s, OCX’s,
    etc.] to compare w/what actually installed on Clean Machine.)

Misc. Install Tests:
   Stress Test: If hit cancel while installing, does app crash? Low disk space test fail the install? Wrong sequence
    disks fail the install?
   Icon / Workgroup: Is workgroup and/or icon not created? Is the Start button shortcut missing any necessary
    command line parms?
   Auxiliary Components: Does ODBC, or other third party supplemental products need to be installed? If yes, is it
    missing?
   DSN’s Setup: Have ODBC DSN’s been setup incorrectly?
   Setup Dialog: Does cancel at dialog crash setup.exe? Is there any residue left in registry? Ini’s left orphaned?
    Shortcuts (icons) left orphaned?
   Registry Changes: Did not perform all required registry changes? Broke registry? (Use Shotgun.exe or
    WinDiff.exe to check registry dumps before and after installs…review the differences.) Do .ini files exist and need
    same research (do not forget ODBC.ini and Win.ini)?


Boundary Condition Testing
Many items taken from Cem Kaner Video.


Data:
   Dataset: Max / Min Size problems?
   Numeric: Min’s / Max’s / Absurds problems?
   Alpha: Problems with ASCII 1-32? 128+? a-z? A-Z? (Remember to check 13-Enter, 9-Delete, Tab, etc.)
   Numerosity: Problem with number of elements in list?
   Field Size: Problems with field size (n chars, long in place of int, etc.)
   Error Guessing: Any inputs that will be most likely to break the system that actually break it?
   Data Structures: Failure within data structures? Break at constraints for elements? (Analyze data structures for
    boundary conditions break constraints.)
   Files: File Size problem?
   Timeouts: Query timeouts?




Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                -7-                                 Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe




App:
   Initial Uses: Does app fail or act peculiar at first run? Anything strange at second run?
   Loops: Boundary failure at loop counter (try mins, maxs, one less then max, etc.)?
   Repeats: If repeat same operations over and over 100,000 times, does app fail?
   Dates: Problems with mins / maxs dates? Year 2000? Year 1899? Time boundary problems (0, 12, 24, etc.)?
   Space: Location boundary problems (too close, too far away)?
   Memory: Boundary failure in memory (not stress test)?
   Syntax: Any valid inputs that do not work? Any invalid inputs that do work?

Hardware:
   Monitors: Problems with old monitors? Too new monitors? Too old/new drivers? Color Problems?
   Hard Drive: Problems with old drives? Too new drives? Too old/new drives? Color Problems?Problem with
    number of drives? With Drive being Full? With size of drive? Combinations of drives with CDROMs, etc.?
   CPU: CPU too old? Too new? Too slow? Too fast?
   Printers: Problems with old printers? Too new printers? Too old/new drivers? Color Problems? Shade
    problems at extremes?
   Miscellaneous: Mouse/trackball/touchpad too old/new? Keyboard too old/new? Pens too old/new? Drivers for
    each too old/new?
   Timeouts: Timeouts with Communications?


Environment (Multiple Configurations Hardware / Software)
Environment Matrices:
   Interoperability Matrix: Not run with Office 4.3? Office 95? Office97? Not run with IE 3.1B? Not run with MS
    Works?
   Resource Matrix: Are there memory leaks? (Proceed through app and record system resources at various
    points, or graph depending on tool used.)
   Running With Matrix: Screen Savers, Virus Protection, Share, Stacker, Explore, Find, MSOffice Toolbar,
    Exchange/Outlook, ATM, and other Memory Resident applets cause problems? (Build test matrix.) Does
    Activation cause errors if switch to other apps currently running (Alt-Tab, Ctrl+Esc, double-click icon at taskbar).
   O.S. Compatibility Matrix: App not run on Win95? Win NT 4.0? Win NT3.51? WFW? Novell Netware?
   Video Matrix: Screens look bad at varying resolution (640x480, 800x600, etc.)? How about varying colors (16,
    256, high, etc.)? How about VGA vs. SVGA? How about B/W monitors?
   Network Config’s Matrix: Not run on other computers? Fails against other Servers? If vary Network settings,
    anything break?
   ODBC Version/Make Matrix: User not able to run app with various versions and/or makes of ODBC drivers?
    (Make matrix)
   Printer Matrix: Output fail on some Printers? Fail on some Drivers? Any Configurations or Settings blow up app?
    (Be creative.)
   Input Matrix: Output fail on keyboard (standard vs. intl.)? Mouse? Pen? Trackball? Touchpad? Digital Pad?
   Disabled Matrix: Does app accommodate poor sight (preferences)? Deaf?

Typical Configuration Errors:
   Device: Wrong device? Wrong device address? Device unavailable? Device returned to wrong type of pool?
   Channel: Time out problems? Noisy channel? Channel goes down? Ignore / exceed throughput limits?
   Disk: Wrong storage device? Does not check directory of current disk? Doesn’t close file? Unexpected end of
    file? Disk sector bugs? Other length (or filesize) dependent errors?.
   Instructions / Return Codes: Wrong operation or instruction codes? Misunderstood status or return code?
    Device protocol error? Assumes device is / is not / should be / should not be initialize inappropriately?
   Miscellaneous: Underutilizes device intelligence? Paging mechanism ignored or misunderstood?




Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                -8-                                 Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe




UI Testing
Communication:
   Tool Tips & Status Bar: Missing command button help tips (yellow boxes) when mousepointer in proximity?
    Missing status-/command-line tip/descriptions (at bottom screen) when mousepointer in proximity?
   Missing Info.: No instructions? Cursor not present? Cursor not switch to hourglass? No status box during long
    delays? States which appear impossible to exit?
   Enable/Disable Toolbar Buttons: Should toolbar buttons be enabled / disabled to provide clarity for available
    operations?
   Wrong, Misleading, Confusing Info.: Factual or spelling errors? More than one name for same feature?
    Information overload? Product Support email address or phone number invalid (Help About, splash screen)?
   Help Text & Error Messages: Inappropriate reading level? Verbose? Emotional? Inconsistent? Factual errors?
    Truncated message text? Missing Text? Duplicate words?
   Training Documents: Factual or spelling errors? Missing information? Incorrect Information? Not properly
    translated? Spot check Index…wrong page numbers? Spot check Table Contents…wrong page numbers?
    Wrong app name, version, or other significant datum?

Dialog Boxes:
   Keyboard: Any keys not work? (Move thru dialog with Tab, Shift-Tab and Hotkeys.)
   Mouse: Any mouse actions not work? (Click, double-click, right-mouse click, etc.)
   Cancelling: Any method unavailable? (Escape key, Close from Ctrl-Box menu, double-click Ctrl-Box, hit Win95
    X button.)
   Oking: Any method unavailable (Enter, double click listbox item, etc.)
   Default Buttons: No Default Button on dialog (Default at hit enter)? No Cancel Button on dialog (Default at hit
    Escape)?
   Layout Error: wrong size? Non-standard format?
   Modal: Is window setup improperly (Modal or not modal)? Can user click behind dialog and cause problems?
   Window Buttons: Should Control menu or Min/Max buttons be visible?
   Sizable: Is window frame wrong (should dialog be fixed / sizable)?
   Title / Icon: Is dialog improperly titled? Is dialog missing or using wrong icon?
   Tab Order: Incorrect tab order (jumps all over)?
   Display Layout: Poor aesthetics? Obscured instructions? Misuse of flash and/or color? Heavy reliance on
    color? Layout inconsistent with environment? Screen too busy (need Tab controls, or toggle option buttons to
    reduce complexity/clutter)?
   Boundary Conditions: Any boundary problems? (Reference Boundary Conditions section of this checklist for all
    text boxes in every dialog.)
   Sorting: Are drop down lists not sorted where should be? (Check list boxes, combo boxes, menus, etc.)
   Active Window: Incoming mail kill app? Help / Quick Preview / Tip of Day / Cue Cards / Wizards / Other apps
    running in background or on top of app cause problem?
   MDI Forms: Puke, is it unnecessary? I don’t like them (unnecessarily complex).
   Memory Leak Test: Does app leak memory as user goes in and out of dialog boxes dozens of times? (Run Task
    Manager or other tool in conjunction with quick VT or macro recorder ‘script’ to check dialogs.)

Command Structure:
   Time Wasters: Garden paths (deeply nested command)? Choices that do not work or are stubs, “Are you really,
    really sure?”
   Menus: Too complex? Too simplistic? Too many paths to same place? Can not get there from here? Hot key
    duplicates? Hot key idiosyncratic (not standard with other apps)? Hot keys not Work? Unrelated commands
    tossed under same menu item?
   Popup Menus: Does right mouse button invoke popup menu? Should it for efficiency?
   Command Line Parameters: Forced distinction between upper/lower case? Unable to reverse parameters
    (locate anywhere in parm line)? Abbreviations/names that are not allowed? No batch input (for testing, faster
    runs, etc.)? Command line too complex?



Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                -9-                                 Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe


   Keyboard: Failure to use cursor, edit or function keys? Non standard use of cursor, dit or function keys?
    Failure to filter invalid keys at input? Failure to indicate keyboard state changes? Failure to scan for function and
    control keys?
   State Transitions: Can not do nothing and leave? Can not quit mid-program? Can not stop mid-command? Can
    not pause?



Program Rigidity:
   User Options: Can not do what did last time? Can not find out what did last time? Can not execute customizable
    command? Are there side effects to the preference changes? Is there infinite tailor-ability?
   Control: Who is in control, comp. or user—is it appropriate? Is the system novice friendly? Is it experience-
    hostile? Artificial intelligence and automated stupidity? Superfluous information requested? Unnecessary
    repetition of steps?
   Output: Limited to certain data or formats? Can not redirect output? Can not control layout? Can not edit
    labeling (tables/graphs)? Can not scale graphs?



Preferences:
   User Tailorability: Can not toggle on’off noise, case sensitivity, hardware, automatic saves, etc.? Can not
    change device initializations, scrolling speed, etc.?
   Visual Preferences: Can not toggle: Scroll bars? Status bar (@ bottom screen)? Tool Tips? Window
    Maximize/minimize? Hidden Windows? Default View? Dialog Colors? Complex/Simple menu structure?
   File: Default with 4 most recent files? Default directory for Open/Save As?
   Localization: Defaults not match local? (Date / Time, Currency, etc. should match system defaults.)

Usability:
   Accessibility: Can users enter, navigate and exit the app easily?
   Responsiveness: Can user do what they want, when they want, easily?
   Efficiency: Can users do what they want in a minimum amount of steps and that is clear? (Wizard?)
   Comprehensibility: Do users understand product structure, help system and documentation (or too complex,
    incomplete, etc.)?
   User Scenarios: Be sure to write-up Test Cases that are User Scenarios and simulate how a user will use the
    system.
   Ease of Use: Is the app easy to learn and use?

Localization:
   Translation: Mistranslated text? Untranslated text? Error messages not translated? Text within bitmaps that
    need to be translated (if so…oh shit)? Macro language not translated?
   English-only Dependencies: Dependencies on tools or technologies only available in English?
   Cultural Dependencies: Dependencies on concepts or metaphors only understood in English?
   UniCode: Any issues here?
   Currency: Not match locality (Britsh pound, Japanese yen, etc.)?
   Date/Time: Any problems with different formats for various countries?
   Constants: Any constants that vary with locality (financial / accounting equations, tax rates, etc.)? If so, how
    handle?
   Dialog Contingency: Dialogs can not be resized to 125% if translation requires additional space?


Performance Testing
   General: Slow app? Slow echoing? Poor responsiveness? No type-ahead? No warning that operation will take
    a long time?



Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                - 10 -                              Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe


   Benchmarks: Are there any discrepencies in performance among similar operations? Are some operations too
    slow? (Do many. For reports, query times, etc. Look for hooks in app to log out this info. Develop automated
    processes if necessary to carry out. Write up results in table / matrix.)
   Profiling: Are there any significant bottlenecks in performance? (Tool to detect where in code CPU time is going.)
   Modem Considerations: Graphics and Help at 300 baud? Data transfers of large sets kill system?



Load Testing
   Volume Test: Fail at large size of input/output? (Try batch processing where available to speed up testing.)
   Stress Test: Fail at rapid input? Fail at preemptive input (starts before last input finished)?
   Execution Limit Test: Fail at 2 instances of app running? 5 instances? 10 instances? 40 instances? 80
    instances?
   Window Limit Test: Fail if run several apps (esp. with lots of buttons=windows)? How does app respond (should
    be with low memory error, and not be fatal.)
   Storage Test: Fail when hard-drive filled up.
   Memory Issues: Fail when limited RAM exists? (Eat RAM until to check app response and survivability. Limit
    other resources and run app.)
   Resource not Returned: Doesn’t indicate when done with device? Doesn’t erase old files from disk? Doesn’t
    return unused memory? Wastes computer time?
   Prioritize Tasks: Is App failing to prioritize tasks? (Should be doing lower priority items during down times. App
    might prioritize, but never get to low priority tasks…need check to force action after XX days elapsed.)



Error Handling
Error Preventions:
   Disaster Prevention: No backup facility? No undo? No “Are you sure” confirmations? No incremental saves?
   Version Control: Inadequate version control (at startup)?
   Initial State: Inadequate initial state validation? Not check for missing (or outdated) components?
   Input: Inadequate tests user input? Inadequate protection against corrupt data? Inadequate tests of passed
    parameters?
   O.S. Bugs: Inadequate protection against O.S. bugs?
   Security: Inadequate protection against malicious use?
   Coverage: Errors that were not handled by programmer?

Error Response:
   Appropriateness: Inappropriate messages? Not easily understandable?
   Help: Unable to hit F1, or Help button to get further details? (Big no-no.)
   Error Detection: Ignores overflow? Ignores impossible value(s)? Ignores error flag? Ignores hardware fault or
    error conditions? Ignores data comparisons?
   Error Recovery: Lack of Automatic error correction? Failure to report an error? Failure to set error flag?
    Program flow returns to wrong area? Unable to Abort errors easily? Poor recovery from hardware problems? No
    escape from missing disk?
   Error Logging: Unavailable? Not informative: lacks module name, or time/date stamp, or error number and
    name? Unable to toggle on/off easily?
   Review Error Log: Any new errors entered into error log after your testing? (Look for failed assertions and silent
    errors.)
   Audit Trail: Lack of audit trail? Not informative audit trail (history of use): lacks date/time stamps, user name,
    etc? Does it slow down system excessively? If so, is there an option to toggle logging off?

Miscellaneous Error Handling Issues:
   Unnecessary: Is error not necessary? If not, why hassle user, add to training expense, add to workload of tech
    support, increase localization costs, etc.?



Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                   - 11 -                                          Print Date: 05/29/12
Version:     1.00.00                                                                                         Tester:    John Doe


   Error Rate: Excessive error rate? (Query on data validation audit logs. Quantity records not promoted through
    per batch.)
   User Interaction: No user options / filters for error handling? Awkward error handling interface? No area for user
    to enter comments in reponse to error for logging?
   Misc. Nuisances: Inadequate privacy or security? Obsession with security? Can not hide menus? Can not
    support standard O/S features? Can not allow long filenames?

Race Conditions
   Data: Multiple simultaneous read/writes Kill server? (Multiple updates compete so that successor begins
    execution on top of data that has not completed processing of predecessor operation.)
   Wrong Assumptions: Assume that one event or task finished before another begins? Assume that input will not
    occur during a brief processing interval? Assume that interrupts won’t occur during brief interval? Assume that a
    person, device, or process will respond quickly?
   Prerequisite Check: Task starts before its prerequisites are met?




Security
   Logins: Do authorization rules exist? (They should to monitor violations, etc.)
   Passwords: Are passwords forced to be changed every XX number of weeks? Are the passwords forced to be
    more than 4 characters? P/W have mixed case? P/W have numerals and special characters?
   Encryption: Should there be encryption at data outputs? Data transmits? Saved document/data repositories?
   Security Violation Plan: Have procedures been established to report and punish violators?
   Off-Hours: Are users limited to specified hours during the day? Should the limitation exist?
   Installation: Are security measures temporarily suspended during installation? If so, can user stop install midday
    and access confidential data, or alter priveleges, etc.?
   Security: Device use forbidden by user/caller? Specifies wrong privilege level for device.



Automated Processes
This section needs further development.
 Contingency Plan: Missing contingency plan in case automation doesn’t work? (For example, app may not be
    integrated soon enough to build automation scripts, or takes too long.)



Back End
Back End Testing is concerned with testing through the back end database system. This is also black box testing, and should be coordinated with
Front End Testing.
Please note that the items in this section are listed in order of priority. Items at the top are higher priority than items at the bottom.
This section is in its infancy, and will require significant modification in the near future.


General
   Object Dependency: Run SQL Server dependency check on ojects that changed to determine scope of impacted
    objects. Then test all impacted objects (sp_depends). Search Front End for all references to changed stored
    procedures / tables / other objects.
   Data Integrity: Ensure DI is maintained, data validation is used, anticipate errors from manual investigation.
   Data Tracking: All transactions validated (import and output sets manually verified via XL, etc.)
   Data Cleanup: Temp objects removed, etc.
   Data Recovery: Transaction failure should be recoverable (if rollbacks properly in place).
   Component Maintenance: Automated maintenance scripts for testing various scenarios.
   Component Stress Testing: Subject test server to high loads, etc.
   Security: Permissions, logins, etc.



Template Filename:     C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf               Create Date:      11/30/1999 4:38:00 PM
Template By:           Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)               FileSize:         113,428 Bytes
Application: Application X                                - 12 -                              Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe




Structural Back End Tests
Database Schema Tests
   Databases and Devices: Verify the database names. Verify enough space is allocated for each database. Verify
    database options settings.
   Tables, Fields, Constraints, Defaults: Verify tablespace names (tables, rollback segments, etc.). Verify field
    names for each table. Verify field types (esp. the number of characters in varchar and varchar2). Verify whether
    field allows null or not. Verify constraints on fields (min/max/etc.). Verify default values for fields. Verify table
    permissions.
   Keys and Indices: Verify every table has a primary key. Verify foreign key existence where appropriate – index,
    trigger, etc. Verify column data types between foreign keys and primary keys (should be the same). Verify
    indices: unique or not unique, bitmap or b-tree, etc.
   Stored Procedures: Verify stored procedure name. Verify whether stored procedure is installed in database.
    Verify parameter names, types and count of parameters. Run stored procedures from command line to boundary
    check the parms. Verify output of stored procedure. (Does it do what supposed to do? What not supposed to
    do?)
   Error Messages: Force stored procedure to fail, and check every error message s.p. generates. Are there any
    errors that do not yet have a predefined error message?
   Triggers, Update: Verify trigger name. Verify trigger assigned to correct field of correct table. Verify trigger
    updates child table FK when SQL alter parent table PK. Verify rollback when an error occurs.
   Triggers, Insert: Verify trigger name. Verify trigger assigned to correct field of correct table. Verify trigger inserts
    child table FK when SQL adds parent table PK. Verify rollback when an error occurs. Try to insert record with
    existing PK.
   Triggers, Deletion: Verify trigger name. Verify trigger assigned to correct field of correct table. Verify trigger
    deletes child table FK when SQL deletes parent table PK. Verify rollback when an error occurs. Try to delete
    record with and without existing PK (child records).

Back End Integration Tests
   General: Look for conflicts between schema, triggers, and stored procedures. Verify Server setup script
    accommodates both setting up databases from scratch as well as setting up databases over the top of existing
    databases. Verify environment variables defined (DOS), if applicable. Record setup time and issues
    encountered.

Functional Back End Tests
   Functionality: Verify every feature in backend (from Requirements, Functional Specs, and Design Spec). Verify
    data is inserted, updated, and deleted according to business rules. Look for invalid logic. Check error prevention
    and error handling mechanisms.
   Data Integrity / Data Consistency: Verify data validation before insertion, update, and deletion. Verify data
    security mechanisms are adequate. Verify Dimension or Reference tables have triggers to maintain referential
    integrity. Verify major fields across all tables do not contain invalid data / characters. Try to insert a child record
    before inserting its parent. Try to delete a record that is still referenced by other records in different tables (kill
    parent and leave orphans). Verify updates to PK cascade and update FK fields in different tables.
   Login and User Security: Verify email login security. Verify Oracle login security. Verify NT domain login
    security. Review Roles, Logins, etc. for abnormalities. Check concurrent logins (multiple simultaneous users
    logged on).

Performance
   PerfMon: Tool to detect performance across all system resources. Set it up and use it.
   Benchmarks: Determine time required for scripts to execute, etc.




Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                - 13 -                              Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe


Network Failure
   At Login: Enter bogus password or login id. How does system react? Reset permissions to eliminate tester, then
    attempt login, etc.
   In Process: Pull out network connection in middle of process, login off network, etc.
   High Traffic: Setup (or coordinate with busy part of day) app so run with high network traffic.

Server Failure
   Kill Server: Shut down server abruptly while client still running. Compare to standard shutdown.
   CPU Activity: Simulate high CPU consumption by having server execute many other tasks so that busy.
   File I/O: Simulate high activity by copying large batch files, or running test util designed for this.
   Services Not Started: Run with services not started on server.
   Silent Failures: Go to server periodically, and review Event Viewer history.



Maintenance
   Walk Thru Tables: Any orphaned temp tables? Any duplicate tables? Any bogus data jumping out?
   Business Rule Enforcement Script: Any nulls in inappropriate fieds? Any Foreign Key values in child table
    without matching Primary Key values in parent table? Any calculations that are not correct? (Write up test script
    consisting of multiple queries to check every business rule.)



Integration Testing
Integration Testing is concerned with testing the full application, from installation through all (or most) functionality.
Note that drivers and stubs should be used in place of code that is not yet written. This is standard black box testing.

Prerequisites
   Check-In: All modules should be checked in.
   Executable: The executable should be built. There should be a full setup too. Confirm pull location and build
    number.



Milestone Checks
   Regression: Regression test existing know bugs.
   Full Functionality: Cover as much functionality as possible. Run the horizontal tests (higher priority) of the app,
    moving vertically (deeper tests) where time permits. Run Installation tests, and compatability and configuration
    tests, etc.
   Objective: Developers want zero defects—meaning that no new high priority bugs are found or high priority
    existing bugs resurface. Testers of course want to find these if they exist.



Unit Testing
Unit Testing is another name for white box or glass box testing. It is concerned with testing the app by walking
through the source code. Note that drivers and stubs should be used in place of code that is not yet written. Also note
that white box testing is typically performed by the developer, unless the project is large enough to accommodate
separate white box testers.




Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                - 14 -                              Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe


Code Inspections
   Coupling: Is the effort required to interconnect the components during integration (after all separate components
    written).
   Maintainability: Is the effort required to locate and fix bugs efficiently, organized and logical code flow, not overly
    complex.
   Comments: Thorough enough? Or to excess to the point of redundancy of the code lines?


Live Inspections
   Case Coverage: Trace through code one line at a time in If…ELSEIF…ELSE…ENDIF, and SELECT
    CASE…CASE….END SELECT statements to check for failures. Be sure to use debug window to alter condition
    variable/pointer values, and Reset Next Statement to repeatedly reset execution pointer to previously executed
    lines of code speeds up testing).
   Error Handling Coverage: Force all errors in error handler to occur by enter them into debug window at
    appropriate lines of code while tracing code (at debug enter Error ### to force error to occur).
   Debug.Print: Use this to test values and assertions.



Design Testing
Design Testing occurs prior to the start of development. This section is located down here because it unfortunately
gets less use than the other sections above. (Also, I perform maintenance testing, so there is negligible design work.)
This section is in its infancy and will require significant modification in the future.

General Design Issues
Systems Overview:
   High-Level Compatibility: Missing / incomplete description of interface with other systems / databases / apps?
    Are all dependencies with external groups clearly defined (i.e.: feedstore hooks, etc.)?
   Systems Structure Chart: Missing / incomplete Systems Breakdown chart? (Should graphically show major
    system components, locations, and descriptions.)
   Major Functional Requirements: Missing / incomplete list of major functional requirements of system?
   Confusing Boundaries: Confusing boundaries between major components of system?
   Manual vs. Automatic: Confusion as to which System components are automated vs. manual? Is description
    incomplete?

Data Overview:
   Data Flow Diagrams: Do they exist? Are they detailed enough? Without a fairly detailed plan, how in God’s
    green earth can all of the individual programmers write their separate components and then have them
    seamlessly come together at the end of the project? The first thing a sane person does on a road trip through
    new territory is to consult a road map…and a DFD is one piece of the roadmap.
   Entity-Relationship Diagrams: Do ER diagrams exist showing the layout of the data (all tables, databases, etc.)?
   Data Requirements: Missing / incomplete list of data requirements for each of major systems? (Should have
    System I/O Diagram including names, descriptions, and sizes of data elements.) Any size problems with
    database?
   Data Groupings: Data Groupings not in logical categories? (Should be Static, Historical—no change likely, and
    Transaction related.)
   Standard Naming: Data names non-standard? Spaces in names? Names too long?
   Data Relationships: Primary Keys and Foreign Keys not defined? (Hierarchical relationships.)
   Source Definition: Source of data unclear? (Dept., individual, server name, etc.)
   Test Data Requirements: Has this been ignored?


Spec Review:
   Thorough: Are there any incomplete or missing Spec Sections?

Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                - 15 -                              Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe


   Glossary: Are there any amiguous terms (TLA’s – Three Letter Acronyms, etc.) in docs? (If yes, consider
    glossary.)
   Business Rules: Missing appendix which clearly defines Business Rules? (Why should programmer / tester
    decipher designer’s / user’s intent when it can be so easily listed in specs?)
   Project Baseline Schedule: Is project schedule unrealistic? Is schedule incomplete? Are there insufficient
    milestones? Have any tasks been left improperly hanging (not wrapped into fragnets of like tasks)? Are any
    fragnets improperly left hanging (not terminated into milestones)? Are any milestones left hanging (not eventually
    rolled up into a single end of project milestone)? Have responsibilities not been assigned to some tasks? Are
    there any unrealistic durations? Are there any nonsenscial predecessor / successor relationships? Has
    contingency time not been built into the durations? Did scheduler ignore or fail to obtain input from staff?
   Acceptance Criteria: Missing / incomplete Acceptance Strategy? (Needed to clearly define minimum scope of
    work -- when work is complete.)


Feature Overview:
   Useless: Is feature useless? If yes, why add complexity?
   Duplicate Efforts: Are we reinventing wheel? Are there similar features, products (i.e.: ‘free’ code—snicker
    snicker) already at MS?
   Competitor Analysis: Are there any similar features in competitors’ apps? If yes, did you fail to research it?
    (Take the good ideas, and take good notes of the bad ideas.)
   Priority: Missing / unclear prioritization of all features? (Very important f/crunch time when determine what
    features to nix.)
   Feature Interactions: Missing details? (For example: undo-able feature, repeatable feature, multi-user impacts, ill
    effects of aborting feature, backward compatibility issues, relationship to other features, etc.)

Other Considerations:
   User Interface Design: Use Front-End Testing checklist section above to test UI Design (menus, dialogs, etc.)



Areas Not Covered
The following areas will not be covered by testing. This list is living, and could change with respect to this project, as
long as project is still in progress.

   Application Specific Features: This generic checklist does not cover application specific items. User needs to
    write specific test cases to cover these.
   Undocumented Features: Not responsible for bugs appearing in undocumented or non-maintained areas of the
    specification.
   External Product Impact: Not responsible for bugs dependent on released Microsoft products. Any bugs that are
    found will be reported to the respective product group.
   Backend Data Maintenance: SQL Server backup and recovery, etc..



Criteria for Acceptance into Testing
   Development Testing: Each new build has to be tested by thedevelopment team before released to testing.
   Existing Bugs Fixed: Developers must resolve all previously discovered bugs marked for fix.
   Release Document: must be issued for every new build. It will include the build number, all changes, and new
    features from the previous build. In a component release, it should indicate which areas are ready for testing, and
    which areas are not.
   Instructions: All necessary setup instruction and scripts should be provided.
   Elements List: A list of all elements in the current drop with version number.




Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes
Application: Application X                                - 16 -                              Print Date: 05/29/12
Version:     1.00.00                                                                          Tester:    John Doe



Bug Reporting
   Bug Tracking System: Use the bug tracking system provided (RAID, etc.)
   Sample Embed Report: Here is a standard bug report. All bugs should contain this information at a minimum:
    <bug report template here.>




User Environment
SERVER Minimum Requirements:
 P/166 machine
 64MB or up memory
 6GB hard disk space
 Windows NT 4.0
 NT SQL Server 6.5


CLIENT Minimum Requirements:
 486/66 machine
 16MB or up memory
 1200MB or up hard disk space
 Super VGA display
 Win 3.11, Win NT 3.1, Win 95, Win NT 4.0




Template Filename:    C:\Docstoc\Working\pdf\b0021c4d-1f8b-4e48-a2f7-17746c699210.rtf   Create Date:   11/30/1999 4:38:00 PM
Template By:          Pierce Business Systems (http://www.angelfire.com/wa/pbsystems)   FileSize:      113,428 Bytes

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:12
posted:5/30/2012
language:Latin
pages:16