User:Bambers Linden/scratchpad
< User:Bambers Linden
Jump to navigation
Jump to search
Revision as of 12:59, 24 August 2010 by Bambers Linden (talk | contribs)
New test plan template
Scope
- This test walks through...
- The purpose is to...
- Est. Running time
Set-up
Environment:
- List test environment requirements such as production or non-production grid, platforms, minimum Viewer and Server versions, etc.
Other:
- List other requirements needed to execute the test plan here. e.g...
- Basic tester account to use as "User B" where specified
- Sandbox, or other area where building is allowed
Test Steps
Example Blurb
This section tests the Viewer's behavior when...
- Run Second Life
- Verify the SL app starts and the correct login screen and image are displayed
- Verify the version number is correct (using Help > About... menu item)
- Verify the Dev Grids menu is called using Ctrl-Shift-G key combo
- Verify the Viewer successfully logs into an online dev grid
- Verify the top menu bar is colored RED to indicate the current grid is a development grid
- Quit Second Life
Pass/Fail Criteria
- Passes if
- Fails if
Tear Down
- List what must be done to revert the tester's environment to a neutral state
Old test script template
Goals to strive for when writing a test script
- Easy to understand and follow.
- Short enough to be run in 30 minutes or less. If it's too long, break it down into smaller tests.
- Clear description of requirements needed to run the test. Number of users, parcels, etc.
Test script contents
- Requirements (ie. # of users, a god account, land, objects needed for the test)
- Est. Running time
- Variables to test (ie. different types of things that generate test permutations)
- Describe the expected behavior and purpose of the new code. (or link to the Design Document)
- List any dependencies the new code may have -- what other systems may be affected?
- List any security implications -- does this feature give access to something it should not?
- Detailed plan(s) for testing new functionality, including success and failure cases if possible.
- Test Setup
- Feature Rule to check
- Step
- Step
- Corner case to rule
- Step
- Step
- Detailed plan(s) for testing dependent code, including success and failure cases if possible.
- Compare Performance, if applicable
- Requirements for gathering data on existing feature being modified.
- Follow this with requirements for gathering data on new feature in new format, etc.
- Explain how to compare data to ensure new feature is not worse (i.e. lower framerate, higher bandwidth, more db queries, etc.)