COSMOS QA Criteria
COSMOS Quality Expectations
This has been put together to address Bugzilla ER 214576.
The terminologies/acronyms below are commonly used throughout this document. The list below defines each term regarding how it is used in this document.
|Quality Expectations||Is a statement of some behaviour, characteristic or operational facility that a product must exhibit for it to be deemed ‘fit for purpose’. Quality expectations are normally grouped into four main categories: functional/behavioural, operational efficiency, inter operability factors; and admin/management factors (to control TCO).|
|Acceptance Criteria||This is a quantification of how a quality expectation is to be validated. For functional/behavioural quality expectations this is a simple Boolean test – it either works or it doesn’t. Hence, for most scope docs there is no need to specifically define functional acceptance criteria. However, other types of quality expectations – especially performance related areas – do require specific acceptance criteria because the quantification is normally some form of numeric threshold (with optional margin/tolerance) that states minimum levels of acceptable operational efficiency.|
The purpose of this document is to set the quality expectations of COSMOS (the software) and matching acceptance criteria that would serve as a preamble to the COSMOS QA team while executing their work.
The QA team will use these criteria as a specification to define and plan their testing efforts for any given iteration. Specific test scenario's, environments and methods are to be documented in the QA iteration activities document eg) COSMOS_QA_i9_Activities
Is COSMOS 1.0 well-formed software?
COSMOS QA perceives COSMOS 1.0 as being a well formed software if it is built upon valid use cases, on bug free technology, with accompanying API documentation and providing easy installations, on some widely used platforms.
|Quality Expectation||Acceptance Criteria||QA Role|
|Valid use cases||COSMOS team to define use cases COSMOS_Use_Cases||Validate ER definition against the use cases - manual|
|Comprehensive Unit Test Coverage||COSMOS team must provide unit test cases (Manual, TPTP/JUnits) to prove integrity of code changes; A unit test case walk-through to QA must be provided 215135 (i9)208604 (i9)||Validate the test case scope and validity and sign off on all ER's. QA are not required to execute unit tests.|
|API documentation/JavaDoc||COSMOS team must provide API documentation/JavaDoc 197515 (i9) 216655 (no target)||Manual verification of API documentation/JavaDoc|
|Easy to use deployment package||Release Engineering team to provide an easy install Build_Packaging_for_COSMOS||Validate the ease of use of the package and accompanying install instructions – manual; RE process will not be scrutinized|
|Base platforms support||COSMOS team to specify the supported platforms COSMOS M2 Dependencies||QA certifies release on specified platforms and runs Sanity tests on latest GA versions (where different).|
Is COSMOS 1.0 a consumable entity?
COSMOS QA perceives COSMOS 1.0 as a consumable (adoptable) software if we can demonstrate its capability to successfully integrate with participating data managers and MDRs through integration and performance testing reports, code samples and other adopter aids.
Clear documentation, process definition for fixing COSMOS bugs or supporting future COSMOS enhancements can also contribute. Dependencies on 3rd Party software must be specified.
|Quality Expectation||Acceptance Criteria||QA Role|
|Successful Application integration of COSMOS components||COSMOS team must provide helper applications for integration with user scenarios 208274 (i9) 209990 (future)||Test the function and effectivity of helper tools – manual. Recorded as manual TPTP tests|
|COSMOS support across products / data sources||COSMOS team must state the kinds of MDRs that can be integrated and provide samples 211093 (future) 214766 (i8) 209987 (i9) 212187 (i8) 212189 (i8)||Execute integration tests with these sample MDRs – manual. Recorded as TPTP manual tests|
|User documentation||COSMOS team will write manuals 214805 (i9)||QA validates the information - manual|
|Additional platforms||COSMOS team must specify 216210 (i9) COSMOS M2 Dependencies||QA will certify the product on these platforms - manual|
|Dependencies on other open source software||COSMOS team to define a minimum version of the dependent software 215609 (no target) COSMOS M2 Dependencies||QA will validate at minumum versions and sanity test at latest GA versions where this differs|
|Future enhancements / bug reporting mechanism||COSMOS team to set a process Bugzilla||QA validates the process - manual|
Operational Efficiency considerations
The operational efficiency of a system is considered to be the characteristics of a that system when deployed to a production environment and it's reaction load.
|Quality Characteristic||Acceptance Criteria||QA Role|
|Concurrency||How many queries / clients may run simultaneously? The COSMOS components 216210 (i9)||QA test that COSMOS components function with 2+ sessions|
|Data Volumes/Performance||Are there any restrictions on the amount of data that may be returned by a query? How many queries / clients may run at a time with acceptable performance? 216210 (i9)||QA perform generic load/data volume tests to generate data points for further analysis|
|Scalability/Stability||COSMOS 1.0 will support a single Data Manager and Management Domain. How many MDRs and Data Managers may be added? COSMOS components must not crash when subjected to load.216210 (i9)||QA perform generic load tests with increasing number of MDRS to ensure stability.|
This section includes the tasks required to complete this enhancement.
- Jimmy Mohsin has generated this page to address bugzilla 214576
- The COSMOS team needs to provide input to this page
- Shivvy, representing the QA team, is supposed to review and sign-off on these criteria
All reviewer feedback should go in the Talk page for 214576.
A formal review is required to agree the content and detail of the quality expectations laid out in the three ‘quality perspective’ tables. This is a three step process…
- Review each quality expectation to agree whether it should be included.
- If it is to be included then determine whether the acceptance criteria can be met with an r1.0 timeframe, or be postponed to a subsequent release.
- All acceptance criteria must be linked to an ER that delivers the required quality. This can be illustrated by linking the acceptance criteria table cell content to either other relevant wiki pages or an ER. If the link is to an ER then this should also state the ER’s status (e.g. not started / WIP /completed)