Jump to: navigation, search

COSMOS QA Criteria

Revision as of 10:07, 25 January 2008 by Strpa05.ca.com (Talk | contribs)

COSMOS Quality Expectations

This has been put together to address Bugzilla ER 214576.


The terminologies/acronyms below are commonly used throughout this document. The list below defines each term regarding how it is used in this document.

Term Definition
Quality Expectations Is a statement of some behaviour, characteristic or operational facility that a product must exhibit for it to be deemed ‘fit for purpose’. Quality expectations are normally grouped into four main categories: functional/behavioural, operational efficiency, inter operability factors; and admin/management factors (to control TCO).
Acceptance Criteria This is a quantification of how a quality expectation is to be validated. For functional/behavioural quality expectations this is a simple Boolean test – it either works or it doesn’t. Hence, for most scope docs there is no need to specifically define functional acceptance criteria. However, other types of quality expectations – especially performance related areas – do require specific acceptance criteria because the quantification is normally some form of numeric threshold (with optional margin/tolerance) that states minimum levels of acceptable operational efficiency.


We intent to set the quality expectations of COSMOS (the software) and matching acceptance criteria that would serve as a preamble to the COSMOS QA team while executing their work.

The QA team will use these criteria as a specification to define and plan all their testing efforts.

Is COSMOS 1.0 well-formed software?

COSMOS QA perceives COSMOS 1.0 as being a well formed software if it is built upon valid use cases, on bug free technology, with accompanying API documentation and providing easy installations, on some widely used platforms.

Quality perspective 1: Is COSMOS well formed ?
Quality Expectation Acceptance Criteria QA Role
Valid use cases COSMOS team to define use cases COSMOS_Use_Cases Validate ERs against the use cases - manual
Bug free implementation COSMOS team must provide JUnits covering 100% of code; code and JUnits walk-through to QA must be provided 215135 (i9)208604 (i9) Run the JUnits and validate their code / ER coverage – TPTP; Black box functional testing – manual / SOAPUI recorded as TPTP manual tests
API documentation COSMOS team must provide API documentation 215534 (i9) Manual verification of API documentation
Easy to use deployment package Release Engineering team to provide an easy install Build_Packaging_for_COSMOS Validate the ease of use of the package and accompanying install instructions – manual; RE process will not be scrutinized
Base platforms support COSMOS team to specify the supported platforms QA certifies release on specified platforms
Wiki documentation Owners take responsibility of the quality of content 197652 (no target) QA will not validate wiki content

Is COSMOS 1.0 a consumable entity?

COSMOS QA perceives COSMOS 1.0 as a consumable (adoptable) software if we can demonstrate its capability to successfully integrate with participating data managers and MDRs through integration and performance testing reports, code samples and other adopter aids.

Clear documentation, process definition for fixing COSMOS bugs or supporting future COSMOS enhancements and a plan for scaling COSMOS as dependent software upgrade themselves can also contribute.

Quality perspective 2: Is COSMOS consumable entity?
Quality Expectation Acceptance Criteria QA Role
Successful Application integration of COSMOS components COSMOS team must provide helper applications for integration with user scenarios 208274 (i9) 209990 (future) Test the function and efectivity of helper tools – manual. Recorded as manual TPTP tests
COSMOS support across products / data sources COSMOS team must state the kinds of MDRs that can be integrated and provide samples 211093 (future) 214766 (i8) 209987 (i9) 212187 (i8) 212189 (i8) Execute integration tests with these sample MDRs – manual. Recorded as TPTP manual tests
User documentation COSMOS team will write manuals 214805 (i9) QA validates the information - manual
Additional platforms COSMOS team must specify QA will certify the product on these platforms - manual
Dependencies on other open source software COSMOS team to define a process to integrate with the newer versions of these dependent software 215609 (no target) QA will validate the process - manual
Future enhancements / bug reporting mechanism COSMOS team to set a process Bugzilla QA validates the process - manual

Operational Efficiency considerations

Quality Perspective 3: COSMOS Operational Efficiency
Quality Characteristic Acceptance Criteria
Availability There are no Availability quality expectations
Capacity There are no Capacity quality expectations
Concurrency How many queries / clients may run simultaneously? Should there be any other concurrency considerations? 216210 (i9)
Data Volumes Are there any restrictions on the amount of data that may be returned by a query? How many queries / clients may run at a time? 216210 (i9)
Performance The Data Managers and MDRs should not degrade the performance of Data Adapters by more than 15%. 216210 (i9)
Scalability COSMOS 1.0 will support a single Data Manager and Management Domain. How many MDRs and Data Managers may be added? Should there be any other scalilbity considerations? 216210 (i9)
Stability There are no Stability quality expectations
Stress There are no Stress quality expectations
Manageability Monitor Administrator should have best practices, guidelines, and tooling to administer a COSMOS environment. 216210 (i9)

Task Breakdown

This section includes the tasks required to complete this enhancement.

  1. Jimmy Mohsin has generated this page to address bugzilla 214576
  2. The COSMOS team needs to provide input to this page
  3. Shivvy, representing the QA team, is supposed to review and sign-off on these criteria

Open Issues/Questions

All reviewer feedback should go in the Talk page for 214576.

A formal review is required to agree the content and detail of the quality expectations laid out in the three ‘quality perspective’ tables. This is a three step process… 1. Review each quality expectation to agree whether it should be included. 2. If it is to be included then determine whether the acceptance criteria can be met with an r1.0 timeframe, or be postponed to a subsequent release. 3. All acceptance criteria must be linked to an ER that delivers the required quality. This can be illustrated by linking the acceptance criteria table cell content to either other relevant wiki pages or an ER. If the link is to an ER then this should also state the ER’s status (e.g. not started / WIP /completed)