COSMOS QA Criteria
COSMOS Quality Expectations
This has been put together to address Bugzilla ER 214576.
The terminologies/acronyms below are commonly used throughout this document. The list below defines each term regarding how it is used in this document.
|Quality Expectations||Is a statement of some behaviour, characteristic or operational facility that a product must exhibit for it to be deemed ‘fit for purpose’. Quality expectations are normally grouped into four main categories: functional/behavioural, operational efficiency, inter operability factors; and admin/management factors (to control TCO).|
|Acceptance Criteria||This is a quantification of how a quality expectation is to be validated. For functional/behavioural quality expectations this is a simple Boolean test – it either works or it doesn’t. Hence, for most scope docs there is no need to specifically define functional acceptance criteria. However, other types of quality expectations – especially performance related areas – do require specific acceptance criteria because the quantification is normally some form of numeric threshold (with optional margin/tolerance) that states minimum levels of acceptable operational efficiency.|
We intent to set the quality expectations of COSMOS (the software) and matching acceptance criteria that would serve as a preamble to the COSMOS QA team while executing their work.
The QA team will use these criteria as a specification to define and plan all their testing efforts.
Is COSMOS 1.0 well-formed software?
|Quality Expectation||Acceptance Criteria||QA Role|
|Valid use cases||COSMOS team to define use cases||Validate ERs against the use cases - manual|
|Bug free implementation||COSMOS team must provide JUnits covering 100% of code; code and JUnits walk-through to QA must be provided||Run the JUnits and validate their code / ER coverage – TPTP; Black box functional testing – manual / SOAPUI recorded as TPTP manual tests|
|API documentation||COSMOS team must provide API documentation||Manual verification of API documentation|
|Easy to use deployment package||Releas Engineeting team to provide an easy install||Validate the ease of use of the package and accompanying install instructions – manual; RE process will not be scrutinized|
|Base platforms support||COSMOS team to specify the supported platforms||QA certifies release on specified platforms|
|Wiki documentation||Owners take responsibility of the quality of content||QA will not validate wiki content|
Is COSMOS 1.0 a consumable entity?
|Quality Expectation||Acceptance Criteria||QA Role|
|Successful integration of COSMOS components||COSMOS team must provide helper applications for integration testing with scenarios||Perform integration testing and execute scenarios – manual. Recorded as manual TPTP tests|
|COSMOS stability during production deployments||COSMOS team must state the minimum system requirements for production. Also recommend parameters (number of data Managers / MDRs that may be added, volume of data that can be queried, etc.) that should be considered for these tests.||Execute performance / scalability /volume/stress/ availability testing with minimum resources recommended|
|COSMOS support across products / data sources||COSMOS team must state the kinds of MDRs that can be integrated and provide samples||Execute integration tests with these samples – manual. Recorded as TPTP manual tests|
|User documentation||COSMOS team will write manuals||QA validates the information - manual|
|Samples / skeleton MDR implementation / any collateral||COSMOS team must provide samples||QA validates the existence and may require assistance from COSMOS Team while using the skeleton implementations – manual. Recorded as TPTP manual tests|
|Additional platforms||COSMOS team must specifiy||QA will certify the product on these platforms - manual|
|Dependencies on other open source software||COSMOS team to define a process to integrate with the newer versions of these dependent software||QA will validate the process - manual|
|Future enhancements / bug reporting mechanism||COSMOS team to set a process||QA validates the process - manual|
Operational Efficiency considerations
|Quality Characteristic||Acceptance Criteria|
|Availability||There are no Availability quality expectations|
|Capacity||There are no Capacity quality expectations|
|Concurrency||How many queries / clients may run simultaneously? Should there be any other concurrency considerations?|
|Data Volumes||Are there any restrictions on the amount of data that may be returned by a query? How many queries / clients may run at a time?|
|Performance||The Data Managers and MDRs should not degrade the performance of Data Adapters by more than 15%|
|Scalability||COSMOS 1.0 will support a single Data Manager and Management Domain. How many MDRs and Data Managers may be added? Should there be any other scalilbity considerations?|
|Stability||There are no Stability quality expectations|
|Stress||There are no Stress quality expectations|
|Manageability||System Administrator should have best practices, guidelines, and tooling to administer a COSMOS environment|
This section includes the tasks required to complete this enhancement.
- Jimmy Mohsin has generated this page to address bugzilla 214576
- The COSMOS team needs to provide input to this page
- Shivvy, representing the QA team, is supposed to review and sign-off on these criteria
All reviewer feedback should go in the Talk page for 214576.
A formal review is required to agree the content and detail of the quality expectations laid out in the three ‘quality perspective’ tables. This is a three step process… 1. Review each quality expectation to agree whether it should be included. 2. If it is to be included then determine whether the acceptance criteria can be met with an r1.0 timeframe, or be postponed to a subsequent release. 3. All acceptance criteria must be linked to an ER that delivers the required quality. This can be illustrated by linking the acceptance criteria table cell content to either other relevant wiki pages or an ER. If the link is to an ER then this should also state the ER’s status (e.g. not started / WIP /completed)