Jump to: navigation, search

Difference between revisions of "COSMOS QA Criteria"

('''Task Breakdown''')
(Task Breakdown)
Line 130: Line 130:
 
# Jimmy Mohsin has generated this page to address [https://bugs.eclipse.org/bugs/show_bug.cgi?id=214576 bugzilla 214576]
 
# Jimmy Mohsin has generated this page to address [https://bugs.eclipse.org/bugs/show_bug.cgi?id=214576 bugzilla 214576]
 
# The COSMOS team needs to provide input to this page
 
# The COSMOS team needs to provide input to this page
# Kishore Adivishnu, representing the QA team, is supposed to review and sign-off on these criteria
+
# SrinivasReddy Doma, representing the QA team, is supposed to review and sign-off on these criteria
  
 
== '''Open Issues/Questions '''==
 
== '''Open Issues/Questions '''==

Revision as of 04:51, 6 November 2008

COSMOS Quality Expectations

This has been put together to address Bugzilla ER 214576.

Terminologies/Acronyms

The terminologies/acronyms below are commonly used throughout this document. The list below defines each term regarding how it is used in this document.

Term Definition
Quality Expectations Is a statement of some behaviour, characteristic or operational facility that a product must exhibit for it to be deemed ‘fit for purpose’. Quality expectations are normally grouped into four main categories: functional/behavioural, operational efficiency, inter operability factors; and admin/management factors (to control TCO).
Acceptance Criteria This is a quantification of how a quality expectation is to be validated. For functional/behavioural quality expectations this is a simple Boolean test – it either works or it doesn’t. Hence, for most scope docs there is no need to specifically define functional acceptance criteria. However, other types of quality expectations – especially performance related areas – do require specific acceptance criteria because the quantification is normally some form of numeric threshold (with optional margin/tolerance) that states minimum levels of acceptable operational efficiency.

Purpose

The purpose of this document is to set the quality expectations of COSMOS (the software) and matching acceptance criteria that would serve as a preamble to the COSMOS QA team while executing their work.

The QA team will use these criteria as a specification to define and plan their testing efforts for any given iteration. Specific test scenario's, environments and methods are to be documented in the QA iteration activities document eg) COSMOS_QA_i9_Activities

Is COSMOS 1.0 well-formed software?

COSMOS QA perceives COSMOS 1.0 as being a well formed software if it is built upon valid use cases, on bug free technology, with accompanying API documentation and providing easy installations, on some widely used platforms.


Quality perspective 1: Is COSMOS well formed ?
Quality Expectation Acceptance Criteria QA Role
Valid use cases COSMOS team to define use cases COSMOS_Use_Cases Validate ER definition against the use cases - manual
Comprehensive Unit Test Coverage COSMOS team must provide unit test cases (Manual, TPTP/JUnits) to prove integrity of code changes; A unit test case walk-through to QA must be provided 215135 (i9)208604 (i9) Validate the test case scope and validity and sign off on all ER's. QA are not required to execute unit tests.
API documentation/JavaDoc COSMOS team must provide API documentation/JavaDoc 197515 (i9) 216655 (no target) Manual verification of API documentation/JavaDoc
Easy to use deployment package Release Engineering team to provide an easy install Build_Packaging_for_COSMOS Validate the ease of use of the package and accompanying install instructions – manual; RE process will not be scrutinized
Base platforms support COSMOS team to specify the supported platforms COSMOS M2 Dependencies QA certifies release on specified platforms and runs Sanity tests on latest GA versions (where different).

Is COSMOS 1.0 a consumable entity?

COSMOS QA perceives COSMOS 1.0 as a consumable (adoptable) software if we can demonstrate its capability to successfully integrate with participating data managers and MDRs through integration and performance testing reports, code samples and other adopter aids.

Clear documentation, process definition for fixing COSMOS bugs or supporting future COSMOS enhancements can also contribute. Dependencies on 3rd Party software must be specified.

Quality perspective 2: Is COSMOS consumable entity?
Quality Expectation Acceptance Criteria QA Role
Successful Application integration of COSMOS components COSMOS team must provide helper applications for integration with user scenarios 208274 (i9) 209990 (future) Test the function and effectivity of helper tools – manual. Recorded as manual TPTP tests
COSMOS support across products / data sources COSMOS team must state the kinds of MDRs that can be integrated and provide samples 211093 (future) 214766 (i8) 209987 (i9) 212187 (i8) 212189 (i8) Execute integration tests with these sample MDRs – manual. Recorded as TPTP manual tests
User documentation COSMOS team will write manuals 214805 (i9) QA validates the information - manual
Additional platforms COSMOS team must specify 216210 (i9) COSMOS M2 Dependencies QA will certify the product on these platforms - manual
Dependencies on other open source software COSMOS team to define a minimum version of the dependent software 215609 (no target) COSMOS M2 Dependencies QA will validate at minumum versions and sanity test at latest GA versions where this differs
Future enhancements / bug reporting mechanism COSMOS team to set a process Bugzilla QA validates the process - manual

Operational Efficiency considerations

The operational efficiency of a system is considered to be the characteristics of a that system when deployed to a production environment and it's reaction to load.

Quality Perspective 3: COSMOS Operational Efficiency
Quality Characteristic Acceptance Criteria QA Role
Concurrency Execution of multiple processes or operations simultaneously. The COSMOS components are required to be multi threaded and support queries / clients to be run simultaneously. 216210 (i9) QA will perform basic concurrency testing by running 2 or more cosmos clients (command line and GUI) on same machine and by executing queries simultaneously.
Data Volumes/Performance Restrictions on the amount of data that may be returned by a query/transaction. Data sets should be kept relatively small in a COSMOS implementation, however guidelines concerning data volumes and effect on performance should be determined and made available to adopters for a given scenario. More specific performance thresholds and metrics may be applied after further analysis of testing results. 216210 (i9) QA to perform generic load/data volume tests to generate data points for further analysis. QA will incrementally increase load on Example MDR repository, which is an XML file based repository and try to test sample queries to monitor response time, CPU and memory utilization.
Scalability/Stability COSMOS 1.0 will support a single Data Broker and Management Domain. The scalable quantity of Data Managers within a COSMOS environment will not be artificially limited and should not present a practical restriction in a well designed system, however, guidelines concerning the number of Data Managers and effect on performance should be determined and made available to adopters for a given scenario. Any COSMOS environment must be considered stable, with predictable results under all conditions. Exceptions should be handled gracefully with informative messaging to the client. 216210 (i9) QA to perform generic load/data volume tests to generate data points for further analysis. QA will incrementally increase the number of MDR's, and subject to a constant transaction load of sample queries to monitor response time, CPU and memory utilization.

Task Breakdown

This section includes the tasks required to complete this enhancement.

  1. Jimmy Mohsin has generated this page to address bugzilla 214576
  2. The COSMOS team needs to provide input to this page
  3. SrinivasReddy Doma, representing the QA team, is supposed to review and sign-off on these criteria

Open Issues/Questions

All reviewer feedback should go in the Talk page for 214576.

A formal review is required to agree the content and detail of the quality expectations laid out in the three ‘quality perspective’ tables. This is a three step process…

  1. Review each quality expectation to agree whether it should be included.
  2. If it is to be included then determine whether the acceptance criteria can be met with an r1.0 timeframe, or be postponed to a subsequent release.
  3. All acceptance criteria must be linked to an ER that delivers the required quality. This can be illustrated by linking the acceptance criteria table cell content to either other relevant wiki pages or an ER. If the link is to an ER then this should also state the ER’s status (e.g. not started / WIP /completed)