Difference between revisions of "DTP Connectivity Ganymede Project Plan: First Draft"
(→Open Data Access (ODA))
m (→High-Level Grouping)
|Line 115:||Line 115:|
Must Haves (
Must Haves (): 1, 2, 3, 4, 5, 7, 9, 11, 14, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, ODA-1, ODA-2
Like To Haves (
Like To Haves (): 6, 8, 10, 12, 13, 15, 16, 25, ODA-3
Revision as of 21:35, 21 September 2007
DTP Connectivity Ganymede
Beside each entry, we will put the following acronyms:
- MH for Must Have
- LTH for Like To Have
For Ganymede, we will focus on Must Have (MH) items before Like to Have (LTH) items.
Things to Consider...
1) Ability to pre-populate a connection UI with driver definitions. (This already exists with the ability to auto-create a default instance of a particular driver template at startup.) (MH) (BZ entry: 202641)
2) Provide common user interface for selecting existing connection profiles. Features of this UI should include filtering of the existing connections based on an arbitrary set of attributes and it should display an extensible set of properties for the selected connection. (This already partly exists with the ability to host the DSE on a dialog page/composite/wizard/property/preference page where necessary.) (MH) (BZ entry: 202643)
3) Provide best-practices for prompting users when to create new profiles, select existing profiles, and prompt the user for authentication information. Perhaps create code sample to help demonstrate best practices. (MH) (BZ entry: 202644 )
4) Ability to inject from a runtime instance of a profile at startup to populate a profile in the DSE. This may be supported already with the repository code. (MH) (BZ entry: 202647)
5) Filtering needs to become much more extensible and flexible. Perhaps split out catalog-loader (SQL-level) filtering from client-level (viewer-level) filtering. Also provide the capability to use client-level filtering by default, if a db-specific extension did not implement catalog-loader filtering. (MH) (BZ entries: 200140, 177272, 199689)
6) Provide extensible encryption capabilities for exporting profiles. Perhaps a new extension point to register an encryption method. Need to research different encryption methods. (LTH) (BZ entry: 202648 )
Possibly add a new extension point that points to a class and provides a name and description for the encryption provider. The encryption provider (IProfileEncryptionProvider) would implement two methods (similar to BIRT's IEncryptionHelper) -
public String encrypt ( String input ); public String decrypt ( String output );
The current encryption/decryption method will be moved to this extension point and used as the default method.
When exporting via the Export dialog, the user can select via a drop-down list from the list of available encryption providers (or the drop-down will be disabled if only the default is available).
When importing, the Import routine will look at the file and (somehow) determine which encryption provider was used and decrypt accordingly or pop up a message indicating that the encryption method is not available.
We will need to change the internal ConnectionProfileMgmt class (that does the saves and loads of profiles) so it would take one of these encryption providers to handle encrypting and decrypting the streams.
7) Provide plug-in wizards to aid driver template and connection profile development. (MH) (BZ entries: 151738)
8) Focus on sorting and see if we can come up with ways to make that work for various levels of the DSE within the Platform's Common Navigator Framework (see if any changes are upcoming for Ganymede) (LTH) (BZ entries: 150592)
9) Connection timeout such that connections in the DSE will be closed if unused for a certain amount of time specified in a preference page. This is a WTP feature that needs to be ported to DTP. It will probably need to be redesigned as the DSE in DTP is not exclusive to database connections and it may not make sense to timeout connections to other types of servers. (MH) (BZ entry: 202653)
Perhaps look at it so that each individual server type implements an extension point (perhaps an extension on the CP extension point), listens in a server-specific way for disconnects, and then triggers a reconnect. It will be difficult to handle in the generic case.
10) Provide enablement support for JNDI connections for getting a pooled JDBC connection through a JNDI service. Check out Apache Tomcat 5.5.x and BEA WebLogic 8.1 as examples. Both are fairly simple to use and configure. However, Tomcat does not support client-side access, which we will need in the DSE. (LTH) (BZ entries: 202640)
-------------------Bugs for Consideration (Start)-----------
12) Connection profile with commas in its driver files names will fail (LTH) (BZ 202338)
13) Default DB Location should be platform specific (LTH) (BZ 201682 )
14) Need a way to access connection information for table in DSE (MH) (BZ 198000 )
15) Usability items for Driver Managment/Editing (LTH) (BZ 164534 )
16) Display properties for SQL objects when selected in DSE (LTH) (BZ 154169 )
17) Detect cp name collision only within repository (MH) (BZ 200774 )
18) Connection is already closed when aboutToClose() is calling (MH) (BZ 199509 )
20) Need ability to repurpose DSE content/actions for other viewers (MH) (BZ 194859 )
21) Preference pages inconsistent (MH) (BZ 193791 )
22) org.eclipse.datatools.connectivity.db.generic has dependency to UI (MH) (BZ 192828 )
23) Need mechanism for supporting migration of driver definitions (MH) (BZ 184807 )
-------------(End) Bugs for Consideration---------------
24) Refactor Generic JDBC Connection profile plug-ins so that the Generic JDBC Connection profile can be excluded from the new connection wizard by adopters. (MH) (BZ entry: 203837 )
25) The filtering support should handle multiple predicates that are ANDed together so that more complex filters can be specified. (LTH) (BZ Entry: 203834 )
26) There needs to be a strict separation between plug-ins with UI dependencies and non-UI dependencies. Adopters have the need to access core functionality such as connection profiles, driver templates, connection management, DDL generators and catalog loaders from command-line tooling or without the UI components installed in the product. Some of the use-cases include programmatically creating a connection to a database, browsing the catalog metadata, generating DDL and executing the DDL. As a connectivity requirement, it is specifically focused on the generic JDBC and Derby plug-ins, however, it also has implications in the enablement project. (MH) 203158
27) The version decoration that appears on the connection nodes in the DSE should allow vendors to format product and version information in a format that is meaningful to the vendor's customers. The current three-part version number is too restrictive and should be optional. (MH) (BZ entry: 203829 )
Open Data Access (ODA)
1) Integrates with Database (JDBC) connection profile category (MH)
- The ODA framework currently integrates with the Connection Profile framework to provide an uniform way for connectivity with heterogenous data sources. It does not directly integrate with the Database (JDBC) profile category. This integration project aims to provide out-of-box support of the DTP JDBC connection profile types, and to enable an ODA-compliant consumer application to utilize the specialized RDBMS support available in the Enablement projects.
2) Initial Integration with the SQL Query Builder and SQL Query Model for use by ODA consumer applications (MH)
- The SQL Query Builder planned in Ganymede is a powerful tool that allows users the freedom to create queries to access a relational data source without detail knowledge of the SQL syntax. The same tool also provides the option to edit a SQL query text directly. Any editing made in either the textual or graphical pane would be automatically reflected in the other.
- This integration project plans to provide ODA adapters to the the SQL Query Builder and SQL Query Model. With the adapters, an ODA-compliant consumer application can consume the builder, as well as other custom ODA providers, through the same ODA API.
3) Support for multi-dimensional result sets (LTH)
- The ODA framework currently provides support for a consumer application to retrieve and process data in the form of rows and columns. An ODA data provider for a multi-dimensional data source would have to flatten its data into a tabular result set. That works well for those ODA-compliant consumer applications that want to consume tabular data sets. However, there are other types of applications, such as OLAP tools, that would prefer to consume data in a multi-dimensional result set. This project aims to extend the ODA API to provide design-time and runtime support of multi-dimensional result sets.
Bugs: 11, 14, 17, 18
General UI: 2, 16, 21, 27
Migration: 1, 4, 6, 12, 13, 23
New Functionality: 9, 10, 25
Refactoring: 20, 22, 24, 26
Usability: 3, 5, 7, 8, 15, 19
Must Haves (21): 1, 2, 3, 4, 5, 7, 9, 11, 14, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, ODA-1, ODA-2
Like To Haves (9): 6, 8, 10, 12, 13, 15, 16, 25, ODA-3