[
Date Prev][
Date Next][
Thread Prev][
Thread Next][
Date Index][
Thread Index]
[
List Home]
Re: [Geotools-devel] [udig-devel] Throughts on catalog api
|
Jody Garnett wrote:
Justin Deoliveira wrote:
I beleive I just did. However I am refering to the geotools catalog
not the udig one.
Justin if we can keep these in sync? The last thing I want to do is
maintain two ... right now we have 3. Consolidation is good.
Not sure about making them mutable, I would rather they were a nice
view of information already maintained by the "data source" (say in a
header or a metadata header), and such things are not always
changeable - but perhaps an "override" should be allowed? Things like
publisher or abstract come from first GeoServer needs (think of
GeoServer's FeatureTypeInfo class ) and then the "dublin core"
metadata standard, and are very useful when searching...
So mutable, if you can get me an "info" interface at the same level
as DataStore. Publisher & Abstract - heck yes.
Sorry, this rant didn't quite make sense to me. How is the "override"
acheived?
Okay so I am not sure if what you need to do? Lets try again ...
- Do you intend to edit the metadata (and have it stored back to say the
oracle metadata tables)?
- Do you intend to override the metadata (and have say the bounding box
"override" some incorrect values from a WMS)?
Fair enough, allow me to explain. In GeoServer we have metadata
interfaces for DataStores and FeatureTypes, called FeatureTypeInfo, and
DataStoreInfo ( I beleive you might be familiary with them ;)).
They map very close to the ServiceInfo, and GeoResourceInfo interfaces
very closely. However, all the implementations pull all this metadata
from teh datastore, in which the interface for metadata is quite
limited, or out of thin air. This doesn't work for geoserver since it
*is* the publisher of the data, and needs to be able to overrid ethis
meta data.
Does that make it any closer. And I must say, I am not too sure that
mutability is the answer either, it was just teh first logical solution
that came to mind.
The next solution is the idea of adaptable from eclipse. Basically I
could adapt into my specific geoserver interfaces and hack them
directly, initially populating them with the metadata provided by the
datastore. Aside, I have started this hacking and will need a code
review soon.
As for how these things are achived, we can play the same kind of lazy
loading or interceptor beans that Eclipse or Spring does. Return an
GeoResourceInfo object that returns cached, or locally modified values,
and then compare and contrast with real data fetched in a lazy fashion
for the real data source.
Note that some datastores already have "api" for effecting metadata ...
oracle datastore and shapefile are the two I know of. But there is no
unified solution, but I imagine a mutable Dublin or ISO interface could
be provided.
Not sure this rant is much better, get specific what exactly do you
need. GeoTools used to have DataStoreInfo (mutable) that was created
from DataStoreFactorySPI and opperated "beside"/"at the same level as"
DataStore. The idea being that you may need the metadata before you need
to access the data...
Note that most uDig GeoResoruceInfo implementations make use of custom
DataStore in which the connection pool, or cached capabilities
information has been made available via a public accessor...
I dont know, something about having to create custom datastores for my
application turns me off.
My usecase was more along the lines of searching for handles which
resolve to a specific object. Searching a complex tree of objects like
the catalog screams visitor to me. Allows for more flexibility imho.
Afraid we cannot do this one, if it was a local catalog yes, but this
interface should represent a remote catalog as well. The catalog spec
(like WFS 1.1) allows for a getObject request that we could use to ask
for an *exact* resource via an ID.
Isn't this exactly the reason why we need it? To abstract the fact that
the catalog may be local or remote. I thought even with a remote catalog
you still have a local handle to it? I thought that was the point of
this api, to provide local "handles" which can be resolved into remote
resources. The visitor would simply be visiting the local handles, and
have the option to resolve if necessary.
Perhaps I am missing this one. I dont have the same background knowledge
of the catalog requirements you guys do, so please tell me to shutup if
need be :). This one isn't really a showstopper for me so I am fine with
whatever we come up with, it was just an idea which might make the api
easier to use.
So this is a case where we need to think rather then be flexible.
Jody
!DSPAM:1004,44a8e80c63489771116852!
--
Justin Deoliveira
The Open Planning Project
jdeolive@xxxxxxxxxxxxx