Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [udig-devel] loading browser-based map sites in udig as base/background layer?

Hey all,

On Sat, 2008-01-05 at 03:12 -0700, Eric Jarvies wrote:
> On Jan 4, 2008, at 6:21 PM, Jody Garnett wrote:
> > Eric Jarvies wrote:
> >>> A license agreement.

I haven't followed that be we *know* there must be issues there.

> in mexico, if you take and view street-level and sat images in hybred  
> mode, you will see that street-level roads are several meters off most  
> cases.  

Yes. 

If you want a one-off solution to your problem, you should be able to
play with Jan's rubber-sheeting algorithm to get what you need. It's in
the udig community space last I looked.


If you want more...

One of the first *useful* data structures we could build would be to use
the rubber-sheeting algorithm that Jan gave us and create a 'global,
piece-wise, correction layer'. 

This is a *great* project if we can do it in the spirit of the
openstreetmap project and it has been on my long term horizon for a
while. I've first been tackling the 'get a working GIS software system'
question.

For example, I have good differential gps data for this tiny little
corner of Ethiopia which shows me the error in the global imagery data
in that particular location (JPL's landsat data is off approx 90 meters
to the NWbyN direction). My data set, because it's differential, has
very high internal consistency which allows me to find lots of different
visible features (e.g. road intersections) to estimate the offset. The
data set is all based off of a control point whose location *I don't
know*. However, I logged enough points (10,000 or so) at that control
point to have a fairly perfect bell curve whose center point can
function as my 'control point' to anchor my data. More importantly,
because I have data over a long enough term, I can estimate the accuracy
of my estimated anchor. (Take stratified random subsets of the data to
avoid the weekly auto-correlations, see how far off the different
estimates are from each other). It appears to be within a river width
which is good enough for ecology.

We may be able to take a similar approach with consumer grade equipment.


> as far as messing with accuracy ... who cares, 

Me, because it's not hard and anything else is 'wanking' as the brits
say. Accuracy is not so hard in a global community because we have (1)
time and (2) personel. We can positional data with *measureable*
accuracy by logging over the long term (say 2-3 years). We can use
consumer GPS's to accumulate lots of readings for the same
points/trips. 


So if you are ambitious, you could start up a project along those lines.
(If not, wait a decade or so and I may get around to it.) What is needed
is a good protocol 'how to add a control point into the data set' and
some infrastructure to log the user generated data. i.e. You just need
to build a geographic information system using uDig as one of the
software components.

--adrian



Back to the top