Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geogig-dev] PostGIS Editing Workflow

Hello Felipe,
thanks for reaching out to us.

I am so glad you guys have implemented such a workflow and it's working for you.

I know it's not optimal in terms of user friendly-ness.
Unfortunatelly, it's not easy to implement an easier workflow without more tools that wrap geogig. Also, from the database point of view, a geogig's premise is to be unobtrusive. For example, we don't want to force changes to your postgis schema, add triggers, table fields, etc, just to support the tool used to track your data's history/lineage.
That doesn't mean it can't be done. We do add triggers and extra tables to the geopackages created with the "gpkg export -i" command that allows us to then import only what have changed back to the repository once the field work is done. Not doing so for PostGIS databases because those generally are more sensitive and DBA's don't like an external tool to touch their databases. But potentially, can be done.

Which actually makes me think if you would like to try that approach? The field data collectors using a geopackage with geogig's interchange format enabled to do the field work, and synchronize back to the repository once back to the office?

Either way, your workflow seems just fine (albeit forcedly cumbersome until we manage to provide better tools than the command line?)

Question: have you tried geogig 1.2.0 already? We've made extensive changes to improve the performance of the local to local (i.e. non http) remotes interaction, which should speed up fetch/clone/push/pull considerably in your case. We'd love to hear back from someone like you using it on real life, especially if you find some problem/defect with it.

Hope that helps, best regards and feel free to keep in touch through this list.

Gabriel.


On Fri, Nov 3, 2017 at 6:54 PM, Felipe Diniz <diniz.ime@xxxxxxxxx> wrote:
Hello,

I am quite new to Geogig, and I would like to know if the way I am working with it is optimal (or at least not dumb).

We collect data in the field with QGIS 2.18 + PostGIS and in the end of the day, each user commits to his repository (PostgreSQL repository). The commits are done in a branch that only that user uses.
Those repositories are clones from a central repository for the project, which stays in a base.

The commit is done directly using geogig --repo ... commit, so no Geoserver or Geogig serve are used in the process. The commit is done by pg import > add > commit.

When a group of users returns to the base they use geogig --repo ... push to the central repository. After that, a data manager merges the users' branches to the master branch.

After all the users committed, and everything is merged, each user does a geogig pull, create a skeleton database (only the schema, no data), and use pg export back to PostGIS.

There are a lot of steps in this process, and I am sure there is a way to make things easier. Would be really great to hear some advice!

Thanks,
Felipe Diniz
Brazilian Army 

_______________________________________________
geogig-dev mailing list
geogig-dev@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://dev.locationtech.org/mailman/listinfo/geogig-dev



--

Gabriel Roldán
Software Developer | Boundless
groldan@xxxxxxxxxxxxxxxx
@boundlessgeo



Back to the top