[
Date Prev][
Date Next][
Thread Prev][
Thread Next][
Date Index][
Thread Index]
[
List Home]
[geogig-dev] Notes about import and sync operations from QGIS plugin
|
Hi all
Following the discussion we had yesterday, here are the ideas to adapt
the webapi to allow the sync process in a robust way. This is
basically how i figured out the process in the client side, from the
QGIS plugin, and we should incorporate the missing elements to support
alphanumeric featureids and a few other little things that we talked
yesterday.
I write all the things that I think are not yet resolved, so it gives
you ideas to work on an we can discuss about them.
IMPORTING A LAYER
Several problems when importing a layer:
- The features in the geopackage have no valid ids. These should be
assigned by geogig after import, and returned, so the geopackage can
be modified to include them. We have to discuss how to add them to the
geopackage (layer table, additional table...)
- The original geopackage has no audit_tables and triggers. This can
be done by QGIS itself, or we can find an alternative way. Not really
a showstopper, since we can still have the full workflow (uploading
the full geopackage to update, instead of just the changes in the
audit tables), but that wouldn't be very efficient
-What happens if the layer to import is in a geopackage with several
tables? Should we upload the full geopackage and tell geogig to use
only one table? Maybe export that table to a second temporary
geopackage, and import that one (but adding triggers and audit tables
to the original one)?
SYNCING A LAYER
This is the current mechanism (not fully working, since the endpoints
do not have all the required functionality). Basically, the process
manages the combination of changes before actually uploading, so it
makes sure that the import operation wont cause data losses.
1)We have a layer with some changes (so it contains records in its audit table)
2)The diff between the commit represented by the geopackage (before
local changes) and the current upstream head is fetched
3)The diff is compared with the local changes. If a feature has been
modified in both places, a conflict is signaled and has to be manually
fixed. Remote changes are applied to the local geopackage, and also
the additional changes defined in the conflict solving.
4) We go back to 2 until there are no conflicts
5) If the diff doesnt imply any conflict (no feature has been modified
upstream and locally), the remote changes are applied to the
geopackage, and the local changes are imported upstream. The import
operation should return the commitid of the head after the import is
done, so the commitid of the geopackage can be updated. The
audit_table is cleaned.
So, in short, the mechanism just does an import, but making sure in
advance that it wont overwrite changes done by a different user. The
logic for checking it is not in the geogig instance (since we are not
using a push operation, but just a normal import), but in the QGIS
side instead.
This all has to be modified to add the handling of featureids with a
mapping, as we discussed.
Not sure if there is a way of delegating all this upstream, since
there is not any other endpoint to upload a layer, as far as i have
seen. There are push/pull operations, but since i do not have a local
repo and just a geopackage, i guess i cannot use them...
Let me know if you need more detail
Please keep me posted of all the changes you introduce, so I can test
them and adapt the QGIS plugin accordingly.
Cheers
--
Victor Olaya
Software Engineer | Boundless
volaya@xxxxxxxxxxxxxxxx
@boundlessgeo